WLED

I heard a lot about “if you want to control LEDs, use WLED“. I never had a need to control LEDs (the strip type), but today I had some spare time and I tested it with a spare ESP8266 and a single WS2812B LED. WLED is not really made for single LEDs, but as a proof-of-concept it’ll have to do.

Can’t say much here though: the install went as expected, connecting to the access point it created was straightforward, setting up my WiFi credentials worked, and controlling the single LED via web interface or Android app was as easy as it can get. This was one of the easiest installs of any IoT thing I ever had.

Now I do have to get me some LED strips to play a bit more with it…

Update: Ok, got myself a 60 LED WS2812B strip. And WLED effects look much better for obvious reasons. And you can split the 60 LEDs into segments and set their respective colors via curl easily:

❯ curl -s -X POST http://192.168.21.138/json -H "Content-Type: application/json" -d '{seg:[{},{},{col:[[0,0,100]]}]}' | jq .
{
  "success": true
}

Above sets the color of the 3rd segment to blue (assuming the effect is “Solid”). That’s way better and easier than any other API I’ve seen for WiFi connected lights.

Advertisement

TP-Link Kasa KC120 – Streaming without Kasa

The main problems I have with IoT devices are:

  • They might send data home without me knowing about it
    • But I can monitor their traffic pattern and if they send home way more data than expected, I could disconnect them
  • They might be vulnerable to exploits
    • But I can put them on a separate VLAN at home so they don’t see other devices unless I allow it (via firewall rules)
    • I can sometimes update firmware (definitely a problem after few years)
  • They stop to work when the company turns off their servers
    • I am able to use them without Internet connectivity

Most Kasa products I own (power switches) are supported by various projects like Home Assistant or python-kasa, so turning on my Kasa power switch on my own is a simple task. Same for my LIFX light bulbs there’s even an official API.

The TP-Link KC120 camera however does not have any supported local API and contrary to my expectation, it does not support a local stream mode via a web browser interface. I can watch a live (and local) video stream via the Kasa application on the phone, but that functionality is at the mercy of TP-Link. I don’t like that.

Following are the steps to have local streaming (resp. recording) for the KC120. And with that it’s possible to do whatever I’d like to do with the stream: publishing on the Internet, processing via OpenCV, local archiving etc.

python-kasa

python-kasa does not support the camera, so you won’t see it during a normal discovery:

❯ kasa
No host name given, trying discovery..
Discovering devices on 255.255.255.255 for 3 seconds
== Plug Three - HS105(JP) ==
        Host: 192.168.21.180
        Device state: OFF

        == Generic information ==
        Time:         2022-05-03 11:37:55 (tz: {'index': 90, 'err_code': 0}
        Hardware:     2.1
        Software:     1.0.3 Build 210506 Rel.161924
        MAC (rssi):   10:27:F5:XX:XX:XX (-62)
        Location:     {'latitude': XX.0, 'longitude': XX.0}

        == Device specific information ==
        LED state: True
        On since: None

        == Modules ==
        + <Module Schedule (schedule) for 192.168.21.130>
        + <Module Usage (schedule) for 192.168.21.130>
        + <Module Antitheft (anti_theft) for 192.168.21.130>
        + <Module Time (time) for 192.168.21.130>
        + <Module Cloud (cnCloud) for 192.168.21.130>

== Plug One - HS105(JP) ==
        Host: 192.168.21.182
        Device state: OFF

        == Generic information ==
        Time:         2022-05-03 11:37:55 (tz: {'index': 90, 'err_code': 0}
        Hardware:     1.0
        Software:     1.5.8 Build 191125 Rel.135255
        MAC (rssi):   B0:BE:76:XX:XX:XX (-54)
        Location:     {'latitude': XX.0, 'longitude': XX.0}

        == Device specific information ==
        LED state: True
        On since: None

        == Modules ==
        + <Module Schedule (schedule) for 192.168.21.182>
        + <Module Usage (schedule) for 192.168.21.182>
        + <Module Antitheft (anti_theft) for 192.168.21.182>
        + <Module Time (time) for 192.168.21.182>
        + <Module Cloud (cnCloud) for 192.168.21.182>

But the camera shows up with an additional -d switch, although it’s being ignored since the tool does not know how to handle it:

❯ kasa -d
No host name given, trying discovery..
Discovering devices on 255.255.255.255 for 3 seconds
DEBUG:kasa.discover:[DISCOVERY] ('255.255.255.255', 9999) >> {'system': {'get_sysinfo': None}}
DEBUG:kasa.discover:Waiting 3 seconds for responses...
[...]
DEBUG:kasa.discover:Unable to find device type from {'system': {'get_sysinfo': {'err_code': 0, 'system': {'sw_ver': '2.3.6 Build 20XXXXXX rel.XXXXX', 'hw_ver': '1.0', 'model': 'KC120(EU)', 'hwId': 'CBXXXXD5XXXXDEEFA98A18XXXXXX65CD', 'oemId': 'A2XXXX60XXXX108AD36597XXXXXX572D', 'deviceId': '80XXXX88XXXX76XXXX88XXXXX3AXXXXXXXXXXXB6', 'dev_name': 'Kasa Cam', 'c_opt': [0, 1], 'f_list': [], 'a_type': 2, 'type': 'IOT.IPCAMERA', 'alias': 'Camera', 'mic_mac': 'D80D17XXXXXX', 'mac': 'D8:0D:17:XX:XX:XX', 'longitude': XX, 'latitude': XX, 'rssi': -38, 'system_time': 1651545748, 'led_status': 'on', 'updating': False, 'status': 'configured', 'resolution': '720P', 'camera_switch': 'on', 'bind_status': True, 'last_activity_timestamp': 1651545210}}}}: Unable to find the device type field!
[...]

Important fields here are the deviceID and via the MAC address, you can find out what IP address the camera has (if you use DHCP). In my case 192.168.21.187 is the camera’s IP address.

nmap

nmap shows only port 9999 open which is the known TP-Link debug port. But there’s more ports:

❯ sudo nmap -p- 192.168.21.187
Starting Nmap 7.80 ( https://nmap.org ) at 2022-05-03 11:51 JST
Nmap scan report for kc120.lan (192.168.21.187)
Host is up (0.012s latency).
Not shown: 65531 closed ports
PORT      STATE SERVICE
9999/tcp  open  abyss
10443/tcp open  unknown
18443/tcp open  unknown
19443/tcp open  unknown
MAC Address: D8:0D:17:XX:XX:XX (Tp-link Technologies)

Nmap done: 1 IP address (1 host up) scanned in 9.28 seconds

And with that port information I found this article: https://medium.com/@hu3vjeen/reverse-engineering-tp-link-kc100-bac4641bf1cd. It’s about a slightly different camera model, but since the ports patch, maybe more does.

I followed it, however I could not get the authentication working: the Kasa account password as per article did not work. Time to do the ARP spoofing to see what the Android app uses to authenticate! Geistless did a great job explaining the steps he took.

My overall plan:

  1. Redirect the traffic from the Kasa app on the phone to my Linux machine (via arpspoof)
  2. Redirect the incoming HTTPS traffic to my HTTPS server (via iptables)
  3. Print the URL and headers for incoming HTTPS traffic which arrives at my HTTPS server

arpspoof

The dsniff package contains arpspoof:

❯ sudo apt install dsniff
[...]
❯ sudo setcap CAP_NET_RAW+ep /usr/sbin/arpspoof

My HTTPS Server

While the original author had a https server as part of his Rust learning, I created a NodeJS version. But first we’ll need keys. Self-signed is fine:

❯ openssl genrsa -out key.pem
❯ openssl req -new -key key.pem -out csr.pem
❯ openssl x509 -req -days 999 -in csr.pem -signkey key.pem -out cert.pem
❯ rm csr.pem

Now the simple HTTPS server listening on port 8080:

const https = require('https');
const fs = require('fs');

const options = {
  key: fs.readFileSync('key.pem'),
  cert: fs.readFileSync('cert.pem')
};

https.createServer(options, function (req, res) {
  console.log(req.url);
  console.log(req.headers);
  res.writeHead(200);
  res.end("");
}).listen(8080);

Some IP traffic routing rules to redirect all incoming TCP traffic on enp1s0 for ports 10443, 18443 and 19443 to port 8080:

❯ sudo iptables -t nat -A PREROUTING -i enp1s0 -p tcp --dport 10443 -j REDIRECT --to-port 8080
❯ sudo iptables -t nat -A PREROUTING -i enp1s0 -p tcp --dport 18443 -j REDIRECT --to-port 8080
❯ sudo iptables -t nat -A PREROUTING -i enp1s0 -p tcp --dport 19443 -j REDIRECT --to-port 8080
❯ sudo sysctl net.ipv4.ip_forward=1

Now run the https server and watch it display the URL and the headers for an incoming request on port 19443:

❯ node ./https.js

and to test, on another machine I ran:

$ curl -k -u admin:abc 'https://t621.lan:19443/test?a=3&b=5'

and this is the output of my https server:

/test?a=3&b=5
{
  host: 't621.lan:19443',
  authorization: 'Basic YWRtaW46YWJj',
  'user-agent': 'curl/7.68.0',
  accept: '*/*'
}

The basic authentication is base64 encoded. To decode:

❯ echo YWRtaW46YWJj | base64 -d
admin:abc

So that works. Now putting it all together.

  • Start the Kasa app on the phone. Make sure the KC120 is enabled and can display a live video stream. Stop the stream.
  • Have the iptables redirect rules in place. And IP forwarding in the kernel.
  • Start the HTTPS server.
  • Run arpspoof. 192.168.21.55 is the phone’s IP which runs the Kasa application. 192.168.21.187 is the IP of the KC120.
❯ arpspoof -i enp1s0 -t 192.168.21.55 192.168.21.187
7c:d3:a:xx:xx:xx 38:78:62:xx:xx:xx 0806 42: arp reply 192.168.21.187 is-at 7c:d3:a:xx:xx:xx
  • On the mobile app, try to connect to the video stream of the KC120 again
  • You should now see some output of the HTTPS server:
/https/stream/mixed?video=H264&audio=G711
{
  authorization: 'Basic aXXXXXXXXXXXXXXXXM=',
  connection: 'keep-alive',
  'user-agent': 'Dalvik/2.1.0 (Linux; U; Android 10; H8296 Build/52.1.A.3.49)',
  host: '192.168.21.187:19443',
  'accept-encoding': 'gzip'
}

And then I finally had the authentication string the camera wanted!

❯ echo 'aXXXXXXXXXXXXXXXXM=' | base64 -d
MY_KASA_ACCOUNT:THE_CAMERA_PASSWORD

Turns out that the password to use was not the Kasa password: it’s a longish string of hex digits. That might be a KC120 specialty or it might depend on the firmware version. I cannot say since I have no KC100, but whatever the password is, it’s possible to find out relatively easily using above approach.

The Result: Local Streaming!

I can connect to the video stream! And with very little CPU usage too.

❯ curl -k -u 'MY_KASA_ACCOUNT:THE_CAMERA_PASSWORD' \
--ignore-content-length \
"https://192.168.21.187:19443/https/stream/mixed?video=h264&audio=g711&resolution=hd&deviceId=80XXXX88XXXX76XXXX88XXXXX3AXXXXXXXXXXXB6" \
--output - | ffmpeg -hide_banner -y -i - -vcodec copy kc120stream.mp4

To change resolution, change it in the Kasa app. 1920×1080 (1.4Mb/s), 1280×720 (850kbit/s) and 640×360 (350kbit/s) are possible.

TODO

  • There is no audio coming from the camera. Audio works on the Kasa app.
  • It would also be nice to understand how to change the configuration of the camera (e.g. change resolution), but it’s ok to set them once via the Kasa app.
  • What options do the parameter video, audio and resolution support?

U2F on the CLI

U2F works well and easily via a web browser, but you can also use it directly on the command line. You “just” have to implement the USB protocol part of U2F, namely talk to /dev/hidrawX.

u2fcli did that and it worked on my R2S (ARMv8):

harald@r2s2:~/git$ git clone git@github.com:mdp/u2fcli.git
Cloning into 'u2fcli'...
remote: Enumerating objects: 57, done.
remote: Total 57 (delta 0), reused 0 (delta 0), pack-reused 57
Receiving objects: 100% (57/57), 19.26 KiB | 1.20 MiB/s, done.
Resolving deltas: 100% (21/21), done.
harald@r2s2:~/git$ cd u2fcli
harald@r2s2:~/git/u2fcli$ go mod init u2fcli
go: creating new go.mod: module u2fcli
go: to add module requirements and sums:
        go mod tidy
harald@r2s2:~/git/u2fcli$ go mod tidy
go: finding module for package github.com/flynn/u2f/u2ftoken
go: finding module for package github.com/flynn/hid
go: finding module for package github.com/mdp/u2fcli/cmd
go: finding module for package github.com/flynn/u2f/u2fhid
go: finding module for package github.com/spf13/cobra
go: found github.com/mdp/u2fcli/cmd in github.com/mdp/u2fcli v0.0.0-20180327171945-2b7ae3bbca08
go: found github.com/flynn/hid in github.com/flynn/hid v0.0.0-20190502022136-f1b9b6cc019a
go: found github.com/flynn/u2f/u2fhid in github.com/flynn/u2f v0.0.0-20180613185708-15554eb68e5d
go: found github.com/flynn/u2f/u2ftoken in github.com/flynn/u2f v0.0.0-20180613185708-15554eb68e5d
go: found github.com/spf13/cobra in github.com/spf13/cobra v1.2.1
harald@r2s2:~/git/u2fcli$ go build
harald@r2s2:~/git/u2fcli$ ls
cmd  go.mod  go.sum  LICENSE  main.go  README.md  u2fcli

Permissions for /dev/hidrawX needs to be given:

harald@r2s2:~/git/u2fcli$ sudo chmod a+rw /dev/hidraw0

And now a full cycle of register (once), sign+verify (log in):

harald@r2s2:~/git/u2fcli$ ./u2fcli reg --challenge MyComplexChallenge --appid https://test.com
Registering, press the button on your U2F device #1 [Yubico Security Key by Yubico]{
  "KeyHandle": "-374aUcG7iWqVc5rsX8jE_8yr1iS-EEDdt106-CAKec90Gg1VVK9dv5E_JmZRIyKVaas9vhLVHb7zbbJ6rNltg",
  "PublicKey": "BHBwVKLRYZZKZGaL96FQtzis8i01M2DMw4IQwuMIKbWa2dZJSC1GlXlYiWhycig4R3DdlipdR675o_e4QfpI-UU",
  "RegisteredData": "-374aUcG7iWqVc5rsX8jE_8yr1iS-EEDdt106-CAKec90Gg1VVK9dv5E_JmZRIyKVaas9vhLVHb7zbbJ6rNltjCCAr4wggGmoAMCAQICBHSG_cIwDQYJKoZIhvcNAQELBQAwLjEsMCoGA1UEAxMjWXViaWNvIFUyRiBSb290IENBIFNlcmlhbCA0NTcyMDA2MzEwIBcNMTQwODAxMDAwMDAwWhgPMjA1MDA5MDQwMDAwMDBaMG8xCzAJBgNVBAYTAlNFMRIwEAYDVQQKDAlZdWJpY28gQUIxIjAgBgNVBAsMGUF1dGhlbnRpY2F0b3IgQXR0ZXN0YXRpb24xKDAmBgNVBAMMH1l1YmljbyBVMkYgRUUgU2VyaWFsIDE5NTUwMDM4NDIwWTATBgcqhkjOPQIBBggqhkjOPQMBBwNCAASVXfOt9yR9MXXv_ZzE8xpOh4664YEJVmFQ-ziLLl9lJ79XQJqlgaUNCsUvGERcChNUihNTyKTlmnBOUjvATevto2wwajAiBgkrBgEEAYLECgIEFTEuMy42LjEuNC4xLjQxNDgyLjEuMTATBgsrBgEEAYLlHAIBAQQEAwIFIDAhBgsrBgEEAYLlHAEBBAQSBBD4oBHzjApNFYAGFxEfntx9MAwGA1UdEwEB_wQCMAAwDQYJKoZIhvcNAQELBQADggEBADFcSIDmmlJ-OGaJvWn9CqhvSeueToVFQVVvqtALOgCKHdwB-Wx29mg2GpHiMsgQp5xjB0ybbnpG6x212FxESJ-GinZD0ipchi7APwPlhIvjgH16zVX44a4e4hOsc6tLIOP71SaMsHuHgCcdH0vg5d2sc006WJe9TXO6fzV-ogjJnYpNKQLmCXoAXE3JBNwKGBIOCvfQDPyWmiiG5bGxYfPty8Z3pnjX-1MDnM2hhr40ulMxlSNDnX_ZSnDyMGIbk8TOQmjTF02UO8auP8k3wt5D1rROIRU9-FCSX5WQYi68RuDrGMZB8P5-byoJqbKQdxn2LmE1oZAyohPAmLcoPO4wRgIhANIZ7Q_cty_UkWigyQ7Ot0pC0egyI_eSUJ52Hge95vz1AiEAzf7hX_XvNQvoPQ2IvjgJjUkV3wvDPctkac2Z_8fRaik"
}

harald@r2s2:~/git/u2fcli$ ./u2fcli sig --appid https://test.com --challenge SomethingElse --keyhandle "-374aUcG7iWqVc5rsX8jE_8yr1iS-EEDdt106-CAKec90Gg1VVK9dv5E_JmZRIyKVaas9vhLVHb7zbbJ6rNltg"
Authenticating, press the button on your U2F device
{
  "Counter": 50,
  "Signature": "AQAAADIwRQIhALlZyMmormC2b9JCaOXYAdKq4wvpdKg4wMu68fLgXmclAiADDHbFxKrm5eYCoCvC-m1vEEegXzWHfwuPLpUh81qHoA"
}

harald@r2s2:~/git/u2fcli$ ./u2fcli ver --appid https://test.com --challenge SomethingElse --publickey "BHBwVKLRYZZKZGaL96FQtzis8i01M2DMw4IQwuMIKbWa2dZJSC1GlXlYiWhycig4R3DdlipdR675o_e4QfpI-UU" --signature "AQAAADIwRQIhALlZyMmormC2b9JCaOXYAdKq4wvpdKg4wMu68fLgXmclAiADDHbFxKrm5eYCoCvC-m1vEEegXzWHfwuPLpUh81qHoA"
Signature verified

M5Stack & AWS IoT

Received my AWS IoT EduKit from M5Stack. First impression: it’s an improvement to the original M5Stack I have: display is nicer and power and reset button is now 2 instead of 1 button. Sensor touch instead of buttons too, but not sure this is an improvement or just cost cutting.

Time to test this via the Blinky Hello World example!

First notes: the instructions do clash with any Python environment you might have set up. It recommends to use miniconda. It’s not needed if you already have a virtual environment setup. Just activate your environment:

$ cd ~/git
$ git clone -b release/v4.2 --recursive https://github.com/espressif/esp-idf.git
$ cd esp-idf
$ . $HOME/esp/esp-idf/install.sh
ERROR: This script was called from a virtual environment, can not create a virtual environment again
$ . ./export.sh 
Detecting the Python interpreter
Checking "python" ...
Python 3.8.5
"python" has been detected
Adding ESP-IDF tools to PATH...
Using Python interpreter in /home/harald/venv/bin/python
Checking if Python packages are up to date...
Python requirements from /home/harald/git/esp-idf/requirements.txt are satisfied.
Added the following directories to PATH:
  /home/harald/git/esp-idf/components/esptool_py/esptool
  /home/harald/git/esp-idf/components/espcoredump
  /home/harald/git/esp-idf/components/partition_table
  /home/harald/git/esp-idf/components/app_update
Done! You can now compile ESP-IDF projects.
Go to the project directory and run:

  idf.py build

You can ignore the error after executing . ./install.sh . All requirements should have been installed as expected into your virtual environment.

In case you get odd errors, delete ~/.espressif/ as it has the compiler tool chain. See also here if your code crash loops.

The AWS CLI tools need to be installed and configured (of course). Then finally the fun starts:

$ cd ~/git
$ git clone https://github.com/m5stack/Core2-for-AWS-IoT-EduKit.git
$ cd Core2-for-AWS-IoT-EduKit/Blinky-Hello-World/utilities/AWS_IoT_registration_helper/
$ pip install -r requirements.txt
$ python registration_helper.py -p /dev/ttyUSB0
[...lots of lines...]
Manifest was loaded successfully

That’ll throw an error if you use Python other than 3.7. The fix is simple: remove the bold lines in the registration_helper.py:

def check_environment():
    """Checks to ensure environment is set per AWS IoT EduKit instructions

    Verifies Miniconda is installed and the 'edukit' virtual environment
    is activated.
    Verifies Python 3.7.x is installed and is being used to execute this script.
    Verifies that the AWS CLI is installed and configured correctly. Prints
    AWS IoT endpoint address.
    """
    conda_env = os.environ.get('CONDA_DEFAULT_ENV')
    if conda_env == None or conda_env == "base":
        print("The 'edukit' Conda environment is not created or activated:\n  To install miniconda, visit https://docs.conda.io/en/latest/minico
nda.html.\n  To create the environment, use the command 'conda create -n edukit python=3.7'\n  To activate the environment, use the command 'con
da activate edukit'\n")
    print("Conda 'edukit' environment active...")
    
    if sys.version_info[0] != 3 or sys.version_info[1] != 7:
        print(f"Python version {sys.version}")
        print("Incorrect version of Python detected. Must use Python version 3.7.x. You might want to try the command 'conda install python=3.7'
.")
        exit(0)
    print("Python 3.7.x detected...")

    aws_iot_endpoint = subprocess.run(["aws", "iot", "describe-endpoint", "--endpoint-type", "iot:Data-ATS"], universal_newlines=True, capture_o
utput=True)
    if aws_iot_endpoint.returncode != 0:

Now in the AWS console you can find it:

and you can also get its endpoint from the AWS CLI:

$ aws iot describe-endpoint --endpoint-type iot:Data-ATS
{
    "endpointAddress": "c7xxxxxxxxxx0f-ats.iot.ap-northeast-1.amazonaws.com"
}

Use idf.py menuconfig to configure the endpoint name and the WiFi connectivity, then idf.py build flash monitor -p /dev/ttyUSB0 to build, flash and then connect to the serial port to watch it. Then you should see incoming MQTT messages:

And you make the LEDs blink by publishing into CLIENT_ID/blink. Stop blinking by publishing to the same topic. If you don’t know your CLIENT_ID, look it up on the display.

Given that AWS charges you for IoT traffic like those messages, don’t keep it messaging all day long. It only takes 5 days 18h to hit the limit of free 500k messages to send (at 1/s).