step-ca and ACME

For my home Kubernetes installation I guess it’s time to enable TLS. Can’t use Let’s Encrypt for this as my internal network is not reachable and while I have workaround for this problem, I’d rather use my internal Certificate Authority via step-ca.

It’s actually simpler than I thought it is, mainly because the documentation I found first included options which were not explained at all. Turns out they are indeed fully optional… Thus on the CA server do:

step ca provisioner add acme --type ACME

This adds the ACME provisioner to the ~/.step/config/ca.json file:

{
  "type": "ACME",
  "name": "myacme",
  "forceCN": true,
  "claims": {
     "maxTLSCertDuration": "12h",
     "defaultTLSCertDuration": "2h
  }
}

The first 2 items were added by above command. The next were added by me. They are optional. Restart step-ca:

harald@r2s1:~$ sudo systemctl restart step-ca
harald@r2s1:~$ systemctl status step-ca
● step-ca.service - Step Certificates
     Loaded: loaded (/etc/systemd/system/step-ca.service; enabled; vendor preset: enabled)
     Active: active (running) since Tue 2021-05-11 18:27:39 JST; 15s ago
   Main PID: 547880 (step-ca)
      Tasks: 8 (limit: 998)
     Memory: 10.5M
     CGroup: /system.slice/step-ca.service
             └─547880 /usr/local/bin/step-ca /home/harald/.step/config/ca.json --password-file /home/harald/.step/pass/key_pass.txt

To create a new certificate on a different machine which runs no HTTP server on port 80:

❯ sudo REQUESTS_CA_BUNDLE=$(step path)/certs/root_ca.crt \
    certbot certonly --standalone  \
    --server https://ca.lan:8443/acme/acme/directory
Saving debug log to /var/log/letsencrypt/letsencrypt.log
Plugins selected: Authenticator standalone, Installer None
Enter email address (used for urgent renewal and security notices)
 (Enter 'c' to cancel): my.mail@some.mail.server

- - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - -
Please read the Terms of Service at None. You must agree in order to register
with the ACME server. Do you agree?
- - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - -
(Y)es/(N)o: y

- - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - -
Would you be willing, once your first certificate is successfully issued, to
share your email address with the Electronic Frontier Foundation, a founding
partner of the Let's Encrypt project and the non-profit organization that
develops Certbot? We'd like to send you email about our work encrypting the web,
EFF news, campaigns, and ways to support digital freedom.
- - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - -
(Y)es/(N)o: n
Account registered.
Please enter in your domain name(s) (comma and/or space separated)  (Enter 'c'
to cancel): m75q.lan
Requesting a certificate for m75q.lan
Performing the following challenges:
http-01 challenge for m75q.lan
Waiting for verification...
Cleaning up challenges

IMPORTANT NOTES:
 - Congratulations! Your certificate and chain have been saved at:
   /etc/letsencrypt/live/m75q.lan/fullchain.pem
   Your key file has been saved at:
   /etc/letsencrypt/live/m75q.lan/privkey.pem
   Your certificate will expire on 2021-05-11. To obtain a new or
   tweaked version of this certificate in the future, simply run
   certbot again. To non-interactively renew *all* of your
   certificates, run "certbot renew"
 - If you like Certbot, please consider supporting our work by:

   Donating to ISRG / Let's Encrypt:   https://letsencrypt.org/donate
   Donating to EFF:                    https://eff.org/donate-le

And now let’s renew:

❯ sudo openssl x509 -in /etc/letsencrypt/live/m75q.lan/fullchain.pem -noout -text | grep After
            Not After : May 11 11:16:01 2021 GMT
❯ sudo REQUESTS_CA_BUNDLE=$(step path)/certs/root_ca.crt \
    certbot renew --server https://ca.lan:8443/acme/acme/directory
[...]
❯ sudo openssl x509 -in /etc/letsencrypt/live/m75q.lan/fullchain.pem -noout -text | grep After
            Not After : May 11 11:17:04 2021 GMT

This is documented here. Note that the certificate ends up in /etc/letsencrypt/live/ and because root ran the command, you need root to get it out. Not the way it should be, but this is more a test for the ACME provider for step-ca.

Protocol Buffers and Node.js

Kafka can encode a message via Avro or Protocol Buffers which are both binary protocols. They are comparable to each other (see here) but since gRPC uses Protocol Buffers and it seems it can do anything Avro can do (plus more), maybe a good time to dig into it a bit.

For debugging, here is a web page which can decode a ProtoBuf message. Nice for debugging.

Google docs for ProtoBuf for JavaScript is here. Here a quick working example:

const messages = require('./addressbook_pb');

let message = new messages.Person();

message.setName("Harald K");
message.setId(321);
message.setEmail("harald@some.email.com");

let phone1 = new messages.Person.PhoneNumber();
phone1.setNumber("03-1122-3355");
phone1.setType(messages.Person.PhoneType.WORK);

let phone2 = new messages.Person.PhoneNumber();
phone2.setNumber("090-5566-7788");
phone2.setType(messages.Person.PhoneType.HOME);

message.addPhones(phone1);
message.addPhones(phone2);

console.log("message object:");
console.log(JSON.stringify(message));

console.log("message as JSON:");
console.log(JSON.stringify(message.toObject()));

let binBuffer=Buffer.from(message.serializeBinary());
console.log("Binary serialized:");
console.log(JSON.stringify(binBuffer));
console.log("And in hex:");
let s=""
for (const i of binBuffer) {
    s+=i.toString(16).padStart(2, '0')+" ";
}
console.log(s);

// Now let's convert the binary ProtoBuf message into a proper object again

let message2 = messages.Person.deserializeBinary(binBuffer);
console.log("Converted from binary:");
console.log(JSON.stringify(message2));
console.log("...and as Object:");
console.log(JSON.stringify(message2.toObject()));

The addressbook.proto is from here.

Kafka & Schema

Once you understand how Kafka works, it’s so easy to find use-cases for it. To learn to understand it better, a local install helps a lot though, plus some interactive tools and libraries to produce and consume data.

  1. Follow the Confluent quick start Docker demo
  2. Configure zoe
  3. See users and pageviews data via zoe
  4. Consume the same data via KafkaJS
  5. Make KafkaJS use the schema registry via confluent-schema-registry

Zoe config file. Note the KafkaAvroDeserializer.

❯ cat ~/.zoe/config/default.yml
---
clusters:
  default:
    props:
      bootstrap.servers: "t620.lan:9092"
      key.deserializer: "org.apache.kafka.common.serialization.StringDeserializer"
      value.deserializer: "io.confluent.kafka.serializers.KafkaAvroDeserializer"
      key.serializer: "org.apache.kafka.common.serialization.StringSerializer"
      value.serializer: "org.apache.kafka.common.serialization.ByteArraySerializer"
    registry: ${SCHEMA_REGISTRY:-http://t620.lan:8081}
    groups:
      mygroup: my-group-id
    topics:
      users:
        name: "users"
        subject: "users-value"
runners:
  default: "local"

View users via zoe:

❯ zoe --silent --cluster default topics consume users
{"registertime":1501533472288,"userid":"User_8","regionid":"Region_1","gender":"FEMALE"}
{"registertime":1511144207405,"userid":"User_4","regionid":"Region_7","gender":"FEMALE"}
{"registertime":1500937323185,"userid":"User_8","regionid":"Region_9","gender":"FEMALE"}
{"registertime":1492141732118,"userid":"User_5","regionid":"Region_8","gender":"FEMALE"}
{"registertime":1509714843903,"userid":"User_3","regionid":"Region_4","gender":"OTHER"}

And now via Node.js:

// Use with https://docs.confluent.io/platform/current/quickstart/cos-docker-quickstart.html
// And its users producer

const { Kafka } = require('kafkajs')
const { SchemaRegistry } = require('@kafkajs/confluent-schema-registry')

const kafka = new Kafka({ clientId: 'my-app', brokers: ['t620.lan:9092'] })
const registry = new SchemaRegistry({ host: 'http://t620.lan:8081/' })
const consumer = kafka.consumer({ groupId: 'test14-group' })

const run = async () => {
  await consumer.connect()
  await consumer.subscribe({ topic: 'users', fromBeginning: true })

  await consumer.run({
    eachMessage: async ({ topic, partition, message }) => {
      // const decodedKey = await registry.decode(message.key)
      const decodedKey = message.key.toString();
      const decodedValue = await registry.decode(message.value)
      console.log({ decodedKey, decodedValue })
      // console.log(`message=${JSON.stringify(message)}`)
    },
  })
}

run().catch(console.error)

which prints out users like

{
  decodedKey: 'User_2',
  decodedValue: Users {
    registertime: 1516972272723,
    userid: 'User_2',
    regionid: 'Region_5',
    gender: 'FEMALE'
  }
}

Note for ARM64 users

While I usually use my ThinkCentre for development work, I have a small ARMv8 machine which I use for small stuff. Node.js runs very well on it and so does Python.

However to install the Python package confluent-kafka, I am supposed to install the latest librdkafka-dev from here. Except this is amd64 only. So no Python confluent-kafka on ARMv8 unfortunately.

OBS and Fun With WebEx

At work we use WebEx for video conferencing. Not the worst choice. I like video conferencing as I think it makes meetings better and more fun. Also nice to see people once in a while when working from home. Then on March 15 this came up:

https://dilbert.com/strip/2021-03-15

This Dilbert strip immediately made me think: I want that too! But how?

Our WebEx is limited by corporate policies, so no fancy backgrounds and certainly no closing credit plugins if they even exist. There had to be a way: it’s a technical problem, so there must be a technical solution.

Searching a bit around made me find OBS which is normally used to stream to Twitch/YouTube/etc. or to record things like lesson videos. And it has a “Virtual Camera” feature which does exactly when you think it does: it presents its output as a new video camera device which other programs can use as their video input.

And it works! Except in WebEx. Turns out WebEx does not like this. However WebEx does not mind NDI stream as an input and OBS can output such a stream via the NDI Tools.

So all I need now is:

  1. Have OBS set up with NDI Tools
  2. Create 2 scenes in OBS: my webcam’s picture and the same plus scrolling text (AKA the closing credits)
  3. Enable NDI Output (in the Tools menu of OBS)
  4. In WebEx application, choose the NDI camera as video source
  5. When you want the closing credits to show, switch the scene

There’s of course a lot more potential fun inside OBS:

Side Effects

As an unexpected side effect while learning about OBS, I fixed my light during video conferences (Key/Fill/Back lights), made me stand out from the background more, learned how boring cameras are when they statically point at you, made the sound better via some audio filters, and I looked into the world of Shaders, which are super interesting but I think that’s too deep into that rabbit hole…

Fun with PIV with my YubiKey 3 Neo

Turns out that my マイナンバーカード is not the only thing which can do things like signing files and PIV is the official(?) standard for this. My old Yubikey 3 Neo can do that too thanks to the yubico-piv-tool.

And it’s basically the same as the マイナンバーカード except I have to set up everything myself.

As on my マイナンバーカード there’s 2 slots for 2 different keys and certificates:

  • Slot 9a for identification
  • Slot 9c for signing

Creating them is simple. I just show the ones for the signing slot 9c:

❯ yubico-piv-tool -s9c -AECCP256 -agenerate -o f2-9c.pub
❯ yubico-piv-tool -s9c -S'/CN=Harald Kubota/OU=Home/O=lan/' -averify -arequest -i f2-9c.pub -o f2-9c.csr
Enter PIN: 
Successfully verified PIN.
Successfully generated a certificate request.
# I need a DNS Name. And 8670h is about 1 year.
❯ step ca sign --set=dnsNames='["test5.lan"]' --not-after=8760h f2-9c.csr f2-9c.crt
✔ Provisioner: myCA@home (JWK) [kid: IFXxmmZDCX76WMNbFfUoBOBZdubx0SG45Jsd0VGxaz1]
✔ Please enter the password to decrypt the provisioner key: 
✔ CA: https://ca.lan:8443
✔ Certificate: f2-9c.crt
❯ yubico-piv-tool -s9c -aimport-certificate -i f2-9c.crt
Successfully imported a new certificate.

And here is how to sign a file and verify the signature:

❯ yubico-piv-tool -averify-pin --sign -s9c -HSHA256 -AECCP256 -i test.txt -o test.signature
Enter PIN: 
Successfully verified PIN.
Signature successful!

❯ yubico-piv-tool -s9c -aread-certificate >f2-9c.crt
❯ openssl x509 -pubkey -in f2-9c.crt -noout > f2-9c.pub
❯ openssl dgst -sha256 -verify f2-9c.pub -signature test.signature test.txt
Verified OK

マイナンバーカード fun!

Got a Smart Card reader, so the fun can begin!

PDF Signing

This is most common, so it’s best documented.

For Windows you need:

Old non-Unicode software, not well tested on non-Japanese PCs
  • Optional: Import the root certificate as it’s documented in the English description.
  • PDF Signing software from here
  • Usage is simple enough. Worked on my first try.
  • To test, get Acrobat Reader DC. Don’t forget to un-click all unwanted extra-software.
  • In Acrobat, go to Edit/Preferences and click on Signatures/Verification’s More… button and enable Windows Integration:
  • Alternatively import the user signing CA certificate into Acrobat Reader so it trusts it.

Now the previously signed PDF should show up in Acrobat as signed:

Non-PDFs

Far more interesting (for me) is signing arbitrary files to proof that those are “mine”.

myna is a simple program for Windows, Linux and Mac to do all basic things the マイナンバーカード can do. Usage is simple (Windows here):

> myna.exe jpki cms sign -i INPUTFILE -o OUTPUTFILE
> myna.exe jpki cms verify FILE

Note that the both command needs the マイナンバーカード available. The OUTPUTFILE is in PKCS#7 format. Verification via openssl is possible:

$ openssl cms -verify -inform DER -in FILE -CAfile jpki.pem

The jpki.pem is the PEM encoded root certificate from your マイナンバーカード . But in return you don’t need the card at this point.

If you want to see who signed:

$ openssl pkcs7 -print_certs -inform DER -in test-signed.p7m -noout
subject=C = JP, L = Tokyo-to, L = Xxx-shi, CN = 
2xxxxxxxxxxxxxxxxxxxxxxxxxxA

issuer=C = JP, O = JPKI, OU = JPKI for digital signature, OU = Japan Agency for Local Authority Information Systems

# Many more cert details:
$ openssl pkcs7 -print_certs -inform DER -in test-signed.p7m -noout -text

# Actual verify:
$ openssl cms -verify -inform DER -in test-signed.p7m -CAfile jpki.pem -cmsout -print
CMS_ContentInfo:
  contentType: pkcs7-signedData (1.2.840.113549.1.7.2)
  d.signedData:
    version: 1
    digestAlgorithms:
        algorithm: sha1 (1.3.14.3.2.26)
        parameter: <ABSENT>
    encapContentInfo:
      eContentType: pkcs7-data (1.2.840.113549.1.7.1)
      eContent:
        0000 - 4d 6f 64 65 6c 73 50 61-74 68 3d 22 4d 6f 64   ModelsPath="Mod
[...]
        00b4 - 76 65 32 44 22 0a                              ve2D".
    certificates:
      d.certificate:
        cert_info:
          version: 2
          serialNumber: 4xxxxxx5
          signature:
            algorithm: sha256WithRSAEncryption (1.2.840.113549.1.1.11)
            parameter: NULL
          issuer: C=JP, O=JPKI, OU=JPKI for digital signature, OU=Japan Agency for Local Authority Information Systems
          validity:
            notBefore: Feb 15 19:11:22 2021 GMT
            notAfter: Xxx xx 14:59:59 2025 GMT
          subject: C=JP, L=Tokyo-to, L=Xxx-shi, CN=2xxxxxxxxxxxxxxxxxxxxxxxxxxA
          key:
[...]

You should be able to recognize the signing serial number: that’s the user certificate on the マイナンバーカード.

Other Notes

There’s 4 certificates on the card:

  • user certificate
  • the CA’s public certificate which signed the user certificate
  • user signing certificate
  • the CA’s public certificate which signed the user signing certificate

asn1decode shows the internal structure of data, but a lot of data is hidden in “OCTET STRING” like this:

 1637:d=8  hl=3 l= 176 cons: SEQUENCE
 1640:d=9  hl=2 l=   3 prim: OBJECT            :X509v3 Authority Key Identifier
 1645:d=9  hl=3 l= 168 prim: OCTET STRING      [HEX DUMP]:3081A580144DE017DE4B7F473DCD867A62D38B134ACE83558AA18186A48183308180310B3009060355040613024A50310D300B060355040A0C044A504B4931233021060355040B0C1A4A504B4920666F72206469676974616C207369676E6174757265313D303B060355040B0C344A6170616E204167656E637920666F72204C6F63616C20417574686F7269747920496E666F726D6174696F6E2053797374656D7382040132C4AB
 1816:d=8  hl=2 l=  29 cons: SEQUENCE
 1818:d=9  hl=2 l=   3 prim: OBJECT            :X509v3 Subject Key Identifier

You can display that data in the binary blob like this:

harald@r2s1:~/t$ openssl asn1parse -inform DER -in test-signed.p7m -strparse 1645
    0:d=0  hl=3 l= 165 cons: SEQUENCE
    3:d=1  hl=2 l=  20 prim: cont [ 0 ]
   25:d=1  hl=3 l= 134 cons: cont [ 1 ]
   28:d=2  hl=3 l= 131 cons: cont [ 4 ]
   31:d=3  hl=3 l= 128 cons: SEQUENCE
   34:d=4  hl=2 l=  11 cons: SET
   36:d=5  hl=2 l=   9 cons: SEQUENCE
   38:d=6  hl=2 l=   3 prim: OBJECT            :countryName
   43:d=6  hl=2 l=   2 prim: PRINTABLESTRING   :JP
   47:d=4  hl=2 l=  13 cons: SET
   49:d=5  hl=2 l=  11 cons: SEQUENCE
   51:d=6  hl=2 l=   3 prim: OBJECT            :organizationName
   56:d=6  hl=2 l=   4 prim: UTF8STRING        :JPKI
   62:d=4  hl=2 l=  35 cons: SET
   64:d=5  hl=2 l=  33 cons: SEQUENCE
   66:d=6  hl=2 l=   3 prim: OBJECT            :organizationalUnitName
   71:d=6  hl=2 l=  26 prim: UTF8STRING        :JPKI for digital signature
   99:d=4  hl=2 l=  61 cons: SET
  101:d=5  hl=2 l=  59 cons: SEQUENCE
  103:d=6  hl=2 l=   3 prim: OBJECT            :organizationalUnitName
  108:d=6  hl=2 l=  52 prim: UTF8STRING        :Japan Agency for Local Authority Information Systems
  162:d=1  hl=2 l=   4 prim: cont [ 2 ]

Got my マイナンバーカード!

For anyone outside Japan this is probably not of any interest. Please pass. Nothing to see here.

For me it was interesting: this is a smart card which can also use NFC, which makes it very interesting: How does it work? What data is inside? Can I look at it? Can other people look at it (without the PIN)? Why does it have 2 different PINs?

Things I learned in half a day since I got my MyNumber card:

  • It can do NFC too.
  • This app works to read data off the card. Including pictures, certificates and other stuff. Interesting.
  • On this site it explains the file system structure and other internals of the card. Very interesting.
  • That unrelated app works great to read my Suica/PASMO card. And both apps work and they figure out which card is for which app. Neat.
  • Here is a 5 year old article about how to use the data via its PKCS#11 API. And how to use this for ssh with OpenSC. I did that a while ago with a YubiKey. I prefer the YubiKey form factor a lot.
  • I got so many gadgets at home, but no Smart Card reader/writer. I should get this fixed so I can read the certificate with my PC. Makea paying tax via e-Tax much easier too.

Google Cloud Platform

My AWS Certified Solution Architect – Professional is expiring in June! Since renewing it is a bit boring, it’s a great reason to get to know GCP better. I generally like their way of thinking more and today I understood why:

  • AWS has DevOps as their focus point for many products
  • GCP has the developer as the focus point for many products

Of course there’s plenty overlap, but the philosophy is fundamentally different. But that might just be my opinion. It would explain why I am more comfortable with AWS with my Sysadmin background, but more curious with GCP (as a wanna-be small-scale developer).

Pub/Sub

Beside creating VMs, traditionally one of the easiest ways to interact with a cloud environment is message queues. In GCP this is Pub/Sub. And it’s easy.

  1. Create a Topic. With a schema (to keep yourself sane).

Schema (AVRO):

{
  "type": "record",
  "name": "Avro",
  "fields": [
    {
      "name": "Sensor",
      "type": "string"
    },
    {
      "name": "Temp",
      "type": "int"
    }
  ]
}

Then you can publish via gcloud (thanks to Pavan for providing a working example):

❯ gcloud pubsub topics publish Temp --message='{"Sensor":"Storage","Temp":9}'

And in Node.js:

const {PubSub} = require('@google-cloud/pubsub');

function main(
  topicName = 'Temp',
  data = JSON.stringify({Sensor: 'Living room', Temp: 22})
) {

  const pubSubClient = new PubSub();

  async function publishMessage() {
    const dataBuffer = Buffer.from(data);

    try {
      const messageId = await pubSubClient.topic(topicName).publish(dataBuffer);
      console.log(`Message ${messageId} published.`);
    } catch (error) {
      console.error(`Received error while publishing: ${error.message}`);
      process.exitCode = 1;
    }
  }

  publishMessage();
}

process.on('unhandledRejection', err => {
  console.error(err.message);
  process.exitCode = 1;
});

main(...process.argv.slice(2));

And with plumber:

# Subscribe
❯ plumber read gcp-pubsub --project-id=training-307604 --sub-id=Temp2-sub -f

# Publish
❯ plumber write gcp-pubsub --topic-id=Temp --project-id=training-376841 --input-data='{"Sensor":"Kitchen","Temp":19}'

Create your website with WordPress.com
Get started