The Webauthn appears simple on the surface, but the more you dig into it, the more its complexity will surprise you.
Without further ado, let's dig into the code. It's written so that you can simply open the browser dev tools and run it in the console!
let credential = await navigator.credentials.create({
publicKey: {
challenge: Uint8Array.from(
"random-string-from-server", c => c.charCodeAt(0)),
rp: {
name: "Try it in the console!",
id: "dev.to"
},
user: {
name: 'John Doe',
displayName: 'Johny',
id: Uint8Array.from("for-the-device-to-identify-user", c => c.charCodeAt(0))
},
pubKeyCredParams: [] // if empty, either ES256 or RSA256 will be used by default
}
});
This is the bare minimum. There are many more options that can be used. However, it should be OK for the default use case. If you run this snippet, a browser specific popup should appear and will ask to proove your identity. In my case (german locale) in looks like this:
Depending on your case, you might use the device directly, your smartphone nearby, or some security key to proove your identity using it.
How exactly you proove your identity depends on the device capabilities, the OS, the browser... At some point, you will to either use some biometric or PIN on the device. This device is then called the "authenticator" and will produce a cryptographic private/public key pair.
The private key will be kept secret, stored on the device and protected by your biometric or PIN code. The public key will ulimately be sent to the server so that it can authentify you next time.
So, what's the result of this call? A "PublicKeyCredential" and the start of your headhaches. ;) It's not some JSON that you can send over, it's an object with encoded byte buffers.
PublicKeyCredential {
id: 'AQtKmY-...',
rawId: <ArrayBuffer>,
response: {
attestationObject: <ArrayBuffer>,
clientDataJSON: <ArrayBuffer>
},
authenticatorAttachment: 'cross-platform',
type: 'public-key'
}
And not only are the byte buffers, but in particular the attestationObject
is tricky since it is encoded using the exotic CBOR format. So you cannot even decode that without an external library.
There were several discussions in the working group of this specification argumenting in favor of plain JSON instead of these impractical byte buffers and CBOR encoding, like for example https://github.com/w3c/webauthn/issues/1362
However, they simply stated:
On the call of 2020-01-22 it was decided that the use of ArrayBuffers is reflecting W3C direction as we understand it and that revisiting that would be too much.
So, well, the burden falls on everyone wanting to use this instead. :/
Instead of trying to decode stuff on the client side, a common approach is to encode the bytes as text using base64url since several parts of the protocol already use base64url too.
Here is a method to do so:
function base64url(buffer) {
return btoa(String.fromCharCode(...new Uint8Array(buffer))).replace('+', '-').replace('/', '_').replace('=','')
}
You can also directly test it:
credential.id === base64url(credential.rawId) // => true
So one way to simply send it over the to the server is as follows.
let jsonCred = {
id: credential.id,
clientData: base64url(credential.response.clientDataJSON),
attestation: base64url(credential.response.attestationObject),
type: credential.type
}
This would leave the burden of the tricky CBOR decoding of the attestation to the server. But first, let's analyze the content of the response.
-
id
: the id of the generated key pair -
rawId
: the same in "raw" format -
response.attestationObject
: a cryptic object in CBOR format containing among others the public key -
response.clientDataJSON
: an encoded JSON object containing challenge, origin, type and crossOrigin flag. This should be preserved in its original form since it is also the signed data. -
authenticatorAttachment
: either'platform'
(authentified through the device itself) or'cross-platform'
(authentified through an external device) -
type
: always'public-key'
for this kind of authentication. I guess it is to provides leeway for future extensions.
Here is an example of the credential.response.clientDataJSON
object.
let utf8Decoder = new TextDecoder('utf-8');
let clientData = JSON.parse(utf8Decoder .decode(credential.response.clientDataJSON));
{
type: 'webauthn.create',
challenge: 'cmFuZG9tLXN0cmluZy1mcm9tLXNlcnZlcg',
origin: 'https://dev.to',
crossOrigin: false
}
Note here that the challenge this time is the base64url encoded version of the original challenge.
The credential.response.attestationObject
is much more tricky. It is doubly CBOR encoded. When (doubly) decoded, it may look like follows.
attestationObject: {
"fmt": "none",
"attStmt": {},
"authData": {
"rpIdHash": "f95bc73828ee21f9fd3bbe72d97908013b0a3759e9aea3dae318766cd2e1ad",
"flags": {
"userPresent": true,
"reserved1": false,
"userVerified": true,
"reserved2": "0",
"attestedCredentialData": true,
"extensionDataIncluded": false
},
"signCount": 0,
"attestedCredentialData": {
"aaguid": "0000000000000000",
"credentialIdLength": 65,
"credentialId": "175c2594733607c41453211d5fe399c5962f60f06d64d832434a2db1bb731f1d9ed319c11a1137d7afbad5a11898855bd128cc4197320f47e7b9ba782c782",
"credentialPublicKey": {
"kty": "EC",
"alg": "ECDSA_w_SHA256",
"crv": "P-256",
"x": "pwRJ454Zmb3Na0ESt0poCsCPoXOWbwFuU5gkZEhNKnI=",
"y": "srbE2S9h8Yn25B4shJgwd1geqcAmm8wAphluPsJ0Uto="
}
}
}
}
However, the webauthn "Standard" just tells the "authenticators" to put what's necessary inside to verify it. Its structure is not standardized. And, well, each device/OS sees it differently. As a consequence, there is a multitude of different "attestation statement formats", with each its own way to be parsed and validated. Luckily, some other people faced the same challenge and wrote some great articles about it!
- Introduction to verifying assertions
- Packed
- FIDO-U2F
- Android Keystore
- Android SafetyNet
- TPM
- Apple Anonymous Attestation
The takeaway is that you cannot verify this easily by yourself. Not by a long shot. This is due to the large diversity of "attestations", sometimes with outdated exotic formats and their underlying complexity of the whole. That is why the next article in this series will focus on server side libraries to validate/verify these "attestations".
Top comments (0)