Use of CBOR, and Uint8Array/ArrayBuffer
craigfrancis opened this issue ยท 14 comments
Forgive me for a simplistic question, but why does WebAuthn use CBOR encoding?
This is an API for users to do a single authentication, so there is pretty much no performance benefit of using a binary format, it just makes the API harder for developers to understand, make mistakes, and requires every website to pull in a 3rd party library to parse it (not good for security).
Couldn't this API simply return a JavaScript object, with the values ready to be sent back to the server?
And because the binary data needs to be sent be sent to/from the server, the use of Uint8Array/ArrayBuffer is a pain - can't it use base64 encoding? It's supported in every language, can be easily sent in a POST request, and is used in other fields (e.g. the Client Data challenge
is base64 encoded, but the Credential Creation challenge
is not).
I started with this ugly mix of PHP and (unsafe-inline) JavaScript to provide the challenge:
<script nonce="<?= htmlentities($script_nonce) ?>" integrity="haha, nope">
challenge_uInt8 = new Uint8Array([<?= htmlentities(implode(',', unpack('C*', $challenge))) ?>])
</script>
It's much easier/safer to pass a base64 encoded value to JavaScript with a data-
attribute:
<input type="button" ... data-challenge="<?= htmlentities(base64_encode($challenge)) ?>" />
Unfortunately I then need to convert this base64 value to something WebAuthn understands:
function base64_to_uint8array(base64) {
var binary = window.atob(base64),
array = new Uint8Array(new ArrayBuffer(binary.length));
for (var k = (binary.length - 1); k >= 0; k--) {
array[k] = binary.charCodeAt(k);
}
return array;
}
var challenge_base64 = this.getAttribute('data-challenge'),
challenge_uInt8 = base64_to_uint8array(challenge_base64);
And it's worse getting the credentials.create() response into something usable:
var decoder = new TextDecoder('utf-8'),
clientData = JSON.parse(decoder.decode(result.response.clientDataJSON));
var attestationData = CBOR.decode(result.response.attestationObject); // A whole CBOR parsing library needed for this.
var dataView = new DataView(new ArrayBuffer(2));
var idLenBytes = attestationData.authData.slice(53, 55); // Great, another binary format to parse.
idLenBytes.forEach(function(value, index) {
dataView.setUint8(index, value)
});
var credentialIdLength = dataView.getUint16();
var credentialId = attestationData.authData.slice(55, credentialIdLength);
var publicKeyBytes = attestationData.authData.slice(55 + credentialIdLength);
var publicKeyObject1 = CBOR.decode(publicKeyBytes.buffer); // And CBOR is back again.
var publicKeyObject2 = {
'id': result.id,
'type': result.type,
'client_type': client_data.type,
'client_origin': client_data.origin,
'client_challenge': client_data.challenge,
'key_type': publicKeyObject1[1], // 2 = Elliptic Curve; using more magic numbers for keys and values, does this save a few bytes somewhere?
'key_algorithm': publicKeyObject1[3], // -7 = ECDSA with SHA256
'key_curve_type': publicKeyObject1[-1], // 1 = P-256
'key_curve_x': btoa(String.fromCharCode.apply(null, publicKeyObject1[-2])),
'key_curve_y': btoa(String.fromCharCode.apply(null, publicKeyObject1[-3]))
};
return JSON.stringify(publicKeyObject2); // Finally, something that can be sent to the server.
As an aside, would the non-JavaScript approach be able to skip most of this extra processing?
You might like GitHub's webauthn-json library, which should alleviate some of your pains with the API.
why does WebAuthn use CBOR encoding? [...] Couldn't this API simply return a JavaScript object, with the values ready to be sent back to the server?
The main reason is that the API is designed to work with dedicated authenticator hardware, which benefits from a compact binary representation. The client does in fact transform the binary response from the authenticator into a more ergonomic object format before returning it to the RP - but since some of the response is cryptographically signed, those signed parts (authData
, attStmt
, clientDataJSON
, sig
) have to remain in the exact binary format they were when signed. The authenticator data format is the way it is in large part to maintain compatibility with U2F authenticators.
There's been some discussion about adding more web-friendly versions of some of the data, but it was decided against in favour of reducing client complexity and data duplication.
the use of Uint8Array/ArrayBuffer is a pain - can't it use base64 encoding?
I guess it could, but on the other hand Uint8Array/ArrayBuffer
are native JavaScript types while base64 strings are not. I think there is value in minimizing the number of places where we need to specify which exact variant of base64 to use - we already have some divergence between base64Url (e.g., CollectedClientData.challenge
) and classic base64 (e.g., android-safetynet
attestation).
the Client Data
challenge
is base64 encoded, but the Credential Creationchallenge
is not
The CollectedClientData.challenge
is base64 encoded because it gets written into a JSON string to be signed by the authenticator. I agree the PublicKeyCredential{Creation,Request}Options.challenge
will most likely come from a JSON object in the first place, but that is not the only possibility.
As an aside, would the non-JavaScript approach be able to skip most of this extra processing?
Perhaps, but please note that the discussion of a non-JavaScript approach is only hypothetical at this point - it's currently not on any roadmap.
By the way, your base64_to_uint8array
function doesn't actually base64 decode the input string, so you're going to have mismatches when you compare the challenge from clientDataJSON
with the original input challenge. Which of course supports your thesis that the API is difficult to use. But again, webauthn-json should help.
Thanks for the suggestion, but I'm not sure webauthn-json is helpful in this case.
Basically I don't want every website needing these extra dependencies, especially one that has been written in TypeScript (introducing the need for Node.js to compile it).
While I'm not completely happy with my implementation yet, I feel like the WebAuthn API should be easier to use - ideally allowing the returned data to be sent directly to the server as normal JSON / Base64 encoded data:
https://github.com/craigfrancis/webauthn-tidy/blob/master/tidy/js/create.js?ts=4
https://github.com/craigfrancis/webauthn-tidy/blob/master/tidy/js/check.js?ts=4
This would also make the server side code considerably easier to implement:
https://github.com/craigfrancis/webauthn-tidy/blob/master/tidy/
But I must confess, I haven't looked at attestation
during create() yet, so I'm not sure if my suggestion will work yet (I've just set it to 'none', mostly because I don't think I need it).
And thanks for noting the the mixing of base64
and base64Url
, that tripped me up when taking result.id
from navigator.credentials.create()
, then later using it for the publicKey.allowCredentials.id
used with navigator.credentials.get()
.
As to my base64_to_uint8array()
function (well, copied from Goran.it, as I've never needed to use Uint8Array before), it seems to be working, so I wonder if I'm using it in a different way than you're expecting?
I should note that this is the 3rd time I've tried to implement WebAuthn, and each time I have to leave it due to other work (I don't want to put something on my websites that I do not understand).
As a random aside, I used JSON reviver/replacer functions and a JSON format rule to allow for client/server communication over JSON. when my rule matched, I converted between the wire base64 value format and the needed/returned UInt8Arrays.
As to my
base64_to_uint8array()
function (well, copied from Goran.it, as I've never needed to use Uint8Array before), it seems to be working, so I wonder if I'm using it in a different way than you're expecting?
Oh, sorry, I thought window.atob
just converted a string value to bytes, not base64 decoded it. Never mind, then.
Taking some of the ideas from the thread on using WebAuthn without Javascript, I've created something that could be done via the browser itself.
It uses the idea of including the public key in result from create()
- so you don't have every website needing to mess around with CBOR encoding. This does introduce some duplicate data, including the flags
and signCount
, but that's relatively small. The main focus is on making WebAuthn much easier for websites to implement, and I suspect most websites would use these parsed values (as I understand it, you only need the binary form if you are using attestation
, where I don't think I've seen a single website do that so far).
It avoids the issue where the length of attestedCredentialData
is difficult to get right - the implementations I've seen so far assume there is no extension data present, e.g.
credentialPublicKey = authData.slice(55 + credentialIdLength); // Missing [end]
By converting all of the Uint8Array/ArrayBuffer values directly to base64 encoding, it's much easier to provide values (e.g. the publicKey.user.id
), and have a response that can be sent to the server.
And a minor thing, my implementation has replaced the base64url encoding of the response.id
with normal base64 encoding, but that's just to make it easier for programming languages that don't support rfc4648 by default.
This is my first working version (3rd attempt at trying to understand WebAuthn, which I don't think is a good in regards to the current complexity):
https://github.com/craigfrancis/webauthn-tidy/tree/master/html
The HTML is at the bottom of the PHP scripts.
I should note that the PHP code is intentionally trying to remain as simple as possible, so it's not pulling in a framework, or using objects, types, etc.
On the call of 2020-01-22 it was decided that the use of ArrayBuffers is reflecting W3C direction as we understand it and that revisiting that would be too much. However, it does appear that implementations that don't care about attestation could be saved from worrying about CBOR if browsers were to parse out the public-key. The group believes that's worth pondering further and we're going to close this issue and open one focused on that.
I do see your point, especially in regards to backwards compatibility, but ArrayBuffers are a pain to send to the server (where they will ultimately need to be sent).
Which is where I'm intrigued by the suggestion of a HTML element, one that does not need any JavaScript, and could be polyfilled by JavaScript on older browsers (addressing the backwards compatibility issue).
I think that HTML element should make a POST request to the server, using JSON encoding, and base64 encoding for any binary data.
At the moment I'm thinking a normal <input type="submit" name="auth" value="Create" />
(so something appears for older browsers, when a polyfill fails), then adding one of two new attributes (credential-create=""
or credential-get=""
) where their content sets the options (also using JSON).
Then, along with the public key being in an easier to use format (maybe PEM), it would be nice to have easy access to all of the other values (rpIdHash, flags, and signCount).
Request:
$options = {
"publicKey": {
"rp": {
"name": "Test Website",
"id": "example.com"
},
"user": {
"id": "MTIzNA==", // A Base64 encoded value (1234), not Uint8Array
"name": "craig@example.com",
"displayName": "Craig Francis"
},
"challenge": "txHXB+K0cQFlWLhBOd0jvHSBCd4aJv8I5X0Z7U7ElGU=", // A Base64 encoded value, not Uint8Array
"pubKeyCredParams": [
{
"type": "public-key",
"alg": -7
}
],
"timeout": 10000,
"attestation": "none",
"excludeCredentials": [
],
"userVerification": "discouraged"
}
}
<form action="/path/" method="post">
<input type="submit" name="auth" value="Create" credential-create="<?= htmlentities(json_encode($options)) ?>" />
</form>
POST Response:
$auth = {
"id": "mGYJM5RrXM1bwWlIvOewnjOAJ1Y4OmmDyMZ5tkdJCcWCay1RktHcfQvpDB4OIw9UsqntFx1FGJDCugyQTTFnrg",
"type": "public-key",
"auth": {
"rpIdHash": "afb64c14d8723ef066d1e108dd60adec30447611664958a5587cdf806ba5ab6b",
"flags": {
"UP": true,
"RFU1": false,
"UV": false,
"RFU2a": false,
"RFU2b": false,
"RFU2c": false,
"AT": true,
"ED": false
},
"signCount": 0,
"attestedCredentialData": {
"aaguid": "AAAAAAAAAAAAAAAAAAAAAA==",
"credentialId": "mGYJM5RrXM1b",
"publicKey": {
"type": 2,
"algorithm": -7,
"curve_type": 1,
"curve_x": "uELJlQrFdsxGjthRcbrcNwMKDGbsaEoP4T5T6JBdGQM=",
"curve_y": "XBZY+ZCfmnQia65ZO17sHuD0FkUoAwIbE39G/EfChjI=",
"pem": "-----BEGIN PUBLIC KEY-----\nMFkwEwYHKoZIzj0CAQYIKoZIzj0DAQcDQgAEuELJlQrFdsxGjthRcbrcNwMKDGbs\naEoP4T5T6JBdGQNcFlj5kJ+adCJrrlk7Xuwe4PQWRSgDAhsTf0b8R8KGMg==\n-----END PUBLIC KEY-----"
}
},
"extensions": null
},
"response": {
"clientDataJSON": "eyJjaGFsbGVuZ2UiOiJ0eEhYQi1LMGNRRmxXTGhCT2QwanZIU0JDZDRhSnY4STVYMFo3VTdFbEdVIiwib3JpZ2luIjoiaHR0cHM6Ly9icm93c2VyLndlYmF1dGhuLmVtbWEuZGV2Y2YuY29tIiwidHlwZSI6IndlYmF1dGhuLmNyZWF0ZSJ9",
"attestationObject": ""
}
}
@craigfrancis , were you able to convert the below to PEM format?
var publicKeyObject1 = CBOR.decode(publicKeyBytes.buffer); // And CBOR is back again.
to PEM format?
@greenpau, have a look at the new-ish getPublicKey() method, where the browser returns the value in DER format (not quite PEM, but it's easy to convert).
I've also got a basic polyfill for browsers that don't support this new method (discussion).
Alternatively, my "webauthn-tidy" repo includes 3 implementations:
- "new" shows my (fairly un-tested) polyfill in use;
- "tidy" is how I'm currently using WebAuthn (JS doing as much work as possible, to tidy up the API);
- "html" you can ignore (I was experimenting how a browser could do all of this with HTML elements only, the JS is filling in what the browser could do automatically).
have a look at the new-ish getPublicKey() method
@craigfrancis , thank you for the pointer!
FYI, I am learning the API. I am "stuck" at registration phase ๐
Hopefully, with your pointer, I will get it going ๐ Thank you so much!
Probably not the best place to discuss here, so I've created Issue 78 on your project.
As a mere spectator and humble web developer, I can only shake my head when reading this. Of course every web developer would pull their hair when seeing something Uint8Array/ArrayBuffer/CBOR and by far favor base64 encoded data! It's so obvious, so clear. What's the point of having everyone write libraries around it, instead of making it usable directly?
On the call of 2020-01-22 it was decided that the use of ArrayBuffers is reflecting W3C direction as we understand it and that revisiting that would be too much. However, it does appear that implementations that don't care about attestation could be saved from worrying about CBOR if browsers were to parse out the public-key. The group believes that's worth pondering further and we're going to close this issue and open one focused on that.
From an external and impartial point of view, this looks like shifting implementation effort from a few "providers" to countless "normal" web developers. Such an interface feels antiquated despite aiming to be the future of authentication. I'm sure it would be really highly appreciated by all web developers out there if plain JSON structures were used.
From an external and impartial point of view, this looks like shifting implementation effort from a few "providers" to countless "normal" web developers. Such an interface feels antiquated despite aiming to be the future of authentication. I'm sure it would be really highly appreciated by all web developers out there if plain JSON structures were used.
Then you're in luck, check out #1683 where more recent discussion has taken place on what could be added to the API to make WebAuthn consumable in the front end without any third-party packages.
Others seem satisfied enough with the API changes I've proposed; I just need to start the ball rolling on an actual PR to add such functionality to L3. Feedback is welcome in there if you have any comments on what's been proposed.