-
Notifications
You must be signed in to change notification settings - Fork 57
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
PROPOSAL: Add support for general (hardware backed) cryptographic signatures and key exchange #263
Comments
Hey @certainlyNotHeisenberg 👋 Thanks for the proposal and background. I agree it could potentially make sense to add this to the Web Crypto spec, if there is implementer interest for it. The Web Crypto API already has a concept of non-extractable/exportable keys, which is of course useful for keys stored in hardware. There may be some open issues that need to be thought through, but at first glance, from the perspective of the spec I think it should be doable to add this, the majority of the effort would be on the side of implementing it. |
Glad to hear it @twiss. How do we go about gauging/evidencing implementer interest? I assume the main implementers are the major browsers? |
@sleevi might know if there would potentially be interest for this in Chromium? |
In general, we consider such proposals harmful to the open web. Such proposals are, effectively, super-cookies that can persist beyond the browsers ability to manage, offer cross-browser venues for tracking, be used to harm the device/media independence of the Web, remove user’s agency (in favor of the hardware vendor), and are fundamentally incompatible with the Web’s security model. This was by far the most contentious issue of the WebCrypto WG, and it’s very intentional that WebCrypto declined to explicitly take this on in scope. The language in the WebCrypto spec explicitly warning developers about this was a reflection of the fundamental challenges here. Although we have explored such APIs for situations that are independent of the web security model (e.g. extensions/apps provided via other mechanisms), a general purpose API is something we’ve long seen as problematic to user safety (such APIs can functionally brick devices if pursuing parity with the underling OS APIs), and equally problematic to explain from a permissions point of view. For example, we don’t expose such APIs as a prompt, but rather require the device owner take explicit steps to allowlist specific extensions when considering such APIs. There are obviously qualifications and complications here, as WebAuthN is example, both positive and negative, of exploring more scoped capabilities. But that’s not a general purpose API, as noted. Given that this is a broad idea, and not a concrete proposal, I don’t think we can say much more than the broad problems. The best step to something concrete is https://discourse.wicg.io/ and working through an explainer. However, there are years of discussion about the challenges of this proposal on the WebCrypto WG mailing list. While my reply here is very blunt and direct in summing up the opposition points, there is a lot more nuance previously captured on the list that is useful context. This especially considers a number of “but what if” responses to the concerns raised above and trying to work through them to their logical conclusion. |
@sleevi Confused by your response:
Honestly, the problems of the current spec seem far more troubling than any issues hardware storage would cause! With the current spec, keys are kept in browser storage vulnerable to malware and physical takeover, keys are not necessarily zeroized after references to them are erased, and keys are not persisted long term because they're lost when browser storage is cleared. So basically, they're totally unusable for most security conscious use cases. Hence why all such use cases currently rely only on mobile apps and external hardware keys. And hence why so few of those use cases have broad adoption — they have to ask for way too much of everyday users.
I find this hard to understand. Do you think it's problematic for user safety and permissions that mobile apps can do the hardware backed cryptography I described here? And that it's standardized across Android and iOS and the hundreds of different devices that run those operating systems? And that these apps are Internet connected and often work alongside parallel web apps? Those systems have worked well for many years at this point and involve clearly defined ways of asking for and receiving user permission. I humbly admit that I know far less about this than you and that I don't know the details of all the community discussions you mentioned. I am a relative newcomer making this proposal, and maybe I'm naive. I understand your bluntness — it must be frustrating to have new people chiming in on topics you've thought about for years! But at the very least, I hope this serves as a useful sign of interest from someone outside the working group. I think there's a growing tide of interest because more and more people need secure, usable cryptography for what they're building, and it's frustrating to not be able to use the widely available hardware in user devices through web apps. Personally (and please don't take this the wrong way), I think it's a shame that 10+ years into the rise of Web 3 we still don't have web cryptography that's truly secure, hardware backed. And I think it's a shame that the WebCrypto/WebAuthn working groups aren't leading the charge in making that available. These specs would ideally represent the future of technology, and I think supporting hardware backed cryptography on the web would be a fantastic step in that direction. I'm ready and eager to join any effort to push the specs in this direction. And of course I want to do so with the utmost attention to detail around security, privacy, etc. |
I'm not an insider either, but I've been spending a few years looking at how to integrate Webauthn (Yubikey) into an overall system that, like you say, would be "truly secure and hardware backed," and also initially surprised that there wasn't more direct support for the key management problem. Reading @sleevi 's reply, I think I see the fundamental problem, perhaps it can be paraphrased this way: Authentication from the server's (the web's) point of view solves one very specific problem: to ascertain that 'you' (a specific human being) are "somewhere on the other side" of the network (from the perspective of the server/service). And that you are in practice approving the use of "everything on your side" - whatever hardware, operating system, browser, application, etc, that you are using, with respect to whatever service you are requesting. That's it. Providing security within the context that you are connecting from (e.g., if you trust the OS but not the browser etc), is not a solvable problem, because if you were to do that, you would end up with yet another system - where the same issue would arise, recursively. Hence the Secure Enclave in an iPhone not being available (in any equivalency) to mobile web - because then you would need hardware to secure that extension per se; the iPhone security measures can only be viewed as simply a bigger Yubikey. You can think of it as a geometric problem - there's only one dimension between "you" (eg your finger tapping the Yubikey to confirm you're "there") and the (final) endpoint of an online service (say, encrypted cloud storage). What webauthn solves is offering a way of "connecting" the two (extreme) end points with a baseline trust model. But you cannot do a general solution for any intermediate point - hence the result that you would inevitably break the open web model. You need to safeguard "your end", and the service can safeguard "their" end, and webauthn can connect the two, and the jungle in-between can be jumped over. |
@petersmagnusson Thanks for your commentary on this. I agree with what you're saying but don't see how it invalidates this proposal. No such "endpoint" — whether a Yubikey, platform authenticator, or user inputting a password — is 100% secure, but that's understood. That fact doesn't mean such endpoints can't be used to perform cryptographic operations. It also doesn't mean that some are more secure than others: clearly Yubikeys and platform authenticators are far more secure than passwords for all sorts of reasons. The whole point of WebAuthn is that it standardizes access to platform authenticators so they can be used alongside or as alternatives to roaming authenticators like Yubikeys. That doesn't guarantee security within the context of the user's device: that security is only as good as the platform authenticator system is, and it may vary across platforms and individual devices. But that doesn't mean there's no value to WebAuthn. On the contrary, there's enormous value because something like 6.4 billion people have smartphones, likely with platform authenticators baked in, and almost no everyday users have roaming authenticators. Currently, web apps that rely on cryptography have no choice but to do that cryptography in the browser, where it isn't secure, or force users to use roaming authenticators. That's why most Web 3 apps aren't usable without something like a Ledger hardware wallet. The result is that such web apps are totally unusable to the vast majority of people. But the irony is, those people already have devices that contain the hardware necesssary for doing the needed cryptography securely! With WebAuthn, web apps are already able to use that hardware for the narrow use case of authentication. Why not for general cryptographic operations? |
I am working on an application where access to hardware backed cryptographic operations (encryption and signing) would improve the security posture for all its users and thus make them safer on the web. Full stop. In fact, not providing access to strong cryptography hooked into the conveniences of platform [biometric] HSM support actually cements the grip platform vendors have over the authentication and security space. It locks the world down so that the only places where top tier security experiences e.g. webauthn can be implemented are in the platforms/browsers themselves (and I'm cynically not surprised they'd want to keep it that way under the guise of "user safety and anti-tracking", or maybe, even more simply, pride). In reality, access to strong platform backed cryptographic primitives would enable e.g. soft authenticator/credential managers to exist with security that is isomorphic to that provided by the browser/platform implementation and not 3rd rate. Generally it would enable rich experiences where a software application is able to keep a users's soft keys encrypted at rest and have them decrypted and used on demand by HSM-backed keys subject to biometric unlock requirements... thus freeing users from depending on any one platform or vendor to manage these keys for them and instead allowing them to opt into using whatever software agent they prefer (an example might be a "web 3" "wallet" application where a soft on-chain identity needs to be secured across different application sites running on a variety of platforms). It's not just about a "server's" understanding of a user. It's increasingly common to see applications where the server knows next to nothing about a user (or the server might not really even exist) and instead the client/agent application performs security operations locally using user-managed keys. These keys need to be protected and asking users for a password on every use is an abundantly crappy experience. The "user safety" concerns seem like they should be solved by the user-agent perhaps by requiring user interaction and/or consent to access functionality that might be able to be abused to track users. Safari/WebKit does this, for example. It's just not the web's problem (whatever that means because as pointed out native apps use "the web" too so we're really conflating concerns here) that "good cryptography implies maintaining a stable identifier". That's just fundamental to good security. Let applications and users decide whether and how to deploy good cryptographic technology rather than tell people it's off limit because what if somebody could track you and what about that. In reality, users will just download the app because they can use a face scan in order to login to their bank and just forget about the password laden browser version of the experience. And I think that's what we've seen happen. There's no reason that web apps and browser extensions should not be granted access to the proposed functionality. Advertising companies have really abused and traumatized everyone, haven't they? |
It's 2022 and browsers still lack of hardware supported security. What if anybody would like to do such an operation like "Sign my vote" by using Yubico NFC on phone? It's really a pity and shame for us that security is not a priority No 1 in browsers... |
@stalek71 Just saying "it's 2022" doesn't really help the process move forward, given that nobody has made a concrete proposal yet. That being said, I want to try to carve out a feature subset that might be useful and explicitly tries to avoid the "supercookie" issue, by considering only generating (and perhaps importing) hardware-backed keys, and using those for cryptographic operations, and declaring as out of scope using existing hardware keys. One way that could work is:
@sleevi That should solve the supercookie concern, right? Or do you see other issues? FWIW, in theory a browser could already implement The other advantage would be that if the HSM is separate from the device, and might go away at any time, having a hint from the application that it's OK if the key goes away might be valuable. Thus, another possibility might be to add another parameter to But - on platforms where an HSM is always available, perhaps that again isn't necessary, and the browser could just always store non-extractable keys on the HSM, if it wants to. |
I think in general it would be great to have hardware-backed security available in the browser. Regarding
|
Hey! Any news about this? |
I don't have any context on WebCrypto itself, but in general I think this is an important facility the web needs to offer, especially as we crack down on fingerprinting and other types of tracking. To @sleevi's "supercookie" point above, I see two paths for providing hardware-backed keys which could (in theory) fit within the browser privacy threat model:
To what extent could the use cases being discussed here meaningfully be addressed by the above? Eg. if we could reliably associate a hardware-backed key with an origin and ensure we reset it whenever a user clears cookies for that origin, would that be good enough? |
Marking that it is 2024-04-11 now (an ol' s'kool bump) |
+1 |
https://w3c.github.io/webauthn/#prf-extension The PRF extension allows deriving secret keys from the enclave that can be used for other purposes outside the enclave. However, I don’t think this is widely supported yet. |
I think you should be able to test this using a simple web example, as for example described here: |
Here's an explainer I wrote up on how we might be able to achieve this in Brave (and standardize it further in a WG): https://github.com/brave-experiments/hardware-backed-webcrypto |
@kdenhartog Did you consider including encryption as a use case? |
Yes, it would be my intention to do so but I didn't write that at the beginning when I first documented the idea. I just need to add an example of it still and opened an issue to track it. The one limitation I foresee is it will be limited to what the TPM supports so that may force certain algorithms like AES-128 vs AES-256. |
Makes sense, will still be very useful for encrypting stuff under a key that is secured by HSM. |
BACKGROUND AND CONTEXT
TECHNICAL TL;DR
PROPOSAL
Add support for general cryptographic signatures and key exchange, backed by either hardware native to the device or an external hardware key. Since WebCrypto already standardizes general cryptographic operations, this would be a straightforward extension of the curreny spec and could likely repurpose material from the WebAuthn spec. In a simple sense, this proposal is "WebAuthn + WebCrypto", i.e. the hardware access WebAuthn standardizes with the general crytpographic signatures (and key exchange?) WebCrypto standardizes.
The user experience can closely match both current WebAuthn implementations and mobile app cryptography flows:
So, from the user's perspective, it's as simple as e.g. passing a biometric check. Everything else is invisible. This is exactly how mobile apps leverage hardware backed cryptography today.
USE CASES
This proposal amounts to enabling more secure cryptographic signatures and key exchange, so the use cases include the vast array of applications of those cryptographic methods! For example:
And on and on...
DIFFICULTIES
The text was updated successfully, but these errors were encountered: