Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

PROPOSAL: Add support for general (hardware backed) cryptographic signatures and key exchange #263

Open
ghost opened this issue May 7, 2021 · 23 comments

Comments

@ghost
Copy link

ghost commented May 7, 2021

BACKGROUND AND CONTEXT

TECHNICAL TL;DR

  • WebAuthn currently supports cryptographic signatures only for authentication. The signed data are challenges randomly generated by relying parties (RPs).
  • WebAuthn is unique in that the cryptography it supports can leverage secure hardware — both hardware embedded in devices through platform authenticators and external hardware keys. This is different than all other web cryptography options like WebCrypto, which provide no standardized access to hardware. Hardware provides far stronger security guarantees than cryptography done in the browser.
  • There's a desire from the community for general hardware backed cryptographic signatures and key exchange, which could be used for a wide range of applications far beyond authentication (see below for examples). It's conceivable to use WebAuthn as it stands to enable some version of this (e.g. by passing a hashed document instead of a random challenge to be signed, to enable document signing), but such use is far from what the spec was intended for and complicates the security model (e.g. because the hashed document is deterministic, not random).
  • Hardware backed key exchange (e.g. ECDH) can enable more secure encryption, since a symmetric encryption key can be encrypted with an asymmetric key stored in hardware.
  • In terms of leveraging device hardware (e.g. the Secure Enclave in an iPhone), there's a puzzling gap between mobile and web. Mobile apps can easily leverage the hardware for general cryptography (including general signatures and key exchange) using the device OS. But web apps have no such ability. Achieving feature parity between cryptography on mobile and web would greatly simplify development of new apps and make achieving widespread adoption far easier.

PROPOSAL
Add support for general cryptographic signatures and key exchange, backed by either hardware native to the device or an external hardware key. Since WebCrypto already standardizes general cryptographic operations, this would be a straightforward extension of the curreny spec and could likely repurpose material from the WebAuthn spec. In a simple sense, this proposal is "WebAuthn + WebCrypto", i.e. the hardware access WebAuthn standardizes with the general crytpographic signatures (and key exchange?) WebCrypto standardizes.

The user experience can closely match both current WebAuthn implementations and mobile app cryptography flows:

  • The user is prompted to pass a platform authenticator check (e.g. Face ID on an iPhone) or insert a security key (e.g. a Yubikey). The check can create a "session" during which the user doesn't have to pass additional checks (e.g. 10 minutes od not needing to do Face ID again), with the limiting case being a session of zero duration so the user has to pass a check for every cryptographic function call.
  • (Behind the scenes:) The RP triggers the hardware to create an asymmetric key pair (ideally with some control over the algorithm/curve).
  • (Behind the scenes:) The RP passes data to the hardware to be signed. The hardware signs the data with the private key, never exposing it to the RP, and returns the result. Or, the RP triggers key exchange (e.g. using ECDH).

So, from the user's perspective, it's as simple as e.g. passing a biometric check. Everything else is invisible. This is exactly how mobile apps leverage hardware backed cryptography today.

USE CASES
This proposal amounts to enabling more secure cryptographic signatures and key exchange, so the use cases include the vast array of applications of those cryptographic methods! For example:

  • All current and future Web3 dApps, which rely on signatures for every operation (@cybercent has pointed this out in their comments here and in their proposal here)
  • Document signing
  • Encrypted cloud data storage
  • Secure peer-to-peer messaging
  • Data integrity protection
  • Transaction non-repudiation
  • Symmetric encryption protected by asymmetric signing

And on and on...

DIFFICULTIES

  • The WebAuthn WG decided against purusing this, and I can understand why. The name "WebAuthn" and the group's charter reflect WebAuthn's current, more narrow scope (authentication). It would be difficult to extend the spec to include use cases beyond authentication without changing the name.
  • Previous, somewhat similar efforts stalled out about five years ago. For example, see Hardware Based Secure Services features and the now dormant Hardware Based Secure Services Community Group
@twiss
Copy link
Member

twiss commented May 8, 2021

Hey @certainlyNotHeisenberg 👋 Thanks for the proposal and background. I agree it could potentially make sense to add this to the Web Crypto spec, if there is implementer interest for it. The Web Crypto API already has a concept of non-extractable/exportable keys, which is of course useful for keys stored in hardware. There may be some open issues that need to be thought through, but at first glance, from the perspective of the spec I think it should be doable to add this, the majority of the effort would be on the side of implementing it.

@ghost
Copy link
Author

ghost commented May 11, 2021

Glad to hear it @twiss. How do we go about gauging/evidencing implementer interest? I assume the main implementers are the major browsers?

@twiss
Copy link
Member

twiss commented May 17, 2021

@sleevi might know if there would potentially be interest for this in Chromium?

@sleevi
Copy link
Contributor

sleevi commented May 17, 2021

There's a desire from the community for general hardware backed cryptographic signatures and key exchange, which could be used for a wide range of applications far beyond authentication (see below for examples).
Add support for general cryptographic signatures and key exchange, backed by either hardware native to the device or an external hardware key.

In general, we consider such proposals harmful to the open web. Such proposals are, effectively, super-cookies that can persist beyond the browsers ability to manage, offer cross-browser venues for tracking, be used to harm the device/media independence of the Web, remove user’s agency (in favor of the hardware vendor), and are fundamentally incompatible with the Web’s security model.

This was by far the most contentious issue of the WebCrypto WG, and it’s very intentional that WebCrypto declined to explicitly take this on in scope. The language in the WebCrypto spec explicitly warning developers about this was a reflection of the fundamental challenges here.

Although we have explored such APIs for situations that are independent of the web security model (e.g. extensions/apps provided via other mechanisms), a general purpose API is something we’ve long seen as problematic to user safety (such APIs can functionally brick devices if pursuing parity with the underling OS APIs), and equally problematic to explain from a permissions point of view. For example, we don’t expose such APIs as a prompt, but rather require the device owner take explicit steps to allowlist specific extensions when considering such APIs.

There are obviously qualifications and complications here, as WebAuthN is example, both positive and negative, of exploring more scoped capabilities. But that’s not a general purpose API, as noted.

Given that this is a broad idea, and not a concrete proposal, I don’t think we can say much more than the broad problems. The best step to something concrete is https://discourse.wicg.io/ and working through an explainer. However, there are years of discussion about the challenges of this proposal on the WebCrypto WG mailing list. While my reply here is very blunt and direct in summing up the opposition points, there is a lot more nuance previously captured on the list that is useful context. This especially considers a number of “but what if” responses to the concerns raised above and trying to work through them to their logical conclusion.

@ghost
Copy link
Author

ghost commented May 17, 2021

@sleevi Confused by your response:

  • How are signatures necessarily super-cookies? Sure, there should be some sort of nonce involved so a client can't just pass in the same data everywhere and deterministically get back signatures. But that's an implementation detail, not a reason to not support signatures. In Section 7 under "Super-cookies", it says care must be taken when introducing future revisions to this API or additional cryptographic capabilities, such as those that are hardware backed. I completely agree with that, but care must be taken doesn't mean it should never be done.
  • How would this remove the user's agency in favor of the hardware vendor? The user could choose to use a platform authenticator, which leverages hardware of a device they chose to use/purchase, or they could use an external hardware key. In each case, the user can choose among a wide range of hardware types and vendors.
  • Related to the previous bullet, how does this harm the device independence of the Web? I understand and support the desire to keep the Web functional across a wide range of devices, but it's certainly not independent from devices in any strict sense. The Web wouldn't exist if not for the devices enabling the Internet. And not all devices in the world can participate. They have to interoperate with a set of protocols. This hardly seems unreasonable.

Honestly, the problems of the current spec seem far more troubling than any issues hardware storage would cause! With the current spec, keys are kept in browser storage vulnerable to malware and physical takeover, keys are not necessarily zeroized after references to them are erased, and keys are not persisted long term because they're lost when browser storage is cleared. So basically, they're totally unusable for most security conscious use cases. Hence why all such use cases currently rely only on mobile apps and external hardware keys. And hence why so few of those use cases have broad adoption — they have to ask for way too much of everyday users.

problematic to user safety (such APIs can functionally brick devices if pursuing parity with the underling OS APIs), and equally problematic to explain from a permissions point of view

I find this hard to understand. Do you think it's problematic for user safety and permissions that mobile apps can do the hardware backed cryptography I described here? And that it's standardized across Android and iOS and the hundreds of different devices that run those operating systems? And that these apps are Internet connected and often work alongside parallel web apps? Those systems have worked well for many years at this point and involve clearly defined ways of asking for and receiving user permission.


I humbly admit that I know far less about this than you and that I don't know the details of all the community discussions you mentioned. I am a relative newcomer making this proposal, and maybe I'm naive. I understand your bluntness — it must be frustrating to have new people chiming in on topics you've thought about for years!

But at the very least, I hope this serves as a useful sign of interest from someone outside the working group. I think there's a growing tide of interest because more and more people need secure, usable cryptography for what they're building, and it's frustrating to not be able to use the widely available hardware in user devices through web apps.

Personally (and please don't take this the wrong way), I think it's a shame that 10+ years into the rise of Web 3 we still don't have web cryptography that's truly secure, hardware backed. And I think it's a shame that the WebCrypto/WebAuthn working groups aren't leading the charge in making that available. These specs would ideally represent the future of technology, and I think supporting hardware backed cryptography on the web would be a fantastic step in that direction.

I'm ready and eager to join any effort to push the specs in this direction. And of course I want to do so with the utmost attention to detail around security, privacy, etc.

@petersmagnusson
Copy link

petersmagnusson commented Sep 17, 2021

I'm not an insider either, but I've been spending a few years looking at how to integrate Webauthn (Yubikey) into an overall system that, like you say, would be "truly secure and hardware backed," and also initially surprised that there wasn't more direct support for the key management problem. Reading @sleevi 's reply, I think I see the fundamental problem, perhaps it can be paraphrased this way:

Authentication from the server's (the web's) point of view solves one very specific problem: to ascertain that 'you' (a specific human being) are "somewhere on the other side" of the network (from the perspective of the server/service). And that you are in practice approving the use of "everything on your side" - whatever hardware, operating system, browser, application, etc, that you are using, with respect to whatever service you are requesting.

That's it. Providing security within the context that you are connecting from (e.g., if you trust the OS but not the browser etc), is not a solvable problem, because if you were to do that, you would end up with yet another system - where the same issue would arise, recursively. Hence the Secure Enclave in an iPhone not being available (in any equivalency) to mobile web - because then you would need hardware to secure that extension per se; the iPhone security measures can only be viewed as simply a bigger Yubikey.

You can think of it as a geometric problem - there's only one dimension between "you" (eg your finger tapping the Yubikey to confirm you're "there") and the (final) endpoint of an online service (say, encrypted cloud storage). What webauthn solves is offering a way of "connecting" the two (extreme) end points with a baseline trust model. But you cannot do a general solution for any intermediate point - hence the result that you would inevitably break the open web model. You need to safeguard "your end", and the service can safeguard "their" end, and webauthn can connect the two, and the jungle in-between can be jumped over.

@ghost
Copy link
Author

ghost commented Sep 17, 2021

@petersmagnusson Thanks for your commentary on this. I agree with what you're saying but don't see how it invalidates this proposal. No such "endpoint" — whether a Yubikey, platform authenticator, or user inputting a password — is 100% secure, but that's understood. That fact doesn't mean such endpoints can't be used to perform cryptographic operations. It also doesn't mean that some are more secure than others: clearly Yubikeys and platform authenticators are far more secure than passwords for all sorts of reasons.

The whole point of WebAuthn is that it standardizes access to platform authenticators so they can be used alongside or as alternatives to roaming authenticators like Yubikeys. That doesn't guarantee security within the context of the user's device: that security is only as good as the platform authenticator system is, and it may vary across platforms and individual devices. But that doesn't mean there's no value to WebAuthn. On the contrary, there's enormous value because something like 6.4 billion people have smartphones, likely with platform authenticators baked in, and almost no everyday users have roaming authenticators.

Currently, web apps that rely on cryptography have no choice but to do that cryptography in the browser, where it isn't secure, or force users to use roaming authenticators. That's why most Web 3 apps aren't usable without something like a Ledger hardware wallet. The result is that such web apps are totally unusable to the vast majority of people. But the irony is, those people already have devices that contain the hardware necesssary for doing the needed cryptography securely! With WebAuthn, web apps are already able to use that hardware for the narrow use case of authentication. Why not for general cryptographic operations?

@dcow
Copy link

dcow commented Jan 11, 2022

I am working on an application where access to hardware backed cryptographic operations (encryption and signing) would improve the security posture for all its users and thus make them safer on the web. Full stop.

In fact, not providing access to strong cryptography hooked into the conveniences of platform [biometric] HSM support actually cements the grip platform vendors have over the authentication and security space. It locks the world down so that the only places where top tier security experiences e.g. webauthn can be implemented are in the platforms/browsers themselves (and I'm cynically not surprised they'd want to keep it that way under the guise of "user safety and anti-tracking", or maybe, even more simply, pride). In reality, access to strong platform backed cryptographic primitives would enable e.g. soft authenticator/credential managers to exist with security that is isomorphic to that provided by the browser/platform implementation and not 3rd rate. Generally it would enable rich experiences where a software application is able to keep a users's soft keys encrypted at rest and have them decrypted and used on demand by HSM-backed keys subject to biometric unlock requirements... thus freeing users from depending on any one platform or vendor to manage these keys for them and instead allowing them to opt into using whatever software agent they prefer (an example might be a "web 3" "wallet" application where a soft on-chain identity needs to be secured across different application sites running on a variety of platforms). It's not just about a "server's" understanding of a user. It's increasingly common to see applications where the server knows next to nothing about a user (or the server might not really even exist) and instead the client/agent application performs security operations locally using user-managed keys. These keys need to be protected and asking users for a password on every use is an abundantly crappy experience.

The "user safety" concerns seem like they should be solved by the user-agent perhaps by requiring user interaction and/or consent to access functionality that might be able to be abused to track users. Safari/WebKit does this, for example. It's just not the web's problem (whatever that means because as pointed out native apps use "the web" too so we're really conflating concerns here) that "good cryptography implies maintaining a stable identifier". That's just fundamental to good security. Let applications and users decide whether and how to deploy good cryptographic technology rather than tell people it's off limit because what if somebody could track you and what about that. In reality, users will just download the app because they can use a face scan in order to login to their bank and just forget about the password laden browser version of the experience. And I think that's what we've seen happen. There's no reason that web apps and browser extensions should not be granted access to the proposed functionality. Advertising companies have really abused and traumatized everyone, haven't they?

@stalek71
Copy link

stalek71 commented May 30, 2022

It's 2022 and browsers still lack of hardware supported security. What if anybody would like to do such an operation like "Sign my vote" by using Yubico NFC on phone? It's really a pity and shame for us that security is not a priority No 1 in browsers...

@twiss
Copy link
Member

twiss commented Jun 14, 2022

@stalek71 Just saying "it's 2022" doesn't really help the process move forward, given that nobody has made a concrete proposal yet.


That being said, I want to try to carve out a feature subset that might be useful and explicitly tries to avoid the "supercookie" issue, by considering only generating (and perhaps importing) hardware-backed keys, and using those for cryptographic operations, and declaring as out of scope using existing hardware keys.

One way that could work is:

  • Define a generateHardwareKey(algorithm, keyUsages) function, which is similar to generateKey but extractable is always false, and the key is stored on an HSM (or Trusted Platform Module, Secure Enclave, or similar)
  • (Maybe importHardwareKey as well?)
  • Explicitly don't offer a way to query keys on the HSM
  • Allow storing the key in IndexedDB instead (same as normal Web Crypto keys), which stores a reference to the HSM key
  • The browser is allowed to remove data in IndexedDB (same as always), and remove the HSM key
  • If the HSM is removed or unavailable while the key object is still in IndexedDB, raise an OperationError on usage of the key

@sleevi That should solve the supercookie concern, right? Or do you see other issues?


FWIW, in theory a browser could already implement generateKey(algorithm, extractable = false, keyUsages) in this way (especially if it's reasonably confident that the HSM won't go away). So the main reason for defining a separate generateHardwareKey is if the application wants to be certain that the key is hardware-backed. However, I would expect that for most applications, if generateHardwareKey is not available or throws, they would fall back to a normal generateKey invocation, anyway - so perhaps a separate function isn't needed.

The other advantage would be that if the HSM is separate from the device, and might go away at any time, having a hint from the application that it's OK if the key goes away might be valuable. Thus, another possibility might be to add another parameter to generateKey (and possibly importKey) for that, e.g. generateKey(algorithm, extractable = false, keyUsages, preferHardwareBacked = true). That way, it's backwards compatible, and it's clear that it's just a hint to the browser, instead of a requirement to generate a hardware-backed key (or not). And for the applications that really need to know, we could expose a hardwareBacked property on the generated key.

But - on platforms where an HSM is always available, perhaps that again isn't necessary, and the browser could just always store non-extractable keys on the HSM, if it wants to.

@matthiasgeihs
Copy link

I think in general it would be great to have hardware-backed security available in the browser.

Regarding generateKey and generateHardwareKey:

  • I think it would be important that API consumers can detect whether their key is stored on a hardware device or just in memory.
  • Also it would be important to detect whether keys are really persisted long-term or whether they may be swiped away when clearing browser cache.

@irux
Copy link

irux commented Apr 14, 2023

Hey! Any news about this?

@RByers
Copy link

RByers commented Oct 2, 2023

I don't have any context on WebCrypto itself, but in general I think this is an important facility the web needs to offer, especially as we crack down on fingerprinting and other types of tracking. To @sleevi's "supercookie" point above, I see two paths for providing hardware-backed keys which could (in theory) fit within the browser privacy threat model:

  1. Higher-level with browser mediation and meaningful user consent. IMHO this is what WebAuthn (and it's downstream uses like SPC) provides, as well as emerging ideas like IdentityCredential.

  2. Lower-level without browser mediation, but within the browser privacy model - i.e. tied to 1P site state (where clearing cookies for an origin would also reset/invalidate such a key). This is where I think there may be an opportunity for WebCrypto to expand. There's also some work on Device Bound Session Credentials which could potentially address some use cases here.

To what extent could the use cases being discussed here meaningfully be addressed by the above? Eg. if we could reliably associate a hardware-backed key with an origin and ensure we reset it whenever a user clears cookies for that origin, would that be good enough?

@ptdecker
Copy link

It's 2022 and browsers still lack of hardware supported security. What if anybody would like to do such an operation like "Sign my vote" by using Yubico NFC on phone? It's really a pity and shame for us that security is not a priority No 1 in browsers...

Marking that it is 2024-04-11 now (an ol' s'kool bump)

@cybercent
Copy link

IMG_2525

@nebolax
Copy link

nebolax commented Jun 5, 2024

+1

@matthiasgeihs
Copy link

https://w3c.github.io/webauthn/#prf-extension

The PRF extension allows deriving secret keys from the enclave that can be used for other purposes outside the enclave. However, I don’t think this is widely supported yet.

@yawn
Copy link

yawn commented Jun 6, 2024

Do you know if the presence of this extension (regardless of CTAP) can be tested easily? Cursorily look seems to indicate the Chrome and FF support it?

@matthiasgeihs
Copy link

I think you should be able to test this using a simple web example, as for example described here:

@kdenhartog
Copy link
Member

Here's an explainer I wrote up on how we might be able to achieve this in Brave (and standardize it further in a WG): https://github.com/brave-experiments/hardware-backed-webcrypto

@matthiasgeihs
Copy link

@kdenhartog Did you consider including encryption as a use case?

@kdenhartog
Copy link
Member

Yes, it would be my intention to do so but I didn't write that at the beginning when I first documented the idea. I just need to add an example of it still and opened an issue to track it. The one limitation I foresee is it will be limited to what the TPM supports so that may force certain algorithms like AES-128 vs AES-256.

@matthiasgeihs
Copy link

Makes sense, will still be very useful for encrypting stuff under a key that is secured by HSM.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Projects
None yet
Development

No branches or pull requests