Eran Hammer-Lahav, the former editor of the OAuth 2.0 specification, announced the fact that he would no longer be the editor of the standard in a harshly critical blog post entitled OAuth 2.0 and the Road to Hell where he made a number of key criticisms of the specification the meat of which is excerpted below

Last month I reached the painful conclusion that I can no longer be associated with the OAuth 2.0 standard. I resigned my role as lead author and editor, withdraw my name from the specification, and left the working group. Removing my name from a document I have painstakingly labored over for three years and over two dozen drafts was not easy. Deciding to move on from an effort I have led for over five years was agonizing.

There wasn’t a single problem or incident I can point to in order to explain such an extreme move. This is a case of death by a thousand cuts, and as the work was winding down, I’ve found myself reflecting more and more on what we actually accomplished. At the end, I reached the conclusion that OAuth 2.0 is a bad protocol. WS-* bad. It is bad enough that I no longer want to be associated with it. It is the biggest professional disappointment of my career.

All the hard fought compromises on the mailing list, in meetings, in special design committees, and in back channels resulted in a specification that fails to deliver its two main goals – security and interoperability. In fact, one of the compromises was to rename it from a protocol to a framework, and another to add a disclaimer that warns that the specification is unlike to produce interoperable implementations.

When compared with OAuth 1.0, the 2.0 specification is more complex, less interoperable, less useful, more incomplete, and most importantly, less secure.

To be clear, OAuth 2.0 at the hand of a developer with deep understanding of web security will likely result is a secure implementation. However, at the hands of most developers – as has been the experience from the past two years – 2.0 is likely to produce insecure implementations.

Given that I’ve been professionally associated with OAuth 2.0 over the past few years from using OAuth 2.0 as the auth method for SkyDrive APIs to acting as an advisor for the native support of OAuth 2.0 style protocols in the Web Authentication Broker in Windows 8, I thought it would be useful to provide some perspective on what Eran has written as an implementer and user of the protocol.

The Good: Easier to work with than OAuth 1.0

I’ve been a big fan of web technologies for a fairly long time. The great thing about the web is that it is the ultimate distributed system and you cannot make assumptions about any of the clients accessing your service as people have tended to do in the enterprisey world past. This encourages technologies to be as simple as possible to reduce the causes of friction as much as possible. This has led to the rise of drop dead simple protocols like HTTP and data formats like JSON.

One of the big challenges with OAuth 1.0 is that it pushed a fairly complex and fragile set of logic on app developers who were working with the protocol. This blog post from the Twitter platform team on the most complicated feature in their API bears this out

Ask a developer what the most complicated part of working with the Twitter API is, and there's a very good chance that they'll say OAuth. Anyone who has ever written code to calculate a request signature understands that there are several precise steps, each of which must be executed perfectly, in order to come up with the correct value.

One of the points of our acting on your feedback post was that we were looking for ways to improve the OAuth experience.

Given that there were over 750,000 registered Twitter developers last year, this is a lot of pain to spread out across their ecosystem. OAuth 2.0 greatly simplifies the interaction model between clients and servers by eliminating the requirement to use signed request signatures as part of the authentication and authorization process.

 

The Bad: It’s a framework not a protocol

The latest draft of the OAuth 2.0 specification has the following disclaimer about interoperability

OAuth 2.0 provides a rich authorization framework with well-defined security properties.  However, as a rich and highly extensible framework with any optional components, on its own, this specification is likely to produce a wide range of non-interoperable implementations.

In addition, this specification leaves a few required components partially or fully undefined (e.g. client registration, authorization server capabilities, endpoint discovery).  Without these components, clients must be manually and specifically configured against a specific authorization server and resource server in order to interoperate.

What this means in practice for developers is that learning how one OAuth 2.0 implementation works is unlikely to help you figure out how another compliant one behaves given the degree of latitude that implementers have. Thus the likelihood of being able to take the authentication/authorization code you wrote with a standard library like DotNetOpenAuth against one OAuth 2.0 implementation and then pointing it at a different one by only changing a few URLs then expecting things to work is extremely low.

In practice I expect this to not be as problematic as it sounds on paper simply because at the end of the day authentication and authorization is a small part of any API story. In general, most people will still get the Facebook SDK, Live SDK, Google Drive SDK, etc of their target platform to build their apps and it is never going to be true that those will be portable between services. For services that don’t provide multiple SDKs it is still true that the rest of the APIs will be so different that the fact that the developer’s auth code has to change will not be as big of a deal to the developer.

That said, it is unfortunate that once cannot count on a degree of predictability across OAuth 2.0 implementations.

The Ugly: Making the right choices is left as an exercise for the reader

The biggest whammy in the OAuth 2.0 specification which Eran implies is the reason he decided to quit is hinted at in the end of the aforementioned disclaimer

This framework was designed with the clear expectation that future work will define prescriptive profiles and extensions necessary to achieve full web-scale interoperability.

This implies that there are a bunch of best practices in utilizing a subset of the protocol (i.e. prescriptive profiles) that are yet to be defined. As Eran said in his post, here is a list of places where there are no guidelines in the spec

  • No required token type
  • No agreement on the goals of an HMAC-enabled token type
  • No requirement to implement token expiration
  • No guidance on token string size, or any value for that matter
  • No strict requirement for registration
  • Loose client type definition
  • Lack of clear client security properties
  • No required grant types
  • No guidance on the suitability or applicability of grant types
  • No useful support for native applications (but lots of lip service)
  • No required client authentication method
  • No limits on extensions

There are a number of places where it would be a bad idea if an implementer decided not to implement a feature without considering the security implications such as token expiration. In my day job, I’ve also been bitten by the lack of guidance on token string sizes with some of our partners making assumptions about token size that later turned out to be inaccurate which led to scrambling on both sides.

My advice for people considering implementing OAuth 2.0 on their service would be to ensure there is a security review of whatever subset of the features you are implementing before deploying the service at large. If you can’t afford or don’t have security people on staff then at the minimum I’d recommend picking one of the big guys (e.g. Google, Facebook or Microsoft) and implementing the same features that they have since they have people on staff whose job is to figure out the secure combination of OAuth 2.0 features to implement as opposed to picking and choosing without a frame of reference.

Note Now Playing: Notorious B.I.G. - You're Nobody Till Somebody Kills You Note


 

Tuesday, 31 July 2012 14:32:49 (GMT Daylight Time, UTC+01:00)
>> The Good: Easier to work with than OAuth 1.0

I'd argue that this is actually a bad feature. In an attempt to make it easy, there have been so many security holes left that a hummer can be driven through. The OAuth 2.0 as currently designed are a recipe for disaster. I've seen way too many sites that think hey bearer tokens are cool, no sigs to worry about, and then they develop an API that does not require SSL/TLS. Just because something is difficult does not mean its not worth doing...sometimes its difficult for a reason.
Alex
Tuesday, 31 July 2012 18:55:17 (GMT Daylight Time, UTC+01:00)
As someone who wrote an implementation of an oAuth 1.0a consumer in native PowerShell, I can say that oAuth in general is a royal pain in the ass to work with. oAuth 2.0 is much easier, but most likely less secure as a result.

Oh, and did I mention how much it SUCKS that you have to invoke a web browser, just to obtain an access token? This is absolutely horrible for the end user experience, for Windows desktop apps, native Android apps, and automation scripts (eg. PowerShell).

I think the whole oAuth project should be abandoned and restarted from scratch. I'd rather just plug in my username and password, and be forced to "trust" each app that I choose to input it to, since you have to with oAuth ANYWAY.
Wednesday, 01 August 2012 04:03:59 (GMT Daylight Time, UTC+01:00)
Excellent commentary. Can't agree more with your Bad and the Ugly.
Wednesday, 01 August 2012 12:31:02 (GMT Daylight Time, UTC+01:00)
I also agree that oauth2 is fine http://homakov.blogspot.com/2012/08/saferweb-oauth2a-or-lets-just-fix-it.html
Saturday, 04 August 2012 19:07:07 (GMT Daylight Time, UTC+01:00)
I think oAuth2 is great but i did get worried when i read "If you can’t afford or don’t have security people on staff ..."

That pretty much accounts for the target market of this protocol - the big guys have always gone their own way anyway in most cases (which then becomes a "standard").

I think if this is the case then priority #1 for the oAuth group (and all associated) it to create offical secure pattern combination rather than relying on folk to shop around to try and uncover what other people did.
Monday, 06 August 2012 06:46:08 (GMT Daylight Time, UTC+01:00)
Sometimes when I can't sleep I read your blog. Works like a charm!
Tony
Monday, 06 August 2012 21:53:39 (GMT Daylight Time, UTC+01:00)
"a bad protocol. WS-* bad."

Damn, that's some strong language he's using there.
The Claw
Tuesday, 07 August 2012 15:07:01 (GMT Daylight Time, UTC+01:00)
I agree that oauth2 is fine but Can't agree more with your Bad and the Ugly
Comments are closed.