Skip to main content

Written by Brett Henderson

The previous blog Entitlements In Context provided some context around how IAM (Identity and Access Management) fits into typical DigIO solutions.

It made some mention of the OAuth 2.0 standards without going into much detail. These standards are an important piece of the access management puzzle. They don’t provide all the answers but they do provide a useful set of building blocks to base solutions on. This article will call out a number of standards, how they have evolved, and some key areas you should be aware of. Hopefully you’ll find it useful.

Let’s dive in.

Standards

Standards have evolved somewhat organically over the years and understanding their evolution helps to identify which parts are most relevant and which parts should no longer be referenced. Let’s walk through them in order of their appearance.

1999

  • RFC-2246: The TLS Protocol
  • RFC-2459: Internet X.509 Public Key Infrastructure
  • RFC-2616: Hypertext Transfer Protocol — HTTP/1.1

A number of relevant standards were published in 1999 including HTTP/1.1, X.509 PKI, and TLS. These still form a large part of the Internet we see today and form a foundation that subsequent standards build on. TLS and X.509 provide the mechanisms that allow us to be sure we’re talking to the right server preventing impersonation and that nobody is eavesdropping on the conversation.

TL;DR HTTPS/TLS is a secure foundation upon which today’s Internet is built.

2010

As the interactions between users and organisations became more complicated there was a desire to create a standard to make it easier to secure delegated access without sharing passwords with third parties. In other words, we often want to allow a system to act on our behalf with another system and with a limited set of privileges. OAuth 1.0 was the first standard that attempted to address this. It was a transport independent protocol that did not rely on HTTPS/TLS. While it was generally secure, it was difficult to implement due to its cryptographic underpinnings and not widely adopted.

TL;DR OAuth 1.0 is no longer used.

2012

  • RFC-6749: The OAuth 2.0 Authorization Framework (website)
  • RFC-6750: The OAuth 2.0 Authorization Framework: Bearer Token Usage

Based on the learnings of OAuth 1.0, OAuth 2.0 took a very different approach and delegates much of its security to HTTPS/TLS. This has ultimately become the most popular standard for Internet-based authorisation today.

It uses token-based authorisation where clients include access tokens on all requests to resource servers. There are multiple flows (a.k.a. grant types) that may be used to obtain access tokens from authorisation servers. Each of these is intended for a different purpose and it’s important to know how to choose between them. These are:

  • Authorization Code: This flow allows a “confidential” client (e.g. a server-side component) to obtain a token with a user’s permission.
  • Implicit: This flow allows a “public” client (e.g. a single page app or mobile client) to obtain a token with a user’s permission. At the time of publication this flow was known to be relatively weak from a security perspective but was included due to the lack of alternatives at the time. We’ll get to the alternatives soon.
  • Client credentials. This flow allows a client system to obtain a token without user involvement. In other words, it’s used for system to system connectivity.
  • Resource owner password credentials: This is a simple mechanism that allows a client to directly request a token with a user’s username and password. In short, this shouldn’t be used.

Another interesting point to note is that the JWT specification did not exist at the time of publication and OAuth 2.0 does not specify an access token format. So while we may often use JWT based access tokens, clients must treat them as opaque. Only resource servers may make assumptions about the token format.

TL;DR OAuth 2.0 supports delegated authorisation. Use Authorization Code flow whenever possible for user-based authentication, and use the Client Credentials flow for all system to system connectivity. Read on for alternatives to the Implicit flow for public clients.

2014

  • OIDC: Open ID Connect

OAuth 2.0 supports authorisation of requests to API endpoints but does not provide the client with details of the user identity. Remember from above that the access token itself is opaque and mustn’t be interpreted by the client.

Open ID Connect extends OAuth 2.0 by defining an additional token called the identity token. The identity token is a JWT containing the user identity. At the time of Open ID Connect’s publication the JWT specification was not final but was available as a sufficiently complete draft.

TL;DR Open ID Connect provides clients with details of the user identity.

2015 (May)

JWT defines a token format that allows a JSON payload to be signed and/or encrypted and encoded in a manner that is URL safe. In the context of OAuth 2.0 and Open ID Connect, tokens are always signed but are not usually encrypted.

TL;DR JWT is used for all identity tokens and most access tokens.

2015 (September)

  • RFC-7636: Proof Key for Code Exchange by OAuth Public Clients

Proof Key for Code Exchange is an enhancement to the OAuth 2.0 Authentication Code flow that makes it suitable for public clients and is much more secure than the older Implicit flow. This technique is usually abbreviated as PKCE and pronounced “pixy”. The full flow name is thus Authentication Code with PKCE.

TL;DR Public clients should use the Authentication Code with PKCE flow.

2017

At the time this standard was released it was common for mobile applications to embed login screens within the application, typically as an embedded web view. This has a number of issues including the application being able to access the user credentials and existing identity provider sessions not being recognised preventing SSO (single sign-on) from working. This standard makes it explicit that the native browser must be used for authentication flows.

TL;DR Mobile apps must use the native browser for authentication flows. Embedded web views are not acceptable.

2021

  • FAPI: Financial-grade API Security Profile 1.0 – Part 2: Advanced

With the advent of Open Banking initiatives around the world, the Financial-grade API Security Profile defines OAuth 2.0 and Open ID Connection implementation guidelines that meet the security requirements of banking APIs exposing sensitive consumer data.

The standards emphasise stronger authentication mechanisms for Client Authentication. Client Authentication is critical to both the Authentication Code flow used for delegated user authorisation, and the Client Credentials flow used for system to system authorisation. Specifically, Client ID and Client Secret must not be used.

The private_key_jwt client authentication mechanism is now my preferred choice. It was originally defined by the Open ID Connect standard but not well known or supported until recently. It provides a number of benefits including:

  • Secure secret distribution is eliminated. Clients can share their public key with the authorisation server over insecure channels without compromises so long as the key is verified and not tampered with (e.g. by confirming a key fingerprint over a separate channel).
  • Credential rotation can be performed without outage. Multiple public keys can be registered with the authorization server allowing two active keys to co-exist while the client rotates its active private key.

TL;DR Use private_key_jwt for client authentication in both the Client Credentials flow and Authentication Code flow. Do not use a client secret.

2022

  • CDR: Australian Consumer Data Standards – Security Profile
  • OAuth 2.1: The OAuth 2.1 Authorization Framework (website)

I mention these standards because they represent the latest advice and industry momentum. For the most part they are aligned with and refer to standards that I’ve already mentioned above.

The Australian Consumer Data Standards currently include the Open Banking and Open Energy APIs and will likely expand to include other domains such as telecommunications. Any solution that makes use of these APIs will need to be compliant with these standards. In addition, we are already seeing organisations use these standards as the basis for their own solution designs.

The OAuth 2.1 standard is currently in draft. As we have seen, the OAuth 2.0 standard forms the foundation for a number of more recent standards but many of these standards augment and replace aspects of the original. OAuth 2.1 consolidates and simplifies the most commonly used features of OAuth 2.0.

TL;DR CDR and OAuth 2.1 standards provide consolidated guidance on how to securely implement OAuth 2.0.

Conclusion

In this article we’ve discussed:

  • The key standards that define the OAuth 2.0 ecosystem.
  • The evolution of these standards and how advice has shifted and become more specific and secure over time.
  • Some key features from each standard to help understand what it contains and why it is important.
  • The latest standards consolidate the OAuth 2.0 learnings and advice from the last 10 years.

The next blog we’ll explore an area that is largely unaddressed by standards, the implementation of fine-grained resource authorisation checks.