The Many Faces of Authentication,
A white Paper

August 3, 2021 / Stefan H. Farr

Contents

  1. Based on
    what is being used
  2. Based on how it is performed
  3. Based on the number of samples
  4. Based on where it is performed
  5. Based on "Who" or "What" do we authenticate
  6. Closing note

The concept of identity is fundamental to every human inhabited environment. We are not "The Borg", we participate in society as individuals, we contribute to the whole, both as consumers but also as creators, with time, effort, ideas and beyond the philosophical debate of "who am I", from a practical standpoint, the identity is nothing more than a reference based on which all these deeds and debts can be uniquely attributed.

Authentication is simply the process of establishing that identity, for whatever purpose, but generally we associate the process with security or privacy related topics: access to certain information, resources, privileges within a certain circle, decision power, responsibility and such. Authentication takes many forms and it is such a pervasive process that we often don't even realize that we are doing it: The moment you hear a sound, see a face, your brain is automatically assigning an identity to its source. Whenever you place a key in the hole and turn it, you are authenticating yourself to the barriers put in place to restrict access to that room. It's often when you don't recognize something "you are unable to establish identity", when you realize that something is out of place. This is because we are extremely good at establishing identity, we can take and aggregate every piece of information and we swiftly draw a conclusion: yes or no.

In cyberspace, authentication has a very special and important role. This is an environment where all things are very, very close (topologically speaking). Unlike reality one can be anywhere within seconds. There are a lot of actors, a lot of noise, we have poor sensors in cyberspace and information about stuff is very scarce and unreliable. To complicate matters further, identity information needs to traverse the barriers between reality and cyberspace and all these require specialized processes that are meant to ensure the quality of the authentication act.

What makes authentication secure, or insecure? How do things connect in cyberspace and then back into the real world? Like with every complex environment, outcomes do not depend only on the components but also on the connection between components and the system as a whole. In this paper, I am going to perform a short analyses of these processes according to certain distinctive aspects that characterizes them. Hopefully, by the end of the it we will all have a better understanding of this process, which at the end of the day, we call online authentication.

1

Based on
what is being used

There are many elements that can be used to identify people over the internet, but they can all be categorized in three categories: something you know, something you have and something you are. If you heard of multi-factor-authentication you probably also heard of these categories, and if you ever authenticated online, you most likely you used at least some, if not all of them.

Each of these types have benefits and drawbacks. When it comes to authentication, there is not one-size-fits-all, different circumstances require different approaches or even multiple approaches put together in such way that one eliminates the shortcoming of the other in order to maximize the accuracy and efficiency of the authentication process.

Something you know

Authentication elements in this category are based on unique knowledge. Knowledge that is exclusive to the identified entity in such depth that nothing in this world can know or guess what that knowledge is, hence the proof of identity. If that knowledge were to not be exclusive and could be correlated with both the authenticator and the authenticated, the assumption of identity attribution would be annulled.

The most representative elements of this category is the password and its derivatives, PINs and passphrases. When used properly, passwords can be extremely secure; they are impossible to reach when stored in your brain. That said, they are also easy to forget so they have a reliability problem. If you need to remember many passwords and change them often this problem escalates radically, and if you fail into the trap of writing them down somewhere you practically lose all the benefits they have: essentially they no longer fall into the category of what you know.

Something you have

Another category that is also common in everyday life is the possession of some special, totally unique object that can in turn make you unique. The most common such object is a key. Keys are great because they can easily hold complex information, that is very hard to guess or forge, whether they are physical or electronic. They however have the disadvantage that they are not with you by default. You need to carry them around otherwise you lose access. You need to protect them and they are relatively easy to misplace. If that were to happen, or you fail to keep them secure, continuously, they can be replicated without your knowledge. At that stage, the entire complexity and power of the key is undermined.

Something you are

Biometric information offer the strongest proof that you are actually the person present at the device. They are also very easy to carry around because essentially they are recognizable features of you: fingerprint, face, retina, iris, heat pattern, and so on. The problem with biometric information is that it is extremely sensitive. It cannot be changed the way passwords can. If biometric data is obtained by a malicious entity, it can be used repeatedly to impersonate the owner, with no possibility to remediate the problem. By this I am not necessarily referring to the fact that somebody will find your fingerprint somewhere, that may be a problem too, but rather to the fact that somebody gets hold of the digitized version of your biometric information, which they then can use to "replay" your authentication (essentially impersonating you).

It is fundamentally important to use biometric authentication in conjunction with other authentication mechanisms that can make use of them without actually having to exchange the biometric data per-se, a topic that brings us to another major differentiator of the authentication process: whether it is performed remotely (at a distance) or locally (by having to send the data).

2

Based on how it is performed

There is a lot of mystery surrounding Public Key Infrastructures (PKI) and the underlying technology (Public Key Cryptography). Mathematically, this is a very complex problem but we don't need to understand the math behind it in order to understand the logical flow and the benefits it brings relative to other mechanisms of authentication. There are other cool features PKI offers, like creating chains of proof, but in this short brief I will focus strictly on this particular aspect of this technology, which otherwise is a powerful ecosystem of identity and trust.

The "Public Key" expression in the name comes from the fact that the building block of a PKI construct consists of two keys rather than one, which is the way we are used to with most things that are locked. The two keys are mathematically connected in such a way that if one key encrypts a piece of data the only thing that can decrypt it in this world is the matching pair of the key. Because of this, you will often hear the explanation that "one key locks while the other key unlocks", which is true, but it fails to capture one of the most magical aspects of public key cryptography: like quantum entanglement or remote sensing, you can use it at a distance, without having to send it.

We hate passwords and we love them. We hate changing them, trying to figure out new ones every time, we are not keen on typing them every time we log in either, but when it comes to using more complex mechanisms, with potentially added friction, we often prefer falling back to good-old password. It is important for business to move smoothly so we often chose the risk vs the friction. But the problem is not with the password itself, but rather the way passwords are used today, the way pretty much every site authenticates:

Exchanging the key

To prove your identity at the client end, browsers will send your pre-established challenge (a secret) to the server, which then compares it with what it has on record and if they match, you are in, otherwise, access is denied. This however means you are actually sending "your secret" to the other party, every single time you authenticate and frankly there are too many things that can go wrong for something not to go wrong with this model. For this model to work:

We don't need any deep analysis to realize that these requirements are simply not ever going to be fulfilled. We use all sorts of hacks to compensate for these inadequacies like sending 2 different passwords, via different lines, but the truth is that the model itself is simply wrong. Secrets are not meant to be shared with untrusted parties. Imagine using the same model with something like biometrics, something you cannot change in case it is stolen. It would be a fatal mistake.

Proving ownership

Going back to PKI now, instead of looking at it the mainstream way (one key locks and the other opens), imagine the pair of keys such that presenting a token which can be unlocked by one of the keys, is solid proof that you possess the other key. Technically it is the same thing, but conceptually is very different.

Let's say you have such a pair of keys, and one of them (which we label as public), you share with the world (amongst them an online service provider) who associates that public key with your account on that service. Whenever you visit, the provider will tell you to encrypt a random challenge and send back the encrypted version. If what you send back matches the challenge after it has been decrypted with your public key, the service provider knows you have the private key. All through this time, the private key has never left your alegoric pocket, you have not showed it to anybody, you haven't sent it anywhere you simply proved you have it, remotely, just like a psychic, only using mathematics as evidence.

Things are profoundly different with this authentication model. Since you do not need to send your key to prove you have it, and because one key cannot be inferred from the other key, you don't have be worried about using your key to authenticate:

It is not difficult to understand, given these, that when it comes to authenticating within "uncertain", potentially hostile environments, the authentication of choice should be PKI. It is not a question of good or bad, but rather a question of being good or bad for a certain purpose.

3

Based on the number of samples

Given that each authentication mechanism has weaknesses and strengths it makes a lot of sense that you would want to use each authentication in such way and in that environment where the strong aspects are the important ones. When one mechanism cannot offer that on its own, it would also make sense to use multiple mechanisms in conjunction, such that the strength of one would eliminate the weakness of the other. To chain the authentication mechanisms, in other words, to use multi-factor authentication. The process of chaining authentications si almost always synonymous with using multiple factors. This is common sense, as using multiple authentications of the same factor gives very little benefit. The weaknesses and strengths are the same and so they cannot improve on one-another.

There has been a lot of talk lately about multi factor authentication, but there are some confusions surrounding the subject as of what really is considered multi-factor. It is generally understood that two factors means the user needs to authenticate two times (for example a password and an SMS), but this need not be the case. It is rather a matter of the number of distinct factors that are taken from a subject that really counts, the number of distinct guarantees of identity provided, rather than the number of times one authenticates.

Single factor

The overwhelming majority of services over the internet use the classical username + password authentication, which due to a number of aggravating circumstances (based on how they are being used) has also become the most insecure of all forms of authentication. It is what we generally mean by "single factor authentication", but really single factor means the service relies on only one of the three authentication forms mentioned earlier (secret, key or biometric).

For example, A PKI based authentication, such as a USB signing key would still be single factor authentication if used on its own, but it would be much more secure, as the secret is not shared.

Multi factor

The idea of multi factor authentication is to increase security by catching multiple aspects of the entity being identified. The most common form of multi factor authentication is the 2FA (two factor authentication). For example a hard token authentication that is protected by a pin would be considered multi factor because it performs two identifications: the token itself (which is proof of key) but which needs to be unlocked with a PIN (which is proof of knowledge). So even though we are sending a single piece of information, in itself it contains two factors, because it cannot be obtained unless the user proves both.

Conversely, consider the situation when you are using your phone to sign in online, you store your passwords in your browser and use a one time password (OTP) tool that is also on your phone or receive an out of bound SMS (onto your phone, evidently). It does not matter that you log in twice (once with username and password and then OTP or SMS), it is still considered single factor, because effectively you only prove you have the phone. Once the password is stored in the browser, it is no longer a secret, it is a key, and so is the OTP. So you essentially present two keys, both of which depend on the same access factor. If somebody has your phone - they have your account.

If however there is a PIN or a fingerprint unlock configured for your phone, the entire authentication does become two factor but not because of the username / password and the PIN, but rather because the OTP tool which provides a one time login key (proving ownership of the phone) and because you need another factor (secret or biometric) to get to it. You could effectively skip the first step of the login (username + password) and go for the OTP with pin unlock and it would still be two factor, even if you effectively provide login information to the site only once.

In fact, strictly speaking, the password effectively becomes Zero Factor, after just being used a single time. Because the password "travels", and it does so through hostile environment, because there is no guarantee for it to not being compromised in transit, or on the service side for that matter, there is no guarantee of identity either. Within the non-PKI authentication method, only a one-time-passwords can be considered one true factor of authentication, because the service does not store it, and it cannot be reused for too long if captured in transit.

These slight misunderstandings, add inconsistencies to the process, and make the authentication landscape very insecure. Sometimes even the added friction of 2FA is just that, added friction, because it brings no security benefits to the table.

4

Based on where it is performed

Probably the least understood aspect of authentication is that it can be performed in different parts of the networking protocol stack and like every other aspect, each has advantages, disadvantages and great deal of confusion. To understand the distinction, let's start by looking at a few examples which are fairly familiar but we don't really look at them as authentication in the normal sense.

Router to router

One of the most popular protocol where such authentication happens is Wifi. To connect to a wifi router your computer needs to know the wifi id (which most often is broadcasted by the router) and the passphrase. Essentially, anybody who knows the passphrase can connect to the router, which means this type of authentication has very little attribution and identity elements in it, but it does ensure relatively well that nobody can intercept the radio waves and listen in on the communication. Being used in a data link layer protocol, by placement (where it is performed) is situated on Layer 2 OSI.

If the router also uses MAC address filtering (not very secure) or Network Access Control (NAC) authentication (PKI based & very secure), then this authentication can also provide good device attribution, which means only select devices in the world can connect to the router. That said, it has a very narrow scope, in the sense that it can only take you to the nearest router. In a similar way, the GMS mobile protocol identifies the phone (device) within the network by ways of a SIM card, but just the same, it can only take you (authentication wise) as far as your mobile carrier. This is the limitation of any authentication placed in the data link layer.

Address to address

Another popular protocol that involves authentication and encryption is the VPN. Although it is regarded more like an anonymizing protocol in the popular culture, the VPN is in fact a very strong point to point (IP Address to IP Address) privacy protocol. It essentially creates an encrypted communication tunnel between two endpoint which may be far apart, many routers apart (so to speak). To achieve this, the protocol was built upon the IP protocol, which puts it on Layer 3 OSI, or the network layer. In a multi-tenant environment, VPN providers will use PKI, to identify the connecting devices (like your computer or mobile phone for example), which provides extremely strong device authentication, but the VPN protocol is very rigid. Although it can use PKI to authenticate your device, it can only connect your computer to another network endpoint (aka a VPN bride). This makes it useful in situations when one wants to connect to local area networks in a secure way, or you want to hide your IP address (the tunneled connection masks your IP address behind the VPN provider's ip address). However, it is not useful to log into applications, because everybody who is coming from behind a VPN, would essentially have the same identity.

Application to application

One of the least known authentication mechanisms is the 2 Way SSL, which uses PKI to authenticate application end-points to each-other, on transport (connection) layer, or Layer 4 OSI. Although it is fairly flexible from a technology perspective and extremely secure, logistically it proved difficult to distribute Certificates (PKI keys) at the scale needed by today's Internet, so this authentication was mostly ignored outside strict enterprise or government environments. It has been substituted with Layer 5+ (application level) authentication protocols, which are more destined to authenticate people rather than computers.

Connection level vs. application level

It's easy to notice, that as we move up in the protocol stack, the range and granularity of the authentication mechanism grows so one would be tempted to conclude that for maximum flexibility we need to authenticate as high as possible in the protocol stack:

Computer <-------- L2 -------> Router
Computer <------------ L3 ------------> Computer
Computer <---------------- L4 ----------------> Application
Human    <-------------------- L5 + ---------------------> Application

This has sadly been the mantra of authentication for many years and sure enough Layer 5 + authentications were the first type of authentications to appear. Only when the industry realized how important it is to apply authentication at lower layers, were they gradually introduced. It's a pity, because there is a fundamental distinction between Layer 5 + authentications and those below, which is Extremely (with a very big E), useful from a security perspective and an entire internet is not making use of it.

These lower layer authentication mechanisms are part of the communication protocol itself. What this means is that if there is no authentication - there is no communication either. One cannot cannot establish communication with a VPN server unless one knows the encryption key, which makes good VPN setups unhackable. Not being able to communicate with something in cyberspace, is analogous to not being able to touch something in reality. For example, you cannot templer with a locked door if it lies on the other side of Grand Canyon.

In contrast with this, layer 5 + protocols perform authentication within the application itself. Therefore anybody is free to communicate with the application even in an unauthenticated state. To put it in perspective, this would be like not having the key to a door, but the door would be within your reach, so you are free to see if there is a wide enough crack for you to fit through, if there is a week lock or somebody hid the key under the matt.

We have gotten used lately, especially in the Saas space, with applications having no perimeter defense. Indeed, with layer 5 + authentications it's impossible to create a hard line, and therefore the defense perimeter is infinitely large. Every line of code, bespoke or originating in any of the third party libraries may become the crack in your allegorical door. In a world where software is built hastily, the complexity is so great that about 80% of each software is pre-built by someone else, where the overwhelming focus is functionality and security and testing receives very little attention, the positioning of authentication in the protocol stack can make the difference between secure application and one that is potentially vulnerable at every corner. Under these circumstances, the question of where authentication happens, is in fact a question of having or not having a perimeter.

5

Based on "Who" or "What" do we authenticate

We often think of identity as a uniquely human trait, but in cyberspace there are many actors that perform actions and need access to resources and they are not humans: applications accessing databases, microservices authenticating to one-another, b2b services, bots, personal digital assistants, not to mention the realm of IoT devices. All these business to business actors need and perform authentication and they do so on behalf of an organization or individual, without having a human to validate such authentications.

On the opposite side of the story, we have humans (client to business), which use devices to visit online services (there is really no other way), but we consider it irrelevant to authenticate the browser/device itself, because we have the human interacting live with the service, and so the entire process is offloaded to that human. The two processes are distinct from the perspective of who "stores and presents the authentication information", but otherwise in both cases there is an unavoidable chain of dependency:

  1. There is a service storing some resource
  2. There is cyberspace actor performing the access to the resource
  3. There is a real world owner of the resource stored at the service

When using current authentication methodologies in either of the two distinct authentication cases, automated vs. user-driven, only two of these three dependencies are preserved:

In each case we lose the capacity to observe the entire chain, and therefore the two authentication case also become two distinct authentication processes based on "who do we authenticate". The reason why the two scenarios are treated differently by the industry is because not every authentication works for both actors. By its nature, being part of the communication protocol itself, only a machine can perform a connection level authentication. If a person were to use PKI as means for authentication, they would have to perform it at application level. Conversely, it would be impossible for a device to use biometric authentication because simply it acts in the absence of the user. The same is true for multi-factor authentication, which would effectively block an automated process.

From a methodology perspective the nuances are subtle, but but they are very important from a security perspective, because they create exploitable scenarios:

These exploit avenues are very difficult to compensate for. Mitigations introduce great friction and in some cases they are so complicated that it is not feasible to entrust the user to managing them. This can make online systems quite vulnerable to ingenious attacks. Due to their dual nature (both clients and servers), IoT deployments are essentially crippled from both perspectives, they are forced to use "application level authentication" and "built in passwords" (both to identify themselves but also to authenticate administrators), which is why IoT security is in such dire situation. Sadly, with the increasing sophistication of attacks, the growing need for automation, AI agents and the increased need for user mobility, SaaS security is not very far behind, in the vulnerability statistics.

6

Closing note

There are a lot of scenarios and combinations of scenarios that come to play in online authentication and identity systems. Covering all aspects in such way that neither security or convenience is sacrificed is an arduous task and requires analyses at the deepest level, accounting for everything: technology stack, actors, processes, the data being accessed, end-point devices, and many, many more components. Luckily, there is a technological answer for everything and it only rests on our understanding of how these answers come together, to meet the process requirements without sacrificing security. My hope is that the present document will help you better understand how different authentication technologies are spread upon the technology stack in your environment, and make it easier to recognize which of them offer maximum security in the points where your processes intersect the technology stack. Hopefully this will help you ask the right questions when you pick your next vendor or help you handpick and combine authentication technologies when you build your next tool, in such way as to give you and your customers maximum security and convenience.

I am Stefan, the CTO of identityplus, a company that works on the frontiers of identity and authentication. Feel free to reach out to us with your hard questions. We cannot guarantee we'll have a solution for it, but we would certainly enjoy dissecting it.

< Identity Plus Vote From Home >