The Shifting Privacy Left Podcast

S1E2: "The Magic of Zero Knowledge Biometrics" with Dave Burnett (ZeroBiometrics)

November 01, 2022 Debra J Farber / Dave Burnett Season 1 Episode 2
S1E2: "The Magic of Zero Knowledge Biometrics" with Dave Burnett (ZeroBiometrics)
The Shifting Privacy Left Podcast
More Info
The Shifting Privacy Left Podcast
S1E2: "The Magic of Zero Knowledge Biometrics" with Dave Burnett (ZeroBiometrics)
Nov 01, 2022 Season 1 Episode 2
Debra J Farber / Dave Burnett

(Episode Transcription)

This week, I’m joined by Dave Burnett, VP of Strategy at ZeroBiometrics, to discuss his company’s cutting edge approach to using one’s face to biometrically-authenticate to systems w/o storing personal data, preventing breaches. We’ll discuss current approaches to deploying biometric authentication, unpack surrounding privacy & security challenges, and explore his company’s tech & why it may enable the biometrics industry to leapfrog over current tech hurdles as there’s now a privacy-preserving method to biometric authentication .

—------
Thank you to our sponsor, Privado, the developer friendly privacy platform.
—------

Rather than iterate on older technology, ZeroBiometrics approached its biometric authentication tech using a clean-sheet design. As a result, they created tech that captures no personal data, not even a biometric. Thus, it doesn't know what someone looks like and doesn’t save personal data to authenticate. In our conversation, Dave pulls back the curtain this magical-sounding tech and shares compelling examples of how ZeroFace enables privacy-preserving biometric identification, verification, authentication, and account recovery. 

The expansion of biometrics is unstoppable at this point. Security risks and privacy issues are too significant, and global legislation can't keep up. Dave illustrates why we can't keep working within the old biometric paradigm if we want to protect our identities and personal data and explains how his team works to bridge the gap between technologists and end-users.

Listen to the episode on Apple Podcasts, Spotify, iHeartRadio, or on your favorite podcast platform.

Topics Covered:

  • Key challenges as we evolve from mobile biometrics to other use cases.
  • Technical & policy differences that affect privacy.
  • How industry leaders like Apple have approached facial & fingerprint biometrics.
  • How ZeroFace authenticates you w/o knowing what you look like
  • Addressing privacy usability challenges in the crypto space.
  • ZeroBiometrics’s impressive metrics for false acceptance (FAR) & rejection rates (FRR)
  • How using a ZeroFace's QR code can radically change the way we travel, ship goods & authenticate to our devices


Guest Info:

Send us a Text Message.

Privado.ai
Privacy assurance at the speed of product development. Get instant visibility w/ privacy code scans.

Shifting Privacy Left Media
Where privacy engineers gather, share, & learn

Buzzsprout - Launch your podcast


Disclaimer: This post contains affiliate links. If you make a purchase, I may receive a commission at no extra cost to you.

Copyright © 2022 - 2024 Principled LLC. All rights reserved.

Show Notes Transcript

(Episode Transcription)

This week, I’m joined by Dave Burnett, VP of Strategy at ZeroBiometrics, to discuss his company’s cutting edge approach to using one’s face to biometrically-authenticate to systems w/o storing personal data, preventing breaches. We’ll discuss current approaches to deploying biometric authentication, unpack surrounding privacy & security challenges, and explore his company’s tech & why it may enable the biometrics industry to leapfrog over current tech hurdles as there’s now a privacy-preserving method to biometric authentication .

—------
Thank you to our sponsor, Privado, the developer friendly privacy platform.
—------

Rather than iterate on older technology, ZeroBiometrics approached its biometric authentication tech using a clean-sheet design. As a result, they created tech that captures no personal data, not even a biometric. Thus, it doesn't know what someone looks like and doesn’t save personal data to authenticate. In our conversation, Dave pulls back the curtain this magical-sounding tech and shares compelling examples of how ZeroFace enables privacy-preserving biometric identification, verification, authentication, and account recovery. 

The expansion of biometrics is unstoppable at this point. Security risks and privacy issues are too significant, and global legislation can't keep up. Dave illustrates why we can't keep working within the old biometric paradigm if we want to protect our identities and personal data and explains how his team works to bridge the gap between technologists and end-users.

Listen to the episode on Apple Podcasts, Spotify, iHeartRadio, or on your favorite podcast platform.

Topics Covered:

  • Key challenges as we evolve from mobile biometrics to other use cases.
  • Technical & policy differences that affect privacy.
  • How industry leaders like Apple have approached facial & fingerprint biometrics.
  • How ZeroFace authenticates you w/o knowing what you look like
  • Addressing privacy usability challenges in the crypto space.
  • ZeroBiometrics’s impressive metrics for false acceptance (FAR) & rejection rates (FRR)
  • How using a ZeroFace's QR code can radically change the way we travel, ship goods & authenticate to our devices


Guest Info:

Send us a Text Message.

Privado.ai
Privacy assurance at the speed of product development. Get instant visibility w/ privacy code scans.

Shifting Privacy Left Media
Where privacy engineers gather, share, & learn

Buzzsprout - Launch your podcast


Disclaimer: This post contains affiliate links. If you make a purchase, I may receive a commission at no extra cost to you.

Copyright © 2022 - 2024 Principled LLC. All rights reserved.

Debra Farber  00:00

I’m Debra J Farber. Welcome to The Shifting Privacy Left Podcast - where we talk about embedding privacy by design and default into the engineering function to prevent privacy harms to humans… and to prevent dystopia. Each week, we’ll bring you unique discussions with global privacy technologists and innovators working at the bleeding edge of privacy research and emerging technologies, standards, business models and ecosystems.

Debra Farber  00:27

On today’s episode, we welcome Dave Burnett, VP of Strategy at ZeroBiometrics, a company who invented a privacy-preserving way to use facial biometrics for easy authentication to devices and experiences. He’s held leadership roles at notable companies like Fingerprint Cards, Indicativ, Identity Devices, Nok Nok Labs, The FIDO Alliance, Symantec, & PGP Corporation. 

In this episode, we discuss:

  • Key challenges as we evolve from mobile biometrics to other use cases.
  • Technical & policy differences that affect privacy.
  • How industry leaders like Apple have approached facial & fingerprint biometrics.
  • How ZeroFace authenticates you w/o knowing what you look like at all
  • Addressing privacy usability challenges in the crypto space.
  • ZeroBiometrics’s impressive metrics for false acceptance (FAR) & rejection rates (FRR), and
  • How using a ZeroFace's QR code can radically change the way we travel, ship goods & authenticate to our devices

Enjoy the episode.


Debra Farber  01:51

Welcome, everyone to shifting privacy left. I'm your host and resident privacy guru, Debra J. Farber. Today I'm delighted to welcome my next guest, Dave Burnett, Identity, Biometrics, Security Leader, and VP of Strategy at ZeroBiometrics to discuss his team's major innovation in the biometric space. Hello, Dave, welcome.


Dave Burnett  02:13

Thanks so much! It's a delight to be here. I'm happy to talk with you about privacy, and of course, about biometrics and some of the innovations that we're bringing to the market.


Debra Farber  02:22

Awesome. It's a really interesting space. I've worked in biometrics a little bit back when I was at Visa. I was on their Public Policy team, and biometric authentication was even the topic of a paper I worked on. So, I have a lot of interest in this space. But, I also know how complex it can be…where biometric enrollment, for instance, is really much more difficult to achieve than the authentication piece. You know, who's the authority on who declares your biometrics are my biometrics or Debra Farber’s biometrics, right? So, that initial enrollment is usually done by a trusted authority of some sorts, like a government or, you know, something else like a bank. How would you present this topic of biometrics? Can you give an overview of the biometrics work that you're doing at ZeroBiometrics, for the uninitiated?


Dave Burnett  03:11

Sure. Like you, I've been in and around the biometric space for quite a long period of time, even long before the days that biometrics were common on our cell phones. And, the good news for all of your listeners - and frankly, whenever we use industry professionals talk about biometrics - is that everyone today pretty much understands the experience of using biometrics because they're on all our phones. That's how we unlock our phones and protect a variety of data that we use online. And, some of our banks use the biometrics that are on our phone to authenticate us. So, it's great that people understand the experience, but it's important to understand a little bit more about how they work because there's a sea change that's starting to happen in the biometric space, where more and more biometric authentication is not happening on our devices. They're happening on back end servers that don't have the same levels of security, privacy and control that our mobile devices have. 


So, let's start with just some basics about the biometrics under the hood, if you will. I'll keep this simplified and I'm just going to focus on what we would call the “world of visible biometrics” (physical modalities) - the things that you can see about someone. You know, their eyes, their face, or their fingerprints. There are other biometric technologies (behavioral modalities) that get into things like your blood vein patterns, or how you walk or the way you type on your keyboard. Those are interesting, but they're not really the types of technology that have the same kinds of privacy considerations or concerns. 


So, for any type of world-visible biometric, they all work in a very similar way. You start with taking a picture of what is going to be your biometric identifier. It could be your face. It could be your finger. It could be your palm shape. It could be your iris. It varies based on what's called the modality, the biometric modality that's stored in a registration event (or some call it an enrollment event). That picture is saved, usually in some sort of a digitized form (not a raw picture that we can see, but some format that's easily interpretable and processable by the computer). Then, when you come back at a later time to verify that you are you, then another picture is taken and it's compared to the first picture. That creates a probabilistic match with some sort of a confidence that your registration image - again, eyes, finger, face - match the verification image that you take at a later point in time. 


Now, I said something that isn't precisely true - and a lot of people don't even fully understand this - but it's a very important point. I didn't mention identity in what I just said. I said, “You are you,” right? It's verifying that you are you, but all the computer’s really doing is just verifying that it took a picture of something, and then compared it at a later time. And, those two things match. So, when your phone verifies you with biometrics (finger or face), that's all it's doing. Any sort of linkage to your identity is happening separately. But, the data that is used is actually quite sensitive, because - it's easiest to understand it with face, but it applies to finger and it applies to eyes as well - that data is you, right? That's personally identifying information (PII). That's part of why our phones have so much hardware-based security in them - in order to protect that data. You don't want to lose that data; it's sensitive. You only have one face. You only have one set of fingers. These are sensitive things if they get lost, and that's one of the key challenges as we shift from mobile phone-centered biometrics to ever increasing use of biometrics elsewhere. Whether it's server side, whether it's for a common attendance machine, or any number of other use cases, we don't necessarily have some of those security guarantees when you go beyond the mobile software.


Debra Farber  07:32

It’s fascinating that there's such a distinction between biometrics on the mobile phone versus other use cases. Can you unpack a little bit why there's a privacy challenge to surmount when it comes to server-side or the time and attendance type of applications?


Dave Burnett  07:50

Absolutely. This is a very detailed topic, but to summarize it: there's a set of technical differences, and then there are - I would call them almost policy or user differences - that both have implications on privacy. So, in the simplest scenario, your mobile phone has secure hardware on it. Generally speaking of the more reputable big brand mobile phone manufacturers, they talk about the security model that they’ve implemented to keep your biometric data safe on your device. But, if you're using something else, you don't know what that technical infrastructure is. Like, let's say a time and attendance machine; it's not uncommon to use face or fingerprint biometrics for recognition on time and attendance machines. You have no idea what the security architecture is, and you don't even really have a way of verifying that you've provided consent for your biometrics to be used in that particular way. Now, consent is a policy issue. It's a legal issue. In the United States, we don't have consistent laws, unlike the European Union and some other countries. We don't have consistent laws across all the states requiring that you obtain consent from a person before your biometric data is used. But, there are definitely public policy considerations that come into play here - as well as the technical security considerations that come into play - because you don't know what that security architecture is like. Then, on top of that, you've got different risk profiles because you don't know what could possibly happen or what the use is going to be beyond that one device that you might register with.


Debra Farber  09:33

So, how would you say that the people who are building in the identity biometrics space are approaching this from a threat model perspective? Like, how are they approaching biometrics in some of these new tech settings - whether it's, you know, for the metaverse or in the Web3 space, or just traditional besides mobile? Obviously, we know how people are doing it with mobile phones. Apple is considered to be almost a gold standard when it comes to biometrics. How would you describe how they deployed a privacy-by-design architecture before we dive deeper into, you know, ZeroBiometrics’s approach?


Dave Burnett  10:14

Sure. So Apple is well regarded for privacy, and they have been consistently with their mobile devices on the bleeding edge of privacy designs. And in fact, you know, their biometric architecture hasn't - yet at least - suffered from any security breaches, which is actually different from some of the other device manufacturers that have had security breaches where information that should be kept private (whether it's your biometrics or your private keys for cryptographic operations), has been potentially subject to compromise. 


So Apple's architecture is pretty straightforward. But, before we get into that, I think it's important to keep in mind that Apple (like every other phone manufacturer) is dealing with this classic, historical architecture that I described around how biometrics work: take a picture, save it, take another picture in the future, compare the pictures. Right? And it's digitized in various formats, and what the picture is changes. Like, Apple uses infrared light, and then they use an infrared dot projector on your face to get a 3D representation of your face. But, fundamentally, it's still the same technology. This is important because this architecture is actually really old, right? If you go back to when biometric search was first invented back in the 1960s - especially for face searches - it was done on computers. There was a huge room for one computer. There was no Internet security, mostly physical security. There wasn't a concept of networking and network devices and adversarial attacks, except for someone breaking into the building physically itself. 


So, there's been a huge design challenge that persists all the way through to today, and there aren't too many things in our digital world that haven't changed pretty fundamentally over the course of 60 years. I mean, there are very few pieces of tech that are 60 years old at all that we're using today much less something that goes back at a fundamental design level to such a long period of time ago, which doesn't mean that there hasn't been innovation in how biometrics work or their accuracy, right? There's been huge advancements in the accuracy of different modalities for instance, but they all still rely upon this fundamental model that inherently is: saving personal information about you


Now, Apple, being Apple, they set a very high standard for how this subsystem in their phones and their tablets work. For their implementation, it doesn't really matter whether it's face or finger; it's the same basic technical architecture. They have a subsystem and literally a dedicated chip. They call it a secure enclave, but that's just a fancy branding term for what's known as a secure element in the industry. This is a dedicated microprocessor (different from the thing that runs your phone) and makes your phone calls and cruises the web. It's dedicated, isolated from the rest of the operating system and it’s actually tamper-resistant. If someone opened up the phone and tried to decap the secure element, you know, it would break and it would destroy the data as well. So, it's isolated. 


That's one of the security things that they do. They have controlled interfaces for accessing it, and they don't allow the data to be copied out. So, data goes into the secure enclave. It can be compared within the secure enclave, but they've designed it so that it can't go out. That's the high-level overview from an architecture-level, but they've done something else, which is also really valuable. They've published security white papers about how their architecture works, and they've made that publicly-available for third parties to review and understand the implementation and design. That's just a good security practice, and it shows the level of seriousness that they take in building out their phone architecture. 


Debra Farber  14:20

That makes sense. So, I think that there's also something else that's missing: the idea that we've got local processing and edge computing that enables it to be really fast, but because it's stored locally on your phone, as consumers as individual humans, that our end users for this technology, it's kind of really nice that we know that we have this in our possession and that Apple does not have access to the data that's put in there. Would you agree with that statement?


Dave Burnett  14:47

I do. It is really useful. The unfortunate thing, though, is that in order to take this old architecture and make it safe, Apple and other phone manufacturers have come up with similar designs - sometimes using secure elements, sometimes not. The data goes in, but it can't come out. What that means is that if you have two devices - let's say an iPhone and an iPad - you have to register your biometrics twice. That's an acceptable compromise, but it's still a usability tradeoff. And that's the issue, right? Whenever any sort of biometric technology is used, there's always been a historical tradeoff between privacy and usability. Every vendor and every company makes those tradeoff decisions. Apple’s pretty public about what those tradeoffs are. Not every other company is as public as Apple is about what their tradeoffs have been.


Debra Farber  15:45

It's probably why they've earned so much trust, even though they're very aggressive with their advertising and tracking. Apple's architectural stance and transparency with the public around a lot of other things - especially around the hardware - it's amazing how much goodwill that has helped them build with the public and thus, a lot of trust - one of the most trusted companies out there, I'd say. But that said, can you tell us: how is the expanding usage of biometrics creating new privacy and security risks?


Dave Burnett  16:17

Well, so it starts with a desire for convenience, first and foremost, right? And, you know, we've all gotten used to the experience of biometrics on our devices. So there's a desire, whether it's a service provider, or whether it's a hardware manufacturer, to extend that same kind of biometric convenience to other experiences other than just your phone. So, you're starting to see for instance, more and more door locks that have fingerprint sensors on them. I recently booked a hotel for a weekend vacation and it was a very interesting scenario where part of what they wanted me to do was to verify my identity as part of the booking process. So, I literally had to take a picture and let my web camera take a picture of my face as part of the booking process, then hold up my driver's license to the web camera so that they could take a picture of my driver's license, and then compare the face on the ID photo to my face. 

And you're also seeing that kind of model happening with systems like the Internal Revenue Service here in the United States. It's just becoming more and more pervasive, because the experience can be so heavily-streamlined and then easy (especially if otherwise you'd have to create a name and a password), these other kinds of high friction experiences that we are seeing biometrics expand outward. Again, though, it's an old model; it's fundamentally insecure because you're saving personally identifying information, and we don't have the same level of transparency or trust that every manufacturer or every company that's using biometrics is actually using them in a safe and responsible way from a security, or even a public policy, perspective.


Debra Farber  18:08

You know, that's a great point because I think the most recent example for me is, you know, trying to play around in the crypto space and signing up for a web3 crypto wallet or even a self sovereign identity wallet, right now, you have to remember a 12 or 24 words seed phrase or put them somewhere safe, you know. It's a very secretive set of words, right? You don't have to use the actual words; you could use the token itself. I could see that it's more of a security issue because you want to make sure that you can have access to your own account in the future, but you also don't want someone else to have access to it, making it partly a privacy challenge, in that it's not easy to use. It's not very usable. When you have many wallets in different places, you know, different organizations, you're starting to have different relationships with different companies where they require a login in a certain way. 


And you're like, “How many 24 word pass codes can I possibly have before I forget all of them and lose some of them and don't have access to my assets?” So I really see this as being a challenge now because for the KYC: Know Your Customer laws, I'm seeing exchanges require the same type of scenario that you're talking about: “Show us all your identification. We're going to capture it and take pictures and document it for compliance.” And, while I'm not personally against KYC, it is adding a lot of friction, and it is not very usable today. I mean, it's preventing a lot of people from even getting into the crypto space to begin with. So I guess the question is what do we do about it? What is ZeroBiometrics doing? I know you're about to tell us, but it definitely sounds like magic! I'm hoping you’ll demystify for the audience that it's not magic; it's a well-designed approach that appears magical.


Dave Burnett 19:54

Right? Well, our iPhones seemed magical when they first shipped, right? When the iPhone 1.0 shipped, and we were all used to mobile phones that just had a keypad on them or, maybe a BlackBerry or a Palm that had a cellphone inside of it, you know, any kind of a big shift in technology - the new experience can seem magical. So thinking about all of these challenges that we'd experienced in our respective careers and biometrics (because I've been in the industry for quite a long time as have others that are part of the core ZeroBiometrics team). We looked at a lot of these challenges that we've talked about: whether it's the existing model, how do you protect the biometric information, how do you trust that the user of the technology that you provide isn't making some mistakes, and how they're deploying the technology that then creates security exposure? We came to the conclusion that we needed to approach biometric authentication using a clean sheet design. The existing model that we've talked about - Apple's done a great job of putting a security boundary around it - that model is just too fraught with risk. There's just too many technical challenges with making that model secure, especially at scale. And when I say scale, I don't mean scale in terms of the number of devices. I mean, the number of people that are deploying the technology. Apple is one company deploying biometrics effectively to their customers. You start talking about many banks or you talk about multiple digital wallets all looking at deploying some form of biometric technology. 


You get into real scaling issues, especially on the security side, and it would be important for your audience to be aware (if it wasn't completely clear) that the folks who sell the biometric technology - the raw biometric technology - they're not security folks, right? They don't think about or have any security considerations at all right? What they're doing is they're selling you a tool, and it works. Yet, it needs security boundaries around it to be safe. They focus on the fact that the technology works and is accurate. They have no exposure or knowledge about how to make it safe. Generally speaking, that's up to the company that deploys it. Apple happens to know how to make technology safe, therefore, their implementation is good. They also invented FaceID, using a variety of different technologies so that it was easier for them to make it safe. But, they didn't invent fingerprint sensors; they had to figure out how to make fingerprint sensors safe, and then they just adapted that security model. So, if I'm a bank or I'm making a door lock, or if I'm making a time and attendance machine or if I'm the hotel chain that I described earlier, where I had to verify my face, you have to figure out how to use the technology in a safe way - especially if you're deploying it inside of your company or inside of your product.


Debra Farber  23:03

That makes a lot of sense to me. I hadn't really thought about that at an industry level that a lot of the salespeople, you know, they're not necessarily security and privacy folks. They're salespeople. They might be talking about security, but it's really about the limitations or capabilities of the product, but in reality, they are not threat modeling for the use case of a specific deployment. That's why it’s so important that companies that are buying solutions understand how to threat model for privacy as well. So fascinating.


Dave Burnett 23:36

To be very clear - and you're hitting the nail on the head - the folks that sell the biometric technology focus on accuracy and “security through accuracy,” and yes, here this level - you need to have a population of **this size** before you would find a potentially matching fingerprint or iris or face. And, the numbers…


Debra Farber  24:00

a false match…


Dave Burnett 24:01

Yes…”false accept” or “false match” are the terms that are commonly used, and different modalities have different accuracy levels or different false acceptance rates / false rejection rates. But that's what they compete on, right? That's not the security of the product, of the bigger product, that they're a component part of. That's not their problem; that's out-of-scope for what most biometrics companies do. 


Dave Burnett 22:33

And having sold fingerprint sensors, for instance, into some early door locks that were being made, because that whole trend really kind of started in Asia, and is only now really starting to come to the United States. I was astonished at how many door locks were being made with literally no security hardware inside them whatsoever. Your templates, your biometric data, just stored in the raw, you know. Anyone who opened up the lock could put some probes onto the chip and copy the data right off that correspondent to your fingerprints. So, it's not a theoretical problem. It's a very real problem, especially as you look to scale up and scale up and scale up. 


So, we decided we can't use the existing model. There's a need to do a fundamental shift in the design to focus on privacy, to focus on scalability, meaning that many companies can use the technology. It needs to have a lot of security principles embedded in the product. We need to make it hard for companies to make mistakes that somehow compromise your data. So, we had these very fundamental design goals as we designed the technology that we now call ZeroBiometrics, and the product was ZeroFace, which we have commercially-available today. What we did was design a system that doesn't know what you look like at all; it saves no information about what you look like in order to authenticate you. That may be the magic part that you're referring to.


Debra Farber  25:57

That's the magic part! That's the part that's like, “Yes! It just works!” So please, yeah, de-mystify us, because I know there's going to be a lot of feedback after this episode or maybe even requests to demo the product. It is just, it seems like it solves so many problems that it can't possibly be the magic we're all hoping for. But it seems like it is to me - I've kicked the tires a bit. And, I'm excited for you to kind of tell us a little bit more.


Dave Burnett 26:26

I'm happy to. So, the most important thing for your listeners to be aware of is what we have brought to market. Yes, we have our own innovation and some proprietary technology that we've built, but we're standing on the shoulders of giants with our technology and the core concepts that we use in our solution are well understood in computer science in general. Let's start with something very simple: how can we recognize your face without knowing what you look like? The answer boils down to a concept called a “zero knowledge proof (ZKP).” A zero knowledge proof is a fancy term, and it's often given an acronym about Z-K-P. What that does is…it's a way of me asserting that something is true to you without me having to reveal the confidential information that I have. So it's saying, “I am me or my face; this is who I am,” without actually revealing what my face looks like. There's a lot of interesting videos about this on YouTube. There are some great science articles about it. There's folks much more learned than I am that can speak very eloquently to this. But basically, the way we work is that we use your face as the secret for the zero knowledge proof. And what we save is the mathematical formula that can be used to verify that your face is actually used. So zero knowledge proofs ultimately wind up doing some form of computation. We do a form of computation. 


And as long as we actually generate the right result, based on what we see, then it's you. But let's go into a little bit more detail here, because it's a very important concept. That's one level down from how the technology works. So, traditional biometrics - I know we've covered this a couple of times - takes a look at your face…saves a picture. We do something very different: we look at your face and we identify these interesting points on your face that have biometric information. Where are they located? It could be the corners of your eyes, or it could be your mouth, or it could be your eyebrows. There's a lot of what we call “entropy” or data that we can recover. When we look at a face, we don't record any of what we see, but we do record what are the most distinctive places to look for information about your face. And then, we inject into our core formula and create, essentially, a customized zero knowledge proof for your face. The result is what's called in cryptography a “hash.” Ultimately, we use SHA256, which is a very well-known and highly trusted hashing algorithm, to do a 1-week conversion into a result - a 250 bit string that describes the correct result if this formula is applied to my face - my zero knowledge proof for my face. And, if my face was registered, the zero knowledge proof for my face is different, Debra, than if we take our product, ZeroFace, and point it at your face. And in most scenarios, if I take your formula and apply it to my face, I'll get a hash, but I'm not going to get the correct hash result that describes me or that is the result of this analysis and this computational process that is unique for my face. So, every person that gets exposed to ZeroFace has a unique hash value that is generated, and we match literally by comparing an original hash versus the resulting hash of the enrollment or registration hash versus the verification hash.


Debra Farber  30:21

So, are you saying that you're able to use biometrics to authenticate someone to a, I don't know, a crypto wallet without replacing completely like BIP 39 - that 12 or 24 words? 


Dave Burnett  30:36

Yes. 


Debra Farber  30:36

Wow. And you're able to deploy this on…what else? What were some other…you know, I don't wanna say use cases…but what are some other technologies you can interface with to enable biometric authentication?


Dave Burnett  30:49

Yes, we absolutely can verify your face is the face that we've seen without saving any information about your face, and that hash value that we generate can then be used to perform a variety of cryptographic operations as a seed into various cryptographic functions. You mentioned one of them, for instance, BIP39, which is the technology that's used for generating 12-word or 24-word recovery seeds for wallets, or root seeds for wallets. The way we work in that scenario is that we can take our hash. If we know what the right answer is for the seed - what is the seed for a wallet - we can regenerate and transform the root hash that we generate from your face into the actual BIP39 hash. This is a very complicated way of saying, “If you forget the password to your wallet, you don't have to remember the 12 words or find the 12 or 24 words. You just scan your face to recover your wallet, right, or you just use your face to authenticate into your wallet.” 


But, there's something else that we can do, and I'll give you a wallet example and then we can talk about other scenarios. So, cryptographic wallets use private keys in order to sign different transactions. You know, in a cryptocurrency scenario, you might be transferring money from person A to person B. In a distributed ledger scenario, you might be approving a transaction or signing an event that says the best smart contract for instance. You might be signing a smart contract to demonstrate that you agree to the terms of that contract. What's happening in both of those cases is that you are using cryptographic operations, where the proof that you were you is possession of secret data - in this case, what's called a private key, a value that only you are supposed to have access to in order to perform the signing operations. 


Now, for Apple (when you do different types of key operations, to go back to that gold standard), when you generate private keys, these secret keys on an iPhone, those are saved in the same security architecture that saves your face data. In cryptocurrency use cases in particular, loss of those keys - it's highly risky; you've got to keep those keys safe. There have been instances of keys being stolen and money being transferred because if they've got that secret data (that signing key) then they can go out and conduct transactions, and it's impossible for someone to tell that it wasn't me that approved those transactions. 


Well, there's an important thing that we do here around signing keys as well. So, that upends the security model, as much as ZeroBiometrics and ZeroFace upends it on the biometric side. Every time we see your face, we can reliably generate the exact same hash value. We can use that to create signing keys. Well, that means that you no longer need to save those signing keys when they're not being used. That's the old model, right? You create these keys, you save them, you put them in some sort of secure, trusted device, and then when you log back in using a long password, then you can sign transactions. 


Why do it that way anymore, right? We know it's you. We see your face. We've verified it and we've got the hash that is unique to you. So then, go generate your signing key, dynamically recreating it. So, every single time you log in, we can make those signing keys available again - reconstituting them out of thin air, essentially, really reconstituting them from your face. Then, you can go right ahead, and this is probably clear, this is a super huge change, right? Your keys don't exist when you're not logged in. They just don't exist anywhere in the world. They can't be stolen because they don't exist, and frankly…


Debra Farber  34:59

That just blows my mind!


Dave Burnett  35:01

…Yeah, and you can't be subject to a subpoena and be forced to turn over your private keys, because they don't exist. 


Debra Farber  35:07

So that’s a challenge right now. Or, if you use your biometrics on your iPhone, for instance, the law basically says - the Supreme Court says - that it is, you know, in the public domain - your face and your biometric. So, you can be forced to use your face to open up your phone for a search at the border, for instance. But, here in this instance, it seems like there'd be a good argument that, you know, your keys don't exist when they're not used so you cannot be compelled to share them.


Dave Burnett  35:34

That's right, they don't exist.You provide them. And, you know, for personal devices, even in the United States, the laws are different at border crossings than they are when you're physically inside, deep inside, of the territory, right? You have very few rights at the border. Even as an American citizen, once you're in the country, you can have different rights; and frankly, courts are split on whether you can be compelled to provide a biometric under court order. Certainly, you can't be forced to provide a PIN or a password.


Debra Farber  36:05

Right. There's something you know, is not necessarily considered in public, but your face is. So the argument goes, you're not secure in that, right?


Dave Burnett  36:14

Even your finger, right? This is part of why on phones, things timeout, you know. So, if you haven't used FaceID or TouchID in a certain amount of time, you have to default back to PIN (a PIN passcode) to keep your data more safe. And of course, we support PIN as well for those kinds of security considerations. But the important thing, and the really interesting thing, again, is that security is shifting away from…and authentication shifting away just from…our phones; more and more devices and more and more services are adopting biometrics to create more seamless user experiences. So, if they've got keys that they've created to represent or to protect your data, well, then they're in possession of those keys, whether those keys are on a door lock, or whether they're on a bank server or what have you. When you're using ZeroFace, that doesn't have to work that way. They don't have to store keys to protect your data. They can get the key from you, when you're authenticated, to unlock your data, as opposed to the other way around. 


So, a service provider wouldn't even be able to respond to a subpoena without actually the user being present to regenerate those keys. So that's one of the privacy issues that we have, and it's not just around biometrics. It's around system security in general, right? Okay, maybe, maybe my phone is private and harder to get access to, but you know, the courts can…you know…there are all sorts of scenarios where people would go to service providers to subpoena or to get access to your private data. Let's set aside government or use cases. Let's talk about (criminal) hackers that can break into bank systems or IT systems at any number of companies to get access to sensitive data. Well, we can not only get rid of all the biometric sensitive data, we can make sure that the company has no keys that encrypt your personal data on their service. So, even if they break into their network, if a hacker breaks into their network, they might steal some encrypted files, but the company that they stole your data from - that was holding your data and trust - don't even have the keys. 


So, it's two paradigm shifts for the price of one with our technology, right? When we completely get rid of the personally identifying information, the security risks around storing face information. We also add revocable features - revocability of your face data - and have all sorts of other very interesting modern things. Like, we make it possible for you to expire your biometric after a certain amount of time, so as to be refreshed. So, you might give a bank, for instance, permission to use your biometrics for a year or for a month or some set period of time, and know that if everything blows up (i.e., they're not able to use it) past the point in time that you permissioned that custodial use of your biometric data.


Debra Farber  39:14

Well, that's really great giving you power to be able to choose when your data is used or not, and have the power to revoke it is just key, especially with GDPR being exported around the world and having more rights, and given current trends where everything is becoming more decentralized and humans having more ability to control their own data because it's living on their devices that they control.


Dave Burnett  39:39

Yes, if it's in a cloud service, that's part of why we talk as much about the security aspects of our product as the biometric aspects. Because today, let's say, if you're using some cloud company to save your documents up in the cloud…you know, could be Dropbox, it could be Google, it could be any one of a number of vendors, right? They are responsible for protecting your data, right? Well, we can shift that focus of control back to you by saying “No, the way your data is protected is with keys that are derived from your face that they don't get to have access to, that don't even exist when you're not there.” So, we really offer not just a biometric solution that doesn't know what you look like, right? So, it’s a big change there on the privacy side. But we also have a radical change on the security side by not requiring keys to be saved, and that has massive implications for cloud services, as well as you know, crypto wallets and other web3, distributed ledger, and sorts of use cases.


Debra Farber  40:46

It's honestly so innovative. So, we're talking biometrics, what are your metrics for false positives? Because that's…like you were saying before…everyone is always competing over who has the best authentication based on these metrics. And so, what typically do companies state as their false positive metrics are and then what are ZeroBiometrics’s?


Dave Burnett  41:08

Sure, there's two terms that are commonly used to describe biometric system performance. One is a measure of accuracy. And it's called the false acceptance rate (FAR). How likely is the system to confuse Debra for Dave, right? And usually that's expressed as a ratio - you know, the likelihood in a large population of people finding someone that looks like me is a ratio. You have to have a certain size of population before you're gonna find someone else that looks enough like me that it might trick the system, and different biometric modalities - all kinds - operate within different kinds of expected performance ranges. So, with fingerprint there's one chance in 50,000, sometimes it's as much as 100,000 people that might have a matching fingerprint, or close enough that it would trick or fool the system.


Debra Farber  42:03

That's considered a high bar? Like a good biometric is a false acceptance rate out of 100,000 people? 


Dave Burnett  42:11

Yes, and in fact, you know how we use four digit PINs on our ATM cards here in the United States? That's roughly equivalent to…it's usually about one over 10,000 is what that's considered to be…sometimes a little bit higher. It just depends on how you do the math. But, that's the rough metrics. So one in 50,000, or 100,000, that's the range for fingerprint. For face that used to be much lower. You rewind 5, 6, 7, 8 years, and it was far lower quality, which is why frankly, fingerprints came to our phones first, because the image recognition, face recognition, the the AI engines - all the various models that were historically used in that whole model of making face recognition work - were just not very good. Now face has gotten pretty decent, and it's usually around the normal measurement of around one in a million. There's one company that has gotten much higher to about 1 in about 125 million. But, because we did this complete clean sheet design, we got to leapfrog in performance. This often happens when you do clean sheet design - where you get that kind of leapfrog. With the testing that we did with the Australian Science Agency - kind of the U.S.’s equivalent to NIST; their agency is called CSIRO - our performance was one in a billion. That means you'd have to have a population of 1 billion people before you would find someone with a similar-enough face to mine or to yours or to anyone else’s to create a false match.


Debra Farber  43:47

That's just insanely impressive! Partly why I said it feels like magic what you've created here. But obviously, you've got the bonafides and the details to back up the technology claims. But, it's a crazy number!


Dave Burnett  44:00

It is a crazy, crazy number, and you know…and I don't want to soft pedal it because it was a deep research exercise that we did with that agency; they didn't just measure performance, they actually went into validate the science claims, like the statement that I made earlier that we don't know what you look like. That's a statement - you know, the idea that we're not saving information that describes your face - that's something that was actually validated by their deep review of our tech. So, it's not just performance; they were assessing the science behind what we were doing as well. It's still a crazy number, right? It's still a crazy, crazy number, and I'm used to getting that reaction. 


When I talk with folks in the industry or their colleagues or customers, there's a lot of, “How? How can you possibly do that?” but I've yet to have anyone - no matter how deep they've dug into the technology - find a fatal flaw or error. We scrubbed that particular research really carefully before we went public with it, for obvious reasons.


Debra Farber  45:04

That makes a lot of sense; and, I know you've told me in the past, too, that you were very skeptical of these claims yourself, having been in the identity and security space for so long. Then, when you kicked the tires and looked under the hood, it sounded like you were just as impressed as I am now.


Dave Burnett  45:23

Yeah, your listeners would probably find this interesting, right? I mean, it's one thing if this invention had come out of Google or Apple. We would kind of believe it on the surface, right, because of massive R&D budgets and deep science and millions of millions of dollars, and 10s of…1000s of PhDs working in the trenches day and night. But, it was a little company that came out of Singapore that brought this technology to market, and so I was quite skeptical being kind of a grizzled, old veteran in the identity and biometric space. At first, I was just…”skeptical” is the polite word for it, and I spent a fair amount of time talking with the CTO, but more importantly, doing adjacent research. There's plenty of information out there about zero knowledge proofs and how they work, and I could see the conceptual application to biometrics. There's been quite a bit of research published by various academic researchers that talks about the core concepts that are somewhat different in our implementation. 


You know, there's unique IP certainly in what we're doing, but the concepts have been studied and explored in depth since the early 2000s. In many ways, what we are is the first implementation of a new model for doing biometrics, and I'm sure there will be many other companies in the future that adopt models similar to ours. So, we're on the bleeding edge, but we're the first of many to come because of the privacy issues that we all face. And, the expansion of biometrics is unstoppable at this point. So, these security risks are too significant. The privacy issues are too significant. The legislation is happening too fast. We can't stay with this old model. It's like saying, “No, I still want to drive a mule-powered wagon when the rest of us are getting in jet planes.” So we’re the first jet plane, right? We’re the first maker of the first jet plane; there will be many others. I think, in some period of time - it might take 5 years, 10 years, or 15 years - you won't find the existing model of biometrics being used anymore. There are just too many problems with it.


Debra Farber  47:44

Oh, thank you for going through unpacking all of that. What are some of the interesting use cases that this new technology can really help with? I mean, you know, I've gone through your website, I'm just gonna list the categories of use cases out now because we only have so much time left. So, we've got: authentication, identification, verification, safe biometric portability, account recovery (which we talked a little bit about before), QR code authentication, and biometric encryption. There's a whole bunch of scenarios here, but I really love that QR code authentication example that you've used in the past with me. Can you talk about the airport luggage tag scenario?


Dave Burnett  48:23

Sure, I'm happy to. It's my favorite use case to talk about, not because we have any intention of building this product, but because solving this problem is not possible with traditional biometrics, at least not in a safe way and certainly not in a way that's fully-anonymized and privacy-preserving. Especially pre-pandemic, I was on flights all the time, right? I was rarely at home for more than a couple of weeks before I'd be back on the road, and while I was fortunate that I never had a piece of baggage stolen, it's surprising, both in the U.S. and globally, just the lack of security around luggage. In theory, it's a secure space at airports. It's super easy for anyone to just walk up, look like a passenger, and just grab a bag and roll it off from the carousel. So, it's an interesting problem that you're not going to solve by throwing bodies at the problem. It doesn't happen often enough that it's going to be solved by throwing people at it, not even in low-cost parts of the world where labor is fairly cheap or free or close to free. Frankly, what's lost usually isn't valuable enough. It's inconvenient, but it's not like you've got $5,000 or $10,000 worth of stuff in that luggage, and even if you were dumb enough to put that much valuable stuff in there, that's your fault, right, is the way the whole model is designed. 


But, if you want to solve that problem, that's where we can do some very interesting things with ZeroFace. Because we break those two paradigms, we don't have to worry about the security of your biometric information. Then, we don't have to worry about any of the key data or anything else. So, because we're not saving biometric information about you, we do have to save that zero knowledge proof that I mentioned earlier; and, we save it in this file called a ZeroMap. The ZeroMap file is not directly readable or processable. You know, you can't look inside of it and make any sense out of it. Besides, we wrap it up in security layers. We AES encrypt it and we have all sorts of integrity checks inside the file itself, which by the way is very different. Going back to traditional biometrics, you just get a raw set of biometric data that they save or that saved by the system. Again, you have to put security around it. So, this is an example of something that we've done differently. 


Our ZeroMap files are inherently encrypted and they have internal checksums, and there's internal encryption beyond the outer layer…I mean, it's a layered set of defenses around the data even though that data doesn't contain any biometric information about you. We still make it safe. So, by making it safe, we enable a couple of things. We can copy your ZeroMap file to any device that you want to use. You can copy it from your PC to your phone or from your phone to your tablet. The challenge with existing Biometrics is that you have to enroll in different places. That's not true with ZeroFace; you enroll once, and then you can push your ZeroMap file to any place that you want it to be, including your baggage tag. Then, one of the things that you can do is take a version of your ZeroMap and encode it in a 2D barcode or QR code and provide it to your airline, and then they can print that on your baggage tag. 


Even though the file is encrypted and protected, if someone managed somehow to break the outer layer, we have internal flags here to restrict the usage of the tag. So, for instance, we can say that this QR code, this version of your ZeroMap, it's only good for 48 hours. So, you give this tag…you give the QR code to your airline the evening before you fly. It's good for 48 hours. After that, it can't be used anywhere. And then, you can also geocode it (say, “Okay, it's only going to be good at my departure airport and my destination airport.”). So, when you arrive, you can take your bag off the carousel and your baggage tag has all the information now that's needed to do that zero knowledge proof for your face. It doesn't have any information about you. That QR code doesn't know that it's Dave. It's just here's a face - here's a zero knowledge proof for somebody's face. I don't know whose face it is, but here's the zero knowledge proof. Then you can walk up to a little turnstile with gates and present the QR code. The machine can scan it, then scan your face, compare the resulting hashes, and if it's you, then the gate opens and then you are allowed through. 


Or, frankly, you can just put it on a mobile phone. Because if you just have a random check, you know, like one security officer doing a random check using a mobile phone, then you can deter a lot of theft. Or, even let's say you want to do one of the things that is not uncommon (especially out of European airports) to do a certain amount of checking to make sure that the ticket holder - before you board, they usually grab a couple of people and say, “Let me check to see are you really you; let me see your passport; let me see your ticket, and let me see your carry on luggage,” and then the cross check all of those things. Well, here, you could just do it very quickly, you know, let me scan the tag that's on your luggage and let me scan your face really quickly. Then boom, yep, it's you. You've proven with this QR code that you are the owner of this physical real-world item. And it's anonymous. And the turnstile reader that I was talking about, it doesn't even have to have any ability to save data; it could be running in read-only mode. It doesn't need network access because - we didn't even talk about this - there's no server side back into our technology. It's designed to all run on an endpoint device, whether that's a mobile phone, turnstile reader, door lock or what have you.


Debra Farber  54:22

So, it opens up so much in the usable privacy and security space. My brain is right now, you know, going through so many potential use cases…this can really help with the friction that we have today, or just fix unsafe ways of verifying things today that have potential vulnerabilities in the process. So this has been a fascinating discussion, Dave. I'm so glad you were able to join us. Give us a call-to-action to the group we have today. Do you want to make a shout out to your website?


Dave Burnett  54:53

Oh, sure. So it's ZeroBiometrics.com. And you know, we love talking with folks about the technology. Of course, we like talking to potential customers just as much as we would like talking to folks that are curious about the technology or skeptical about it or who need to understand the kinds of the security implications. It doesn't have to be a customer engagement. We're really on a mission to not just launch our company and to generate revenue and be commercially successful. We're really on a mission to help the whole industry see that there's a new way of doing biometric authentication that is fundamentally privacy-preserving and has significant data protection implications for the use of the ZeroKey concept - having keys that you can delete and then recreate. We're really on a mission to raise awareness about this new way of doing biometrics, because it's high time that we could trust all of the systems that we use, to identify us to keep us safe, to keep us private, to know that we don't have to worry about data breaches or misuse of our information. And, in this world, where so much financial data has been routinely breached and lost, the expansion of traditional biometrics means that it's just a matter of time before biometric data breaches are as common as financial data breaches. We can't let that happen, and it will if we continue using the old paradigm…if we insist on using donkey hall wagons as our primary method of biometric transport, then we're going to have a lot of biometric data breaches…but it's NOT necessary.


Debra Farber  56:46

Well, it's great to know that there's a solution that will bring us into the future where we won't have to think about biometric data breaches happening as often as we've seen financial ones. Dave, thank you again for joining us. Until next Tuesday, everyone, we'll be back with interesting content and with another great guest. 


Outro

Debra Farber  57:06

Thanks for joining us this week on Shifting Privacy Left. Make sure to visit our website ShiftingPrivacyLeft.com where you can subscribe to updates so you'll never miss a show. While you're at it, if you found this episode valuable, go ahead and share it with a friend. And, if you're an engineer who cares passionately about privacy, check out Privado, the developer friendly privacy platform and sponsor of this show. To learn more, go to Privado.ai. Be sure to tune in next Tuesday for a new episode. Bye for now.

Podcasts we love