The Shifting Privacy Left Podcast

S1E1: "Guardians of the Metaverse" with Kavya Pearlman (XRSI)

October 24, 2022 Debra J Farber / Kavya Pearlman Season 1 Episode 1
S1E1: "Guardians of the Metaverse" with Kavya Pearlman (XRSI)
The Shifting Privacy Left Podcast
More Info
The Shifting Privacy Left Podcast
S1E1: "Guardians of the Metaverse" with Kavya Pearlman (XRSI)
Oct 24, 2022 Season 1 Episode 1
Debra J Farber / Kavya Pearlman

(Transcription)

Welcome to the first episode of Shifting Privacy Left. To kick off the show, I’m joined by Kavya Pearlman, Exec Director of The eXtended Reality Safety Initiative (XRSI) to discuss  current challenges associated with extended reality (XR), the XRSI Privacy & Safety Framework, and the importance of embedding privacy into today’s technology.

---------
**Thank you to our sponsor, Privado, the developer friendly privacy platform**
---------

In our conversation, Kavya describes her vision for bridging the gap between government & technologists. While consulting  Facebook back in 2016, she’s witnessed 1st-hand the impacts on society when technology risks are ignored or misunderstood. As XR technology develops, there’s a dire need for human-centered safeguarding and designing for privacy & ethics. 

We also discuss what it’s been like to create standards while the XR industry is still evolving, and why it’s crucial to influence standards at the foundational code-level. Kavya also shares her advice for builders of immersive products (developers, architects, designers, engineers, etc.) and what she urges regulators to consider when making laws for web3 tech. 


Listen to the episode on Apple Podcasts, Spotify, iHeartRadio, or on your favorite podcast platform.


Topics Covered:

  • The story behind XRSI, its mission & overview of key programs.
  • The differences between the "XR" & "metaverse."
  • XRSI's definitions for new subsets of "personal data" w/in immersive experiences: biometrically-inferred data & psychographically-inferred data.
  • Safety, privacy & ethical implications of XR data collection & use. 
  • Kavya explains the importance of the human in the loop.

Check out XRSI:

Guest Info (Kavya Pearlman):

Send us a Text Message.



Privado.ai
Privacy assurance at the speed of product development. Get instant visibility w/ privacy code scans.

Shifting Privacy Left Media
Where privacy engineers gather, share, & learn

Buzzsprout - Launch your podcast


Disclaimer: This post contains affiliate links. If you make a purchase, I may receive a commission at no extra cost to you.

Copyright © 2022 - 2024 Principled LLC. All rights reserved.

Show Notes Transcript

(Transcription)

Welcome to the first episode of Shifting Privacy Left. To kick off the show, I’m joined by Kavya Pearlman, Exec Director of The eXtended Reality Safety Initiative (XRSI) to discuss  current challenges associated with extended reality (XR), the XRSI Privacy & Safety Framework, and the importance of embedding privacy into today’s technology.

---------
**Thank you to our sponsor, Privado, the developer friendly privacy platform**
---------

In our conversation, Kavya describes her vision for bridging the gap between government & technologists. While consulting  Facebook back in 2016, she’s witnessed 1st-hand the impacts on society when technology risks are ignored or misunderstood. As XR technology develops, there’s a dire need for human-centered safeguarding and designing for privacy & ethics. 

We also discuss what it’s been like to create standards while the XR industry is still evolving, and why it’s crucial to influence standards at the foundational code-level. Kavya also shares her advice for builders of immersive products (developers, architects, designers, engineers, etc.) and what she urges regulators to consider when making laws for web3 tech. 


Listen to the episode on Apple Podcasts, Spotify, iHeartRadio, or on your favorite podcast platform.


Topics Covered:

  • The story behind XRSI, its mission & overview of key programs.
  • The differences between the "XR" & "metaverse."
  • XRSI's definitions for new subsets of "personal data" w/in immersive experiences: biometrically-inferred data & psychographically-inferred data.
  • Safety, privacy & ethical implications of XR data collection & use. 
  • Kavya explains the importance of the human in the loop.

Check out XRSI:

Guest Info (Kavya Pearlman):

Send us a Text Message.



Privado.ai
Privacy assurance at the speed of product development. Get instant visibility w/ privacy code scans.

Shifting Privacy Left Media
Where privacy engineers gather, share, & learn

Buzzsprout - Launch your podcast


Disclaimer: This post contains affiliate links. If you make a purchase, I may receive a commission at no extra cost to you.

Copyright © 2022 - 2024 Principled LLC. All rights reserved.

INTRO
I’m Debra J Farber. Welcome to The Shifting Privacy Left Podcast - where we talk about embedding privacy by design and default into the engineering function to prevent privacy harms to humans…and to prevent dystopia. 
 
Each week, we’ll bring you unique discussions with global privacy technologists and innovators working at the bleeding edge of privacy research and emerging technologies, standards, business models and ecosystems. 
 
Our very first podcast guest is Kavya Pearlman, a self-described cyberguardian, CEO & Co-founder of the extended reality safety initiative, and my good friend. In this episode, we discuss:

  • Kavya’s background and what drove her to build XRSI;
  • The definition of XR and what it has to do with the term, metaverse;
  • New subsets of personal data unique to XR: i.e., biometrically-inferred data and psychographically-inferred data;
  • Risks to people’s privacy & safety and the ethical implications of XR data collection;
  • Kavya’s emphasis on the need to design with “humans in the loop” and how to proactively protect privacy and autonomy when building metaverse tech; and
  • Info on participating in Metaverse Safety Week this December. 

Enjoy the episode.


Debra Farber  01:30

Welcome to The Shifting Privacy Left Podcast. I'm your host and resident privacyguru, Debra J. Farber. And I am so excited to be here today to kick off my very first episode. Every Tuesday, we'll have lively discussions with privacy technologists, innovators, and ecosystem makers who are working tirelessly to make products, services and experiences safe for humans by embedding privacy, data protection, and security into the design, architecture and development. All before code is ever shipped. Of course, I want to thank my sponsors over at Privado, the development friendly privacy platform, who's made this opportunity possible, and our community partner XRSI, you can learn more about Privado at privado.ai. And you could learn more about XRSI at xrsi.org


Debra Farber  02:22

And in fact, speaking of XRSI today's my great pleasure to introduce my very first guest and cyber guardian, Kavya Perlman, the founder and CEO of the extended reality Safety Initiative, or XRSI, so our community partner, Kavya has a BS in Computer Application and Programming from the GBM Institute of Technology and Management, and a Master's degree in Information & Network Security from DePaul University, Chicago. She's the former Information Security Director for Linden Lab, makers of one of the very first online virtual world, Second Life, and VR platform Sansar. So, she has a breadth of experience here. And through the creation of the umbrella organization, XRSI. Kavya has pulled in global XR pioneers, visionaries, technologists and ethicists, including myself, launching engaging projects such as The Medical XR Council, ReadyHackerOne, The XRSI Child Safety Initiative, Metaverse Reality Check, The Cyber XR Coalition, and - the program that I lead, The XRSI Privacy & Safety Framework (XRSI-PSF). Kavya, welcome!


Kavya Pearlman  03:35

Thank you. Congratulations, and oh my god, this is so exciting! I feel truly honored to be part of this, Debra. I mean, we've been talking about it for so long. So let's do this.


Debra Farber  03:49

Let's do it. Let's do it. Okay, so obviously, you've got all this breadth of experience, you know, you've worked at Linden Lab, which is, you know, even I've checked out Second Life. And you know, way back when, and it still has, so many users. Tell me a little bit about what inspired you to create XRSI after having worked in the virtual reality (VR) and augmented reality (AR) space, which is what we collectively call "XR," (just for our listeners here), right? What inspired you to create XRSI and how would you sum up the overall mission of the org?


Kavya Pearlman  04:23

For sure, as you know, Debra, this goes back to my early intuition around my career that one day, I want to be able to bridge the gaps between governments and the technologists and you know - mostly the people who make decisions about cyber warfare, or cybersecurity - oftentimes don't understand the consequences on humans, societies, and (in general) countries altogether.

So, I remember back in 2016, that's when it really started to hit me - back when I was working at Facebook doing third-party security during the U.S. Presidential Election time - the impact on humans and societies, when the risks around technology are ignored or just not understood. And thereafter, when I assumed the Director of Information Security at Linden Lab, that prototypical metaverse that is Second Life -  the oldest existing virtual world -  had its own unique challenges, and a tremendous amount of regulatory obligations that we had to comply with.

So, I had to get creative together with the Chief Compliance Officer there, Scott Butler, one of my really good friends. We started to think in innovative ways to safeguard those virtual worlds, and they had very unique and interesting aspects that we had to take into account. So, this is where I started to think about how the world is going to look different because they were also developing this VR platform called Sansar. And, I was trying to provide tech policy guidance & cybersecurity guidance to that platform as well.

So, as soon as I left Linden Lab, it was definitely something that was on my mind to continue to drive towards my aspiration of bridging the gap between what is understood by governments and what is understood by stakeholders: what the technology really needs at the code level. How do we engineer better and that's where XRSI came into being first I met Security Researcher, Ibrahim Baggili, who was who is a Senior Advisor at XRSI. He was hacking virtual reality! I was really inspired by his work and then took his work and some of the novel cyber attacks he did with his team at the University of New Haven to bring to the world, and I announced this mission to help build safety and inclusion in "extended reality" ecosystems or "emerging tech" ecosystems. Nowadays, we've kind of have more common words for it -  "Web3," or "Metaverse" -  and those are the emerging tech ecosystems we are noticing. So yeah, those things inspired me because I could see the world is gonna look a lot different. And we need to get ready for what is to come.


Debra Farber  07:13

Amazing. And before I go on, could you just give us a quick overview of what are the work you did for Facebook in that moment? And was it in 2016? You said, it was it was a contract, right? Not as an employee,


Kavya Pearlman  07:24

Right, so I was a consultant, but privileged to it's...it goes back to you know...so, the controls that were not that mature during that time, pretty much had the privileges that an employee would, you know, we're talking about 2016. And as everybody knows about, you know, what happened during that time, the risk posture of this company, Facebook, had elevated to, you know, handling nation-state attacks, and that hadn't been seen before. So, this was a totally different ballgame. And, I had the front row seat two.


Debra Farber  08:01

That's so interesting! Honestly, I wish I was a fly on the wall in those meetings. So going back to XRSI, how would you sum up the overall mission of the org? I understand now, what inspired you to create it? But, you know, what's the mission? What are the what are the main goals for the organization?


Kavya Pearlman  08:18

So, we believe that the world needs something more proactive. Thus far, every...let's say...nonprofit organization that are trying to make a "better internet" generally talk about policy and are not necessarily integrating it with how engineers should code or proactively get in front of some of these issues. So, this is where I think XRSI is unique. We are trying to proactively discover things that are risky.  And it is very challenging, because people can't even agree to the terms or what should be called this or that. What is XR? What is the metaverse? So starting with creating a common baseline understanding and moving all the way do establishing some proactive measures and iterating constantly on that knowledge. So, as the technology develops, there is a dire need to proactively safeguard people from these new technologies.

And that's where XRSI is uniquely positioned. We have over 160 advisors and volunteers worldwide who are actively looking into curating some of the research from different perspectives and angles, and then putting together in the form of various white papers, but all the way to a comprehensive framework - which you're leading - and part of the XRSI Privacy & Safety Framework. That's one of my favorite programs. But you know, there are several other programs, because there's almost like a layer of onion that just keeps peeling and at the core of it is a human centeredness, safeguarding, ethics and privacy.

But when you look at it from various angles, we got to think about - oh, impact on humans brains! So there is this aspect of medical that we address, we have created The Medical XR Council to try to understand various biometric stuff, medical XR-related stuff. Then there is the aspect of, oh, if all the media platforms are saying things, you know, that may or may not be trusted, where do we go to find the trust? We're like, okay, we need a media platform. So ReadyHackerOne came into existence. A special focus on children is necessary; this is the kind of technologies and the convergences we're dealing is very unique and we haven't prepared well to address those things. So The Children's Safety Initiative was formed. The Cyber XR Coalition specifically addresses the diversity and inclusion aspects that stem from algorithmic biases...that stem from now representation in virtual environments.

So all of these collective things from a multidisciplinary perspective looked at just to create an overall comprehensive effort to safeguard - and that's why I keep saying safety and safeguard because it's way beyond just security. It's way beyond just physical security. We have to really think about what's so unique about these emerging technologies, and that's what kind of remains at the forefront. A couple of things that I would even talk about - "human in the loop" or "societies in the loop" that we didn't take into account back in 2016. And so now we have to, and we must.


Debra Farber  11:37

That's awesome. And I want to, for our listeners, let them know about the XRSI Privacy & Safety Framework I've come on board to help XRSI around this past January. So however many months that is ago, Kavya and I met on International Privacy Awareness Day (January 28th 2022). So it was just well-timed. It almost feels like some sort of divine timing, where the "cyberguardian" and "privacyguru" meet for the first time and all this wonderful magic and sparks happen! So, Kavya told me all about what she's doing at the organization and how well-formed the various programs already were and she had already helped lead version one of this framework. So, I am now helping with the iteration of XRSI PSF version 2, where we're just getting a lot more into the depths of the controls that you select for the various risks.

This framework really is a comprehensive risk management framework that can be used by businesses of any size, we've kind of right sized it for 1) small emerging startup teams that are working on the metaverse; 2) it could be a medium sized company that's in high growth mode; or 3) it could be an enterprise that's putting together some program or a platform or hardware for the metaverse and they have more systematized way of addressing risks. So obviously, all three types of organizations wouldn't be able to have the same level of staff and ability to mature their controls for privacy and safety. So, we've kind of right size guidance that would help organizations achieve the goal of risk mitigation, first, risk identification, and then mitigation.

We were inspired to put this framework together by NIST Privacy Framework. But we've added so much more, we liked the way that they organized the flywheel of objectives that we want to achieve. And so we kind of took from that and made other improvements that are very specific to the XR space. And so you know, if anyone has listened to this and is really wants to get involved, please do reach out to me after my email address is debra@XRSI.org. For this particular program, because of course, XRSI is only one of the organizations I work with. So I have my own email address as well for my company, Principled LLC: debra@principledadvisor.com.


Kavya Pearlman  13:57

Now that you reminded me of the time that we met, I don't think I've ever mentioned to you what really attracted me to you. I was like, in the back of my mind as I was listening to you through the Twitter Spaces event, I was like, "Oh my gosh!" And I don't know if you remember right after this that I had sent a tweet that there was some kind of a privacy discussion around XR. And I was like, I only wonder if there was a privacy guru. Because you know, the kind of territory where you were trying to address it's uncharted territory. And it really, experience alone is not enough. It really requires wisdom. And so when you were, you know, we stayed in that Twitter Spaces for hours and hours.

That's exactly what I saw with you: wisdom! And "guru" is such an appropriate word for that; that wisdom necessary to what you're doing today is to inform people to shift left for privacy. And until I met you, I hadn't heard any other expert looking at privacy from that perspective, from the perspective of what must happen at the engineering level at the architecture level, and talking about all that detail. Another aspect was you were like, you had the pretty good understanding of Hedera and the decentralized architecture and in XRSI PSF version 2.0 we wanted to explore the decentralization and XR intersection, so, it was so perfect. I don't know if I've ever mentioned to you but this was like, "Oh my god, we need this privacy guru for XR!" and thankfully you accepted. And, there we are, just trying to address - and hopefully soon launch - the version 2.0 of the PSF.


Debra Farber  15:40

I honestly had no idea you were literally thinking to yourself, and of course you are Indian. So you know "guru" actually is used in place of "teacher," right? And, so it's so funny, and it felt like a divine timing to me as well.  And I chose the moniker "privacyguru" (by the way, she's referring to my my Twitter handle, my email, my one of my private email addresses, I use it everywhere on the web for my short links to my LinkedIn). I chose it back when gMail was still in its infancy and it was still invite-only back in 2004 when I was in law school. And it's amazing how you know, I guess I wanted to be a privacy guru, but I've kind of now grown into actually being that role that I've set for myself. So in just reflecting back, I do find that fascinating and I'm glad you found me! I'm glad we found each other and so, I mean, I feel like this is like a virtual hug we're having right now that everyone's listening in to. 

So let's dive deeper into some of the conversation and you know, I said that XR means extended reality, but you know, for our listeners I define that, but how would you give the full definition to what is XR when we're talking about XR? So, what's XR?


Kavya Pearlman  16:50

XR is generally used like an umbrella term extended reality. The way I like to describe to most people is we are extending our realities to go into another dimension. And now that dimension, when the metaverse - this next iteration of the internet is in its full effect - that's what I would say we will extend our reality to go to the metaverse. So, XR really is an umbrella term, where we have our perceived reality the objects that we can see the perceived environment. But anytime we enhance that, whether it is using virtual reality, which is you know, fully digitalized environment, and you can interact with using some head mounted goggles; or it is an augmented reality, which is you know, augmenting digital objects on top of the real world, your perceived world is now enhanced; or it is any other form of combination, maybe using a desktop experience to really extending your dimension of presence, experiencing realities. So that's where you know, this kind of an umbrella term serves us really well is every time like if you look at Second Life, people are literally creating their extension of their original life in different forms different avatars. And that's where XR is really a quite a unique term and umbrella term to refer to a spectrum of reality that is beyond just our perceived reality at the moment. And that's what I think XR is.


Debra Farber  18:28

Great. And so the thing is, we keep hearing about the metaverse, but we also hear about, you know, VR, AR,  XR as the umbrella term. And to me, it kind of seems like companies are using the term Metaverse, or a lot of companies at least that are using the term Metaverse are using it really to market rebrand virtual reality or augmented reality. You know, I'm seeing a lot of VC firms and large tech companies. I mean, look at Meta even right - they are the metaverse company is kind of how they're branding themselves. They're investing billions and billions of dollars in the pursuit of creating immersive experiences. So how do you define what the metaverse is? And how does that align or diverge from what XR is?


Kavya Pearlman  19:02

Good question and a very timely question because a lot of people sort of tried to define that. What I concern myself most with is, I personally don't really care whether people call it the "M-word" or any other "S-word," simulated reality or whatever. What I really focus on is the convergence that we are referring to when we call something the Metaverse is essentially, is a confluence of so many different technologies that will enable us to extend reality into various different dimensions; dimensions that we can't really see from the naked eye. That's why we need these devices.

So, technologies like AR, VR, 5G, edge networks, improved graphics and computer hardware, like there's a lot of different capabilities, IoT devices, etc. would go into creating a fully-networked, interoperable, next iteration of the internet, essentially, where we would have various different properties. But interoperability would be one big one. Immersion, the ability to immerse ourselves and not just, you know, at a head like currently, we experienced the Internet at enhanced distance, even at the, you know, handheld, you're kind of like a little bit disassociated. But with the next iteration, or the metaverse, it's much more immersive, or at least has the option to immerse yourself. Sometimes even being almost indistinguishable from reality; almost like replacing reality, but not entirely.

So I think Metaverse is going to become this successor state to mobile computing and primarily, even though it just comes from some of the popular culture, I really hope that it's not going to be that pop culture, dystopian nightmare. That's where we come into play, is we're trying to make sure that we don't just create this dystopic nightmare that was documented in Snow Crash where people are addicted; they've kind of lost a sense of reality, etc.; and there are all these problems. We are trying to proactively build better internet than the one that is mentioned in one of those pop culture books, Snow Crash


Debra Farber  21:30

Amazing. I say, "Amazing!" because you've just touched on a bunch of things that I've raised. I have a lot to say. So, first I'll point out that I have definitely read Snow Crash. It is my fiance's favorite book. It's by Neil Stevenson, a futurist who created the cyberpunk book that the began the concept of the metaverse. What's fascinating to me is how, usually tech bros, will look at dystopian novels, find some technology really interesting and then want to bring it to life only see the positives in it and not really put the effort into preventing the dystopia that was obviously part of the novel they picked it up from. I feel like the metaverse is a great example of that. So that's number one.

The second is that people seem to conflate things. They seem to define the metaverse as one place that they are building, like "I am building a Metaverse, and to me the metaverse as you say is like a next stage of the Internet. You don't have an Internet you have the Internet that is a connection of many different web servers and websites and all of the independently-owned and operated assets kind of all are accessible through an interface like through a browser. And so for the metaverse, I mean, to me that it seems like it's the end state where the differences that you have from the Internet today is that it's going to be different levels of immersion. Right? You have an immersive experience. This is not like not just that you have what we have in many things today. Well, you might have like, "Oh, look at how this sofa looks in your actual room before you buy it," like we have that and that's going to be a precursor to what's in the metaverse, but I wouldn't say that that app that allows you to do that is a Metaverse, right. What are your thoughts on that? Or how people talk about the metaverse?


Kavya Pearlman  23:08

So I want to be very clear on this like the XRSI definition to me is the simplest for everyone. It's that "Metaverse" is the interconnected virtual world but interconnected and interoperable virtual worlds. So, where we can actually go from one particular virtual world, whether it is augmented mixed, or however an extend our reality into other virtual worlds as well. So hopefully, over time, this will become more of a common understanding. In fact, for that matter, we are working on a particular standard that will devise as you know, XRSI is a standard developing organization. So we're working together with a few other standard bodies as well to potentially provide a standard definition and put that standard out there. It's a quite a challenge, you know, not just the metaverse, but we saw that with XR also initially people were like What is XR? What is VR and I used to go around the world, you know, sort of teaching them what makes an "extended reality." And so now we are kind of fighting that same war words with Metaverse.

But it's really important to know that Meta, or for that matter any of the platforms is not the metaverse. Second Life is not the metaverse.  Roblox or Fortnight - no matter how sophisticated they are, whether they have a component of commerce, whether they have a component of social immersion - that's not the metaverse. The metaverse is only one and it is evolving.

As we experience more and more convergence of these technologies. The unique aspect that I'm looking at is oh, hey, how these technologies are coming together. What about safety? What about privacy? What about identity protection in these environments where, you know, you can be an embodied avatar. So what happens to your identity in the front end? In the back end? Ethical decisions? Because we need persistence interoperability and that means we need to share lots of data across companies. How does that all work out? And how do we make sure that "human in the loop" or "societies in the loop" are not undermined, or minorities are not targeted or murdered because they just didn't know that they were sharing so much more data about themselves.


Debra Farber  25:47

Right. So that actually brings up the segue basically into two new categories of "personal data" that XRSI defines. I know through my work with you and XRSI that a truly immersive experience necessitates the collection of tons of personal data. And so 2 new areas that are really focused...are really a subset of personal data that are part of the vast data collection of the metaverse are going to be: biometrically-inferred data, and psychographically-inferred data. Can you elaborate a little bit on these terms and what they mean and then perhaps give us a few examples?


Kavya Pearlman  26:26

Sure. Excellent. XRSI is one of the first organizations that kind of defined XR data so XR data includes what we know personal identifiable information. We call it in the United States, but personal data according to GDPR. But what we say is it includes way more, it includes biometrically-inferred data, sensor data to enable six degrees of freedom, and creating presence, persistence, and immersion. So any kind of data, inferred data, sensor data that is required to create that presence, persistence, & immersion, we collectively call it XR data. Now within that XR data, there are these two subcategories that are so important to understand, as well as potentially even put in the law, because they're going to impact how we treat that data. So let's look at biometrically-inferred data, a collection of datasets that is inferred from behavioral, physical, or biometric identification techniques or other nonverbal communication. This could be inference of your gender identity, inference of mental workload or health status, cognitive ability, is the person sleepy, what about the religious background or cultural background?

So PII essentially doesn't directly reveal, oh, this person is of this religion, but one could infer those things. One could infer skills and abilities and all the way to physical health, mental health. And then there is the second type of data that we're talking about, which is psychographically-inferred data. This segment is specifically to address the intersection of XR and neuro technologies. This is the data that results from inference of neurological, psychological, or behavioral patterns. And this kind of data is really just, you know, people or companies inferring what are the responses of humans to various types of stimuli? What are the neurological responses, by collecting and aggregating these types of data, one could pretty much quite accurately establish a sort of a profile a psychographic profile, and infer their emotional state, or even like physical state of mind? That's where dissecting in these two categories can allow us not only to inform engineers as to what kind of protections are needed in specific contexts, because those protections definitely depend on the context in the medical context.

For example, we want this data to be shared within a certain network like a healthcare network, but we don't want this data, at least not in the United States. So again, the context matters in certain jurisdictions, you share that data with an insurance provider, you could be denied coverage because they know your health condition before you even know about that. So those kinds of protections have to be built in. Not only do engineers have to understand this when they're coding, but also, the laws have to take that into account and provide necessary guidance that will allow people to seek remedy if any of their biometrically-inferred data or psychographically-inferred data is somehow putting them at risk or causing harm of any kind.


Debra Farber  29:48

So this actually kind of reminds me to reiterate to the technical folks who are listening in, and to remind them that when you're risk modeling for your new product or service or new idea you're bringing to market, that it's not about risk modeling for the company itself, right? It's not about how do you reduce risk to the company, though that might be part of your job. That's that's absolutely valid to look at. But the underlying the key part of the risk assessments that you need to be doing from a privacy and safety perspective is about risks to the human that's using your product and service. And so you often talk about the concept of "human in the loop." Can you unpack that a little for us? What do you mean by this phrase?


Kavya Pearlman  30:33

Yeah, and this is a very significantly important aspect. We discovered it during our research for version 1.1. We never published anything on it, but the research, as we were, you know, meeting with several multidisciplinary stakeholders is, hey, all the frameworks that currently exist, they talk about various risk coming from networks, nodes, servers, and even the cyber attacks (they're only addressing that sort of attack surface), but with respect to extended reality or these emerging technologies. In fact, that was true in 2016. Two, we had "democracy at risk" and "societies at risk." This thing has started to come into play. I would say, I would have to go back to 2011 when the Egyptian revolution (Arab Spring) happened, and people were using technology to potentially coordinate with other humans to bring down a dictator; that's where we saw a society essentially using technology to influence massive political outcomes. This is where things get really, really interesting.

What we're talking about now is that there's a human that is in the loop, giving away potentially all of the data as if they're literally part of that network. And that's what is so unique about the metaverse; the human experience itself is a part of the equation, and we need to protect that and safeguard that. And especially when it comes to children, it's even more critical, because children don't necessarily have an autonomy; they depend on parents. Likewise, what we are risking here when we say "human in the loop" and the "society in the loop" are at risk is human autonomy. It goes beyond privacy, where you know, privacy is like, hey, I want to remain private and protect my information. Here. We're talking about our mental cognition, our mental thoughts, thought patterns, those are all at risk - and essentially, human autonomy, human free will, human agency - because we're seeing companies who are already taking eye tracking data gaze pose all this data with facial tracking data. So where does this lead...?


Debra Farber  32:46

...also voice data...


Kavya Pearlman  32:49

Exactly. Voice data and patterns, too. So all of this could lead to humans inevitably being so highly influenced by this other intersection. And that's why convergence, I keep saying, because there is this other intersection with artificial intelligence which could be used to persuade human beings to make decisions that they don't necessarily want to. They would think it's their decision, but they'd be influenced by all these algorithms or surrounding factors. And that's why I keep saying the "human in the loop" ourselves, when we put on these glasses, or when we experienced these new emerging tech ecosystems, we need to be aware of that. Whether we use it or not. That's the other part too, is like, as a bystander we have to worry about our privacy and our autonomy. Because we don't necessarily have to put on the headset, one could still be recorded by other people who are using these technologies where you just come in contact with it.


Debra Farber  33:46

You know, you make such excellent points around agency. I just want to stress that your thought patterns, and you know how you're thinking about certain things or what excites you, because of the dilation of your pupils, and that data that is being collected. What scares you because of the data that's collected about potentially like how, you know, sensors in the room or on your body that that talk about other feedback that you're giving off, other signals? Can you get more private than that? Can you get more personal than how you think about things? I don't think so I think it's like the essence of what you are as an individual. And privacy, I believe it was a subset of agency, it is those personal moments, those personal, you know, we want free from government, but also free from businesses, to be that  embedded into our thought processes. 

So I think this is some of the most sensitive data that can ever be collected. And it's really scary that it could be weaponized against you, even through things that seem innocuous, like small nudges and dark patterns that get you to do things you would not have otherwise done if not for the slight manipulation that the company might have done, whether it's an algorithm or just the experiential environment that they choose to present to you. 


Kavya Pearlman  35:06

We have invented the ability to influence moods, influence emotions, show some mental imagery...I mean, all these things are being done, but at a sort of not-so-dangerous level. We talked about this convergence, this is really that human in the loop issue that we talk about, basically being targeted from every angle from any every aspect,


Debra Farber  35:29

Indeed. So I definitely want to talk about more positive things here, as opposed to just talking about all the risks. And, you know, I don't want this podcast to be only about fear, because I think there are a lot of exciting aspects about this technology. But I do first think it's important that we talk about some of the major threat vectors in the metaverse, both physical threats, as well as psychological threats. And if you could give some examples to just illustrate your point, that would also be really helpful.


Kavya Pearlman  35:56

Yeah. So actually, Debra, I think it's very positive that we talk about these things because what we're really talking about is trust. I want to be able to trust the technology that is today being used to perform coordinated, assisted surgeries. So, at the Medical XR Council at XRSI there are people who are practically utilizing augmented reality (AR) devices, HoloLenses to perform surgical operations, or to train clinicians. And now, that's what it is.

I see it as very positive when I talk about these threats, and when I'm able to unpack or just discover new cyber attacks; it gives me such hope that we discovered it first, and we understand it now. And now literally, as you know, we're writing an Advisory to the FTC just this week advising what should go into new lawmaking / rule-making. So I see it as a very positive thing, even though some may think, "Oh, my God - threats and dangers and etcetera!" Look, we are the CyberGuardian and PrivacyGuru. We're not, we're not pushing the danger on you. We're really removing the danger from the technology or the threats proactively. So we're staying ahead of the bad guys and I see this as a very positive thing.


Debra Farber  37:11

Exactly. I mean, you know, obviously that's why I find inspiration in doing this work. So, you make an excellent point and even reframing it about trust because that's really what it is about. No metaverse-related product or service is going to ever take off if no one trusts it. And there are some companies out there, like Meta, that have to earn that trust because there's an element of distrust as a result of their previous privacy approaches and snafus. So thank you for that. I really do appreciate it.

I also want to point out two things. One, I think the metaverse is not here yet. I think it's going to take minimum 15-20 years. This is me personally before we have anything close to an immersive experience, where people are using on a daily basis outside of an initial phone call to someone who's got an avatar and like a phone booth somewhere where they maybe they're in a small space or other small little pieces of the metaverse might be being built. But the challenge right now is being able to render the immersive experience using the technologies out there today. We don't have the processing power yet to be able to, you know, render the metaverse in the way that  most people are envisioning it or as it's being sold as to what the endgame is here. We've got components of it people in the web3 space using NFT's to transfer value in and out of various Virtual Reality spaces. And there's all these different - I don't want to say fiefdoms, although it can feel that way at times, but all these different - attempts at building portions of what will eventually be the metaverse, that's the first thing I do want to ask you, how far down the road do you think it'll be before we actually have truly immersive experiences. Do you agree it'll take 15-20 years to even get there?

The 2nd point I was gonna make is that I'm so inspired by the fact that, unlike any other time in existence of the world, there have never been so many experts of different varieties coming together to design for something like a new phase of the web here. We've got economists mixed with web & cloud experts mixed with privacy experts and security experts. And I think the fact that it's going to take 15 to 20 years to get there, it provides us with this opportunity right now to say "Okay, if this is what you're going to build, here are all the risks; here the pitfalls to avoid; here's the right way to think about building with privacy and safety by design and default." And so, I see that as the big opportunity - and that's what inspires me: that we have time to really get in front of the problem. We have time to not only educate the businesses, but to educate the regulators on what is possible today, and where we're going in the future and how do we help them scale regulations to kind of drive the ethical considerations that need to be put in place. Okay, please comment on that or answer the question about the timing of the metaverse. 


Kavya Pearlman  40:06

I generally refuse to answer these type of questions as to you know, how you know, how far along or what's going to happen. But I would say though, the reason why I tend to not really claim anything - I'm not a futurist, I'm just a researcher who collects information and tries to attribute what's going to happen. So the based on the research so far and things that I know is that we are moving from linear progression of technology to more exponential. An example I would take is we saw Stable Diffusion, the aspect of artificial intelligence algorithm that literally revolutionized the way art is done. And in a matter of 30 days, you saw Stable Diffusion, literally impacting every possible use case even incorporated within Canva (which is a application to design various, you know, simple banners all the way to sophisticated designs that can be used for various different use cases). That's what I see is we are probably discounting the fact that when we say 15 years, that's one thing and then the second factor is climate change.

What we saw with COVID Like there are some unaccounted factors that we still don't know. To me, it feels very much like, you know, intuitive. Humans are so intuitive and so smart, that we are preparing for something that we have maybe not confronted just yet. Kind of like we had zoom ready, when COVID happen. I mean, nobody really planned to have 1.8 billion plus children move online all of a sudden, but we did confront those realities, almost the child sexual harassment online increased so many folds and that's what is the hope here is that whether it is 2 years or 5 years, or whether it is 10 years, I wouldn't think 15. Maybe like, because of all the things that I mentioned.

But yeah, whatever that timeframe is, you're so right: we have a window of opportunities. So we need to get these building blocks. Because what we are putting together right now these interdependent virtual worlds, or converging artificial intelligence, and NFT, and all these other technology convergence, now we have the opportunity to get it right at the level of code. And that's why again, you know, I really love the idea that you're introducing and hammering on, Debra: this shifting privacy left is really trying to move towards the aspect where code is happening and define privacy there. They're influencing the privacy aspects, they're all the way to gaining control of our agency, at the code level, and designing with safety by design and default. That's what I think. And I don't know, you know, I haven't seen the future.


Debra Farber  43:07

You're right, of course. And of course, I'm just it's a matter of, you know, you're drinking from the firehose of information around the metaverse. I'm drinking from the firehose of information about privacy and privacy innovation generally. I'm now expanding that to include the metaverse but it's, you know, it's a lot of info to follow. But, what we are able to do is what we're good at; humans are great at doing pattern recognition, and then being able to kind of identify trends - trends of decentralization, trends of humans wanting their agency back, and not wanting to feel like a handful of corporations have all the power and are extorting value out of them. They get nothing back. So, I named this podcast show "Shifting Privacy Left," because there has been a focus for many years - I've been in privacy for 17 years & I've watched it grow up; it has always been focused too much on the legal space, too much on lawyers and the legal function since the very beginning. I am a lawyer by training (I have a JD),  so I feel pretty good about making the statement that there have been too much focus on the legal function owning privacy within organizations.

As a result, it took so long for use to culturally shift left in businesses to the engineering function. There are a whole variety of reasons as to why this is the case; maybe we could discuss on another podcast. It's because of the over-focus on legal and then governance, risk, & compliance, (GRC) and building those policies and embedding your procedures to make sure you're able to effectuate any rights that somebody has. And, if they say, "I want you to delete data about me," or "Show me what data you have about me," - there are these great rights that individuals have now as a result of GDPR. GDPR requirements are being exported around the world, including in California's CCPA, and now a host of other states. Eventually, I'm sure every state in the U.S. is going to have a similar law. So I don't think they'll ever be a federal law! Yes, I'm just throwing that statement in there with no commentary other than that. <laughs>

But businesses very often get stuck with the compliance paper chase. And, you know, none of the policy rules you have are actually going to affect the personal data itself; they're not going to actually safeguard anything unless somebody technical, some architect or designer - an experienced architect designs for how personal data flows and the various components of the system you're creating will be structured and the privacy engineer, you know, is going to look and make sure that everything about the code is thinking about how data is shared and the algorithms are using it. None of this is going to be solved for with the GRC function by itself, and has to be part of a formal privacy-by-design and privacy engineering program.

We've been talking about this for a long time, the GDPR requires the 7 privacy-by-design principles to be considered. It's just hasn't really happened enough, and so the goal of my podcast here is really to scream from the rooftops, "Here's some advice! Here are some people working in that space! Here are some privacy technologies to consider for technologists and innovators bringing new products to market to make sure that they're building with privacy and safety by design and default."

So given that - and I know we don't have much time left - I have this question and then one more for you: what advice would you give to the builders of immersive products and services, and you know, what would you urge regulators to consider when regulating the metaverse?


Kavya Pearlman  46:37

For sure, I think that's a very large question. But I tried to sum it up and maybe three points. One of my very first advice is adopt The XRSI Privacy & Safety Framework. Within this framework, we are capturing the very aspect that, you know, back in the days as a Head of Security or Information Security Director that I would hand over to a Senior Director of Engineering, or the Application Security Engineer. So it's literally capturing that all the way - to what are the potential nuances that must go into the legal landscape (which hasn't really happened yet for metaverse). But, that's why this framework is quite unique; to provide guidance from the developers' level all the way to the regulatory aspects.

My second recommendation is to learn about the technology. So, let's say you're not a XR Engineer, or you're not really working in the metaverse field just yet. Well, my advice to you is get busy learning about it because this is where the world is moving to, and whether you're a privacy professional or cybersecurity professional, if the next iteration of the Internet is 3D & immersive, then we better get our handle on what what are we talking about here. So, starting with basic terminologies all the way to plugging into some kind of a bootcamp or taking some professional coursework.

And my final recommendation is to receive these interventions, one that is being led by XRSI is Metaverse Safety Week, which is in December. And, this intervention, year after year, we're trying to bring different stakeholders, regulators, lawmakers, technologists, big tech organizations, etc. to essentially talk about how are we going to safeguard these converging technologies. And so I think that's where really, you know, if you're listening and interested, that's where I would direct current attention. Go listen to, or be part of, this metaverse safety - 10th to 15th of December - and figure out where in this ecosystem you fit in.

Even as a bystander, you want to be aware of your rights. You want to be aware of what are the consequences that your children might have to face, or you yourself might have to face? It could be because you are not white or a minority or, you know, your data is now being captured by your company because they are using XR devices for training. So, these things lead to many unintended consequences. And that's where, you know, we're going to be discussing all that. So those are my three recommendations, Debra, and I know there's a lot more, but that's where I direct my energy: mostly to the Framework and to  Metaverse Safety Week, and essentially encourage everyone to learn about this new domain.


Debra Farber  49:40

Amazing, amazing. I just want to double click here on the Metaverse Safety Week. That's it's 5 separate days of just really wonderful content from some real pioneers in the space like some regulators, some, you know, you could say if there's a few people that you wanted to highlight, but I also wanted to say like, you know, tell people where to go to learn more about Metaverse Safety Week, and I do believe it's free for anyone to attend. Correct? So Exactly. Yeah. So this is literally for anyone. If you want to just learn about, let's say human rights in the metaverse, I believe that's the first day of the Metaverse Safety Week conference. So you could tune in for one day, two days, up to five days. There's definitely different content for each of the days of the week. And I hope to see a lot of you there. And if you're looking to sponsor Metaverse Safety Week, there's definitely opportunity there. XRSI can't continue to do their work without the amazing support from, you know, sponsors, which we have, but we'd love to add more organizations to the community so that we can make a stronger community, right? I mean, you know, be part of this.


Kavya Pearlman  50:53

Yeah, no, I definitely want to give out the Metaverse Safety Week website: MetaverseSafetyWeek.org. And really, you know, my tremendous gratitude to these amazing individuals, professionals and regulators. So we will have Congresswoman Lori Trahan, who is literally involved in children's online privacy reform, delivering a keynote on Day 3. On Day 1, we will hear from Human Rights Commissioner of Australia, Lorraine Finlay, delivering a closing keynote on the human rights in the metaverse focus day on Day 2. We have people...we actually have an astronaut (Sian Proctor), the first African American woman to fly the civilian spaceship into space! I mean, she's such a hero and I can't wait to meet her in virtual reality while she delivers the opening keynote on Day 2.

We'll hear from the UK's NHS where we have the Head of Advisory at Cisco speaking. We have folks from U.S. Air Force. AR pioneer, Louis Rosenberg, who you know most, and those in XR know was the Chief Scientist at one point at NASA and currently pioneer and even advisor. And we have you, Debra, doing a remarkable announcement around a new Shared Responsibilities Model for immersive experiences. We even have a few regulators from the UK and from Australia. Of course, Julia Inman Grant, another superhero around online safety with folks from NHS. So I mean, the list is endless.

Go to: MetaverseSafetyWeek.org. Check it out. You can do so many things. You can show up and register as a participant. You can update your social media profile pictures to demonstrate support on your profiles. You can sponsor. You can be a community partner. But, most definitely you can talk to people who may be impacted by this directly or indirectly, and it's an opportunity for all of us to come together and discuss what the impacts are going to be and how are we going to stay ahead of those things. So thank you for mentioning it.


Debra Farber  53:03

You're very welcome. You mentioned it is going to be in a virtual reality space. So you know, how could you have a Metaverse conference without actually interact in some sort of VR space? So you know, it's free, and it's an opportunity to actually check out some new platform and technology. Kavya. Thank you so much. Honestly, it was my pleasure to interview you on my first podcast episode. I am absolutely positive that all the listeners will find value in what you've shared with us today. And I wish you all the best. And, I love working with you. So, I know I'll see you soon. But, how can anyone else reach out to you if they want to get in touch with you?


Kavya Pearlman  53:43

Thank you, Debra. Such an honor to be your first guest on the podcast. If any listener wants to reach back out to me, I'm at Kavya@xrsi.org. Or, just message Debra. We're already very connected - have each other on Signal, etc. So, thank you again for allowing this brainstorm on how do we address these technical convergences and the risks that come together. 

Congratulations again. Shift privacy left! I already love it!


Debra Farber  54:27

Well, thank you so much! And, until next week, everyone. We'll be back with great content next Tuesday. Talk to you then.

OUTRO

Debra Farber  54:31

Thanks for joining us this week on Shifting Privacy Left. Make sure to visit our website, https://shiftingprivacyleft.com, where you can subscribe to updates so you’ll never miss an episode. While you’re at it, if you found this show valuable, go ahead and share it with a friend.

And, if you’re an engineer who cares passionately about privacy, check out Privado, the developer-friendly privacy platform. To learn more, go to Privado.ai.
 
Be sure to tune in next Tuesday for a new episode. Bye for now!

Podcasts we love