Tech'ed Up

The State of Tech Policy • Dave Barmore

July 11, 2024 Niki Christoff
The State of Tech Policy • Dave Barmore
Tech'ed Up
More Info
Tech'ed Up
The State of Tech Policy • Dave Barmore
Jul 11, 2024
Niki Christoff

Runway Strategies Co-Founder Dave Barmore joins Niki in the studio for a deep dive into the wave of tech regulations coming from the states. They talk about gridlock at the federal level and why tech needs to pay attention to the states when it comes to pending policy regulations facing AI, data privacy, and social media.

"in terms of bills getting passed and laws regulating tech, whether that's AI, data privacy, social media. Don't sleep on the states." -Dave Barmore

Show Notes Transcript

Runway Strategies Co-Founder Dave Barmore joins Niki in the studio for a deep dive into the wave of tech regulations coming from the states. They talk about gridlock at the federal level and why tech needs to pay attention to the states when it comes to pending policy regulations facing AI, data privacy, and social media.

"in terms of bills getting passed and laws regulating tech, whether that's AI, data privacy, social media. Don't sleep on the states." -Dave Barmore

Niki: I’m Niki Christoff and welcome to Tech'ed Up. Today's guest in the studio is fellow tech policy nerd and Washington consultant Dave Barmore. We're in the midst of a remarkably unproductive Congress, one that has passed a historically low number of bills even for an election year.

Given that, I'm talking to Dave today about what's happening in tech policy across the United States, where legislatures are moving to pass laws addressing privacy, AI, and social media. 

Dave: Thank you for having me. Excited to be here. 

Niki: Thank you for coming on the show, Dave. 

Dave: It's been so fun to watch you do this over the years, and it's surreal that I'm finally in the studio.

Niki: I know, especially since you live only a few blocks away. [chuckling] I feel like when I first started, you were like, “It's so weird to hear your voice on Spotify.”

Dave: I do my dog walks, and I have my, my list of podcasts that I listened to and Tech’ed Up is one of them.

Niki: Also, shout out to Dave's mom, who listens to this podcast. 

[both laugh] 

Niki: Does my mother? [cross-talk] Yeah, I know. Does my mother listen? She does not. Your mother does. [both chuckling] She's bringing Nebraska to the - she's so nice. 

Dave: She's very plugged in. 

Niki: She's super plugged in, and she was excited you were coming on. So, we'll get right into it.

Dave: Let's do it. 

Niki: Before we start talking tech policy, though, I want to talk a little bit about your career and how we know each other. So, you were one of Uber's very first policy hires. And I think people forget because Uber is so, y’know, ubiquitous now. It was illegal slash not legal 

[both chuckling]

in many states when you started there. And I think you oversaw 15 specific laws in like 30 states that you were working on getting it legalized. 

Dave: Yeah. So we go way back, as you mentioned, we were both in the trenches together at Uber. I was always based out of D.C., but as you did during your time there as well, I wore many hats. But during the beginning of my time there, I helped oversee as the company expanded into new markets across the US I would work with state and local policymakers to help them understand how ride-sharing was different than traditional taxi or black car livery services.

So that's really how I cut my teeth on state-level engagement across all 50 states, and I've really leveraged that network now at Runway Strategies, my boutique consultancy to help clients engage at the state and local levels.

Niki: I try to explain when I send clients your way because I don't do state public policy work. When I send folks your way, I say, like, “When you have a specific Statehouse, there are the couple of local lobbyists who know everybody who their kids play softball with everybody else's kids, and they can tell you sort of the background of what's happening, and the what's going to make it to the floor.”

When you have 50 states and multiple cities and a ton of legislation moving, other than trade associations, you're one of the few consultancies that I know of that actually manages that monitoring and engagement. 

Dave: Yeah, we're in D.C., so there are many, many, many firms that specialize in federal engagement.

But I think we are one of a few select groups that really hone in on the kind of multi-state, 50-state engagement. We have a network of local strategists and lobbyists that have those deep relationships with legislators. But I really think before you get to that step of needing those lobbyists, we help our clients understand what states do you need to prioritize,  [Niki: mmhh] right?

Whether it's the big obvious ones like California, New York, maybe it's the ones like, Wyoming that have been, I know, out there on a lot of crypto issues. [Niki: totally] So, I think understand and strategize like there's what states should you be prioritized? 

Niki: There's definitely sleeper states, [Dave: yeah] states people might not obviously think of.

This is a good pivot into what's happening in Washington, which maybe then lays out what's happening in the states. We're in the middle of a sort of, I mean, we're always known for gridlock, but it's, it's, it's super bad right now. 

Dave: We're reaching historic levels of, of gridlock, I fear, in D.C. in many ways. Helps my case is I go to clients and say,  “I'm not saying you shouldn't pay attention and monitor what's going on in D.C., but I think in terms of bills getting passed and laws regulating tech, whether that's AI, data privacy, social media. Don't sleep on the states. I think you need to really be paying attention to states that are moving at light speed on all these, these big ticket items.” 

Niki: And I want to go through those. I will say for Congress they have had a couple of bills that I thought like, “Oh, this is totally dormant. Nothing's going to happen.” Like TikTok - that was signed into law!  

Dave: Yeah. That was, I think, a surprise to many, um, but I think that's one example. Y’know, the White House executive order on AI. I think that really set a good foundation that a lot of the federal agencies are now looking to implement. Unfortunately, I think since April, there hasn't really been [chuckling] much momentum. You've got Republican leaders on the House side, Steve Scalise, coming out saying that he will not support any further AI legislation at the federal level.

I think we can expect very little developments here in D.C. going into the election. 

Niki: And we're just right on the brink of complete frenetic, deranged election season [Dave: campaign season] Sorry. Yes, campaign season. [Dave: Campaign season. Buckle up] Buckle up. 

Okay, so let's look at the states. 

Dave: Let's look at the state. 

GDPR was really a global first right and setting that kind of standard. Following that, California was the first state back in 2018 to pass the California Privacy Protection Act that went into effect 2020, and that really started this, this cascade [chuckling] of states moving on their own bills.

So fast forward to today, you now have 20 states that have passed their own comprehensive data privacy bills and so, you have this what's called a patchwork of state-based laws. And it's been interesting because there are arguments on both sides of whether this kind of patchwork of state laws is a good thing or whether or not it's just a huge compliance and administrative burden for companies.

And if you think about tech being so ubiquitous, like, how realistic is it to think that your data should be treated differently, just crossing, y’know, geographic, like, state lines? I've seen some arguments in favor of this patchwork saying that, “Y’know, we need to be able to move at a quicker pace, and states allow that kind of innovation and policymaking to occur.” 

Just this year alone, we've seen seven states pass their own laws.

How are these laws going to be enforced? I think the big, sticky issue is what's called a private right of action, [chuckling] which I'm sure you've heard a lot about. At the federal level, that's being widely debated. So what that does is allow consumers to bring a private lawsuit against companies for violating the laws.

And so, I think that is a huge sticking point for the industry. And you'll see industry pushing back on a lot of these private right of actions. Vermont passed a state privacy bill,  it passed both chambers, went to the governor. The governor ultimately vetoed that bill, citing concerns about the private right of action and its impact on the businesses, small and large in the state.

Niki: Okay, so there’s a lot that I want to unpack there, and I know we're just totally nerding out, but the people who listen to this podcast are, many of them are tech policy people, [Dave: yeah, yeah] but so the private right of action, I think a lot of people, including staffers on the Hill, don't understand what that means, but essentially you give a right to consumers to sue.

But what actually happens in practice is they are certified as a class, there's a class action lawsuit, it is a total boondoggle for trial attorneys, and then you get a check. 

I just actually got a check for two, 2 dollars and like 55 cents in the mail. I found, I didn't even realize I was a member of this class. [chuckling] I must've signed up six years ago, right? So my damages were not just nominal, like nothing and then lawyers are getting money. 

The idea is this can enforce laws, right? You violate the data privacy law, then you can get sued. There are, however, other ways that you can see my bias here. There are other ways -

[both laughing] 

Niki: Subtle! 

There are other ways you can enforce these laws. Like state attorneys general, right? We have enforcement agencies. That is actually their job to say, “Hey, you're, you're not compliant with the law. We're going to bring a lawsuit against you.” We're - and I know that many of those offices, certainly at the federal level, are under-resourced, but it's a real burden for small businesses, right? 

Facebook has what, 600 in-house lawyers? Google, where I worked, they can handle this Incoming litigation; if anything, it's sort of a competitive advantage, but if you're a smaller company and you suddenly are getting sued because there was a breach or you mishandled something, that's incredibly onerous.

So anyway, I'm against it. [chuckling] 

Dave: No, I think all those points. Again, the, the counterargument to the Attorney General's side, as you pointed out, is, y’know, even at the state level, right, Attorney General's offices are highly under-resourced and won't have the capacity to carry out those, um, enforcement activities. A lot of the work that we do at Runway is making sure legislators understand that, as a lot of them are going after the big tech platforms, and they're not thinking about the downstream effects of how this could impact, y’know, earlier stage startups that are trying to compete with the large platforms.

And so, I think a lot of our work at Runways is trying to coalesce support amongst smaller startups and making sure they have a seat at the table as well. 

Niki: Right. Well, and you can have a standard that is, y’know, something - we've been at a startup that, I don't want to talk about the standard of compliance we had.

[both laughing]

It was sort of defiant. 

Dave: Yeah, that's one way to put it. 

Niki: But I do think that they have different abilities to be compliant and they can be building compliance programs. But again, they're in the just starting stages, so you want to foster that competition because obviously antitrust is another big issue here. [Dave: Right] 

But I do think another thing we talked about is deletion, right? I'm, I'm truly frustrated that I'm kind of a privacy hawk. I want to delete more of my data online. I have gone webmaster by webmaster and deleted things before. [Dave: Yeah] And, and yet, when we were at Uber, people would talk about data retention. “Hey, you're keeping my location data indefinitely.”

And then we would have to explain, right, but if a crime has been committed, we need that if there's a business audit of a business using Uber, we need those literal receipts. And so, sometimes it's a good idea in theory. But again, the unintended consequences of overly broad deletion policies can kind of break the apps.  

Dave: Right. And I think there are states that are moving on how to streamline that process for consumers so they're not having to go through entity by entity and making those requests. 

Niki: BlockShopper. I'm coming for you. [Dave: laughs] What's their deal? Why will they not delete stuff?

I don't even know who they are! [Dave: laughs] Anyway, that drives me crazy that my stuff is public. 

Dave: Uniformity across these state laws I think is really key and I think I want to call out one group that I've always found very interesting.

So, the Uniform Law Commission. We're completely nerding out now.

Niki:  I actually don't know - yes, continue! 

Dave: They're a group of mostly volunteer attorneys, but their sole goal is to work with state legislators to help them create more uniform laws. Harmonization, definitions, how we're defining these different types of sensitive data, like, that's all going to be really important. 

Niki: And we're talking about from the business side, industry, but also for people like your privacy protection shouldn't really depend on what zip code [Dave: yeah]  you're in and being able to understand the basics of what are my rights.

Dave: Yep. 

Niki: You know, sometimes if you go to a website, it'll say, like, “Here's the policy for everyone. But if you're a California resident, we have this extra box you can check because you have just a little more protection.” 

Dave: Right, right,  

Niki: And regardless of what I think about California's law, which is, I think it's, was drafted hastily and is overly broad.

But why in California do you have different rights? 

So, the idea is, what you're talking about is - It's kind of like going from grassroots up [Dave: yeah] to get a standard because we're not able to get top-down.

Dave: And let's not forget, state legislators have, y’know, conferences. They go to the National Conference of State Legislators. They have big conferences where thousands of state legislators descend upon, y’know, cities like Miami and, and they become notorious for, y’know, spring break for state legislators. [Niki: laughing] 

But when they're not partying, they're also talking amongst each other and learning best practices, learning from states that have passed these laws regulating tech. How did that process go? How did they work with industry? How did they make sure to strike a balance between not trying to stifle innovation?

You're on a national stage when dealing with state legislatures because of that reality that they're all talking to one another.

Niki:  Right. And they are addressing a need and desire of the voters. So I ran a poll, in the end of last year. of a thousand likely voters and when they, when asked “What areas do you think the government should regulate more?” privacy was at the very top. Online privacy. People, people are frustrated, and we'll get to social media, which I think is driving part of that.

But before we do. It's 2024. Let's talk AI. 

Dave: AI. An issue that, y’know, just even, what, two to three years ago wasn't at the forefront of, y’know, the general population or [chuckling] state and policymakers as well.

Niki: Congressman Will Hurd was one of the first guests on this podcast. He founded the AI Caucus. He was on the board of OpenAI. When I tell you he could not get people [chuckling] to pay attention, that was two years ago. 

Dave: It's crazy. And look at just this year alone, we've had over 400 bills introduced at the state level looking to regulate AI. [Niki: laughs] 

400 bills in 40 states across the country. So, that just shows you the sheer amount of bills. We talk about. Attorney General's offices not being resourced. A lot of these state legislators are part-time. They have very, very minimal staff.

And so, I think it's important for those in the industry to think about just simply educating policymakers about not only the potential harms that could come from technologies like AI, but also what good could come from this, right? I think a lot of that is lost in the debate. And so, I think having those kinds of conversations just educating a lot of these legislators that just have no clue about the technology.

Niki: Yeah! It is panic at the disco on this technology. And I think it's because the zeitgeist has embraced the robotic idea of Terminator. It's going to take jobs. It's going to make kids dumber. It's going, people won't be able to write. 

And then we miss things like your MRIs are going to be much more accurate. There are so many, I think when we talk about efficiency, it's like talking about the national debt, like, nobody knows what it means to increase [chuckling] efficiency, but in our day-to-day lives - I don't know if you use it at work AI at all? 

Dave: I do!

Niki: I do, too for things that are not my core competency, right.

That are just a, an administrative task that I can use AI to help with. I think education and also just storytelling around the positives is really missing. 

Dave: I want to focus a little bit on some common threads across state AI regulation. Interestingly, this year, Colorado was the first state to pass a comprehensive AI risk-based approach kind of framework. And this is following, EU passed their historic EU AI Act. That's now in its implementation phase. Similar to how California passed the first privacy law, Colorado is the first state, which, little-known fact, actually Colorado was the first state to pass rideshare legislation. 

Niki: I did not know that.

Dave: Everyone thinks it was California, but they passed it through the agency, through rule-making. So, Colorado was actually the first state to pass by statute.

Niki:  I did not know that.

Dave: Now, ridesharing and on to AI. I want to point something out a lot of these state bills they pass, and then they have an effective date.

So we're looking at with this Colorado AI bill, which to explain to the listeners at a very high level, it puts kind of safety protocols for how developers and deployers of high-risk AI models. The rules of the road for how those companies need to operate different disclosure requirements for consumers that they know how their information is being used for highly consequential decision-making, all of that.

The effective date for this Colorado bill is not until, I believe it's February 2026. 

Niki: Oh, wow!  

Dave: So, just think of the amount that an advancement in the tech. It's gonna be a real challenge for these legislators to keep up with the speed of innovation.

Niki: So, not to go back to privacy, but I think Texas’  privacy law, which I actually really like. I think it's a good, good law. 

Dave: Goes into effect next week. 

Niki: Exactly. It's just about to go into effect.

I think it's a great law, but it does take time because they need people to get prepped. But to your point, 2026, [Dave: laughing] I mean, I don't know. We're all sort of living in fast forward at two X, 2.5 X. So who knows what will happen? 

Dave: Colorado's already committed to revisiting this during their 2025 session.

Move over west to California. It's a fascinating process playing out. California of all the states has the most bills that are being pushed on AI. Over 50 bills that have been introduced. Some bills that have garnered a lot of headlines, one by state legislator Scott Wiener looking to put into place kind of safety protocols for how these large language models can operate.

I think you've seen a ton of pushback. With California, it's so interesting because you have such a progressive state that's out there in the interest of, y’know, different union groups. You saw the drama with Scarlett Johansson. 

Niki: Yes. I'm on ScarJo's side with that, by the way!  

[both chuckling]

Dave: But you've seen, you know, involvement from organized labor, ensuring that their interests are heard with this debate. 

Niki: Including writers and actors! 

Dave: And then you throw in the fact that I think what I saw five of the six largest AI companies are based out of California. [Niki: laughs] 

So you see this, this huge grandstanding from all the industry players saying, “You know If this law were to pass, we'll potentially have to leave the state and you know, there's nothing holding us to having our headquarters be in California.” 

I believe they tightened up some of the language to address the threshold application of the law so that it wouldn't impact some of those smaller players and just be targeted for the ones that are computing millions and millions of data sets. So, uh.

Niki: Well, I know I'm offering a lot of opinions, which is in your business you are probably not going to offer as many opinions on this [Dave: laughs], but I just feel like you're not really supposed to write laws to target individual companies. They're supposed to be generally applicable, and if they don't work as a general application law, you should rework the law, not just single out the big companies.

I think we could build in more sunsetting of laws where you are forced to revisit it. And with AI, that to me seems really practical because we know that we don't know [Dave: right, right] what the tech is going to be like, it's the known unknown or whatever. [chuckling] 

So why not make it, once it goes into effect, just have a period where you're going to assess, like, “Did this break a bunch of stuff? Did it hurt small and medium businesses? Is it actually causing harm? Is this a harm we anticipated, but there never was actual harm?” Which I think a lot of these bills are, are built around.

Dave: I think even back to a year or two years ago when we were just starting to fully grasp all the capabilities of generative AI, you saw a lot of states looking to implement task forces and y’know, do study committee states love to do a study committee. It shows that they're thinking about the issue, but they're not really implementing, y’know, the full rule of law on on the issue [chuckling]. 

Colorado was significant in that it was the first actual comprehensive bill to regulate the technology. But I think to your point, they're going to have to, whether it's a sunsetting clause, just address the speed of which these systems will advance and develop, and how the speed of even state-level policymaking.

Niki: Yeah. So AI, obviously, 400 bills, you said, just this year. [Dave: Yeah] We'll see more next year, I'm sure.

Dave:  I'm sure. 

Niki: Okay, let's go on to last topic, social media. And specifically, you said this before we started recording: social media is really right now focused on kids. 

Dave: Children's online safety. So, I would say that I'd say AI is like the hottest issue.

I would say children's online safety is just, just below that. A ton of activity going on in D.C. You think back to, was it just earlier this year they had the Senate, Commerce hearing where they had Zuckerberg and all the different tech executives.

Niki:  He got up and apologized to people's parents.

Dave: It was, y’know, [Niki: awkward] all over the news. 

[both laughing]

Niki: Yes. Sorry. Yes. All over the news. 

Dave: You saw that. And that was around a federal framework, the Kids Online Safety Act. So again, that gets to how companies are using children's data. There's already a federal law in place, COPPA, which regulates for users under 13. What do we do for those that are between 13 and 16 and 18 years old.

So I think you have states that are moving on their own statewide frameworks addressing how companies are targeting children, users under 16. In some states, this notion of addictive feeds, addictive social media platforms. Really interesting just getting to how do you define what's addictive, right?

I think you've got the Surgeon General coming out saying that, y’know, there needs to be similar to decades ago where they implemented the tobacco disclosures. Like, does there need to be a similar 

Niki: Like a skull and crossbones? [Dave: chuckling] Your brain on social media? 

Dave: New York actually just adjourned a couple weeks back in early June. But right before they gaveled out, they passed a bill called the Safe for Kids Act. And again, this goes towards “How can government look to regulate these addictive social media platforms?” What you saw from industry was just a need to really think about how you're defining these terms and the scope. 

To your point, like, how are they thinking about what types of companies are impacted? Just because the company uses algorithms to, y’know, enhance your experience on the app, does that mean that they're addictive?

Niki: I have four teenage boy roommates, 13, 13, 15, 18, and I'm constantly polling them on their online use. [chuckling] 

I know you're laughing cause it's weird I call them my roommates, but they are in fact my roommates. They are not my kids. 

[both laugh]

The 13-year-olds use Duolingo, which is gamified, and they're learning German. So they get a little note saying, “Hey, you've been streaking for 85 days. Don't forget to practice your German today.”

Duolingo is not a big company. I can't think of anyone who thinks that's a bad use of a notification on a smartphone to remind them to practice their German. 

On the other hand, the 15-year-old, who sometimes listens to this podcast [both chuckle], is completely addicted to TikTok. And TikTok is extremely sticky because it doesn't start with what you're following. It starts with a recommendation page that it knows you're going to like. 

And so, I mean, [whispering] those kids are spending hours and hours on TikTok. They are on Instagram and the content they get is very different than the content I get. And then, they're using Snapchat which allows them to see where their friends are and also is gamified for engagement. They want to streak with their friends. 

So it's this whole suite of apps that they're online all the time. Tons of YouTube videos, especially for the younger ones. And to your point, How do you know which part of it is harmful? That is how they socialize now, right? There's not a group of kids coming down the street, grabbing them, and they go out. Like, they find each other online. That's how they socialize. 

Even the actual algorithm itself and the notifications and the gamification isn't always quote-unquote bad or harmful, but sometimes it almost certainly is. 

Dave: All of that's correct, and I think a lot of the debate has been what role does government play in enforcing and regulating this type of activity and, y’know, how do parents play a role in this, right?

There's also constitutional concerns. You'll see industry pushing back saying that systems such as age verification are actually infringing on young user’s First Amendment Rights. 

Niki: I don't; I just scoffed because they can also do math. [Dave: laughs] They can tell how old they need to be to be 18. [chuckling] They can subtract from 18. 

Dave: That is the counterargument, but I think you're seeing a lot of lawsuits being filed in federal courts all across the country after states like California, Utah, Arkansas, New York.

Industry's kind of prepping for how they want to respond there. I think it's a kind of an interesting debate to have,not only at the state level, but the federal level with the COSA and the different proposals that are being debated here in D.C. 

Niki: It's interesting. We've heard TikTok talk about their bill where they're going to have to divest and they say it's a First Amendment issue. And actually, First Amendment issues are that the government can restrain speech in very limited ways if you have other places to speak freely, which I think is kind of obviously true, but we'll see what happens with that case.

Dave: And I think you see a really interesting argument from a lot of civil liberties groups saying that a lot of, y’know, LGBTQ users need this community that they can find through these different social media apps, unfortunately, by involving parental consent rights, that could complicate how some vulnerable communities are able to find that, that refuge.

So, I think it's a hugely complicated issue, but I do think states are finally looking to address it. And so we'll see how it plays out. There's still plenty of time before these become the law of the land.

Niki: AI, social media, and kids protection and privacy. These are the biggies. 

Dave: Watch that space. Only nine states are still in session. A lot have gaveled out for the year. They're going into what is considered off-season. So they're prepping for 2025.

They're going to these conferences. I think Kentucky is the one in August for NCSL. They're going to be meeting with other lawmakers to, y’know, [Niki: drink bourbon] drink bourbon [Niki: chuckles] hear how, y’know, how'd the session go? What lessons did they learn? And then you'll really start to see lawmakers. Introduce bills in the fall and, y’know, come January, when almost every state goes back into session, they have that sprint legislative working period.

Don't sleep on the states. We're already thinking about, y’ know, a world post-November with the election. 

Niki: Well, I appreciate you coming on. I think that you do have this niche specialty, which is not just that you monitor the 50 states from D.C., but that you do it exclusively for tech companies, which my business is also exclusively tech. It's a little rare in D.C. but I send people to you all the time for this.

Dave: And this has been one of the best parts about the consulting space is getting to work and partner with groups like Christoff & Co. I'm honored to have been included as one of your trusted partners on your website. 

Niki: You are one of my trusted partners! I'm always sending people to you. I'm like, “I don't know what's happening in Tallahassee.  Go ask Dave Barmore. He'll know.”

Dave: It's been fun. So thank you for having me on today. I hope this was helpful for your listeners. 

Niki: Yes. Thanks for coming on.