AI50

Data-driven Policy and Innovation with AI

May 22, 2024 Hanh Brown / Daniel Castro Season 5 Episode 210
Data-driven Policy and Innovation with AI
AI50
More Info
AI50
Data-driven Policy and Innovation with AI
May 22, 2024 Season 5 Episode 210
Hanh Brown / Daniel Castro

In this podcast episode, "DATA-DRIVEN POLICY AND INNOVATION WITH AI"
join us for an enlightening discussion with Daniel Castro, Director of the Center for Data Innovation. We will explore critical topics such as data governance, privacy policies, and the transformative power of data-driven innovation. Daniel will share valuable insights from his distinguished career, providing behind-the-scenes stories of impactful CDI initiatives and offering strategies for balancing innovation with public trust.

Listeners will gain a deeper understanding of data policy challenges, learn how to foster productive collaborations, and explore emerging opportunities across various sectors. This is a unique opportunity to stay ahead in the rapidly evolving landscape of data and technology. 

๐ŸŽ™ AI50 Podcast

๐Ÿ“น Want to receive our videos faster? SUBSCRIBE to our channel!

๐Ÿ‘‰ Visit our AI50 website

๐Ÿ‘‰ Schedule a demo

๐Ÿ“ฐ Receive our weekly newsletter

๐Ÿ‘‰ Follow Hanh Brown on LinkedIn

๐Ÿ› Follow AI50 Business Page

Find Daniel on LinkedIn: https://www.linkedin.com/in/danieldcastro/

Show Notes Transcript

In this podcast episode, "DATA-DRIVEN POLICY AND INNOVATION WITH AI"
join us for an enlightening discussion with Daniel Castro, Director of the Center for Data Innovation. We will explore critical topics such as data governance, privacy policies, and the transformative power of data-driven innovation. Daniel will share valuable insights from his distinguished career, providing behind-the-scenes stories of impactful CDI initiatives and offering strategies for balancing innovation with public trust.

Listeners will gain a deeper understanding of data policy challenges, learn how to foster productive collaborations, and explore emerging opportunities across various sectors. This is a unique opportunity to stay ahead in the rapidly evolving landscape of data and technology. 

๐ŸŽ™ AI50 Podcast

๐Ÿ“น Want to receive our videos faster? SUBSCRIBE to our channel!

๐Ÿ‘‰ Visit our AI50 website

๐Ÿ‘‰ Schedule a demo

๐Ÿ“ฐ Receive our weekly newsletter

๐Ÿ‘‰ Follow Hanh Brown on LinkedIn

๐Ÿ› Follow AI50 Business Page

Find Daniel on LinkedIn: https://www.linkedin.com/in/danieldcastro/

Hanh: 00:00:00
Hello, I'm Hanh Brown. Welcome to AI50, where your data meets innovation. At AI50, we skillfully deploy cutting edge AI technology to create large language model apps. These apps are tailored specifically for the needs and preferences of the 50 plus demographic. Our team of experts works, uh, diligently to understand the unique requirements and interests of this age group. And by leveraging the power of AI, we develop solutions that cater to their specific needs.

Hanh: 00:00:38
We believe that technology should be accessible to everyone regardless of age. So our goal is to empower the 50 plus demographic. With tools that enhance their daily lives and keep them connected to the world around them. So, today's topic is data driven policy and innovation with AI, where we'll go into the intersection of data, technology, and public policy. So our special guest today is Daniel Castro. He's the director of the Center for Data Innovation.

Hanh: 00:01:15
a leading think tank studying how data and technology create new opportunities for people and businesses. Daniel's work focuses on the impact of emerging technologies and helping policy makers develop rules supporting responsible innovation. So Daniel bridges the gap between policy makers, tech companies, and business leaders. And the public to help everyone understand the benefits and challenges of AI and data analytics. He champions responsible innovation, believing new technology should benefit

Hanh: 00:01:50
society while addressing potential risks. So, in our discussion, Daniel will share his unique perspective on data driven innovation and the future of technology policy. We'll explore his groundbreaking work at the Center for Data Innovation and his insights on fostering a responsible, inclusive future. So Daniel, welcome to the show.

Daniel: 00:02:15
Thanks so much for having me on.

Hanh: 00:02:17
Yeah. Thank you. Thank you so much. So I'm from Michigan and where are you calling from?

Daniel: 00:02:23
I'm calling from North Carolina, Winston Salem.

Hanh: 00:02:25
Okay. Nice. Nice. I know we talked a little bit before the show. We're, we're seeing some great weather here. So I think we both can relate to the same, all the activities that's going on right now. So I, I thank you so much for your time. So to start things off, could you share with us something about yourself that many people might not know? It could be a unique experience or a

Hanh: 00:02:48
hidden talent or an interesting fact that helped us understand what has shaped your journey, and perspective.

Daniel: 00:03:00
Yeah, it's a great question. I wish I had more hidden talents or talents in general. A unique fact about me is I'm both an American and I'm Colombian. My father was Colombian, so I grew up in a dual national household. I think having a global perspective on issues, especially around, uh, technology and innovation, something that's really useful because you have to realize that there's, um, more than one way of understanding how a technology is going to be used. Um, different countries have different

Daniel: 00:03:29
perspectives on these issues. Uh, different demographics have different perspectives on these issues. I think, you know, having that experience of growing up in a household, you know, multicultural household, you know, gives you. It gives you new experiences and new ways of looking at things that is useful whenever you're confronted with new ideas. Um, sometimes you know that you should challenge those ideas and you should challenge conventional wisdom. Um, because, you know, you've seen it challenged in your own life.

Hanh: 00:03:57
Well, thank you. Thank you for sharing that. So your work at the Center for Data Innovation is deeply rooted in advocating for data driven innovation and shaping public policy. So can you share with us how your personal journey, your values and beliefs have influenced your approach to these critical issues?

Daniel: 00:04:19
Yeah. So, you know, I grew up as someone who, um, loved, uh, technology from, you know, early stages. I was excited when we got our first, uh, computer in the household. Um, I, you know, uh, enjoyed programming, um, as a child, you know, even before it was kind of cooler to be a software engineer, to do programming, right. The way it is now, I think much more kind of respected as a profession. Um, and so, you know, I always thought technology is something that was useful and beneficial for the world

Daniel: 00:04:46
and something that was, that was good. Um, and I think that has shaped a lot of the way I look at, um, technology and technology policy and where I interact with this because I understand that one, um, deeply understanding technology helps you understand how policy impacts the opportunities there. Um, so you, you know, the, if you have a very superficial. Knowledge of what technology can do, um, both its potential and its limitations. You can't, can't, uh, you know, create effective laws and regulations around it. Um, and so, you know, seeing technology

Daniel: 00:05:19
and innovation as a force for good. I think changes the way, you know, we encounter, you know, all the new innovations that we see in the world. Um, if you, if you look at it as a force for good, when you see these changes, you're generally optimistic about what it means. It doesn't mean you have to be naive about recognizing that there could be trade offs and there could be challenges. Um, but, you know, generally recognizing that, uh, historically, technology has been a force for good, whether we're talking about, you know, You know,

Daniel: 00:05:48
the printing press, gas lighting, um, and heating, you know, uh, railroads, automobiles, the internet, you know, computers today, and now AI, you know, these technologies had just a tremendously positive impact. Um, you know, the industrial revolution, I think about what it meant for, you know, the, the average life, Um, workers and and, you know, what their standard of living became, and obviously there were major challenges with improving workers rights. So we went through it throughout history. But if you look at this

Daniel: 00:06:15
trajectory, it's been positive. So, you know, when I think about kind of, you know, when I approach new technologies, that's one of the biggest issues, you know, I'm looking at is, you know, is this continuing to be a force for good? And we have a deep history of recognizing that. Um, when it comes to data, though, I'd say, you know, the other part of this is, you know, I, um, I think that, you know, the famous maxim, you can't manage what you can't measure is really important. And data really starts empowering us

Daniel: 00:06:40
to make, you know, smarter decisions, better decisions, faster decisions. Um, and, you know, just whether it's, you know, in our individual lives, you know, deciding things like, you know, wearing a smart watch or a Fitbit and, and, you know, figuring out, okay, I'm going to, you know, exercise differently because I see this information or eat differently or, you know, improve my health, um, using data, um, you know, at the city level to, to, you know, better manage, uh, municipality or deliver services to citizens or at the organizational level. You know, we were talking about, you

Daniel: 00:07:13
know, companies that are really innovative because they can do more with data. I think that concept of, you know, you can't manage what you can't measure, it's hugely important and undercuts a lot of, you know, how I look at, um, policy as well.

Hanh: 00:07:26
Great insight. And what a great time right now to convey that message, right? With AI and the speed of advancement and the hype, the fear, and those are very real, but there's also opportunities for embracing innovation. So that's great. So now, throughout your career, you've navigated. Some complex landscape of data policy and technological advancements. So what have been, what have been some of the most valuable lessons that you've learned along the way?

Daniel: 00:07:59
Well, I think one is that, you know, progress is really hard, that, um, you know, uh, there's so many barriers to innovation, and that, you know, the point at which a technology, um, or innovation, um, You know, the potential is realized. You know, you can see how it would be beneficial. And then the point where it's actually widely deployed in the world and generating those benefits, there's often a huge gap there. Um, I think it's easy to look at, you know, some technologies, you know, smartphones, for example, and we

Daniel: 00:08:28
see this rapid, um, adoption cycle. I think a lot of people who study, uh, innovation technology, they might be familiar with the, the S curve of innovation, right? It starts kind of slow. And it rapidly accelerates and goes up quickly. And we've seen that, um, kind of curve be compressed over time. So if you think about something like, you know, radio, you know, what it took to go from 0 percent to, you know, nearly full 100 percent adoption, right? That was maybe, you know, 40

Daniel: 00:08:54
years television, um, you know, a little bit more compressed, uh, computers, it gets a little shorter. And then you get to something like smartphones. And you're talking about only a few years, right? It gets so much smaller. Then you have things like AI, where you had chat GPT. If you look at that as an app, you know, one of the shortest adoption cycles in history of only a few months, where you had, you know, millions, a hundred million users.

Daniel: 00:09:18
That, you know, that makes it, I think, look to the average person that, Um, change is happening so much faster now. Um, and in some areas, you know, some of these, you know, very consumer oriented places where we're just talking about adopting software. Yes, change happens fast when we talk about some of these, you know, bigger areas of life. You think about healthcare being transformed to be more digital. That takes decades, right? We've been investing billions of dollars in the United States, for example, just to

Daniel: 00:09:48
get hospitals and primary care physicians to use electronic health records. Um, the technology has been around since the 80s, and we were just talking about basically computers and digital records, but actually doing this so that it's useful for, uh, the patient so that people's data is protected, um, so that, you know, providers have an incentive to adopt it and deploy it effectively and responsibly to make sure it doesn't interfere with quality of care. All of those things has taken a tremendous amount of time, and now that we even are rolling it out, now they're focused

Daniel: 00:10:19
on, okay, now that the technology is there, how do we make sure that we're using it effectively to do things like reduce adverse medical events? How do we make sure that, you know, for example, when a doctor is, Um, giving a prescription if it's going to interact with another prescription that that's identified automatically by the computer and that when alerts are being sent to the doctors, we're not sending too many alerts. So they just dismiss them, you know, getting to that whole usability side of technology.

Daniel: 00:10:45
So, you know, the lesson here again is just, you know, that progress can be so hard. It takes, I think, a lot longer than we expected and expect in most cases. So many people, I think, predicted self driving cars by 2020. Uh, you know, it's 2024. Uh, we do have, you know, some self driving cars on the road and very limited deployment, but we're still a long way from actually getting it fully out there. I think that's the, the lesson that we have to learn here.

Hanh: 00:11:10
I agree. I feel like when it, AI first came out for the general population, There was a lot of hype. There was not a full understanding and the speed of change was daily. And not all of it was, in my mind, AI, so to speak, right? But now 2024, I think people are in execution mode. And as you execute everything that you're describing, privacy, security, transparency are huge. And in my mind, in order to do that successfully, obviously

Hanh: 00:11:42
data is the heart of it. But you, you also want to cultivate. AI ecosystem because without that, you are really not reaping the benefits of AI.

Daniel: 00:11:54
That's right. And I think one of the things that's so important is when we look at these innovations, I mean, Chad, GPT was released as a proof of concept more than anything, right? It wasn't released originally as, you know, here's a service everyone should subscribe to, was really just showing, you know, the potential. You know, Dolly, same similar thing of to do image generation, and so. You know, it's one of those areas where it's, it's easy to be a critic, right? It's easy to go in and look at a,

Daniel: 00:12:20
a basically a beta product and say, look at all the ways it can be used badly or all the mistakes that makes. Um, and you know, it's true and it's kind of fine. And you know, there's, there's something, I think just innately human and us kind of, uh, poking holes at a few ideas, um, and being somewhat skeptical. Um, but there's also so many positive examples of how it's useful and how it's an improvement. Um, and I think it's important that, you know, we both, you know, put on that critical, you know, lens and

Daniel: 00:12:51
look and say, okay, how do we make sure that as we're widely deploying this, we address those risks, but also recognize that, you know, we need to not just make sure things don't go wrong, but also make sure things go right. How do we actually make sure the technology It dances and is used widely for the, the beneficial purposes that, that are possibly out there.

Hanh: 00:13:13
Mm hmm. I agree with you. Conversations like this, in my mind, it puts it out there for others to be a part of that conversation, right? Because everybody's concerns are very legit and they need to hear conversations like this to ensure that, hey, it's adaptation, there's change, there's lifelong learning, there's evolution. And that comes with kind of a, you know, rattle your current scenario. So you really have to step out of that and adapt this lifelong learning in the age of AI.

Hanh: 00:13:46
Because I don't know how you're going to effectively live without it now.

Daniel: 00:13:50
That's right. I mean, you know, it is. increasingly integrated into our lives. You know, if you have a phone, it will be automatically deployed. If you have a computer or a web browser, if you're using, you know, Microsoft Office or any of these apps, um, it will be, it will be part of us. Um, and I think one of the interesting things is, you know, where AI is so beneficial is how it can Um, allow us to, you know, interact with devices and information in an entirely natural way. You know, it's conversational now.

Daniel: 00:14:23
When we talk about, you know, before it was always, you know, have to teach people all these digital skills. And really what that meant was teach them, teach them to figure out how to work with a computer instead of teaching the computer how to work with humans, right? And so now anyone, you know, anyone who. Who can speak, who can sign, who can, you know, um, just express an emotion through their face or body language, you know, can communicate with these AI, uh, enabled systems, and I think that opens up so many possibilities for so many people. You know, um, who might otherwise

Daniel: 00:14:55
never really be able to sit down on the computer and maybe do research the way that, um, you know, people who are, are fully able there who have, you know, um, certain capabilities can do, maybe you're not literate and you want to interact with this. AI is just reducing so many barriers to, I think, creativity and jobs and interactivity. And I think that's amazing.

Hanh: 00:15:19
That's, you're spot on. And it really democratizes, it levels. the playing field. I don't care where you are in the world. You now have access. I call it enterprise even, right? Small businesses, whether it's 10 people to 10, 000 people, you really have access to the same tools for people. Let's say that's near and dear for me. It's like older adults, you know, the language barrier, the let's say disabilities and so forth. So now there's ways That you can have a conversation with your

Hanh: 00:15:52
phone, your chat application. And all of this is due to the recent GPT4o recent release, multi modal integration with their APIs and so forth. So it's exciting times, you know. It really levels the playing field for everybody. All ages, right?

Daniel: 00:16:12
That's right.

Hanh: 00:16:13
Yeah.

Daniel: 00:16:14
That's right. And there's, there's not that learning curve that I think, you know, we had before where, you know, you had, you had to kind of wait your turn, right? Because you had to have people coming in and showing you so many of the people that are adopting this technology now, they didn't have anyone teach it. They just tried it. And once they tried it, they realized they didn't really need a teacher. They could learn just by exploring. So I think one of the really interesting things here is that,

Daniel: 00:16:41
um, you know, anyone who's curious about this can really start now. There's no, there's no barrier to starting to use some of these technologies today for many people.

Hanh: 00:16:52
I echo that. So at the Center for Data Innovation, It's been involved in many projects and initiatives, so can you share a story of one that you're particularly proud of and why it holds significance for you?

Daniel: 00:17:07
Yeah. So one of our early projects was, um, thinking about open data. Um, you know, we, I started the center around 2014, uh, I guess 2012. And so one of the questions we had was, you know, um, President Obama had. created on his, uh, one of his first executives orders was around opening up government, um, having more transparency in government, having government, um, be more digital and work for citizens. But it was an executive order. And you know, our question, um, that we

Daniel: 00:17:37
kind of posed originally was, you know, might President Obama Be the last open government, um, open data president. And, and, you know, how can we make sure that doesn't happen? How can we make sure that, you know, this idea, which started with this administration, um, and had bipartisan support, was fully institutionalized? And so, you know, we looked at what would it take to, to really move forward and take that, um, executive order and turn it into legislation. And, um, through numerous conversations, um, on the Hill, Um, with, um, any

Daniel: 00:18:12
companies who are doing really interesting work around open data, academics, nonprofits, um, we worked, uh, together with a whole, um, coalition to build up, I think, broad support for what became the Open Government Data Act, um, which became, uh, passed law, um, actually signed by President Trump, um, you know, many years later, so you think about the, the time it takes from, you know, ideas, let's take this idea and, and really You know, formally, um, codify this and law and getting it to something that has now been implemented. And and now agencies have this

Daniel: 00:18:43
responsibility of publishing their enterprise data sets. So they say these are all the data sets we have. Um, they are making, you know, there's now standards around making them available. We have chiefs data officers who are focused on making this data available. And there's just been so much thought into thinking through how government agencies at the federal level, you know, create value through their data and make sure that it's put to useful purpose. And now we've seen that idea, of course, at the state and local level as well.

Daniel: 00:19:14
We've seen many businesses embrace open data. I think it's something that's really useful. Um, and we're also seeing, I think, this, you know, um, the spirit, um, reflected in all the innovation around Ai. As you mentioned, AI is, you know, strong and useful when it can be trained on, um, large data sets. And a lot of this data is coming from government. And there's just huge opportunities to continue to make progress in this space. So that's an area, you know,

Daniel: 00:19:39
that we started initial work on. And I'm, you know, delighted to see that it kind of made its way all the way through. Um, and, you know, hope, hope to see a future other ideas like this, where, you know, um, there's just so much that I think government can do hand in hand with industry and academia and nonprofits, um, to advance, you know, opportunities to, to leverage data and to make, you know, data driven decisions.

Hanh: 00:20:05
Awesome initiative. Now, one of the key challenges in promoting data driven innovation is balancing its potential benefits with the need to address public concerns and mitigate risk. So, how do you approach this delicate balance in your work?

Daniel: 00:20:21
So, I think the most important thing is not to let fear dominate. Um, I think some people are probably familiar with the, um, you know, the Gartner hype cycle for new technologies where, you know, it's a, um, at the beginning you have a technology come out and, um, not many people know about it and it kind of reaches this kind of peak where, um, you, you hear a lot of claims that maybe aren't true. And then finally, you know, there's, they, they call it the tropical disillusionment and then things kind of let go out. Right.

Daniel: 00:20:46
And so there's a lot of technologies that I think are overhyped. Um, on the, uh, kind of the, the counter side to that is what I would say. There's a technology panic cycle as well, where at the early stages of technology. Um, first there's just a few adopters who are, you know, kind of the innovators and they're excited about it, but at some point. You know, fears about new technology outpace understanding of it. And I think we see that happen again and again with many new technologies. And we actually have done some

Daniel: 00:21:13
reports looking historically at various technologies and how they went through these cycles. Um, even things that, you know, I mean, now we look back on, you know, motor vehicles. They had these red flag laws where they, um, some jurisdiction said, um, because cars are so dangerous, the solution is going to be, um, before you drive a car, you just have someone walking in front of the car waving a red flag. Um, that's, you know, not a scalable solution, right? Imagine if every vehicle had someone

Daniel: 00:21:38
walking in front of it, waving a flag. Um, we, you know, there were concerns about, um, you know, the printing press. And when we started moving from the printing press to Um, pulp fiction books, people were saying, well, do we really want anyone to be able to publish their ideas? Do we want anyone to have access to pamphlets? What would you know, those the kind of pulp fiction books due to the minds of youth, you know, they're going to be reading um, just you know, these uh, pure sci fi and fiction

Daniel: 00:22:03
things instead of the, you know, uh, esteemed, you know, historic, uh, English writers, for example. Um, so, you know, we've all, we've seen so many of the same fears, you know, fears about children, fears about impact on society, um, fears about, you know, the harmful effects, you know, people concerned about, um, you know, uh, when gas was used for lighting the noxious fumes in people's homes and what that would do to their health. All of these same fears that we have about technology, um, you know, radio waves in our brain and what that

Daniel: 00:22:34
might impact some very similar things. And so I think it's really important that we don't let fear dominate. It's also important that we be guided by evidence. So, you know, the point isn't to say that, you know, some technology might not be harmful or some technology might not have problem. It can, and we should always investigate that and look at that with an open mind, but we can't be motivated primarily by fear, um, because when we see fear being the dominant emotion, we tend to end up with, you know, bad laws

Daniel: 00:23:06
and regulations that hold us back. And in this case, particularly I think around AI, There's just so much opportunity and promise. We don't want to see that held back. We don't want to see, you know, the opportunities to use AI to, you know, find a new medical, uh, cure held back by 10 years for someone who needs that. We don't want to see AI that can be used to address loneliness or provide, you know, opportunities for people to age in place. We don't want to see that delayed by 10 years because that will affect so

Daniel: 00:23:35
many, you know, people who could use that, you know, Those benefits now. And so I think we have to, you know, continue to iterate and evolve and track risk and challenges, but not let the perfect be the enemy of the good. In these cases, look at opportunities where the technology is often not perfect, but maybe it's better than what we have today. And so, you know, use that as a metric for deciding, you know, when we move forward and when we slow things down.

Hanh: 00:24:01
Yeah. And you know, I love that what you said, don't let fear dominate, because let's say if we do. We might not be driving cars or flying right now, we'll be using horse buggies and so forth. So we really need to adapt innovation, you know, with enthusiasm and caution. And I think the key is be a part of the conversation and learn and grow and entrust in AI. And I always recommend for folks when trying to use AI. Is that start small, you know,

Hanh: 00:24:32
something that perhaps it's so repetitive, uh, such a nuisance that they would just love to not deal with. So start small and gain confidence and then you can go further from that.

Daniel: 00:24:47
That's a, that's a really great suggestion. Uh, and I wholeheartedly agree. I think there's, um, these opportunities to use AI to do something, um, that you also don't like to do. Right. And, you know, That's one of those things where if you use it for in ways that make your life better, you know, That's you know, it's a win and it's always going to be a win. One of the things I use it for is Um, I don't uh, I like to Have help writing messages to people to make

Daniel: 00:25:18
sure I phrase things, you know, kindly appropriately Um, and I love using the technology To just quickly, instantaneously run through an email and say, is this the best way to say this? And you get that answer. And it's, it's incredibly useful.

Hanh: 00:25:38
And you know, I think what AI has helped me, for instance, like I would ask it a couple of questions, but what it uncovered for me is a whole wealth of knowledge that I didn't even know what to ask. Do you, you know what I mean by that? Yeah. Um, So it really, it doesn't cause you to become lazy or dependent on it. What it does for me at least, it improves my critical thinking that I didn't have before. Right. So I think it's wonderful.

Hanh: 00:26:05
So now as a leader in the field of data policy, I'm sure there are times when That you had to navigate challenges to build trust, perhaps somebody that didn't trust you, the policy or the data that you were using, and you had to take an unpopular stance on an issue. So can you discuss a situation like that where you face this type of challenge and what did you do? How did you approach it?

Daniel: 00:26:30
Yeah. So there was, um, I think early on when, especially when I was focused on, on data policy and data driven innovation. And, you know, at the time. This isn't something that many people were thinking about. You know, this is kind of pre the data science boom and big data boom. Um, and whenever people would want to talk about data, they want to talk about privacy and one of the important points that I was trying to emphasize is that we need to go beyond the privacy conversation around policy around data.

Daniel: 00:26:59
We need to be thinking about data. Um, not only, yes, how do we protect data, but also, are we collecting enough data? Are, is there more data that we should be collecting? Is there better data that we can be collecting? Who is being left out of the data collection? Um, how are we making sure that the data is used well? And also, you know, really emphasizing this point that in many cases, there are trade offs between privacy and innovation. And that is something that, um, a

Daniel: 00:27:22
lot of people who are advocating for privacy Did not want to acknowledge, did not want to, um, you know, uh, have that part of the debate because nobody wants to be against innovation. They weren't necessarily against innovation. They were just for privacy. Um, and you know, just I, I would freely admit that sometimes being pro innovation means there's trade offs with privacy on. I'm kind of saying you might get less privacy. Um, but that was a very unpopular opinion because it was, you know, um,

Daniel: 00:27:52
you know, you, you, in some ways I was making the argument that effectively some groups were against innovation. Um, and so, you know, I, I think what's important there is that, um, Um, in this debate, in this conversation about, you know, where policy should go on these technologies, um, we get really clear on what it is exactly that we want, because in many cases, um, you know, people were pro privacy, but they're also pro health care, right? You know, they're also pro education. Um, and so what we, I think, should be able to agree on Is that, you

Daniel: 00:28:30
know, if the goal is, for example, to improve education, public education through data analytics, through better understanding of how students learn, um, that should be the goal. And we should focus very practically on how do we get there. And you know.

Hanh: 00:28:47
It's not one or the other, right? You don't have to choose one or the other. You can collaborate with all key stakeholders of what you're doing to achieve a common goal. So it's another attitude shift as opposed to, you know, something that's so absolute. Well, perhaps it isn't in this case. Perhaps it's more of a, a shift paradigm shift so that you can move towards a goal that you can still have privacy and achieve. Oh. Your health goals and

Hanh: 00:29:15
education goals and so forth.

Daniel: 00:29:18
And recognize that, you know, at the end of the day, many people have different values and choices and we can't pick just one for everyone. You want to give people as much choice as possible. people. Some people, for example, You know, when they have a rare disease, they opt in to sharing their data as much as possible because they want to see any progress that can be made in that area. You know, people with, um, you know, children that have childhood diseases, chronic diseases,

Daniel: 00:29:43
they want to see that progress. They're willing to make those tradeoffs, you know, in other cases. You know, there might be a, a non serious medical condition that is very private that somebody doesn't want to share. And that sharing looks different. And so I think we have to recognize that giving people as much choice and opportunity is really important. And there's not kind of a, there's not a one size fits all solution, right? It doesn't, you know, people are usually very unhappy when, you know, whether it's an advocacy group or

Daniel: 00:30:12
the government or a company saying, this is the way you must handle data. Give people more choice and then we get more options for everyone.

Hanh: 00:30:21
Optionality and it's personal. Right. And what I hear that's right is, you know, and nowadays with the advancements of ai, you can do all that. You don't have to choose one or the other. You can achieve privacy, transparency, and achieve your, your health goals, whether you are. A practitioner or a hospital. So now in your work, I'm sure you collaborate with many folks, stakeholders, like policymakers, industry leaders, and academics to drive progress in the data privacy.

Hanh: 00:30:58
So how do you foster these connections and facilitate productive conversations, because they have conflicting goals, for instance.

Daniel: 00:31:08
That's right. You know, I think it's, um, in, in most policy, it's a debate, right? It's a debate of, of facts and, um, evidence and persuasion. Um, sometimes it's a debate about values and goals. And I think first, the first stop is figuring out what are you actually debating? Are you debating And so, you know, that's I think the first step. Um, and then you figure out, you know, do we have a common common

Daniel: 00:31:35
interest and how can we work together on many of these areas? And I think that's why, you know, there's there's a lot of talk. I'll always in the news about, you know, divided government and the partisan government and Um, but when you look at so much of the legislation that comes out of Congress, um, around tech policy, around data policy, it is very bipartisan. You know, we see stakeholders from all sides come together and say, you know, yes, we want to use data to advance health care. Yes, we want to collect

Daniel: 00:32:01
better data around the census. Yes, we want to have better, um, you know, data about, um, Uh, even global warming and climate change because it affects our coastal communities. And that is, you know, the coast affects everyone, um, red states, blue states. And so I think there's so much work in this space that it really truly is, uh, bipartisan. And so I think just focusing on, you know, again, um, where can, you know, data have a positive impact? How can data be used for good? Uh, there's a lot of, a lot of interest

Daniel: 00:32:31
and collaboration across the board.

Hanh: 00:32:35
So true. Now, data driven innovation. Has the potential to address some of the society's most pressing challenges, health, education, government, like we discussed. So what are some of the most promising opportunities do you see?

Daniel: 00:32:50
Yeah, I mean, so education I think is one of the most important, um, and I think one of the most exciting opportunities. We don't have personalized education in the United States, um, especially in the public. Uh, school system. And what I mean by that is, you know, everyone learns at different rates. Everyone comes to education with different skills and abilities, different interests. And, you know, I think back to when I was in, uh, school, you know, you had a large classroom, um, you had one teacher and

Daniel: 00:33:15
they were teaching to the middle, right? They were teaching, um, to kind of the average student in the class. If you were head, you were probably a little bored. If you were behind, you were probably a little confused, and those differences get bigger and bigger every year, and that's a problem. Um, and so with AI and data, I think we have so many opportunities to deliver the type of education that each person needs. Somebody is, you know, going faster on math. Great.

Daniel: 00:33:45
Let them, let them zip ahead. They will learn more, they will stay engaged, and they will enjoy school because they're being challenged and they have these opportunities. If somebody's, you know, doing, uh, great in reading and language arts, allow them to accelerate in those areas. Um, and then, you know, the areas where people need more work. You know, that's where the educators can spend more time doing one on one work with them doing enrichment. Um, and also, I mean, that's at the individual level.

Daniel: 00:34:13
Then you think about, you know, at the school level and just how we manage schools and just the need for better data to figure out, okay, you know, are there problems going on the school? Are there, you know, Are there health issues going on at the school? Um, you know, where are the buses? You know, just logistics that, you know, data can help with that. And then you start thinking kind of bigger picture, what kind of, you know, all these different school systems have various types of interventions, whether it's, you know, uh, drug interventions, gangs,

Daniel: 00:34:39
you know, uh, special STEM programs. The question's always, are those programs effective? And the only way to know that is by looking over time, um, usually, you know, sometimes it's, you know, 5 to 10 years and seeing, okay, you know, where have these students ended up in the workforce? How are they doing? Are they thriving in society? And you have to have good data to do that. You have to have not only these kind of educational systems, this longitudinal data systems that are about the schools. You also have to have

Daniel: 00:35:10
it about the workforce. And so many states have started to build these, they didn't build them so they were interoperable. They didn't build them to really track this over time. And that's one thing that I think we're starting to think about. I think that's a huge opportunity to really improve our workforce and our students and how we learn.

Hanh: 00:35:30
I think. The education system needs to have a course, AI integration. If you can pick up a phone, if you have a computer, a laptop, and if you start integrating your day to day usage of this, that student or that school system needs to have a dedicated course in teaching students how to integrate, how to use it wisely, as opposed to seeing it as cheating. There's nothing wrong with growing, developing, opening your mind. And if you are ahead, that's great. But let's say if you're behind, well, here

Hanh: 00:36:04
is a tool that you can utilize offline outside of the classroom and so forth. So the attitude ought to be embraced as an enhancement augmentation, as opposed to like, you're so bad for using it. You must be cheating.

Daniel: 00:36:21
Exactly. And I just think about, you know, these tools that, as you mentioned, you know, the new release of ChaiCPT, it's multimodal, it can see things. And think about a student that's struggling with, how do I solve this math problem? You can point your phone at it, and now you have a personal tutor. That you know, students didn't have access to this speaks to you. This is across income levels. As long as people have access to a phone, that's just incredibly powerful.

Daniel: 00:36:45
But we have to figure out how do we actually get this widely deployed? So students know how to use it. As you said that they're trained on the appropriate ways to use it. So it's augmenting their education. So it's helping them learn better. It's not just a kind of a cheat code for getting through school where they're not actually learning. Want to make sure that they're actually using it to, to get a better education. I think that possibility is certainly there.

Hanh: 00:37:09
And that's going to take a huge paradigm shift for the administrators, the superintendents to create a dedicated course on AI and how that needs to be integrated in the school and home life. They intersect, right?

Daniel: 00:37:24
That's right.

Hanh: 00:37:25
Because I still hear, you know, my kids are not, they're graduating from college and so forth, but like, I still hear from, you know, elementary, middle, and high school. It's still a bad thing. It's, it's perceived to be cheating. So you kind of have to hide it.

Daniel: 00:37:40
That's right. And that needs to change. And then I think those perceptions will change. And some of it will be generational because at some point, maybe students, they're going to be using this at home. They're going to be using this, you know, instead of Google.

Hanh: 00:37:53
Yes.

Daniel: 00:37:54
And so there's not going to be a way of kind of taking it away from them.

Hanh: 00:37:57
So, data technology is constantly evolving. So how do you stay informed and how do the center of data innovation remains at the forefront the development?

Daniel: 00:38:10
A lot of reading and a lot of talking to experts. I think that's one of the most important things in technology is that you actually understand, as I said, kind of deeply how these things work. I remember when I was working on blockchain, for example, you know, I had to go in and really, you know, deeply look at, okay, how exactly, you know, are these algorithms working? What goes into it? The same with machine learning, um, and AI. You have to study the algorithms.

Daniel: 00:38:34
You, it's not enough to know kind of vaguely, this is how it's working. You need to understand really what's happening under the hood, um, I think to make effective policy. Now that everyone needs to do that, But for the work that we do in policy, I think we have to have that strong connection. Um, and I think Congress is getting better at this, too. There's been a number of, um, fellows that have been placed through, um, AAAS, um, the American Academy for Science, um, that has brought in AI experts and placed them in congressional offices.

Daniel: 00:39:06
So that as some of these, you know, policy makers are thinking about legislation, they have someone in house who is an actual, um, you know, computer scientists who can provide them that expertise so that as they're writing policy, it actually maps to where the technology is.

Hanh: 00:39:24
Very true. So the center of data innovation produce numerous reports over the years. Can you share an example of one that you believe has had a particularly significant impact on shaping data and policy?

Daniel: 00:39:41
Yeah, so one of our earlier reports was on Um, what we call data poverty or the data divide. And this is a concept that I think has, um, gotten a lot more recognition, especially in the age of AI, where people are concerned about bias. But basically, we made the argument that, um, for many individuals and groups, the biggest problem for them is not that too much data is being collected about them, which would be kind of the privacy concern, but that too little data is being collected about them. And for that reason, they might

Daniel: 00:40:08
be left behind as we see data driven innovation advancing. Thanks for listening. And so, for example, you can think about, um, health care, for example, where certain populations aren't included traditionally in medical research. Um, they're not the core representatives in clinical trials, for example. Um, you can think about many areas where there are data gaps, where, for example, maybe a wealthy neighborhood has air quality sensors and a poor neighborhood doesn't. And so, you know, when you, um, can

Daniel: 00:40:37
use an app to figure out whether or not you should go bicycling if you're in a wealthy neighborhood. You can do that if you're in the poor neighborhood. You can't. Um, there's so many areas like that where there is these data divides and conceptually, you know, we've had this long tradition in the United States and globally of thinking about the digital divide where people didn't have access to the Internet or computers or digital literacy. And the argument that we've made is

Daniel: 00:41:00
that there's also this data divide, um, and that it's something that we should recognize and address through policy to help close that gap. And it's something that I think has increasingly gotten attention. We've seen, um, a lot of work focused at the global level and trying to, um, collect data from all parts of the world, thinking about things like language gaps and data sets, thinking about, um, Um, how well represented different groups are in data sets. The Census Bureau has been rethinking, for example, it's

Daniel: 00:41:28
different demographic categories. We've seen new thinking in terms of surveys the government does to make sure that, um, for example, in long term care facilities, are they asking, you know, good questions to figure out that there might be unique differences between older adults, that it's not just, are you older, but what languages do you speak? Um, you know, what's your relationship status? What's your sexuality questions that Um, so just thinking through those data questions. It's usually important.

Daniel: 00:42:00
And so I think that report and some of the subsequent ones we've done on that topic have had a lot of impact on this conversation.

Hanh: 00:42:06
That's great initiative. No data policy issues often transcend borders and require a global perspective. So how do you maintain this broader view while also addressing regional and local concerns?

Daniel: 00:42:23
Yeah. So you're right. I mean, so much of data is global. And one of the, um, points we've made in a recent report is that with big data, especially, you know, these larger and larger data sets, um, it's no longer possible to, um, just expect, um, One organization, um, one company, one country to collect all the data about a particular topic. That was really the old model of thinking. It used to be, you know, if you wanted to have, um, you know, World Health Organization or something, maybe they're

Daniel: 00:42:50
going to collect all the data on malaria. Right. We're moving to a world where they're going to be lots of different partners that have data and what we need to be thinking about is how do we build global data infrastructure so that data collected, um, you know, and, uh, South Africa and Iran and Colombia and the United States and Australia. They're collecting data about the same topic, the same theme, you know, that that data is interoperable, that's in a similar data format, and that it can be shared, and that we have these, you know,

Daniel: 00:43:20
federated data networks, um, and that's a policy problem, that's a technology problem, that's a data standards problem, that's a skills problem, um, and it's really hard to think about how do you bring all that data governance together, but that's what we have to be thinking about, how data transcends borders, and that For it to be useful and valuable, it should transcend borders that we don't want data localized. We don't want it restricted on where it can be used. We want it to be global. Um, so that's, uh, I think that's one of

Daniel: 00:43:46
the most important issues that we work on. Do a lot of work on kind of cross border data flows, thinking about data as part of trade policy as well. Um, so just lots of opportunities to really push policymakers to think about data in new ways.

Hanh: 00:44:02
Well, thank you so much. We covered so many points. So we're at the tail end. Is there anything else that you would like to add?

Daniel: 00:44:09
Um, you know, we, we covered a lot. I, I just want to say thank you for this conversation. It's, I think, so important that we have, you know, so much of this technology available to everyone because benefits, um, are things that I'm very optimistic about and I, I think have great potential.

Hanh: 00:44:25
I echo that. Well, thank you for joining us in this insightful discussion on data driven policy and innovation with AI. So, today's talk has shed light on the crucial role that data and technology play in shaping our society and driving progress. So, throughout our conversation, we explore the fascinating intersection of data, AI, and AI. Public policy. We went into the challenges and opportunities that arise when leveraging these powerful tools for societal benefit.

Hanh: 00:45:01
Our discussion highlighted the need for ethical innovation in AI and data analytics. So as we create and use these technologies, we must focus on being clear, responsible, and beneficial to people in society. And today's insights have shown us that. The amazing possibilities of data driven innovation, and that we can use AI and data to solve huge problems, boost economic development, and enhance people's well being in significant ways. So we hope that this conversation has motivated you to reflect

Hanh: 00:45:40
on the impact of data. And AI in our future. So I encourage you to keep learning about this exciting field and participating in discussions like this so that we can advance and improve and make change and good change for the future. So thank you for being a part of this enlightening discussion and keep innovating and striving towards a better, more data driven world. Thank you for your attention. Well, thank you.

Daniel: 00:46:13
That was fun. Thank you so much.

Hanh: 00:46:14
Yeah.

Daniel: 00:46:15
Thanks for having me.