Menu

The woman who spoke out against Google on AI

Jun 26, 2024 •

Meredith Whittaker turned her back on Google, but she never left tech entirely. The former AI whistleblower is now the CEO of Signal, a messaging app that keeps conversations encrypted – used by journalists, whistleblowers, drug dealers, militants and others who want to keep communications secure.

Today, CEO of the Signal Foundation Meredith Whittaker, ahead of her public appearance at The Wheeler Centre in Melbourne, on the tech giants who hold our future in their hands.

play

 

The woman who spoke out against Google on AI

1277 • Jun 26, 2024

The woman who spoke out against Google on AI

Audio excerpt — Whittaker:

"Chairwoman Maloney, Ranking Member Jordan, and members of the committee, thank you for inviting me to speak today. My name is Meredith Whittaker and I’m the co-founder of the AI Now Institute.”

ASHLYNNE:

In 2020, Meredith Whittaker gave testimony to US congress about the threat of artificial intelligence.

Audio excerpt — Whittaker:

“Facial recognition poses serious dangers to our rights, liberties, and values, whether it's used by the state or private actors. The technology does not work as advertised. Research shows what tech companies…”

ASHLYNNE:

She'd turned her back on a career at Google, and wanted to send the public a message.

Audio excerpt — Whittaker:

“Facial recognition relies on the mass collection of our biometric data. It allows government and private actors to persistently track where we go, what we do, and who we associate with.”

ASHLYNNE:

But she didn't turn her back on tech entirely.

Audio excerpt — Whittaker:

“If you care about the over policing of communities of colour, or gender equity, or the constitutional right to due process and free association, then the secretive, unchecked deployment of flawed facial recognition systems is an issue you cannot ignore.”

[Theme Music Starts]

ASHLYNNE:

From Schwartz Media, I’m Ashlynne McGhee. This is 7am.

Meredith Whittaker is one of the world’s most influential people in AI, except she doesn’t work in it anymore.

She spent years deep in the bowels of Google, until a secretive defence contract changed her entire perspective.

She’s now the president of Signal, an encrypted messaging app used by journalists, whistleblowers, drug dealers, militants and really anyone who wants to keep their communications secure.

Privacy advocates love it, law enforcement hate it and Meredith Whittaker is fighting governments desperate to regulate it.

Today, the president of the Signal Foundation Meredith Whitaker, ahead of her public appearance at The Wheeler Centre in Melbourne, on the tech giants who rule the future and why she says Signal’s not one of them.

That’s after the break.

[Theme Music Starts]

[Advertisement]

ASHLYNNE:

Meredith, a lot of people would describe the moment that we're in now, when it comes to artificial intelligence, as a really pivotal one. But I wanted to ask about you personally and what led you to begin researching and examining AI?

MEREDITH:

Yeah, well, it boils down to my having been in the tech industry now for almost 20 years and a little over ten years ago or so, beginning to see AI or machine learning, as we called it then, emerge as a hot new thing inside Google where I worked and starting to ask questions about what AI was and how it works that led me to a more critical perspective on these set of technologies, and actually to understanding their deep, inextricable connection to the platform surveillance business model.

ASHLYNNE:

So talk to me a little bit more about that. What were you working on at Google at the time and how did you see it come up in your workplace?

MEREDITH:

Yeah, so I'd done a number of initiatives at Google by that time and one of the big projects that I kind of cut my teeth on in tech was an effort called Measurement Lab. Whether we can trust data or not, whether it should inform our decision making, or whether we need to be a little bit wary. So I was coming into this AI moments around the early 20 tens with that type of scepticism. And of course, when I saw that these technologies we were calling AI, these machine learning models were being trained, were being informed by much less robust data. Often data that was making very socially sensitive claims about people and populations. Data that was created in service of selling ads or, you know, tracking clicks and preferences across browsers or, you know, the search engines or what have you. That really animated my concern around AI.

There was a moment where I realised that there were some real high stakes to what I was seeing, you know, and potentially those stakes were beyond what I had understood before. And this was 2014 when a research group from Harvard was presenting to myself and some colleagues an initiative that aimed to create an AI system to predict genocide. And I remember sort of thinking through what they were doing and, you know, asking some pretty basic questions, right? Like the term genocide is not clearly defined. It is a politically contested term. So I was asking questions around, okay, well, how do you define genocide? What type of data do you rely on to model predictors for genocide, right? So, you know, where are you getting this? How do you take responsibility for the fact that if you deployed such a predictor, you may be putting your thumbs on the scale. You may be, in fact, inflaming or otherwise shaping volatile political contexts outside of your scope of awareness. And I got no good answers to those questions.

ASHLYNNE:

So you stayed on at Google after that, then what happened?

MEREDITH:

Yeah, 2017 someone sent me a Signal message one night in the fall of 2017 saying, hey, I think you need to know this. I happen to have privileged access to the small team of people who are working on this secretive contract with the Defence Department to build machine vision and surveillance systems for the US drone war, which had been deemed illegal by every blue chip human rights organisation. So I was extremely angry and I was very worried because this was Google, who was a company that had, you know, effectively been enabled by regulators and governments around the world to take on incredibly significant roles in our society, right? You know, you no longer go to the library, you Google it, right? They are, you know, positioning themselves as the gateway to the world's knowledge and understanding. They are, you know, creating the platforms for human interpersonal communication and so we had given this power to these companies on the basis of very clear promises to do the right thing, to not be evil, to not misuse this power. And this was crossing a red line, in my view.

Audio excerpt — News:

“Artificial intelligence, drones, warfare, and Google. It's a mixture that caused an uproar inside the tech giant, where the early motto was ‘don't be evil’. So what's behind Google's contract with the Department of Defence for a project called Maven?”

MEREDITH:

Google, who is now yoking their fortunes to the world's most lethal military, in the context of a drone war that was, again, deemed illegal, that was massacring wedding parties based on inaccurate data streams, and that was setting a precedent for a, kind of, technologically enabled empire that I found really dangerous and morally repugnant.

ASHLYNNE:

Is that why you left Google?

MEREDITH:

No. That's why I started organising at Google. That's why I started rallying the people around me to push back. That's why, you know, thousands of people along with me wrote letters, gave out literature, signed petitions, went to the press, spoke publicly against this contract, and ultimately forced Google to cancel it.

Audio excerpt — News:

“A letter to Google CEO Sundar Pichai signed by more than 3000 Google workers. Here's what it says, quote, ‘we believe Google should not be in the business of war. Therefore, we ask that Project Maven be cancelled and that Google…’”

MEREDITH:

So, you know, no it's not why I left, it's why I started fighting in a much more concerted way and started relying on techniques from the past of, you know, building power among workers and stakeholders to push back on corporate decisions because it became very clear that, you know, change from the inside doesn't happen simply by making a good point.

ASHLYNNE:

After the break, the harmful content on Signal that Meredith Whittaker says is being weaponised.

[Advertisement]

ASHLYNNE:

So, Meredith, you're now the president of the Signal Foundation. What was it about Signal that appealed? Why join?

MEREDITH:

Oh, I love Signal. I've been a Signal super fan since day one. I've been friends and admirers of the Signal folks. I think there is a future in which Signal sustains and grows and continues the incredible momentum that it's already generating and that is a future where we can speak privately, where journalism is possible, where human rights work is possible, where dissent is possible, where activists and dissidents and militaries and CEOs and anyone with something important to say that could be weaponised against them by their adversaries or those in power, can use Signal to truly communicate privately. And that's the future that I think we need. You know, particularly given so many gnarly issues in the world, particularly given the increasing prominence of authoritarians in governments. I think we must preserve the ability to communicate in a truly private way, because we cannot have all of our thoughts and our correspondence subject to mass surveillance and then weaponised by those in power.

ASHLYNNE:

Meredith, you've been in Australia now for a little over a week. There's a big push here for our government to improve regulation of the internet. What do you make of it?

MEREDITH:

You know, I'm learning and listening to a lot of folks in Australia. I think, you know, I was really heartened to see that the eSafety Commissioner released language that, you know, looks like a great first step in protecting encryption. I think Australia has a big role to play in these debates.

Audio excerpt — Anthony Albanese:

“We will be seeking cooperation wherever we can, but we need to take action and the online...”

ASHLYNNE:

There is one really gnarly issue though that Signal is grappling with at the moment, and that is this big conversation around how encryption and platforms that are encrypted, including Signal, are being used to share terror and child exploitation content. That must weigh on your minds.

MEREDITH:

Well, it weighs on my mind that people are weaponising such socially significant and often monstrous issues to attack the one technology we have that actually enables private communication. Like, look, this is a long standing tension. The attack on encryption, the attack on the ability to communicate and transact privately over networks computation has existed since the mid 70s, when public key encryption technologies were invented and security services in the US literally tried to stop publication of the paper that documented these systems. Then you go to the 90s and these same tropes emerge over and over again. You have a 1992 op-ed in Wired by then NSA head Stewart Baker, that claims that these very rudimentary encrypted systems are the provenance of terrorists and pedophiles, and that no one beyond those demographics should really be concerned about privacy, because the rest of us, you know, don't have anything to hide. And now the landscape of surveillance in the 70s, in the 90s and now is vastly different. But somehow the arguments never change. Somehow it is always the state must have access to every realm of human life, in service of rooting out the bad guys and supporting the good guys. But of course, that is not how it works in practice. And what we see today, in my view, is a renewed attack on the ability to truly communicate privately based on emotional claims that are not supported by data. There is no evidence that encryption is a causal factor in child abuse or terrorism.

ASHLYNNE:

I'll just jump in there, Meredith, because I just want to explore that a little bit more. And I guess what we're talking about is not whether it's a causal factor, but the fact that this type of content is shared on encrypted platforms. It does get shared on encrypted platforms, I think that that's fairly obvious, isn't it? Do you accept that?

MEREDITH:

Well, the purpose of encryption is to provide provable privacy. We cannot look into the messages that you send on Signal. So I'm sure a number of things are shared on encrypted platforms, but there is no evidence that encrypted platforms produce more abuse. You know, there are things that we do to ensure that the harm surface on Signal is reduced. The primary one is we are not a social networking platform. But again, that framing then immediately implies that somehow if there were... if platforms, if tech companies just did the right thing, the deep sickness of child sexual abuse would somehow go away. And I think to me, that's really dangerous because we've abstracted the issue of child sexual abuse into an online problem that a couple of engineers turning some knobs and giving law enforcement and potentially authoritarian governments in the US extra access will somehow solve. I don't think one person who showed up in Epstein's black book has been prosecuted. You know, I do need to say it is incredibly dangerous. In the U.S. right now there is a woman living in prison, at this moment, because Facebook turned over messages between her and her daughter that documented their access of reproductive care and dealing with the aftermath in the state of Nebraska after the Dobbs decision.

Audio excerpt — News:

“We are learning more now about how Facebook turned over a mum and her teens daughters, her teenage daughter’s chats to police in Nebraska. Court documents show police served Facebook a warrant, as part of an investigation into an alleged illegal abortion.”

MEREDITH:

We need to demand the kind of precision and rigour that is often difficult in the face of highly emotionally charged topics, but there's no way to get this right if we don't do that.

ASHLYNNE:

Finally, Meredith, there are AI evangelists and AI doomsdayer's in this world. What do you identify as?

MEREDITH:

I am an AI realist, which means I'm not clear that AI is what we should be interested in so much as the configurations of power that are going to use AI and the way that AI could be used to generally harm the most marginalised.

I'm not really a user of AI. Putting aside that I don't really use GPT or these sort of generative toys. I think in most cases, like most other people, I am more the subject of AI than a user of AI. And what I mean by that is, you know, we all walk down the street and a CCTV camera will pick up our face, and that CCTV camera may transmit that biometric data to a Clearview AI facial recognition database that then pulls up a profile of us, right? I walk into a bank, I might apply for a loan, an AI system that is a licence from Microsoft on the back end may be involved in making a determination about whether I get that loan or not. But in none of those cases am I actually using AI. I am conscripted as an AI subject by the institutions and governments that have chosen to implement AI. And so, I think we need to recognise that power dynamic when we think about using AI. This is generally a tool of those who have power that's used to subject those who have less.

ASHLYNNE:

Thanks so much for your time, Meredith.

MEREDITH:

Thank you. It's great to be here.

[Advertisement]

[Theme Music Starts]

ASHLYNNE:

Also in the news today…

Julian Assange has walked free from Belmarsh Prison and will face court today on the island of Saipan, a US commonwealth in the pacific ocean.

Under a deal struck with the US justice department, Assange is expected to be sentenced by a judge to time already served and then, he will be free to return to Australia.

Prime Minister Anthony Albanese revealed Assange would be accompanied to court by the current UK high commissioner, but said he would not comment further until after a final judgement has been secured.

Audio excerpt — Anthony Albanese:

“We want him brought home to Australia. And we have engaged and advocated Australia's interests, using all appropriate channels to support a positive outcome. And I've done that, since very early on in my prime ministership. I will have more to say when these legal proceedings have concluded, which I hope will be very soon.”

ASHLYNNE:

That’s all from the team at 7am for today. My name is Ashlynne McGhee. Thanks very much for your company, see you again tomorrow.

[Theme Music Ends]

Meredith Whittaker turned her back on Google after raising concerns about the mass surveillance fueling AI, but she didn’t leave tech entirely.

The former AI whistleblower is now the CEO of Signal, a messaging app that keeps conversations encrypted – used by journalists, whistleblowers, drug dealers, militants and others who want to keep communications secure.

So why did she blow the whistle on Google? Is privacy the answer to AI? Or does privacy cause just as much harm as surveillance?

Today, CEO of the Signal Foundation Meredith Whittaker, ahead of her public appearance at The Wheeler Centre in Melbourne, on the tech giants who hold our future in their hands.

Guest: CEO of the Signal Foundation, Meredith Whittaker

Listen and subscribe in your favourite podcast app (it's free).

Apple podcasts Google podcasts Listen on Spotify

Share:

7am is a daily show from Schwartz Media and The Saturday Paper.

It’s produced by Kara Jensen-Mackinnon, Cheyne Anderson and Zoltan Fesco.

Our senior producer is Chris Dengate. Our technical producer is Atticus Bastow.

Our editor is Scott Mitchell. Sarah McVeigh is our head of audio. Erik Jensen is our editor-in-chief.

Mixing by Travis Evans and Atticus Bastow.

Our theme music is by Ned Beckley and Josh Hogan of Envelope Audio.


More episodes from Meredith Whittaker




Subscribe to hear every episode in your favourite podcast app:
Apple PodcastsGoogle PodcastsSpotify

00:00
00:00
1277: The woman who spoke out against Google on AI