Frances Haugen: Can we trust tech transcript

20 May 2022, 17:00 BST 

 

Gina Neff: This is the latest event from the Minderoo Centre for Technology and Democracy. You're here for Frances Haugen. Can we trust tech? And before we get started on the event, there's just a few housekeeping points that we need to cover here at the Babbage theatre. Emergency evacuation we will not need this evening, but if the alarm bell sounds all attendees should vacate the room immediately. There are no tests planned. The event Is being recorded. By attending the event you are giving your consent to be included in the recording. The recording will be made available on the Minderoo Centre for Technology and Democracy's website and the website of the Centre for Research in the Arts, Social Sciences and Humanities here at Cambridge shortly after the event and for those of you here in the room, please raise your hand during the discussion and a mic will be brought to you. We also anticipate that the camera will be focused on the front, so you won't necessarily be filmed in that.

So I am Gina Neff. I'm the Executive Director of the Minderoo Centre for Technology and Democracy and it's my great honour tonight to welcome Frances Haugen and Professor John Naughton to this discussion. First I'll introduce John. Professor John Naughton is Chair of the Advisory Board of MCTD. He's also director of the press fellowship programme at Wolfson College. By background, a systems engineer, John is also a historian of the internet, whose main research interests lie on the network's impact on society. Since 1987, he has written for a fine publication here in this country, the Observer.

Frances Haugen is an advocate for accountability and transparency in social media. Born in Iowa City, Iowa, Frances is the daughter of two professors, we welcome her back to the academy tonight, and grew up attending the Iowa caucuses with her parents instilling a strong sense of pride in democracy and responsibility for civic participation.

Frances holds a degree in electrical and computer engineering from Olin College and an MBA from Harvard. She's a specialist in algorithmic product management, having worked on ranking algorithms at Google, Pinterest, Yelp, and famously Facebook. In 2019, she was recruited to Facebook to be the lead product manager on the civic misinformation team, which dealt with issues related to democracy and misinformation and later worked on counter espionage. During her time at Facebook, Frances became increasingly alarmed at the choices the company makes prioritising their profits over public safety, putting lives at risk. As a last resort and at great personal risk, Frances made the courageous decision to blow the whistle on Facebook.

The initial reporting was done by the Wall Street Journal and what became known as the Facebook Files. Since going public, Frances has testified in front of the US Congress, the UK and EU Parliaments, the French Senate and National Assembly, and has engaged with lawmakers internationally on how to best address the negative externalities of social media platforms?

Frances has filed a series of complaints with the US federal government related to Facebook, now named Meta, claiming the company has been misleading the public and investors on how it handles issues such as climate change, misinformation, and hate speech, and the impact of its services on the mental health of children and young adults, so with no further ado, let's turn it over to Frances before our conversation.

Frances Haugen: Hello thank you guys for inviting me. I really enjoy reaching out to college students and to the academic community because I really believe the way that we've got to where we are today is too few people were sitting at the table. That we live in a world today where huge algorithms play a huge outsize impact on society for the level of awareness even the public has on how they operate.

If we were talking about any other similarly powerful industry, so let's imagine we're talking about the auto industry, we can buy a car and independently crash it and determine whether or not it needs crash tests and standards. If it's safest for a child to drive in as it is safe for an adult or a senior citizen. If we were talking about an oil company we would never ask the oil company if there were some children dying downwind from an oil derrick, what they thought the problem was, we would never take them at face value of saying where we should focus our attention. If they said, have you ever considered meteorology? The clouds move across the sky all the time maybe the rain is the reason why the kids are getting sick. We would never take them at face value. We would come up with our own hypotheses. We would test the air. We would test the soil. We would test the water. We would draw our own conclusions and then we would come to them to say what do you say about this data?

What we're facing today is a really interesting social problem because decisions that were made in isolation by people, often with very little context on how the rest of the world works, and very little concern like because the different functions are so isolated the people who are advancing these systems often are actively discouraged from considering what the side effects might be. That we end up in a situation where you can make a change to an algorithm that impacts back then was maybe 2 1/2 billion people back in 2018 some engineers at Facebook said hey people aren't producing enough content anymore. We know if we give them more engagement, they'll produce more content and this sounds really intuitive. It's like OK, if we give them more little hits of dopamine, more comments, more likes, more re-shares it's amazing people make more content. No one stopped and said well are we going to distribute different content if our measure of what is good is the ability to get a reaction. And because there wasn't enough people sitting at the table, there weren't enough different kinds of people sitting at the table, it took six months before researchers sent into Europe found that on the left and on the right political parties across the board said we know you changed the algorithm because it used to be that we could spread, we could post the bread and butter of democracy, like a white paper on our agricultural policy, and it's not like it got the most comments like let's be honest here it's not that exciting, but we could see from the data people read it. People were consuming it and now if we put out the exact same content, it's crickets. It goes nowhere. The only content that gets distributed is divisive polarising extreme content because that gets a fight in the comments. The algorithm thinks that’s good. And this has really, really, really serious consequences. Right now, Facebook is the internet, you know, Facebook and internet equals an equality for at least a billion people in the world. For a majority of languages in the world today 80 or 90% of the content in that language is only available on Facebook. There are now many countries in the world where when you poll people on ‘do you use the internet’ and ‘do you use Facebook’ more people answer that they use Facebook then say they use the internet. Contemplate that that's amazing. When you combine it with the fact that Facebook has told us that the only thing we could argue about was content moderation, censorship, and the fact that censorship requires us to rebuild those systems language, by language, by language and those same places where Facebook is the Internet, where people don't get to opt out, those same places are often the most linguistically diverse.

I'm caring like because we all have to fight for a world that is linguistically equitable because the Internet touches us all and the same people who are paying the highest costs of these systems are the same people who get the least voice to push back so I'm very excited to hear and start that conversation with you guys or continue it.

Gina Neff: Thank you. John, do you want to pick up on a first?

John Naughton: Yes, I do. First of all thank you for coming. I should say to the audience that anybody who thinks that whistleblowing is easy has never done it and we have here somebody who has and just following up from what you said, given the numbers of bad actors of various kinds who wish to exploit the, for example, Facebook, do you think the company has anything like the capacity it needs to address that problem?

Frances Haugen: One of the things that made me the most concerned when I worked at Facebook was Facebook routinely encounters conflicts of interest where the employees don't even recognise that a conflict of interest has taken place, and this question of ‘is there enough staffing on X problem?’, is a really interesting implicit conflict, right? People will say, oh well, it's so hard to hire. We have as many people as we can put on it, and implicitly we're not asking the question of, well, would you be able to hire more people if you spent more money on this problem? Like how much is it worth to actually do a good job on that problem? Facebook as it stands today, does not invest enough in safety to adequately address the threats that come at it. Does it have the money to do so? Yes, it does. Does it make the choice not to invest in that way? That's also true.

John Naughton: And this is a company that proposes to spend $10 billion and put 10,000 engineers working on building an imaginary virtual world in the next 10 years?

Frances Haugen: Yeah, and if you'll notice this is before the DSA had passed. They said they were going to hire those 10,000 engineers in Europe, you know, just to dangle the carrot.

John Naughton: DSA being the Digital Services Act?

Frances Haugen: The Digital Services Act. The most significant piece of regulation around the internet basically passed in the last 25 years came to political agreement just a few weeks ago and I think Europe was very concerned, excuse me, Facebook was very concerned that Europe might act. And so yeah, they came out and said, you know the future, the future. Oh yes, there's a lot of unpleasant things about social media today, but the future is this glorious world where we all live with headsets on our face, and that's why we're gonna spend $10 billion on video games, but only $5 billion on safety.

John Naughton: This is a bet on what the future of humanity is like. In other words, they're assuming that maybe 5 billion people might be content to live in a virtual reality world for a good deal of the time.

Frances Haugen: I think this is a great illustration of the danger of educating computer scientists in isolation because the phrase ‘the metaverse’ came from, a dystopian novel, like when Neal Stephenson was describing the metaverse, it was not meant to be an aspirational concept. It was meant to be a symptom of society and decay. Right? That the world had become so unequal, so unstable, so sad, so pessimistic that we would hide from this glorious for a virtual world. And one of the things that is like actually, one of the few treats of being whistleblower, is you get to talk to a lot of journalists and no one in the world gossips as much as journalists gossip, and I've been told that by multiple journalists that that part away Mark is so confident we're all going to spend all day in the metaverse is 'cause he spends all day in the metaverse now. And I think we need to know before we start seeing the consequences of making a substitute virtual life so attractive that people would want to spend all day with a headset on their faces, I think we should ask questions now about what does that world look like or where are the consequences of that world because we are beginning to go down those tracks.

Gina Neff: And to the critics who might say, OK, fine, but are these problems Facebook problems, right? Surely you know the problems of misinformation are just bad actors out there. How do you? How do you counter those critics?

Frances Haugen: So Facebook likes to say often you know there's always been these bad things in the world, so people have always said things that weren't true. People have always been hateful. There's always been ethnic violence. Kids have always been bullied. There's always been people with eating disorders. You know this is just a mirror like we're just a conduit. It's neutral. And the thing that we all need to be aware of is that Facebook is not a mirror. It is an amplifier and is also an inducer and so I'll give you like a concrete example. It's like a little more grounded than misinformation because like I think when we talk about Russia and things like that it gets a little abstract. If you go into basically any kids Instagram account in the United States now all the recommended accounts will be accounts about their school, and they'll be things like confidential pages where you can post gossip like a Google form and the owner of this anonymous account will post this vicious gossip about some other child in your school. Or I learned the second one from the principal of my high school. There are now fight clubs on Instagram, so the number one thing he spends disciplinary time on, and this shocked me, I felt very old when said this, he was like the number one thing that I spent disciplinary time on is Instagram. I was like how, how is that possible, right? It's an app and he's like there is an anonymous account where kids go and pick fights with other kids and their friend will film it. So one of those two children did not consent to that fight, and they'll film it and the kids never come forward because they are embarrassed, but this account is just fight after fight after fight with these kids broadcast out. And those accounts are getting recommended as the top recommended accounts 'cause they're high engagement. That Facebook’s not a mirror, it's an amplifier and then inducement, that's a behaviour once you put a platform for performativity humans go and perform. And so in the case of misinformation, you know, not all information gets distributed equally. Good speech doesn't get to counter bad speech because misinformation gets to live in the entire space of ideas. I always like to say. - It's like the French fries of information.

Truth is complicated, it's nuanced. Sometimes it's really unsatisfying. It's the kale of information. It's still delicious, it's just a delicious in a different way. And when you have an algorithm that doesn't place any value on truth and it treats misinformation and truth as an equal things French fries will always win over kale.

John Naughton: Just, can I go back to the Metaverse for a minute? You could view this switch, this amazing kind of strategic switch, as being an attempt to shed the unpleasant skin off of a Facebook that's mired in all kinds of difficulties. But could it also be somehow confirmation that Facebook as such is actually declining?

Frances Haugen: So I had the pleasure of meeting with a regulator earlier today and they pointed out something to me that I had not been aware of, which was on Facebook playing an announcement a couple weeks ago like Mark posted on his personal Facebook page that Facebook was going to make a shift to show you less content from your family and friends. And I was intrigued 'cause basically every time they've faced a crisis for five years, six years, they've been like, OK, we hear you, we've got these problems we're going to show you more content from your family and friends because like the reality is your family and friends aren't the problem. Facebook has run lots of experiments where if you treat all content on Facebook as either being content you consented to, like your family and friends, pages you've actually followed, groups you actually joined or everything else like stuff your friend commented on and insert into your feed or just stuff Facebook thought you might like or a group you got invited to but never joined. If you just give people more stuff they consented to for free you get less hate speech, less violence, less nudity. So for Facebook to come out and say, you know that friends and family thing no, no, no we're worried you are missing some really good posts so we are going to show you less of your family and friends.

I think that's a sign that they must be facing huge problems. Like I'm guessing there's substantially less content from your family and friends than there was a year ago and that's why they're doing this because I do think they're really scared that that the future is moving on, but they're having trouble adapting like they, they haven't been able to figure out what is their role in society and people are leaving, and so I do think they're trying to take the cash while they still have it and try to build the next leg for it.

John Naughton: Do you think that might be geographically uneven? In other words, that that the decline is in the markets where they derive most of their revenue from users, but it will be, it will be continued in poorer markets, and if so, what would be the implications of that?

Frances Haugen: That is, that is, I think the question of our time like if I could, if I could make every policy person in the world listen to one question and one answer it would be that question. And it's because most people are unaware that Facebook makes about $55 per person in the United States and Canada. They can make about $14.50 per person in Europe and they make about $4.50 per person in Asia and they make $2.50 per person in the rest of the world. And so the consequences of Facebook dying on the coasts in the United States and there's been a couple of quarters now where it's been bad enough they've had to publicly put it in their SSE filings or declining in Europe is they will still be the internet for billion people.

They will still be the internet for most African countries, for many parts of Southeast Asia, because in those places they went in and bought the right to be the internet for enough users in those countries they came in and said your data is free if you use Facebook, but for anything else you're going to have to pay, and so businesses don't have their own webpages, there aren't independent news sources. You know, everything is on Facebook and so people may walk away in places where they have choices, but we're going to be leaving behind a billion people who are much more expensive users to support 'cause they're more linguistically diverse and we're going to starve the beast, and there's not going to be resources to do adequate systems in those places.

Gina Neff: Let's talk a bit about that systemic risk of you know Facebook equalling the internet for more than a billion people in the world right now today.

And a company that's done the math right? Or the maths? I knew that would get a giggle, that's why I did it. I speak both languages. So they so they've done the calculation that those billion don't count?

Frances Haugen: I think that's I think we have to be so careful, so like there's a difference between a mistruth and a lie. A lie means you intended to deceive. A mistruth can be you were just ignorant or you were misinformed, right. I think it is a question of Facebook genuinely. I'm random right now. I have the title I want, they'll never get it, which ‘Is love over hate’. It's a maths joke. So, like Facebook believes that the way they can tell are they doing a good job is that they take all the good they make in the world and they subtract the bad and they say 95% of what we do is good, 5% is bad, 95 minus 5 is 90 boom, we're that good. And I think we should be doing the same. It needs to be good divided by bad, because that way it doesn't matter if there's only a little bit of bad, you double the amount of bad, you have the overall amount of good.

Because in social networks that is what happens like in the United States, 4% of the population got 80% of the COVID misinformation. The average user had a good experience on Facebook but that 4% showed up at school board meetings screaming. That 4% made death threats on teachers.

You can’t think about these things in isolation, and I think Facebook looks at what they've done for that billion people and they say we brought online like the last year that I found data on they had 300 million people in the world. Their data was being paid for used Facebook. Right? I think they see it as we brought, we changed people’s lives.

And because there are so few people at Facebook that even get to read the safety data, Facebook's response to my disclosures was not to say oh, let's have a conversation about this. It was to lock down the safety information inside the company so the only people who could read any of it were in safety.

And when you have that kind of segregation, like they didn't want people to confirm that I wasn't misconstruing these things, and so I think it's a thing that happens and those people don't matter, it's just that they would rather tell a story to themselves about the good that they've done.

Gina Neff: And that good comes with a lot of risks. I mean, let's talk about the capacity that they have for managing in multiple and 80 languages in multiple countries, in multiple complexities and situations. You know what does that look like on the ground?

Frances Haugen: So I always I chaff when they sentences like ‘We support 80 languages’ because like let's imagine you were buying any piece of enterprise software in the world, they would show you a chart and they would have lists of features and they would show which tier gets those features.

That's kind of what the safety systems are like at Facebook. They say they support 80 languages but they have like 160 safety systems for American English. They're tuned, optimised for American English.

Gina Neff: And when you say systems you mean AI systems?

Frances Haugen: Yes, like machine learning AI systems that go on like, say, violence, incitement, hate speech, XYZ, self-mutilation, whatever it is. I'm pretty sure I talked to some Norwegian journalists. I'm pretty sure there's none of those like self-harm things in Norwegian. But neither here nor there.

They say we support X language but they might only support four or five of those 160 systems in that language, and they'll say they support it. So the first thing is we do need to have transparency where we say, ‘What exact safety systems exist and in which languages?’ Because like for their COVID classifier there were some languages, in their initial round, I think they had something like 14 languages, and for some of those languages they caught 4% of the COVID misinformation by their own statistics, right? So, so the first thing in terms of operational complexity is just that we need to talk about this in terms of comprehensiveness. But the secondary question is ‘who has which needs?’ Right so, I'm a big proponent of we need to be focusing on product safety because there's a lot of things we can do to make these products safer and that works in every language. But if we are going to talk about individual languages, then we have to start talking about things like which moderators speak which languages? How many moderators are assigned to each language in each harm type? Where did those moderators sit? That's almost like really pedantic. Like should we really micromanage companies that much? Guess what? The Ukrainian language moderators were sitting in Russia up to a year ago, I think maybe even like last September.

Right we even had this reported to us and they asked over and over again, ‘Like hey Facebook, geopolitical context here. We don't think that's a good idea.’

That's why we need transparency into these systems is because tech needs to live in democracy’s house. When we separated it out, we live with the consequences of that.

John Naughton: I was thinking when you look back at the recent history of this industry, say for 20 years, I think there have been two major whistleblowers. One is Ed Snowden and the other is you and what's interesting is that in a way, you both had the same approach. You strategically accumulated a trove of revelatory documents and you took them out, you left but then you both chose strategically a powerful conventional media partner in order to make sure that the stuff you brought out actually really makes it into the public sphere. And I think in both cases that was really shrewd. Because Wikileaks discovered a long time ago that it's not just enough to publish this stuff. You have to have some way of getting it into public discourse and in both cases you've done it, and as I was watching the way in which your story developed the thing I wanted to know, given the chance to see you, was: which of the things that you revealed that you regard as important have not had the public purchase that you would have liked or anticipated?

Frances Haugen: I think the biggest one is this again. So I one thing I highly value about my education was I had the lightest weight engineering degree we got in the United States so I gotta take a lot of other classes and one of the most important classes in my performance as a machine learning person was I took a bio geography class. And biogeography like what does living organisms and maps have to do with like, computer science? But it really taught me things in terms of there's not uniformity in the world.

The thing that I think is the most important is this idea of linguistic diversity. Linguistic equity is really, really important and machines don't understand it and unless we talk about it, unless we demand transparency into it, we will have radically different outcomes. That didn't land at all in the United States. It landed really well here, and that's partially because Facebook spent 87% of their operational budget on misinformation on English. They spent 57% of their hate speech budget on English. People in the United States generally understand the significance. They did emotionally connect to it. Really the only issue that really landed there, I felt was like the teens stuff like children.

And so to say that you know, did it land? I think Europe is really like the Digital Services Act like main Continental Europe really took the ball and ran with it, and I think it's because they're living with the consequences of it like it crystallised their lived experience but I have felt that like I haven't yet been able to explain into the United States, the idea that, like in a way that people connected to it that I really think 10 or 20 million lives are on the line from not caring about languages. Because the most fragile places in the world, places where Facebook is the internet where you cannot choose to leave are the most linguistically diverse and their languages don’t get supported.

Gina Neff: So, well, just before we open it up. I mean, what's the answer? So, I know you've got the answer the answer, love over hate to that equation? You've talked about what the human backstop is to the safety systems, right? You know that that Facebook, in terms of being able to open up and evaluate those safety systems, how understaffed they are in terms of human treat and human marketing. What should we here in this room be advocating?

Frances Haugen: Totally so, I'm going to like way more abstract to come back and so one thing I would be more concrete. I always say any conversation that begins and ends with me is doomed to fail, not going to work, but I want to step back and imagine a different skill set. So what I'm about to advocate for I want to first contextualise just with literacy. So if we rollback in really recent time like I'm talking in 1900 like one of the things we discovered on World War One was that if you didn't live in a city like a big city in Europe or in I don't know about the UK, I know this for Europe if you don't live in a big city in France, you were probably not literate beyond like maybe 100 or 200 words. It sounds crazy like as recently as like 120 years ago, 100 year ago. We came to the conclusion was we needed to live in a society where everyone was literate because you can't check power unless people have a baseline level ability to interpret information in context, and that's why, I like things like public education really matter that if we want to check power, we have to have people have an ability to sit at the table at least at a very basic level. 

And when I left Facebook, I felt quite firmly, and I've talked to other people about this, I felt there were at most 300 or 400 people in the entire world who understood how the algorithms worked at places like Facebook and the intersections with the product features. And at that point maybe 30 or 40 were left at Facebook. And I've talked to people who I consider on that list and they're like no, it's more like 200. And no system, no world can exist with systems that powerful. Only 200 people who understand them.

And so we need to think about how do we bring people to the table? And I'm not saying everyone in the world has to be able to train animal model or understand animal model, but we need to have live in a society where at least 10% of the population or 15% of the population understands the significance of these issues at a level where they can talk about them or they can have at least a mild seat at the table.

And I feel really firmly about this because we're entering into an era, where we've already entered an era, where technology is going to move faster, and faster and faster, and it's going to be more and more opaque to us because when we were talking about cars as being the transform transformation technology, anyone could buy one and take it apart. Right, if you didn't like how that engine worked. If you didn't like leaded gas. You could, you know, be like, hey, there's leaded gas. I can see the lead.

And now we live in a world where unless it's presented on our screens, we don't get to see what's actually happening behind and we don't need to see what the representative impacts are. And if we don't figure out the modern ways of doing governance on these systems, we're in deep trouble.

Gina Neff: On that bright note. We've got time for questions from the audience. I've got two microphones. There is someone on the back row raising her hand.

Audience question: Hi, thank you for the talk so far. I'm quite interested in terms of you joined Facebook quite late and you know I know there was a drop off post Cambridge Analytica in terms of staff and recruitment issues.

They still are able to recruit a lot of people, and I suppose I was wondering what you thought you were getting into, what sort of made you change, and really maybe your thoughts on why staff are still staying there.

You know, because if people are dropping off in sort of developed countries, and yeah, just be interested to know that. Thank you.

Frances Haugen: So Facebook is full of kind conscientious people who do think they make a difference. I joined in 2019 kinda almost by accident. So if you have more than five years experience in Silicon Valley and you are a software engineer data scientist, that kind of thing, you will get contacted four times a year, like someone will reach out and be ‘like hey there's this thing called Facebook, have you ever thought about working there?’ and usually I ignore those emails and all are kind of snarkily, I was like the only thing I would work on is misinformation and the recruiter came back oh by the way we have this civic misinformation job by the way, and I totally slow off them because I felt very, very conflicted because like I said before, like I knew even going in to there, how  few people work on who people who are on human orientation. I'm a product manager who also work on algorithms, so I've worked on algorithms since 2007 and to give people context so when I say there is like 300 or 400 people in the world who really understand the human side and the algorithm side at the level that we need to have our conversations.

The first product manager at Google product manager at Google who actually stuck and did well over time, started in 2006. The company was 10 years old for someone with like a real human centric hat and an algorithm hat converged in one person.

This is a very young field and so they said like hey, we have the civic misinformation job. Would you be interested in it? I had a very personal grievance with misinformation, so I was sick enough in 2014 that I had to relearn to walk. Coeliacs, when you get that diagnosis, please take it seriously, gluten is real, and I got malnourished enough I got paralysed beneath my knees and so I had to relearn to walk. I had this assistant I hired who helped me with a bunch stuff and he started doing is like taking me walking five days a week and he told me I had to eat protein and all these different things. I really feel like I own my life that I live today, that I really enjoy today, to him.

And 2016 after Bernie lost the nomination, he went off the deep end. He kept going into darker and darker corners of the internet and by the time the election came around we were exchanging like 1000 word emails on does George Soros run the world economy? And our conversation ended when he was ‘Like do you even read your own citations? Like these are off in the mainstream media.’

And it was such a loss for me because he was a smart college educated funny compassionate human being and even he couldn't resist misinformation on the internet and so I never wanted anyone to feel pain that I felt losing him. I felt because he was like a little brother for me. So like they offered me this job and I slow walked it because on one side I felt I needed to do it this job, on the other I was like this would be a really unpleasant job. And I ended up getting there and within two weeks I was like Oh my God, I'm way in over my head, 'cause I thought I was gonna work on misinformation in the United States, right? Because I had all the same cognitive biases that every other Facebook employee did. Anyway, I live in Silicon Valley. I only worked with a tech company, I hear misinformation, I hear 2019. I'm like, oh clearly we're prepping for the 2020 election. It's clearly what this job is and no one had ever clarified with me in my entire recruitment process that no, no, no, I was only going to work on misinformation in places without third party fact checking. Which is most of the world, right? 'cause most of that third Party fact checking budget was spent on English and mostly on the United States. They never disclosed any of those stats for a reason. And within two weeks I was like this is not a problem, this is like a civilizational problem, because in world history we've only had periods of mass literacy a few times. Like the printing press, was not actually disruptive. Teaching people to read was disruptive. 40 to 50 years after, Martin Luther comes along. It's like we have to teach everyone to read, even women, right? Everyone was read the Bible like otherwise you don’t go to heaven. And along with that we had misinformation, right? We had pamphlets on witches. Do you know if your neighbours a witch? Do you know what to do if they are a witch? Simple solution rhymes with dire.

Uhm, but the yeah and no one seemed to understand how serious it was. Like 50% of messages sent on Messenger in Myanmar where the UN said there was a genocide caused by social media were voice messages like it really is people are becoming literate to use a system that was not designed for communities that have just gone just begun to process information in this way.

And so you know the internet was designed for people like Ivy League universities, right? And we I was just really shocked that we had like two people working on digital literacy.

John Naughton: And it just follows that if some friend of yours came to you and said I've got a daughter, she's just got a PhD from Cambridge in machine learning. She's very bright. She's been offered a job at Facebook. What would you say? Would you say she should take it?

Frances Haugen:  So I always say it is possible to be an ethical person and work at Facebook and it's important for good people to work at Facebook because, like I said before, these systems are very very OK. They're very, very complicated.

We have won some rights in the Digital Services Act. We hopefully will pass the Online Safety Bill in some form and get some rights via that. We will need good people to have their eyes open inside these companies and learn how they work.

I personally think we should be funding public fellowships where people work in these companies for a few years.

Gina Neff: And when you say good people, is that policymakers?

Frances Haugen: The public needs to fund fellowships to go into these companies because the only way we understand the system is working with these systems and if someone’s daughter had a PhD and was interested in doing, one you will make lots of money, and if you live well beneath your means because that will give you the choice to follow your conscience. But it is a thing where we need people to go and understand and bring it out because these systems will change where our civilization goes and if only people who have no interest in a crop of good work where we will lose that opportunity.

Gina Neff: There's another question I was going to take one on this side this.

Audience question: Thank you, thank you again so much for being here and speaking. I just wondering, you know, I think one of the things, at least I'm from the United States and kind of the like the underlying threat in a lot of Facebook communication is if you regulate us, if you, you know, stop all of these like if you bring government specifically in as a way to make our systems more accessible, make people understand what the algorithms doing like it's going to become impossible as a business model to operate. And that in turn is going to, you know, negatively impact the billion people who rely on Facebook for internet. I was wondering if you could just speak a little bit, maybe to that like reality or the myth of it, and kind of what you see is then going forward like the relationship between maybe not totally upending their business model, but still being able to bring regulation in as a means of moderating?

Gina Neff: Will regulation ruin the business?

Frances Haugen: I think the word regulation is a huge umbrella and covers lots different things and I don't support regulation that says ‘you are not allowed to have x type of content’ because the reality is AI, Artificial intelligence, machine learning is incredibly bad in a high precision way being able to say this is hate speech. This is misinformation. This is of democratic importance, like it's incredibly bad at doing that in a precise way, because language is so ambiguous and there's such a continuum of like classification. If you do regulations like that, it will make it almost impossible to run a business. I totally agree on that.

If you're talking about things like saying you have to ive researchers and civil society groups access to your data so they can ask independent questions,  you have to publish the risks you know about and tell us how you can address them and you have to do these kinds of things where you actually give people options in terms of using your product. You're still going to be profitable. You're will probably be more long term successful.

I always like to say that people think I don't like Facebook. I'm actually trying to tell Facebook be long term successful because right now they're optimising for the short term so much that they will fall off the edge. And so remember regulation in general is meant to help people move away from just thinking short term and do things that are more sustainable and so, no, I don't think it'll threaten it. Remember, Facebook has enough money that they're getting 75 billion, 9 zeros billion, not British billion, I always get confused on these things.  A huge amount of money $75 million worth of stock buybacks but there are spending only $5 million in safety. So even in a world where they spend $15 billion on safety, it's not like there business is unprofitable. And we need to search any questions below after.

Gina Neff: Right here in the white jacket.

Audience question: A few of us in the room recently had a talk with someone who used to be a Director of Facebook quite early on, no longer is in the company, and when he was questioned about, sorry should not have given away the gender, when they were questioned about the material that you leaked and said that this was part of a standard process that there are very thoughtful people, as you said already, who came up with these reports all the time and wanted to try and make it better. And then it went across many people's desks and so it wasn’t that they were being ignored, but just simply that there was so much going on as part of this normal business process. What would you say to this person?

Frances Haugen: That's a really interesting question.  

I'm guessing the comms team doesn't like that answer. This one you know. The repeating thing that happened at Facebook was there were very, very minimally staff teams, even critical teams, and this is a real contrast to Pinterest. Pinterest struggles on hiring even more than Facebook, right? But Pinterest is very intentional what they do and they don't do.

And Facebook on the other hand when they struggle with hiring they still try to do every possible thing under the sun, because they're so afraid that something will come from behind and eat their lunch so I totally can't believe with them that that report may have happened. It might even gotten escalated a little ways up the chain but there's definitely an internal culture where if you complain too much, you get marginalised. If you're too negative, you get marginalised. If you don't give people excessive, excessive amounts of benefit of the doubt, you get marginalised.

And so it's one of these things where I recently reread about Havel’s the Power of the Powerless, and they talked about post totalitarian systems. So totalitarian systems are ruled through force. Post totalitarian systems rule through ideology on the downstream will be powerful, brute force and ideologies. Ideology can heal around injuries. So if you start questioning the ideology, like the idea that Facebook isn't just in here, but you think they're just not normal, or you question, shouldn't we spend more on hiring people 'cause then we wouldn't have to be like oh, there's so much going on we didn't have time. If you raise these questions you will get healed around and you will be pushed off to the side. And so I truly believe that he believes that earnestly but that doesn’t mean its ok.

Gina Neff: Is this what you think is happening at other tech companies with their critical safety teams and critical researchers?

Frances Haugen: To be fair to Facebook, I think it has different circumstances because of their own choices, right? Twitter never went into like paying for internet in large African countries. Like no other tech company opened itself up to some of these cans of worms, like Facebook chose to open those cans of worms.  So some of that is I don’t think they face the same problems.

I don’t think there is the same incongruity, I don’t think there's any similarly successful company in Silicon Valley that is as toxically viewed, and yet is able to continue maintaining maybe Palantir? Palantir, though, does do places where it adds legitimate social value, I don't know, yeah.

John Naughton: Could I just ask the question for him? Did your friend, the ex director, did he ever read Andrew Bosworth’s memo? I’m paraphrasing, but it's worth reading if you if you want to. It went to everybody and the memo says (I'm paraphrasing now), “what we do is we connect people, period. And sometimes by connecting people that things happen, including people dying. But that's neither here nor there. We connect people. End of story”.

Frances Haugen: Well, it's again that question of is it love minus hate or love divided by hate? Because the way he phrased it in that memo is the most important thing we can do in the world is connect people. Like it does. Not, it's not even so much that sometimes people die, it says, it doesn't matter what people die. Connecting people is so important, so transformation that those deaths are collateral damage that we should embrace because connecting people is so important.

Audience question: Great, thank you so much for your talk and thank you to the panel as well. I just have a quick question about Facebook's processes for diagnosing, identifying, flagging and then removing disinformation and misinformation. So we've seen in recent years Facebook and now Meta has come under increasing pressure to enhance the transparency of their fact checking processes of their disinformation teams. As I'm sure you know from your experience as well.

So I'm wondering if you could comment on how Facebook identifies, flags, monitors and then takes down disinformation misinformation from an internal perspective and then also externally, do they engage with civil society groups? Do they engage with researchers on the ground? And maybe if you could answer this question building on your earlier comments about the asymmetry and digital literacy as well, and the linguistic diversity dimension of countering misinformation. So a bit of a long question there, I'm afraid.

Frances Haugen: So in the wake of the 2016 election, Facebook took them all in there and said, ‘OK, we should probably do something about this about viral misinformation’ so targeted disinformation is about twice as common in prevalence in terms of impact. It's harder problem to solve. Lets focus on viral disinformation. These things show up in the news. You know things like tide pods. Every time it happens it increases the chance that someone can step in and do something.

And so they formed they said they made a very clear decision and said nothing at Facebook is false until a third party says that is false, because we are not the arbiters of truth. So they went out and they recruited a bunch of people informed.  It's like a standalone entity. It's like the international fact checkers union or something like that. The name I don't remember.

They are an accreditation agency that accredits fact checkers and Facebook has a budget they set every year where they hire journalists, they pay them something like $5000 to fact check. It's like a lot of money, $3,000-$5000, to pick individual stories out of Facebook and there's an internal key terminal where it used to be you could just see the most popular stories on Facebook, and this was a problem because it takes the fact checkers couple of days to do the research, the make phone calls and do the things to make the fact check and say ‘this is not true.’ And in the first three days a story is out there that you get 75% of the distribution is going to get so if you don't act until that far in, it's too late. So they added this second term, which is the predicted likelihood, what’s the predicated reach of this item and what do we think it is going to get? So we get a little lead time to try to dismantle this. And the fact checkers are not told what to fact check, they can do whatever criteria they want to. They fall into different categories, so some speak different languages though overwhelmingly they spoke English at the time when I left the company. Some of them had specific focus areas like maybe they do medical misinformation, maybe they do client misinformation. There's like a couple of stakeholders where, like they are fact checkers to do a specific area, and beyond that they let the journalists chose what they want to. And once the fact check is written, they have a system where they go and they try to find that exact link across Facebook and they will take that down.

Uh, if it is a match on something, so like, let's say there's another article that is clearly just repeating the same thing, it's not the same page, they try to cluster those into a single cluster. And I think they are ones where, if they're not confident as part of the same cluster, but it looks likely, that’s when I think its demoted, but I'm not sure on that last step. I know sometimes they can move things, instead of taking them down, so I'm not sure exactly what the criteria, but that's the general operational process.

Gina Neff: Let's take one more here in the middle.

Audience question: Hi, so one of the things we've talked about in the past and it's a unique thing about Facebook is that Mark Zuckerberg himself has a lot of control. In fact, he's basically the king. And so you know. And he's spoken also about how he wants to fix various things about Facebook. But in my view, they don't necessarily get fixed.So I was wondering if you could talk a little bit about what you think is happening in his head, and you know, what his intentions are and are they genuine? How does he end up not following through?

Frances Haugen: So I I'm not a psychologist so can't diagnose and I. He is such an interesting character, so I've never met Mark as Facebook likes to say. I've never sat in C-Suite meeting but I do know that Mark missed out on a lot of really foundational human experiences, I think, are important for being a good leader. Right, he became CEO of Facebook in 19. He left college a like 19, probably after not having taken a lot of things other than CS classes, and he got he got to be the boss from that point on. You know in in addiction communities they talk about the idea that you know someone who starts drinking when they're 16, doesn't really emotionally mature beyond that point. because they use that cutch instead of facing the uncomfort whenever they face uncomfort they don't get a chance to grow.

And I talked to someone about how you know Mark Zuckerberg had a really traumatic 2016 so in April of 2016 he was in Africa and they were literally holding parades for him through the city streets in places like Nigeria for saying he was ‘the saviour of Africa’. That he was bringing Internet. He was bringing people to the future. You know, the Arab Spring had happened a couple years before we were like, oh my goodness, Facebook is gonna like topple the dictators around the world like he was riding high, one of the richest men in the world. And you know, six months later, seven months later, we said ‘he's ruined democracy’ and I think he I think he really, really struggles. I think the reason why he's spending all day in the metaverse is, no let's have compassion here, hurt people hurt people and why I refuse to like demonise Mark is until Mark gets to heal he's going to keep hurting us. And I think the reason why he's running away in the metaverse is when he goes into a café, if he goes through a restaurant, I think people grimace at him. And that has to be hard and the part that I find so sad about the whole thing is Nick Clegg, I'm willing to be critical about Nick Clegg.

So Mark Zuckerberg had no clue when he was 19, what he was stealing from himself by sending on the course he one went on. He probably even today doesn't understand fully like how he suffered. He might not even be able to identify the pain he feels, right. But, Nick Clegg's whispering is his global head of communications or policy, and he's telling Mark, we know this because Mark has gone on podcasts, he's telling Mark “People are just jealous of you. You're so successful, you're so rich you've changed history. They're just so jealous. That's why they're so critical” or he's saying things like “You know what? You're going through a really big growth experience right now”. Like Mark has said this “I've really grown the last few years. I've realised that sometimes you have to accept you're going to be unpopular if you stand up for what you believe in, and I'm a defender of freedom of speech.” And part of it is absurd about this and this is how I can tell that Mark is being hurt by his own choices. One of the things that that the reason why Vaclav Havel is so amazing is way he talks around the power of the powerless. How the powerful and the powerless it is a kind of a blurry distinction because within each of the systems we are both powerful and powerless, right. That ideology both oppresses and causes, and we are the oppressors and the oppressed.

And in the case of Mark, like he has solutions he could use today to reduce misinformation but don't touch content, like if we said Alice writes something and it is in Bob, her friend’s feed, and he re shares and Ellen's and Carol, her friend of friends feed and she re shares and it ends in Dan’s feed. Dan and Alice don't know each other, but we don't treat that re share any differently than if it was a friend. Dan doesn’t know who Alice is, she could be a Russian agent.

If it was 100 hops down the chain, I'm sorry, we'd still treat it exactly, the same as if it were her friend. If you just say hey if it gets beyond friends of friends you can still say whatever you want. It can be horribly hateful. It can be misogynistic. It can be violent. It can be (inaudible) you can do whatever you want but you have to choose to do it. You can't just hit reshare anymore. We're going to value intentionality. We're going to value humans in the loop. You can copy and paste if you want to keep saying it but you have to choose to do that. That is the same impact on misinformation, as third party fact checking only we didn't choose which ideas were good or bad. If Mark is a defender of freedom of speech, if the reason why he's suffering is because he stands up for freedom of speech, why doesn't he do that? And the reason for that is it's going to cost him maybe .1 or .2%. of profit. There's some countries in the world where 35% of what's in the news feed are shares. Now I'm guessing those are also probably more fragile countries, 'cause it's usually lower literacy populations are more dependent on reshares. And so it's one of these things where Mark is really, really isolated. He's surrounded by people who have made the choice. They know if they tell him hard things, they'll get cut out of his life. That's not, that's not a strength of his like we shouldn't log him and say good job Mark, you cut people out who told you hard things, but we should be empathetic that he's partially there because we let him go there, right?

He ended up there because no one dislodged him. No, we didn't do greater pressures say you can't be CEO and chairman. And so I really think we need to do, one thing we're asking the SEC is like a possible remedy is just to say he needs to sell some of his shares. Right, you don't even need to take them from him. You just need to say you can’t be CEO anymore.

John Naughton: Well, here's some good news, you. I made the comparison to everlier between you and Ed Snowden because you, the two whistleblowers, in Ed’s case, yeah, I mean the consequences for both of you. In Ed’s case he has some really dangerous adversaries. The good news is that you only have Nick Clegg.

Frances Haugen: It's actually interesting. I have open DMS on Twitter and I don't even get harassed on Twitter, it's miraculous.

Audience question: Thank you so much and this kind of speaks to what you were saying about Nick Clegg and the policy side of things. I know Facebook doesn't spend enough resources on moderation, but they do spend a lot on lobbying. And so I was wondering, like in the United States, it seems a lot of politicians outwardly are very against Facebook, and they say a lot of negative things about it and yet there's been like some very big legislation that's been kind of put forward, and it seems to have, like no political future there in terms of regulating Facebook, and so I'm wondering if you could speak a bit about like from an insider perspective, the extent of money that they sink into like the Washington Office and how much corporate capture there is in the political space.

Frances Haugen: So the question was, they don't spend a lot of moderation, but they spend huge amount on policy, comms and lobbying, and questions around political capture.

I lot of what I want to work on is around education like if we were going to teach political scientists that class on social media and algorithms, how would we do that in a way that is meaningful, but you could also have a class that you can take if you're an English major or political science major like you, you don't have to create a math because I do think there's this thing right now, where DC is really, really, really dependent on Facebook telling it how to solve Facebook's problems? Right, and so you can take zero classes in the world today on stuff like the kind of reshares they do like two hops down. There's a huge amount of dynamics in these systems where you can just down modulate the 1% most extreme users. Like top re shares, reshare 30 times a day. The typical user re shares once every other month or something. Why should we treat someone who treats does 30 re shares a day? Why should there re shares be the same value of someone who does it once a month, right?

We don't have the language. We don't have a public muscle of accountability, so we've never gonna build one because the information was always locked behind screens. And so I think there is a path forward where we just need to flood the channel on the policy side like every single member of the House of Representatives deserves to have an intern who could adequately inform them because right now there are none and Facebook goes in there and poaches out of these Senate staffs anyone who starts to build a competency, and I think that's actually the greatest danger on the regulatory capture side is that that we're forcing people to learn on their jobs about these fields. And as Facebook knows if you are willing to put the legwork in to work on this for a year, even, they're gonna come in in offering two or three times your current salary just to stop more anymore and come to work for them. And I think as long as we are dependent on them on Facebook for telling us how to discuss Facebook, we're not going to move forward.

Gina Neff: Last wisdom for this audience, sitting here in Cambridge. What can we as individuals do?

Frances Haugen: Gotcha. So I think the only thing I can talk about tonight that it does anywhere like we need to continue.

I see. Yeah, I think there's I'm a really big proponent of the idea that politics begins with conversations. Right, there's this idea of political activity and pre political activity. And most people only know about the idea that like Facebook's put huge amount of money into saying the only thing that matters is content moderation. They spent $120 million endowing the oversight board, which only gets to make decisions about censorship. So talking to people about the idea that we have lots of solutions that don't involve content but they need transparency.

The Online Safety Bill right now is probably the single biggest impact you guys can do.  We have to have to have to have to put in the law transparency on day one, or instead of saying Ofcom's gonna research for two years whether or not to give data access it should be we're going to spend three months after law passage on how to do data access or how to do certification of which academics, which civil society groups? Because the reason we're dependent on asking Facebook ‘Facebook which we do about you?’ is because we can't ask questions independently

We have to have data on day one. You have to contact people you have to keep pushing on him. The secondary thing that I didn't talk about is I talked a lot about the global South today and it's really easy to say. You know, I really care at the global South. It's so far away I don't know if it really. impacts me, it's like really sad was tragic. The reality is places like Ethiopia have already said, hey we, we just had nothing violence incident caused by Facebook. Or at least grossly worsened by Facebook. Facebook let us down. They didn't have language support for most languages in our country. They only did it at the last minute. Really have had haphazard. This has been so disruptive we're going to get rid of Facebook. We're going to build our own thing. And I and they literally said this is quoted. I think in the Guardian, like the preppers in charge of in Ethiopia said, we think Wechat is the model. This is like social media in China.

We are not aware of it yet, but we are about to face a new Cold War right between the free internet and the authoritarian internet. Or I should? Say the libertarian internet and the authoritarian Internet. The libertarian Internet is leaving behind people. And the consequences places like Ethiopia are trying to build these things and I think they're going to find it's either way more expensive or hard to do it than they thought it was, or it's harder to maintain it, or any of these things, and I think they're just gonna turn to China and say you've spent tonnes of money and time thinking about how to run a safe version of social media and we are going to look up in 10 years and realise we just lost Africa to having an authoritarian internet.

And that impacts all of us. It really changes the course of human history, so we have to be having these conversations. We have to be pushing it forward. We have to talk about these things because we still have time to live in a different future, so thank you.

Gina Neff:

Thank you all of you. Thank you for being part of this conversation. We would very much appreciate if you get the Eventbrite link to a short feedback survey, give us some feedback on this session. Details of our future events will be on our website www.mctd.ac.uk we also are going to have a briefing on June 16th on the police use of facial recognition in the UK and whether or not police forces are meeting those minimum standards that algorithmic experts on ethics and safeguarding.

If they're, if they're able to meet those standards, so that's our next report and an event. More information will be on our website forthcoming. Thank you again for joining us today and thank you again Frances.