The Washington PostDemocracy Dies in Darkness

Transcript: The Path Forward: Social Media

MS. DWOSKIN: Hello, and welcome to Washington Post Live. I’m Liza Dwoskin, The Post’s Silicon Valley correspondent. And I’m delighted to welcome Susan Wojcicki, the CEO of YouTube.

That's for being here, Susan.

MS. WOJCICKI: Sure, thank you so much for having me.

MS. DWOSKIN: Susan, I'm really excited for this conversation on many levels. First, it's the moment--you know, this is, like, a year of really seismic changes in our society. We're in the midst of a global pandemic, it's a consequential presidential election, and the country is protesting for racial justice. And all of these are issues where social media plays a huge role. And then, also personally, I feel like I have followed your journey as a reporter, almost, like, four years ago to the current moment, when, going into the last election, I think social media companies were really taken aback by the scale of Russian meddling and misinformation surging up on the platforms, including yours.

And today, when I report on a company--you know, we're still going to call you out for your problems, but it's a more mature company, it's a different company.

MS. WOJCICKI: Definitely. It's been a historic year by any measurement. And I don't think any of us could have foreseen how many changes that we would all be grappling with and challenges.

And you know, since 2016, we've made tremendous changes at YouTube. So, you know, we have been really focused on what we call responsibility, and all the different products and policy and changes that we need to make as a result of that. And I think, you know, while there's still work to do and there will always be work to do, we've come a long way.

And you know, I'd be happy to talk about any questions you have on that front, but it's been really important for me. This is a very--of course, a very uncharted area for everybody. So, I just want to make sure always that, as we're making decisions that I'm thinking about things in terms of being on the right side of history. It might be hard right now, but how will we think about it in the future, and how can I make sure that YouTube is on the right side of history.

MS. DWOSKIN: That's the right compass. So, let's--I'd love to dive into the news.

You know, even though I think you guys have really gotten a lot better at stopping some of the abuse and misinformation, there are still some big stories that slipped through the cracks and, most recently, there was the video, a plant called "Plandemic" that you know, shot up--shot up on YouTube. It was a video, as you're familiar, you know, that alleged that Dr. Fauci had a role in spreading the virus and that masks don't--like, masks actually spread coronavirus, and all sorts of misinformation.

So, how did that video become one of the top trending videos on YouTube?

MS. WOJCICKI: Yeah, so, we have definitely updated our policy many times since the COVID-19 crisis hit. I think we implemented more than ten different policies. And it definitely is a policy violation. I can go through to say, for example, the virus doesn't exist or that you can take any kind of unsubstantiated cures that have not been verified. And so, Plandemic was a video that was violative of our policies. We did remove it. But what happened is there were a large number of people who reuploaded that video and they tried a whole variety of different techniques, whether it was speeding it up or slowing it down or putting it in a frame or changing it in a whole bunch of ways. And it just took some time for our systems to be able to catch that all. We use a combination of people and machines, and we were able to ultimately bring all of those copies down, but there was a moment in time where there was just a challenge in our enforcement. But it was never an issue with policy. It always was a violation of our policies.

MS. DWOSKIN: Do you think it's fair, then, for journalists like me to say, then, that despite the company's best efforts, even with your best efforts and policies in place that, you know, you can't stop the worst from coming. You can't stop this misinformation from reaching millions and millions of people?

MS. WOJCICKI: Well, I mean, I want to make sure we take responsibility, and we always know that we can do better. And with really every single issue we look and we do a postmortem and we say, "How could we have done better? What did we learn from this incident and how can we make sure that we--in the future, we make different changes?"

And so, you know, our systems are constantly getting better. We're constantly adjusting to whatever the last attack was. That's not going to be the next attack, because we will have already closed that loophole. And but, we do take responsibility. We want to make sure that we're providing--that we're doing the best we can. We want to--especially in sensitive areas, whether it's news or medicine, that we're providing accurate information for our users.

MS. DWOSKIN: And what about--so, you recently talked about YouTube has--will donate, I think, $100 million to black creators--to promote black creators on the site.

But there's--right now, there is a lawsuit from black creators, a small group of black creators arguing that YouTube's algorithm systemically discriminate.

And recently, I was just checking the highest paid YouTube influencers and none of them are black. Why is that?

MS. WOJCICKI: Well, so, YouTube's always been a platform that is open and we've actually really prided ourselves on the fact that we have such a diverse group of creators. And I think if you look across our platform and you look across any sociodemographic group, you're going to see that YouTube is--has a well-represented set of creators.

And we did make the 100 million donation to be able to amplify black voices, because even though we are a platform that's open and prided ourselves in the diversity that we have, we thought it was really important to be able to do even more. And so, that is a content commitment--a multiyear commitment to amplifying black voices on the platform.

And you know, I'm very excited that we're doing it. We actually did our first original with it just this Saturday, which, if you haven't seen, I think was just really insightful, which was "Bear Witness, Take Action." And we were able to bring together a large number of thought leaders across many different areas to be able to reflect on what we're going through right now in some of the challenges and some of the systemic racial inequality.

And you know, the lawsuit is--that's a new one. We're certainly going to look into it and take it seriously and try to understand what concerns are there. But I would actually say, you know, that we actually have seen a lot of success among different black creators, like, whether you say Jackie Aina, who actually was--is a beauty creator who started by making sure that she was able to talk about products for people of color. And you know, she was one of the first.

Marques Brownlee, for example, is actually a tech reviewer, and he's, you know, probably our top, if not, one of our very best creators reviewing tech. He's come to all the Google events and he did some really nice reflection on--reflecting on the color of my skin, just recently. And so, just, we've been really proud that we have so many creators. And we'll look into that lawsuit and make sure that we're responsive to whatever concerns there are.

MS. DWOSKIN: Yeah, are you able to examine whether your algorithms actually, for example, would favor creators of one race over another? Is that something that you're examining? Can you, even?

MS. WOJCICKI: Well, so, we don't have--I mean, right now, there's--it's not like our systems understand race or any different--any of those different demographics.

So, but on the other hand, like, what we do do is whenever we get allegations or concerns like that, we certainly can work with third parties or hear what are the questions or concerns that people have and look into that and try to understand.

And you know, machine learning fairness is a huge area of work just across the industry at Google and at YouTube. And so, we always want to make sure that our machines haven't by accident learned something that isn't what we intended, and if we ever find that it did then we will retrain our machines to make sure that they now have the right--that whatever that issue was has been removed from the training set of our machines.

MS. DWOSKIN: And what about, you know, still on the topic of race, I think one of the most salient criticisms of YouTube, and we've written about it, is that YouTube has sort of enabled this generation of far-right creators to go viral and get huge audiences and make tons of money on the platform, as well. You know, people who said that people of color were a separate species. I think we quoted in a story once activists talking about how YouTube became the online library for the far right.

So, thinking about that in light of the protests right now, I wanted to ask you a question that I hope all CEOs are asking themselves right now, and all individuals, which is, how does--how has your company contributed to systemic racism in society?

MS. WOJCICKI: Well, you know, that's a really important question and certainly a deep question. And you know, I think the recent events in the fight for racial equality have been, you know, very--it's been a historic time and it's been really important to me that we are doing everything we can to look at our systems and to take a hard look and say, how can we do better?

And so, you know, we recently just did a pledge. the pledge was the $100 million content creation to make sure we were amplifying black voices, and a commitment that we would look at our policies and products. And again, we want to make sure that they're fair for everyone. And but, you know, we're going to take a hard look, in particular with regard to the black community, given some of the concerns, and make sure that we're closing any gaps that are there.

I will say that we did a really huge change to our systems last year, where we updated our hate and harassment policies. So, in our hate policy, we would not enable anyone to use gender, race, religion, sexual orientation as a means of discrimination to promote violence, separation, segregation. And so--anyway, that's just kind of a highlight of how that would work. So, it could be that you're referencing content that was on our platform that is now no longer there.


MS. WOJCICKI: We actually also released the number of videos that we've removed. And just in the last quarter, we removed over 100,000 videos that were violations of the hate policy.

MS. DWOSKIN: Yeah, you're absolutely right. I should have actually said that one of the ones--the specific one that I referenced that has been taken--that creator has been banned from the platform.

And you know, we even wrote last year about how some of those sweeps that you did, because you've gotten more aggressive in response to criticism, even took down videos that--like, videos about Nazi Germany that were put up by educational groups and teachers.

MS. WOJCICKI: Mm-hmm. Yeah, so, you know, with every policy change, there's always going to be unintended consequences. And so, we work really hard to ensure that our policies are done in a way that are consistent and fair and that we do it in a way that doesn't have any of those unintended consequences.

But on the Nazi Germany videos, so, it used to be, for example, that we allowed videos of Nazi Germany and historical speeches when there was no context allowed, so, just historical video.

And what happened--and you know, we realized that that could be subject to abuse in a variety of different ways. And so, we now still allow that content, but if--for educators, for non-profits, we make sure that there is actually contextual content associated with that video.

So, it can be before or afterwards or in the four cor--we talk about the four corners of the video, making sure that it's available for users to understand, because otherwise that content can be abused in ways that none of us intended.

MS. DWOSKIN: So, that's so--

MS. WOJCICKI: We tell them that and we educate them. They call us and we let them know, like, this is our new policy and how you can reupload the content, but there needs to be context available.

MS. DWOSKIN: I think it's so interesting that you bring that up. It raises something that I wanted to talk about with you, which is, okay, so, you guys are adding context to content. You're now putting out banners with authoritative news sources like, for example, on a mass shooting or the George Floyd protest. I can literally get a ban--go to an information center on the top of YouTube where you tell me--you're curating and feeding me authoritative sources, much like on Google, and you're also taking a much more aggressive role in curating what types of creators should be allowed on the platform or not based on your rules.

So, given right now that there's this big debate in Washington where the Trump administration is moving to erode some of the legal immunities where--for tech platforms, on the argument that they're no longer the neutral pipes.

Do you think that--well, first of all, what do you think about that move by the Trump administration, and also, do you think that, in some ways, you guys already are a media company, that you're already not neutral, based on all the stuff that you're doing?

MS. WOJCICKI: So, yes--so, Section 230 is an incredibly important piece of legislation. And yes, I'm very concerned about changes that could be made to it, and the implications. It gives us protection in a few ways which is, first of all, to be an open platform.

And if you look at YouTube, like, we don't have people who produce content. We are just a platform for other people who produce content and we're able to be open. We're able to allow all types of people to upload content and we think that gives incredible diversity.

But 230 also gives us protection when we take down content. And you know, running one of these platforms, I can tell you, you have to be able to manage your community; otherwise, your community gets hurt. So, you can't be allowing hate and harassment, violent extremism. You have to have child safety, and the Section 230 enables us to have these protections.

And so, I would say when we make the policies, the policies are not on content. It's not like we're making content. We're just making policies about what we will allow on our platform. And when we make those policies, we go to a broad range of experts to try to understand what's the right way to be able to do that without any of the unintended consequences.

If there was--if there were guidance from the government about what was hate or what was harassment, we could adopt that and we could use that, but there isn't. And so, as a result, we're left having to go and talk to experts to figure out what that policy is. And if we didn't do that, I can assure you that the site would have a lot of real issues that would be really hurtful and we want to make sure that we're protecting our community.

MS. DWOSKIN: Do you think that there should be a regulatory body that literally says, "This is what hate speech"--what is hate speech so that Facebook can't make one decision and YouTube make another and Twitter make another?

MS. WOJCICKI: I mean, if there were--if there was more guidance about what were the definitions, that could be really helpful. So, Europe, for example, does have much stricter hate laws, and we were able to implement--we were able to use some of that when we looked at how--our guidance.

But you know, right now there's not in most countries, and not in the U.S. And so, as a result, we need to go out and we need to talk to experts and we need to figure that out. But you know, provided that that--there was guidance that was reasonable and everybody could agree, sure, we'd be happy to implement that. And whenever there is guidance, like, we do implement it.

MS. DWOSKIN: Wow. So, like, in some way you actually agree with the Trump administration, that there should be some government authority perhaps saying what is bias, what is hate speech, what kind of speech should be allowed online. I mean, that's effectively what the Trump administration is trying to do.

MS. WOJCICKI: Well, I'm not--I mean, I think--I'm not sure that I would say I necessarily agree. But what I would say is that, you know, when you run a platform, you do need to have--you do need to have clear policies. And there's always going to be debate about what those policies are. Half the people are going to say you didn't go far enough; the other half are going to say you went too far.

And certainly, whenever there are rules, like, we implement that. Like I said, with the hate policy that has been in Europe, like, we implemented that years ago with regard to how that was handled in Europe. So, you know, we want to be good citizens, we want to be good players. We want to protect our community, and YouTube is an incredibly valuable resource for so many people. So, you know, we'll continue to work with all different third parties and come up with those right policies, but we are not necessarily--we're not creating the content. We are just a platform that is coming up with the ways that we want to manage our community.

MS. DWOSKIN: I want to talk for a moment about President Trump. Recently, Twitter and Facebook took very different approaches to a piece of content--to a post and a tweet by President Trump, and the tweet was a racially divisive tweet and a post on Facebook where he said, in reference to the protests in Minnesota and the National Guard going in, he said, "When the looting starts, the shooting starts," which was a reference to the civil rights era when the National Guard went in to protest in that time.

Now, I think you guys were a little bit saved by the bell, because he didn't post that on YouTube. But my question to you is, if he did, would you have taken it down.

MS. WOJCICKI: Well, so, first of all, it's really hard to always comment on a theoretical, like, what the video would have had or what it would have done.

But I will tell you that we, for elected officials--we hold them to the same standards that we hold all of our other users and uploaders. And so, if there is an elected official and they do post content that is hateful or promotes violence in some way, like, that is content and that would be the basis for removal of that content.

That said, whenever you have an elected official, there's always going to be reuploads of that content, but by news organizations. So, Washington Post, I'm sure, would cover that and could have, like, a snippet of that statement, but because it's in context, we still would--we would allow it, because there would be the context and there would be, potentially, like, the historical commentary associated with it.

And so, that content, in theory, it would still be there, but it would be surrounded with context. But yes, something like that could be a violation of our policies and if it is then we would remove it.

MS. DWOSKIN: I see. So, just to kind of pin you a little--push you a little bit further on that one, you're saying that you would have allowed the Trump post if there had been context around it, like, news articles explaining what it meant, versus if he just said--went on YouTube and posted a video and said, "I'm looking at what's happening in Minnesota. When the looting starts, the shooting starts."

MS. WOJCICKI: Well, what I'm saying is, just to be really clear, is that if a video was uploaded by a politician, any elected official, and it had something that crossed our line with regard to hate speech or inciting violence, like, we would remove that content, okay?

But you, Washington Post, would probably cover it and you may have a clip that we removed, but because you're providing commentary, we would not remove the Washington Post version of it. So, that would still be there and users would still have the information about what's happening in our society, but the actual clip coming directly from that politician would not be accessible anymore.

MS. DWOSKIN: So, you can't speculate a little about whether that Trump post would have broken the rules? I mean, I know you were following it closely.

MS. WOJCICKI: I'm sure--I mean, I just want to say, it's always really hard to speculate on something that is theoretical and in terms of what the video is or the context or how it was said.

So, I want to just emphasize again that we apply the same rules to politicians and if something violated or incited violence, we would remove it.

MS. DWOSKIN: Yeah, and that's very different from Facebook, which has an exception for politicians and political leaders and YouTube, which did, actually, until recently. They've kind of walked away from that.

MS. WOJCICKI: We--I mean, I think--I mean, I'm trying to remember exactly when that policy was put in place, but it's been there for, you know, at least a couple of quarters.

MS. DWOSKIN: That's interesting. Was it a hard call for you?

MS. WOJCICKI: I mean, I think we wanted to speak with experts about it and really understand the implications of what that meant, but after thinking about it and, you know, we realized, like, you have to have the same standards for your politicians; otherwise, it becomes this double standard. So, you know, I think once we implemented that--and again, we've had it for a while. I want to say, like, definitely--I know last year we were talking about it. So, it definitely is the right decision for us.

MS. DWOSKIN: Have you removed any videos from President Trump? I know you removed several from the Brazilian President, Bolsonaro.

MS. WOJCICKI: Yes. Yes, we did--well, on Bolsonaro from--that were COVID-related.

I think on President Trump, not that I'm aware of. There have been some ads that may have had different technical issues, like, and that's a really common practice for our advertisers. So, we actually have a political database of all the videos that have been uploaded, how they were targeted. And so, every single time Trump has advertised on YouTube, you can see that. And I think, you know, there were some back-and-forth on some of the technical details, on some of the implementations, and that's really standard for advertisers.

MS. DWOSKIN: Okay, that's interesting. You answered one of my questions, which was I knew there were some Trump ads that had been removed, and it was for technical reasons. All right.

MS. WOJCICKI: Yeah, they were for technical reasons.

MS. DWOSKIN: Okay. And what about--let's go to the protests. I'm curious, what social media platform is your go-to for information about the protests--like, footage?

MS. WOJCICKI: Well, I mean, of course I'm going to say YouTube is a go-to platform. I think we [audio distortion] coverage across the board.

But you know, I also will say that different platforms are strong in different areas. And I think that's actually some of the beauty of having multiple platforms. So, I'll say some of the--Twitter is really focused on the real-time aspect of what's happening right now.

YouTube, I would say, has the benefit of having a lot of content that is both short and long form, but is really a lot of times a little more produced than some of the other mobile video content that we would see and it's enabled a lot more thoughtful discussions about what is happening.

And so, I think if you're looking for, like, what's happening in the moment or what's, like, a clip of someone at a protest, there are going to be other platforms where you may have a larger selection of videos. But if you really want to understand what is happening, if you really want to see the leading thinkers on what are their thoughts in terms of the fight for racial equality, what's happening with regard to George Floyd protests, then I think YouTube is a great place to go because we're going to have news and we're going to have a large number of commentary and YouTubers who are going to comment on that and really be able to reflect on what's happening right then and there.

MS. DWOSKIN: Yeah, it's a really good point. I was going to say, like, during this cycle, I've been using Twitter and TikTok for breaking news and footage and, like, actually seeing what's happening at the protests today. But if I wanted to go back and maybe see what happened at the protest, like, three days ago, I would be going to YouTube.

But I was wondering, do you think that that's a shift, because I'm trying to remember, like, in previous news--in breaking news events, like some of the ones I've covered, like the Christchurch shooting, I feel like YouTube was more of a go-to source. Is that fair or what's the difference there? Do you think something's shifting?

MS. WOJCICKI: I don't know that shifting--YouTube--I've always thought about YouTube as a library in the sense that we have a large collection of content and we're not just focused on what's happening right in the moment.

And so, you can actually go and you can understand, you know, any conflict and get content associated with that. So, you know, I think every platform has its benefits. We also have a lot of short-form content. We have a shorts product. People can upload mobile video to YouTube. We see mobile video that does really, really well on our platform.

But I do think that, because of--because we're not focused on what happened today--you know, other platforms will have, "This is the content of today," and then it disappears. It's harder to find the next day. You can find the content of what happened five days ago, ten days ago, a month ago on YouTube and that's all accessible and available.

And so, that's why I use the library as an analogy, and I think we've always been focused on that. We've always had good search; we've always hosted everything; it's always available; it doesn't disappear. And so, that's the way I would recommend using YouTube.

But you know, freshness and what's the current event and what's happening, it's also incredibly important that we have that and we have all the latest clips.

So, we did have actually, just so you know, with George Floyd, at one point we had over 200 million views of people watching videos with George Floyd in the name. So, just to give you a perspective, that's actually quite a lot of usage just there looking to understand what was happening right then at the moment.

MS. DWOSKIN: Wow. And was that people uploading protest video in real-time or was that anything from, like, Washington Post summaries--so, plug The Post--but news summaries of what was happening, as well?

MS. WOJCICKI: They were views, basically, of any commentary that had the word--the name "George Floyd" in it. So, it could be upload, it could be protest at George Floyd, you know, protesting the death of George Floyd, or it could be a Washington Post story, for example, just covering what happened with regard to George Floyd.

So, we have really significant volume across the board, but what I think is unique about us is the fact that we host and store and make everything available from any point in the past.

MS. DWOSKIN: That's fascinating that you see YouTube as a library more than a place for breaking news or real-time events. That's how you've always felt and that's--you're comfortable in that--is that kind of niche that you're comfortable--

MS. WOJCICKI: I mean, we do have a breaking news--I mean, when you go the library, you can actually read the newspaper, too, right? You can actually see everything. You can get all the books, you can get the magazine, you get the newspaper. So, like, our goal is that we can cover everything.

And actually, with COVID, we actually had a breaking news shelf that's been there pretty much every day since COVID first really broke. We have that in 30 countries. And so, we do--and news, we've seen a 75-percent increase in COVID-related news from authoritative sources. So, we have seen a lot of news but we will also see people not just looking at the news. We'll see them looking for longer commentary and thoughts. And so, our average watch time of people on YouTube is often an hour or more. And so, people aren't coming just to say, like, what--you know, just read the headlines. What happened--show me something for three minutes. They actually really want to engage and understand the issues at a deeper level.

MS. DWOSKIN: That's fascinating. I want to--I keep saying that, but really, I'm learning a lot, which is cool, considering that I thought I knew a lot but there's--I want to ask you two last really quick questions, which is--


MS. DWOSKIN: --so, I recently saw the report that, you know, kids under 20 are spending almost as much time on TikTok as on YouTube. How much is TikTok a threat?

MS. WOJCICKI: Well, I think this first of all shows how dynamic our environment is. And--


MS. WOJCICKI: Yeah, and so, it definitely shows that, you know, here's a new player coming out of China and that has a lot of adoption. And I mean, I see that, too. I see that all over, kids on TikTok. And in many ways, I think this just shows that we have a dynamic environment, that there's always room for innovation. And it is a competitor, but it also--you know, we all learn from each other. And so, I think this is--digital video is a huge base. There are going to be new competitors. We see that across the board.

So, just this year, we saw a lot of new entrants, whether it was--but they were more on the premium subscription, whether it was Quibi or Disney or Apple or HBO Max, right? So, this is just a very competitive space and I expect to continue to see more competitors.

MS. DWOSKIN: Yeah, that's very true. And speaking of the ground shifting really quickly, I want to ask you, lastly, is that you're a mom of five, unbelievable, and so--and your kids are school-age. So, they're all going through this transformation where school in real life won't be as important in the next year, year-and-a-half.

Are you thinking differently about--obviously, tons of kids are on YouTube already? Are you thinking differently about kind of strategy, the role that YouTube can play?

MS. WOJCICKI: Well, YouTube is a really incredible way to learn, and we did see homeschooling and the queries and the people searching for homeschooling double since we had COVID-19. And you know, I personal--I'll just say, as a mom, I see it as a super-useful resource whenever my kids are asking me something in math or science or something, I forgot. I just tell them go to YouTube, and there's always, like, ten videos to pick from that are explaining that concept better than I ever could.

So, I do think YouTube is an incredible resource for learning. But to really learn, you need to do tests and you need to have a teacher and you need to do problem sets. And so, I really think about YouTube as being an important source but not--it needs to be married with other services that can help kids learn online. But it is pretty amazing the way kids are all learning online right now. And we're certainly going to double down and invest more in education and what we can do with YouTube to support all the schools. And we see that's--it's a big opportunity.

And like I said, we see ourselves as a library. So, the more that we can work with schools to be able to highlight the library that can be useful to them, I think it will benefit everybody.

MS. DWOSKIN: Well, we're past time, so I want to thank you so much for the conversation. I learned a lot. And I want to thank everyone for tuning in.

Also, want to plug that tomorrow we're doing a special event in Washington Post Live, commemorating the legacy of Juneteenth. That event will be with Lonnie Bunch, who is the founding Director of the Smithsonian National Museum of African American History and Culture.

You can find--you can register by going to

Again, thanks so much. My name is Liza Dwoskin, I'm the Post's Silicon Valley Correspondent.

MS. WOJCICKI: Thank you so much for having me.

MS. DWOSKIN: Thanks for coming. Thanks for watching.

[End recorded session]