Peter Warren Singer “LikeWar: The New Weaponization of Social Media”

My name is Dr. Travis Morris and I have the
honor and distinct privilege of being the Peace and War Center director and it is my
privilege also to welcome you to tonight’s Todd Lecture Series. Tonight, we’re going to have an engaging conversation
and a continuation from the panel that we know that you’ll be thrilled about. It’s also my distinct honor and privilege
to introduce our provost and dean of the faculty, Dr. Sandra Affenito. Let’s give her a round of applause. I know we have mini guests in the room and
if you would afford me, ma’am, just read a bit of your bio to share with everyone. So in this capacity she provides leadership
and program and curriculum development which covers assessment, accreditation, strategic
and financial planning, community partnerships, development, maintenance and growth of the
university’s program infrastructure and capacity which translates, she has a tremendous leadership
role and responsibility here at our institution. So throughout her career she has had academic
agenda as a teacher, as a scholar, as a bio behavioral research scientist. She holds a PhD in nutritional sciences from
the University of Connecticut and she completed post-doctoral training in biomedical and behavioral
sciences at Wesleyen University through the National Institute of Health. She has written and coauthored 60 peer reviewed
journal publications and has been invited as a speaker at multiple international and
national conferences relating to pediatric obesity, diabetes, women’s health, eating
behaviors and nutrition intake. Collectively, she has over 35 years of leadership
and administrative experience in higher education and healthcare in the corporate sector, and
all of these experiences and her leadership background provides her the ability to lead
in a complex environment with students and faculty and staff from the US and international
institutions in a very effective and efficient way. It’s our pleasure. Let’s welcome Dr. Affenito. Thank you, Dr. Morris. I got up quickly so that you would say less
but it really is all about our speaker this evening and it’s a distinct honor and privilege
to be here tonight to introduce our keynote. So welcome. Welcome everyone to Norwich University and
the Todd Lecture Series. Tonight’s keynote culminates the two day Norwich
University military writer’s symposium that has addressed the provocative theme warfare
and the 21st century feature battlegrounds. For the past 48 hours, our active and inquisitive
learning community has engaged this topic in classrooms through writing, intensive workshops,
in panel discussion and through interdisciplinary conversations. I would like to thank the guests of the symposium
who has spent the past two days with our students, faculty and our staff, Paul Scharre, Benedetta
Berti, Ian Brown who is with us, David Bellavia. I would also like to warmly welcome this evening
Dr. Robin Havers, president and CEO of the Pritzker Military Museum & Library in Chicago
whose support enduring support for the military writer’s symposium and the Colby Award makes
this amazing academic experience all possible. Additional thanks are due to the members of
the Todd Lecture Series committee as well as to the members of the Todd and Drew families,
joining us via the livestream this evening. The Todd Lecture Series is named in honor
of retired US Army Major General and Norwich President Emeritus, W. Russell Todd and his
late wife Carol. We are so grateful for their dedicated service
too and support of Norwich University and the Northfield community. I would like to also recognize the Todd’s
daughter and son-in-law, Ellen and John Drew as well as all of the Drew Foundation members
for their generous donation here for resources for the specific purpose of funding our Todd
Lecture Series. The Lecture Series will always be free, it
will always be open to the greater Vermont community and to the Norwich student body
and stream live to Norwich students and alumni across the globe. It is now my distinct honor and true privilege
to introduce this evening speaker, Dr. Peter Warren Singer. Dr. Singer is strategist and senior fellow
at the New America Foundation, the author of multiple award-winning books and a professor
of practice at Arizona State University. He has been named by the Smithsonian as one
of the nations 100 leading innovators by defense news as one of the hundred most influential
people in defense issues and by foreign policy to their top 100 global thinkers. And as an official mad scientist for the US
Army’s training and doctrine command. So that’s 300 plus in total. Analytica, social media data analysis his
name Dr. Singer as one of the 10 most influential voices in the world of cybersecurity and the
25th most influential voice in the field of robotics. Described by the Wall Street Journal as the
premier futurist in the national security environment, Dr. Singer is considered one
of the world’s leading experts on changes in 21st century warfare. He’s consulted for the US military, the Defense
Intelligence Agency, and the FBI. And he has advised a range of entertainment
programs for Warner Brothers, DreamWorks, Universal, HBO, Discovery, History channel,
and the video game called series called, Call to Duty. Dr. Singer served as coordinator of the Obama
campaigns defense policy task force and was named to the US military’s transformation
advisory group. He is an associate with the US Air Force China
Aerospace Studies Institute. Additionally, Dr. Singer has served as a member
of the state department’s advisory committee on international communications and information
policy, and as an advisor to IDS. Dr. Singer’s award-winning books include the
following, Corporate Warriors, The Rise of the-
Changed your personal lives how we communicate with our family, how we don’t communicate
with our family. Those of you in the room that have kids or
those of you that our kids have had the situation of sitting at the table and everyone’s staring
down at the screen. It’s changed education from how we’re connecting
with people that aren’t here right now to create an entire new departments for cybersecurity
and of course has changed warfare. Now along the way though, there’s been a couple
other changes that have happened that are important. The first one of these is in the hardware
side. All of those computers that were lashed together
into that original intergalactic computer network lacked what’s known as a sensor. They lack the ability to gather information
about the world beyond the computer. Whether you’re talking about on the left there,
that’s the original server for ARPANET. It’s about the size of a refrigerator to the
one that that young man there, anyone recognize him? Mark Zuckerberg? Almost exactly 15 and a half years back. He’s sitting in his dorm room programming
what was originally called FaceMash then called the Facebook and then Facebook. All of them lack the ability to gather information
about the world beyond the computer. Now, pretty much every computer has over 25
sensors on it, some are pretty obvious, the camera taking imagery of the world beyond
the computer. Other ones line the background geo location,
where in the world is this computer? Therefore, where in the world is the user
of this computer? Our friends in the room, who knows cybersecurity
who also know there’s a thing called metadata that sort of sensor data put on top of the
messages themselves. There’s a second change though that’s exemplified
by the story of young Mark Zuckerberg there. It’s not just the ability to gather information
but to share it in fundamentally new ways. And this is sometimes known as Internet 2.0,
Web 2.0 or more popularly social media. And essentially what happened is it brought
together two types of historic communication revolutions. So if we look back in the history of communication,
new technologies come along and they would change either one of two things. Either they brought two people together in
the fundamentally new and better way, the Telegraph, the telephone, or they allowed
one person to reach many and a new better way. The printing press, radio, TV, broadcast. Now, what social media allows you to do is
combine those two together, you could simultaneously do one on one interaction but also broadcast
out to the world all in real time. And that’s not just the communication revolution,
that’s the very appeal of it. And there is a fascinating quote that’s been
said by three wildly different people. One is Donald Trump, a businessman turned
celebrity, turned commander in chief. The other is LeBron James, a teenager turned
basketball player, turned celebrity turned businessman. And the third is a young man Junaid Hussain
who was a teenager turned terrorist, turned celebrity, three wildly different people and
they all said that the reason they like and use social media is it’s like owning your
own newspaper and unpack what they mean by that, owning your own newspaper. You get to decide what news to collect, you
get to decide how to package it and what then to share with your audience. Now, that leads to something else, something
that we call LikeWar. The first is the experience of all of this. The internet may have begun as a way to connect
and communicate first among Pentagon supported scientists, then nerds, then the rest of us
so it was a communication space. Then, it became a marketplace, billions and
new trillions of dollars. But as you started to put people and they
then bring in their interactions, news, politics, it’s begun to feel a little bit like a war
zone and we all notice that. Every time there’s some event, be it a very
clear and obvious political event battle on election, on impeachment, or an event and
a rollout of a certain product, a new movie, a new music album to just what happened at
a party last weekend. We end up seeing battles back and forth, people
taking sides, battling over everything from what you should believe to what you should
do, to what’s the nature of the truth itself. And we all kind of feel those battles happening
as they play out on our Twitter feeds, Facebook, Instagram. But there’s something else that hasn’t just
begun to sort of feel like a war zone. Our research team spend about five years tracking
how people were using social media. Initially, we focused on in traditional war
zones. So places like Iraq and Ukraine but very quickly
we saw it had to morph into other areas. So if you are looking in Iraq at ISIS’s use
of it, there was battlefield use, but very quickly, the key issue was also how they were
motivating in coordinating terrorism outside of Iraq and Syria. If you’re looking at terrorism, you start
to look at other locales. Maybe, for example, looking in on Mexico drug
war and now we’ve moved into criminality, we’re looking at criminality. We also have to look at, for example, not
just how cartels are using it, but how street gangs in Los Angeles, in Chicago go back to
Ukraine. If we’re looking at Russian military use of
social media, it’s targeting the Ukrainian military, it’s also targeting Ukrainian politics
and elections. I’m starting to see the same things reaching
into Poland, Hungary, Brexit, the United States. Some of the things that they’re doing in these
spaces, whether it’s politics or terrorism, it is not just copycatting what celebrities
are doing, but they’re moving into it. So we looked at how celebrities are doing
it or it’s just things that teenagers are doing naturally whether they are a terrorist,
a soldier, or just someone going to school. And so as we looked at all of these different
cases from around the world, from different categories, we discovered a second thing. It doesn’t just feel like a war zone, it’s
actually a new mode of conflict itself. If you think of cyber war as the hacking of
networks, like war is its twin, it’s the hacking of people on the networks by driving ideas
viral through a mix of likes, shares and lies. And just like in cybersecurity, you might
see a diverse set of actors. So in cybersecurity you might have everything
from we’ve got people in this room who want to join cyber command and the US military
to you might have bank robbers to you might have teenage hacktivists, all of them with
wildly different goals who end up using the very same tactics to breach a network. It’s the same thing in LikeWar. LikeWar is a space where for example ISIS
is top recruiter is copycatting the pop star Taylor Swift or in turn Lady Gaga fans are
copycatting Russian military intelligence. And one the goal is to sabotage the US election,
and the other is to sabotage a movie rollout. Doesn’t make sense. Sound strange, but that’s where it’s up. It’s also just like cybersecurity where what
happens on the network does not stay on the network. You can affect everything from battlefield
outcomes to the outcomes of elections, to public health. A lot of you are dealing, if you’re in school,
we’ve got the return of childhood diseases that we thought were done when I was a kid
going back to the public health side and yet anti-vaccine conspiracy theory has helped
bring them back or it might just be how we think about the truth itself. But what was also interesting about this is
not just the impact of it, but as we looked at all of these different episodes and uses
around the world, we kept seeing a consistent set of rules just like in other realms of
be an air warfare, a navel. There’s tactics and strategies, there’s just
doctrines that went out. There’s rules that are applied across the
system. And it is important, I believe for all of
you to understand these rules and all the different roles that you play, whether it
is as current or future leaders, your roles as citizens to just your roles as good family
members. The organizations, the individuals that get
these rules are the ones who are winning out now and the ones that are the new losers of
the game, not just online but beyond. So what are those rules? Rule number one, as the X files told us the
truth is out there. In a world of mass collection and mass distribution,
it’s hard to keep things a secret. In fact, one CIA officer put up to us that
all secrets now come with a half-life. And that gathering and sharing of information
might happen deliberately, so the average millennial will take over 26,000 selfies in
their lifetime and they will take it in the darndest of places. This is an Iraqi unit taking it in the middle
of the Battle of Mosul. We’ve got an aviator in the room, I know an
air force guy that did a, in the US military it’s called a combat selfie. And they did it actually overhead of the Battle
of Mosul. It’s not just those crazy kids, we have the
very first commander in chief to use social media before he became commander in chief,
not even Barack Obama. Donald Trump did not enter social media till
he was 61 years old. And yet he brought into that role over 40,000
bits of data on everything from what he was doing, what he was saying to what he was thinking,
not just overtly thinking but you can also mine this for psychological data influences
tells. This is one of the keys behind the Cambridge
Analytica scandal. Now, that deliberate side can also be mirrored
by the inadvertent background collection sharing. So anyone in the room where exercise apps
like Fitbit or something like that. Those are the healthy people. So essentially what happened is they were
looking at a map, heat map of where in the world the users of these exercise apps were. And they noticed a little spike of light of
customers in East Africa but that’s interesting that we’ve got customers there and then they
zoom down and they went, “Oh, that’s really interesting that our little cluster of customers
are not in the city but they’re in the middle of the desert of East Africa.” And then they zoom down and what you see here
is solely off of the heat maps from the exercise apps, they discovered a classified US military
facility and a CIA black site. What was happening is the guard force and
this is a mix of special operations and private military contractors were doing their morning
jogs around the base of the perimeter and revealing not just its location but it’s pretty
good targeting data. Similarly, the location of an aircraft carrier
was revealed this way and near perfect outline of someone running, doing their jog around
it. Now, this sharing can affect everything from
tactical military operations compare the D-Day invasion where the allies were able to keep
secret not just the location but the day of it with literally over a million moving people
in parts versus the bin Laden raid, supposed to be the most secretive military operation
of our lifetime. The only people that were supposed to know
about it at the time was the actual operators who weren’t formally told confirm the target
until after the helicopter took off the operation center, and that famous picture from the White
House of Obama and only his closest advisors in that very small room. That’s it. Except there was a Pakistani cafe owner who
was up late at night in [ba-da-bad 00:20:46], heard the helicopters come in and did the
new natural thing. He went online and he complained and his complaints
about the noise of the first helicopter, the second helicopter. Now there’s an explosion, now there’s a second
explosion. We’re essentially a live feed of what was
supposed to be the most top secret operation. And at the end of it he said, “Oops, I guess
I’m the guy who live tweeted the bin Laden raid.” Now that took place in a country that had
6% internet connection at the time. Now, this is not just hitting military operations,
it affects political campaigns. I’m from Virginia, at this point in the 2008
presidential campaign, the front runner for the Republican nomination was a fellow, that
a lot of you probably don’t remember. It was a senator from Virginia named George
Allen. George Allen had locked up most of the conservative
wing of the Republican Party support. He was looking pretty good in polling in Iowa
and New Hampshire. If you were a betting person, you would’ve
said, “George Allen’s going to be the one to get the nomination,” maybe he might win. Except George Allen goes to a rural state
park in Virginia and he is not ready for two technology changes. The collection of a digital camera and a new
video sharing platform that had been created just a couple months earlier inspired by Janet
Jackson’s episode at the Super Bowl called YouTube. George Allen, in this rural state park says
a racist term that he should not have. And his what was known as the Macaca moment
goes viral and he never wins another election again. Now compare that the candidate not ready for
a camera, digital camera and going viral to help every single candidate running for president
right now is hoping for that moment of virality. In fact, many of them are bringing the camera
with them, whether it is into their home, to literally the dentist to up on stage with
them because they want that magic moment of virality. Now, this leads to rule number two. The truth may be out there but it can be buried
underneath a sea of lies. And that is the essence of everything from
Russian information warfare to how it has hit domestic politics, to corporate crisis
management, to teenage life. The critical case study in this and you can
think of this in everything from the proof case to, for the historians in the room it’s
a little bit like the Spanish civil war before World War II and that it was sort of the lesson
that everyone else is looking at and learning from was something that remains still controversial. The 2016 election. Now, what happened at the start was a, you
might think of it as a traditional cybersecurity event. You had a breach of an email network. I did interviews of everything from the White
House cybersecurity coordinator at the time to Hillary Clinton’s campaign manager. And we asked them, “What were you thinking
when you heard the news about this breach?” And they actually said, “They weren’t all
that surprised? And they weren’t all that worried. Why?” Because it had happened every single previous
election that we’d had email. It had been bi-partisan, both Obama and McCain
campaigns got hacked, both Obama and Romney campaigns got hacked. So when they got hacked, they didn’t think
it was that big a deal, they’d been expecting it. What they weren’t ready for though was not
the theft of information, the LikeWar spread of information and entailed everything from
the false front. Gruccifer 2.0, which was the presenting itself
as the hacker, an individual Romanian, we now know it was Gruccifer, it was Russian
Military Intelligence GRU to sites like Tennessee GOP there. It’s a little bit small for you to see, but
Tennessee GOP, their handle is, “I love God. I love my country.” They joined in 2015. So just as the campaign is starting up and
can GOP it turns out that they, the country they loved was not America or Tennessee and
they were instead of mid-20’s Russian working at something known as the Internet Research
Agency. Now a couple of numbers that are fascinating
about TN GOP there and it was a company that’s a Twitter as coming by similar align efforts
in Facebook, Instagram, and the like. At its high point TN GOP had over 170,000
followers. Anyone in the room with 170,000 followers? Please tweet out about this speech but a couple
of things that 170,000, that’s 10 times the number of followers as the official real Republican
party, 10 times but that’s not the number that matters. Election day 2016, TN GOP there the seventh
most read account, not the seven. Most of the more than 3,000 documented Russian
sock puppets. Sock puppet is when like TN GOP there you
have a person behind it who’s presenting themselves as something else, not the seventh most of
the more than 60,000 Russian bot accounts. Bots are when you have an algorithm chirping
away driving overall web trends. The seventh most overall, more than pretty
much every single media source, celebrity, politician, you name it. Why? Because tin GOP was being echoed out by people
with hundreds or in the case of retired Lieutenant general Michael Flynn, hundreds of thousands
of followers. Now, like any other military effort, there
was not just one line it was accompanied by, for example, they add by a thing playing out
on Facebook, a hundred, we now know that 146 million Americans, that’s half the United
States population was unknowingly exposed to Russian propaganda via their Facebook accounts
in the final months of the election. Now, this is not just about driving web trends
but social media. It’s the opposite of Vegas. What happens there doesn’t stay there, it
hits other spaces. The particular vector is over 90% of professional
journalists, so everything from whether you’re a newspaper reporter, to you’re the producer
of a local radio talk show. Over 90% of them use social media to determine
what story to cover, what angle to take on the story, who to interview for the story. And if it’s performing well, to revisit it
for another story, another episode. So it can be significant but I don’t want
you to think about this as just politics or American. We see LikeWar efforts going on in the top
left there. Now, we were talking about this with some
of the students at a visit before. That’s the Instagram feed of a Mexican drug
cartel. On the right is just like in cybersecurity,
you may see targeting a military networks, you may see targeting of corporations. This is from an episode where Nike effort
went after them to drive down their share price. That effort involved three lines of activity. One was real Americans really upset at Nike,
genuine actors. The second line of effort was what are popularly
known as alt-right trolls, especially people who are hyper-partisan and using all sorts
of like fake accounts and misinformation to try and game the system. And then the third line of effort was our
friends, the Russians jumping in. And it’s a great illustration of parts of
the Russian strategy where they didn’t create the controversy or the split they just saw
the fire to pour kerosene on top of. And what’s notable about all of this is it’s
like any other tactic or technology. If it works and it has low barriers to entry,
other people will copy it and you will get what we think of in military terms as proliferation. And the fun example of that comes from Lady
Gaga. So anyone in here a Lady Gaga fan? All right, anyone in here see the movie A
Star is Born? Okay, so Lady Gaga fans are passionate and
in particular they felt when her first movie A Star is Born, came out that she had the
inherent right to have the number one movie in the Box Office and how dare any other movie
come out this same weekend. And so on the fan sites, they openly talked
about, we can do what the Russians did to the US election to the rival movies coming
out that same weekend pushed out in particular one by Sony called Venom. So they did a copycatting of what the Russians
did everything from false front accounts to, they created what you might think of as an
AstroTurf movement. So AstroTurf is the parallel to a grassroots
movement, bottom up real people but like AstroTurf, it’s fake, but it feels real. In this case, just like the Russians, they
identified what they saw as the core vulnerabilities in the American Media Ecosystem. They also figured out they should be people
who were greater trusted. So the Lady Gaga movement modeling after the
Russia’s effort posed as concerned moms who then particularly targeted local radio and
TV as if there was a real protest movement about violence in movies in particular this
new movie Venom, that’s bad. Don’t let your kids see it instead, they ought
to go see Lady Gaga’s movie. I love this because it shows kind of the proliferation
for the military ethicists in the room. One of the other things that the Lady Gaga
fans had was a ethical discussion about weapons and tactics and they concluded that it was
ethical to copy Russian disinformation warfare. But in doing so, it was unethical to use real
people’s faces that they were opposing as so they would use stock photos. Now, this also shows the pattern where if
you’ve been tracking the news, it’s not just Lady Gaga fans who copycatted the Russians
but out of just about two weeks ago, the Chinese did and they put together an almost mirror
network to what the Russians had done. Just, and again, it had over a one year history
of it. It was sort of revealed in the last couple
of weeks. They deployed it, it was posing as everything
from like TN GOP conservative news sites to one of them was a fake woman from Colorado,
built up credibility followers. And then as the Hong Kong protests, they pivoted
from talking about pro Trump or the Colorado one was talking about how great hiking is
and pivoted to everything going on in Hong Kong is a CIA conspiracy. So we’re seeing this proliferation hitting
lots of different sectors. This leads to rule number three. We are in a world where virality Trumps veracity. It is more important that something go viral
in terms of its power and power defined as achieving your goals that influence. Some more important that something go viral
than it be true. It doesn’t mean that the truth can’t go viral,
the truth can but the power comes from the virality. And the power not just in shaping online behavior
and belief but the real world. Again, shaping everything from who someone
votes for to whether they turn out to vote, to whether they join a protest movement, whether
they join an extremist group to what movie people go to belief in a science to the truth
itself. I think a great illustration of this is the
episode that happened about a half year back when a group of high school students from
Covington, Kentucky visited Washington, DC now go back to rule number one. We have more data about this episode than
the CIA could have dreamed of a decade ago. We have video of what those kids were doing
from three hours before, to they got into an argument with a native American. We have video from three different angles,
we’ve got all the data but my guess is if we did a survey of this room, 60% of you would
say one thing happened, 40% of you would say something else happened and you would cling
to it like it’s the most important truth itself because you’ve been shaped by whichever echo
chamber of that virality steered. And notice again, I didn’t say what happened,
it’s just rather how we’ve been shaped by all of this. Now, if this is the case, that virality is
so crucial, we need to understand why things go viral? And whether it was Russian information warfare,
political ads, conspiracy theories like Pizza Gate to jokes like pizza rat, if you remember
that. There were consistent attributes of virality. And I don’t have time to go through all of
them, but one I think is illustrative of this is the notion of planned authenticity. Now planned authenticity, that sounds like
a contradiction. How do you plan to be authentic? But arguably the Murray Von Clausewitz of
this tactic is Taylor Swift. Now she, at a young teenage like Marie wrote
an essay about her strategy, and the strategy was to use this mechanism to achieve popularity
and power within the music game. And she’s become the richest self-made woman
under the age of 40. And she calls it Taylor and Taylor embraces
those two technology revolutions that I talked about. She gathers information on scale and real
time like never before in terms of overall trends, but simultaneously individual behavior. So she is able to know whether someone breaks
up with their boyfriend, whether they pass a driver’s license, what shirt they chose
to wear that day? And then using that information she will engage,
she will congratulate them on passing the driver’s license, consult them on breaking
up with their boyfriend. I love your shirt, she is acting like she
is their friend. And the new meaning of the term, she is their
friend but she’s doing it conscious of the fact that the whole world is watching. It’s the exact same tactic that that’s the
handle of Junaid Hussain who was ISIS’ top recruiter. And in his case he wasn’t trying to persuade
people to buy an album. He ended up persuading over 30,000 people
from 99 plus countries to travel to Iraq and Syria to join a group made up of people they’d
never met before. It was the exact opposite of how Al-Qaeda
built out. Al-Qaeda the term itself translates as the
base. It was a reference to the mountain camp in
Afghanistan that you had to be known and trusted to get to. ISIS the flip, and again, not just convincing
people to join but also inspiring acts of terrorism everywhere from Texas to Paris. Now, this leads to our next rule, we are seeing
a reordering of the art of the possible in pretty much every single realm. We are seeing winners that wouldn’t have been
able to win before and people that we thought or organizations that were dominant become
the new losers that’s happening in politics. Donald Trump himself says, “I would not have
won without social media.” Now it’s a little bit more of a complex story. It was really the campaign side as opposed
to what most people point to as his Twitter feed. But maybe you don’t believe Donald Trump,
we interviewed instead the top political campaign strategists in Washington, DC, someone who
makes literally millions of dollars advising campaigns. And he walked us through all the metrics that
they use to determine whether a candidate is a likely to win, everything from how much
campaign cash you have, how many offices you open up in each county, et cetera, et cetera. And he said, “Donald Trump should not have
one.” And he was not making a moral ethical, remember
he’s a political campaign strategist. He was just saying by our metrics, he should
not have beaten not just Hillary, but his 19 other rivals for the Republican nomination. But he ended up by saying, “Of course he did
win.” And it’s because all of these old rules and
metrics that we used aren’t the way to win anymore. It’s the same phenomena as I mentioned in
the terrorism game. ISIS rises in part by inverting how you operate. But it also changed the way a battlefield
operations. Instead of trying to keep its invasion of
a rock secret like we would did with the D-Day or the bin Laden raid. They create a hashtag and the hashtag was
literally all eyes on ISIS because they wanted everyone watching everyone from the Iraqi
soldiers on the opposite side of the battlefield as part of the story of how an invasion force
of 2,500 defeats a defending force of over 20,000 because of the fear pandemic spread
online. But also they wanted us watching and the United
States that had just weeks earlier said, “We’re never, we’re not going back into Iraq for
another generation.” ISIS goes viral and polling shows that Americans
become more afraid of terrorism after this then in the weeks after 9/11. ISIS had not achieved a major attack that
had killed thousands of Americans but we were more afraid of terrorism. And then that leads us to redeploy again into
Iraq and the like. Entertainment celebrities, if you’re under
the age of 30, you recognize these as some of the most influential people. If you’re over the age of 30, I have no idea
who they are. To business, I love this case, Burger King
is considered one of the best corporate brands at all of this. They’re taught in business schools as a model. And what you see here is Burger King engaged
in an online beef with Kanye West and McDonald’s as a way of selling beef. Everything I just said would have made no
sense 15 years ago and yet it is a best practice. But it’s not just about the new winners and
losers, it’s also about the new responsibilities of kind of the ultimate winners and kind of
losers. The powers, not just on the battlefield but
those that control the battlefield itself. Go back to the story of young Mark Zuckerberg,
15 years ago, he’s a teenager writing software. Software and Facebook is originally about
rating how hot your dorm room mates are. And it moves on to become a platform used
around the world, he becomes incredibly wealthy but he also becomes one of the most powerful
people not just in business, but war in politics itself. Because with the simple change of his mind,
he can tilt the playing field one way or another, whether it’s an effort over disinformation
campaigns targeting election, hate and extremist groups, public health anti-vaccine conspiracy
theory, or as we saw the episode even mass killings where this from Myanmar or ultimately
700,000 people were targeted through a campaign that was coordinated and motivated via social
media platforms. And what you can see, and I love this picture
of Zuckerberg in front of Congress because it shows everyone’s uneasiness at this and
trying to figure out just what to do about it. Is it the responsibility of the company or
is it the responsibility of the law? And we’ve kind of not yet answered that and
this is a political issue that’s going to be with us for at least the next generation. But guess what, it’s just like warfare. There’s a back and forth, there’s continual
refinement and so you ain’t seen nothing yet. Two core challenges, the first, only half
the world is online, half the world is still to come. Think how much of a challenge we’ve had at
all of this with our over two centuries of experience at democracy, free media, et cetera,
half the world won’t have that. So it’s going to be an interesting thing to
play out and particularly for those of you going into the military a lot of this will
be in the spaces that you might deploy into. But the second is the tactics, the technology
continues to advance. And so this is particularly the case with
artificial intelligence, popularly known as deep fakes. This comes in two types, one is where you
use AI to take something that’s real and manipulate it into something that’s fake. The topics is still shot and you can see it
online. It’s a video of a speech that Barack Obama
never gave. The second type is where you use, you create
something out of nothing that’s completely fake, but it looks real. So this is what happened when they asked AI
to generate what human beings think celebrities should look like. You can see we’re obsessed with the Hemsworth
family. And now what I think is notable about this
is a pretty dated picture, you can see little tells that he’s fake if you look close. He’s the ugly Hemsworth. He’s got his ears slightly off, his eyes are
off, the next version of it you basically say, “Hey, AI fix that.” Or in her case as an example of how the hair
covered and the like. Like everything, it will be used for entertainment. In fact, it already is. There’s a new Will Smith movie coming out
where Will Smith stars opposite a deep fake of Will Smith, it’s called Gemini Man but
it’ll also be weaponized. And we already see examples of it targeting
individuals, putting them into videos that they were never in to this is a political
version. It’s not a deep fake but it gives you a sense
of where we’re going. This young woman was one of the gun control
activists after the Parkland shooting in Florida. She did a shoot on the left, that’s the reality
where she tore up a bullseye from a shooting range. The imagery was manipulated to make it look
not just she looks evil but also to make it look like she was tearing up the constitution. That false hood was then sent viral by go
back to the example of Nike being targeted, the same kind of three lines of effort. One, gun rights activists, two outright trolls
hyperpartisan, three our friends the Russians. The result of it was more people saw the truth
of her tearing up the constitution than the actual reality of what she did or the debunking
of it. So in closing, we’ve got a lot of challenges
and it’s much like say 15 years back where we started to realize about cybersecurity
and cyber war. We had a lot of challenges in the threats
to networks. And apt to that parallel and also to how it
was introduced to the field of public health, I think there’s lessons to be learned from
these other areas. The primary one is there’s no silver bullet
solution to this. As long as we have people, as long as we have
the internet, we will see this kind of interactivity. So instead it’s about risk managing it. And there’s no silver bullet, there’s no one
thing that we have to do. You break it down by government action, corporate
action, individual action. Just like in public health, no one would say,
“Oh, well we have a center for diseases control. I guess I don’t need to wash my hands.” Or similarly no one would say in cybersecurity,
“Well, I have good two factor on my Gmail, that’s why we don’t need US military cyber
unit, right? We need layers of action.” So real rapidly the sort of things that we
need to undertake on the governmental side, we need something that many of you are taught,
we need a strategy. The Trump administration the several months
back released the very first update to US cybersecurity strategy in 15 years. It’s a good advance. Neither Bush nor Obama administration had
updated the strategy. There’s only one problem, it had not a single
sentence about this entire problem set. And that’s in the wake of all the things that
have played out over the last couple of years. So without a strategy, what we have instead
is a lot of ad hoc action, there’s some cool things being done, say at NSA. But then there’s other areas where we’re not
seeing action. A particular issue is we really need whole
of government on this. So for example, since our election over 30
other nations have seen their elections targeted similarly state department hasn’t been bringing
together all the democracies to work together on this, or we were talking about this earlier
the education side. Digital literacy is something that we don’t
have taught here or it might be legal action that’s needed to get ahead of problems like
deep fakes. Corporate side. The companies that run these networks have
to understand that they have a new kind of responsibility, they are no longer running
just communication spaces or marketplaces. They are running political and battle spaces. And that brings a very different way of looking
at the world that’s required. And we’ve seen them coming to grips with this
over the last couple of years and literally days. And similarly, if instead, if you’re not a
platform company but you’re a regular company, you’ve got to think about this is another
threat factor against you. And then finally, in closing, there’s our
role, like in cybersecurity, like in public health, it depends on knowledge and ethics. So the most vulnerable like in public health,
like in cybersecurity are the ignorant. I don’t mean that in kind of the insulting
way, I just mean people who don’t understand don’t know what’s going on. Over 60% of social media users, they not only
can’t tell the difference between real and fake news, they can’t tell difference between
an ad and a news item. Over 60% of social media users don’t know
how the companies that they use make money so they don’t realize I’m literally the product. But it’s also about this sense of ethic. I teach my kids and they are taught in their
schools, cover your mouth when you cough, that does nothing to defend you. It’s all about you taking responsibility for
everyone else that you connect with. Now, think about the digital version of this,
we all have that friend, that co-worker, that relative who does the digital version of coughing
in your face. They spread conspiracy theory, something a
little bit extremist, fake or junk news. Maybe it’s us. And think about how we react to that as opposed
to if they walked up and coughed in her face, until we developed that ethic to guide our
own personal behavior but also push back to protect this ecosystem, we will continue to
be the targets and losers of LikeWar rather than building the kind of resilience as a
nation, as organizations, as individuals that we need to thrive in the 21st century. Thank you. Thank you very much, Peter. So we have some time for questions and if
you’d like to ask a question, we have two mics set up, so please come on down and we
can continue the conversation via your questions. Thank you very much for an excellent presentation,
Peter. Thank you. Good evening. My name’s Kelly Evans. I’m a junior in the Corps Cadets. I was curious, so we had the Todd Lecture
Earlier today and I don’t remember who, but one of the three of you mentioned that in
different countries they in like elementary, middle school, high school, they educate the
kids about internet use and what to look for. And I feel like our generation is a little
bit better about using social media and stuff like that because we’ve grown up around it
and we know it. Do you think that it would be possible for
the government to pair with some of the bigger social media sites to try and educate the
users that are older and barely know how to turn on their computers? Yeah, so great question. And a number of different things that come
out of that. The first is the issue of you hit it exactly,
no, this is not a young person only problem. Actually, the data shows that baby boomers
are seven times more likely to share false information online than any other generation. Essentially what they said video games would
do to youth, Facebook did to them. But you are not, there’s two issues here. One, we could say, “Oh, we’ll give it that. We’ll just wait till you all become four star
generals and CEOs and presidents.” Well, the challenges that still gives us a
couple of decades in between a bad stuff. Right? but the other part of it is you are
not sick from it either, and one of the interesting things is they did for example, studies that
showed they compared everything from PhD students to youth et cetera. And they found that age wasn’t inoculator
against this conventional education. If you had a PhD, you were still just as likely. It was rather about having digital literacy,
which is a particular kind of literacy which is it’s not fact checking, direct fact checking
does not work. All the data shows that even though we keep
defaulting to that as a response. Instead, it’s understanding whether it’s these
rules that I laid out to understanding that imagery can be manipulated and giving people
the chance to work with manipulating imagery so then they see that and they go there to
understanding the tactics that go after particular parts of your psychology, and so you have
that kind of awareness of it. So there’s a particular kind of sort of training
set and as was said earlier today, there’s some nations that have these. And that’s important because we often, this
feels kind of scary and then often sometimes goes to, in the United States we have a first
amendment. Well there’s nations like the Estonians, like
the Finland’s that have vibrant democracies and yet have built up resilience against these
kinds of threats. The final thing that they have is something
that we really need to go after and I hope this is sort of part of the conversation here,
is I spoke with the Estonian ambassador about this and he said one of the cases, not just
the training side, is that we talk about it all the time. There’s no sort of fear of talking about it,
no denial around it. And again, think of the echo and public health. Some of the worst diseases that spread are
the ones that people are afraid to talk about. And it’s the same phenomena here and unfortunately
we’ve not had that kind of open discussion about it. So half the people in this room who were Facebook
users and some of the people in youthful crime, Facebook as my grandparents. If you’re on Instagram, Facebook owns, it’s
hard to break the news for you. Over half the people were exposed to it. How many of you have come to grips with that? Or in turn major figures in politics and media? Logistically they have to and not particularly
we can point to episodes how many of them have said, “Hey, I got taken in in 2016.” This is what I’m doing so that it doesn’t
happen again as opposed to just not talking about it, but it also is not just on kind
of the body politic. It goes back to as you brought it up, the
role of the companies themselves. Part of why most of you don’t know this is
for example originally Facebook was in denial that it happened November, 2016 Zuckerberg
says, “It’s a pretty crazy idea that this happened,” then they have to revise that and
say, “Actually, I was 140…” But then they further when the data is put
out, it’s not pushed to you, the user. It’s actually placed solely on the desktop
version of Facebook where of course, most people’s interactions that you can’t get it
with this. And it’s put where you actively have to go
hunted out to find out if you were exposed to or not. Think about a completely opposite realm, which
we’ve learned about in cybersecurity where if you’re an Instagram or Facebook user, Gmail,
whatnot, they now push out to you things like, “Hey, I know you have a password but it needs
to be a strong password.” Or, “Hey, you should also have to factor,
you should have the cell phone number or something else on it.” They’re proactive in equipping you, they’re
not proactive in equipping you to defend yourself. Imagine popups that say, “Hey, here’s how
people are sometimes taken in by conspiracy theory. Here’s what you can do to defend yourself.” So again, there’s layers to this from whether
it’s the education system to corporate that we could go after this in a way that very
much preserves our rights. Great question. Thank you. Good morning, sir. Thank you for coming Norwich University. The speech was really informative to an environmental
science student. So I’m Jack Carson, what I wanted to ask was
that you mentioned that we see all this type of technological influence from major parties
like Russia, China and certainly other countries like even Germany or big countries in the
middle alliance. What I wanted to ask is that, do we see this
type of investment in programming by other countries like say Pakistan or even some of
the African nations? And if so, how long will it take for me to
catch up? Because you mentioned only about half the
world’s population is online. Do we see the same investment being put in
by other countries, or is it really only confined to the major players? Yeah, great question. Before I jump into the direct answer even
the fields that you’re interested in has been shaped by this. So for example, there was the recent UN meeting
and climate change was a big issue there. And there was an online campaign that was
using all of these very same tactics to flip push climate change denialism attacking certain
figures who are climate scientists and the like. So it’s used in a negative way, the flip side
is and I’ve worked with people in the field is, you can use these very same tools of virality
for good. They’re just like any other tool, a fork or
whatever. It can be used for good or bad. So we talked about negative episodes of vrtality. If you’re a Taylor Swift fan, you’d say, “Well,
this is positive,” but a different example would be ice bucket challenge, that was a
deliberate campaign that focused on using this virality to bring attention to the cause. Climate change, too much of the discussion
has fallen into sort of trying to be a fact-based discussion where one side is pushing out scientific
reports and not just technical terms, but saying this is the facts of the matter when
we know in terms of virality, it’s not just the pure facts, it’s for example those networks
of narrative around it. So far more important than the report was,
for instance, the image of that single polar bear floating on the ice going away, or the
story of the 16-year-old little girl from Sweden. So even in the debate around environmental
science, we’re going to have to utilize it, it’s already being utilized, it needs to be
utilized better. So your question though is about, “Hey, are
there other players out there?” And absolutely, this is a space that is very
low barriers to entry. So it’s pretty much everyone, small states. You mentioned India, Pakistan is shaped everything
from Indian elections to actually certain episodes of mass killing within India to the
desktop between Indian, Pakistan had all these really strange attributes of LikeWar happening
where you had false news reports planted et cetera in their contention. You also have part of the aid of this is a
brewing I did a previous book on private military contractors, kind of the corporate versions
of mercenaries. We’re seeing the LikeWar side of it, for hire
companies that will run these campaigns for you. So a different small nation version democracy
activists in the Middle East were being targeted by, it was funded by a government but the
one running the operation was a for hire company. Oh, by the way, don’t just think that hits
the outside of the world, we saw a similar episode hit at a local election level in California,
essentially in a County election. The County board makeup was going to affect
a multimillion dollar hospital deal. And so the local millionaire, the real estate
guy hired a private military, sorry, a private influence operation company to try and shape
the outcome of the County election. He spent a quarter million dollars in the
hope of getting the $7 million. So we’re going to see these hit layers. And that again is why I go back to that notion
of you’re not going to be able to deter this kind of action. There’s just too much of it going on at too
many layers, it’s more about building resilience against it. And the great thing about resilience is that
it aids in whatever the attack is. If I’m being attacked by someone who is trying
to sell me bad product or push this information from a political, the resilience works against
both of them. And so it’s a better way of defending ourselves. Great question. Good evening, sir and thank you for the awesome
lecture. Thank you. So I wanted to ask about LikeWar in reference
to developing nations. I understand that in developing nations communication
infrastructure isn’t a huge thing right now. So I wanted to know what a LikeWar would kind
of do to the development in a developing nation, it’s politics conflict in the region. And if it doesn’t, then what can we kind of
do? What can international communities try and
do to kind of prepare for when they take on the internet? Yeah, a great question because one of the
things while there is not been the build out of infrastructure falling the same pattern
in the United States where you are seen in much of the developing world is two phenomena. One is sort of skipping a generation and jumping
right into cell phone, smart phone. So wherever you are in the world actually
there is mass use of cell phone and it goes back to that kind of the ability to network,
to gather information, to share it. And this affects whether it’s local politics
or that episode of mass killings I mentioned in India to you’re deploying as a soldier
somewhere in the Middle East. Someone with a cell phone is going to be there
to out you while back break the news to special operations command that one of their supposedly
secret operations in the Middle East was being talked about on Facebook. So you’ve got that kind of thing going on,
but your question is particularly about the politics of it. And it actually links back to the prior question. The case studies that show both the impact
of what you’re asking about but also our like little battle labs for the 2020 US selection
is the Philippines and Brazil, where we saw sort of an advancement of some of the tactics
that were used in 20, just like any other war, kind of what works first and then the
other side builds a little bit of a counter the attacker moves to something else. And so we’ve seen certain usages in those
elections. Also some of the smaller East European elections,
more recently in the EU, they were seeing slightly modulated tactics, similarities but
kind of moving into new realms. And so this is where I go back to kind of
thinking about the transnational nature of it. The United States ought to be not just aiding
these other nations and building resilience but learning from both what it them, and the
success cases of defending themselves what happened. And ideally, we ought to be saying, “You hit
this other democracy. It’s as if you hit us and hope that they’re
doing the same.” We’re actually a great illustration of this
not a small country but we’re not too far from the Canadian border. There is a fascinating comparison between
Canada has an upcoming election and has a clear and fairly cohesive national strategy
to defend it. That’s bringing together everything from their
intelligence community and government to their media. Their training, their campaign beat reporters,
not what to report their democracy, they don’t tell them that but they are equipping them
by saying, “Here’s what we’ve seen political journalists be targeted with and other elections
you should know about it,” move forward, be equipped. So they’ve got this kind of comprehensive
strategy, everything from their intelligence community engagement with media across the
border in the United States. We don’t have that kind of comprehensive strategy. Good evening, Dr. Singer. Thank you for coming. My question is, do you see the United States
using LikeWar in the future to combat terrorism and other future threats and conflicts? So yes, and not just in the future, we’ve
already done it. So the book that this is coming out of the
opening scene is the one of the opening scenes is the ISIS takeover of Northern Iraq. And ISIS kind of using all of these tactics
and everyone on the opposing side Iraqi army but also of US military and a thrown back
and not knowing how to handle it. But there’s a long, and there’s a lot of military
historians in the room. There is a long history of the United States
not doing well in kind of the first battle but then rapidly learning and implementing. And the close of the book is looking at some
of the things that US military has put into place since. And they range from ones that are vanilla
and ice. And so the closing scene in the book is a
visit to Fort Polk, which is a military base, which has a long history of be where the US
military experiments. It’s where we did the Louisiana maneuvers
back in the 1940 period that figured out how to use trucks and tanks and set up horses. And they’ve changed that training ground to
incorporate training for some of these techniques. So that’s the overt side. There’s also go back to rule number one, no
secret, the truth is out there. One of the things that CENTCOM did is had
a contract that was put out and said, “Hey, we’d like to buy software that would allow
one person to control at a minimum 10 social media accounts.” It sounds a little bit parallel to what the
Russians were doing. We actually did a lot of these sort of similar
things targeting ISIS helping to drive down what they were attempting to push viral and
push alternative messaging. Go back to the discourse of Zuckerberg. No one raised free speech issues around that,
everyone in the universe and we agree ISIS back. What’s become the challenge and is continuing
to be an issue and will in our body politic is what happens as they’re more contested
political identities. And you see the use of these tactics either
taking people offline or pushing alternative influences on them where maybe we don’t all
agree they’re bad. And the point where this has really kind of
exploded over is far right extremism. We’ve had more Americans actually killed by
far an extremist than ISIS itself but it’s a contested political ideology inside the
United States. So what you’re seeing is both our body politic,
but the companies themselves trying to come to grips with everything from, “No one had
an issue of intelligence community and FBI infiltrating ISIS networks and false one accounts
to no one how to a challenge of ISIS knock off line their accounts. Make Junaid Hussain’s job harder.” If you put the ISIS black flag up, the algorithm
would immediately knock you offline. But when it moved into post-Charlottesville,
then post-Poway Pittsburgh killing, we started to see contestation around that. And one of the interesting things is it’s
part of kind of people’s anger at the companies themselves is either you didn’t do enough
or I don’t like that you did too much. So as an example, Neo Nazis imagery, it was
okay to post it on Facebook and the like after Pittsburgh, after the Poway, and they said,
“Freedom of speech issues and the like,” and it’s not our role to adjudicate after the
Christ church mass killing then they changed their mind and said, “Okay, now we’re no longer
doing it.” And so you had people mad at them before for
not doing enough and then you had other people say, “Now freedom of speech are biased against
me.” So this kind of contestations can be one of
these political issues, but again, what’s happening is we’re mainly deferring to the
companies themselves to decide it for us, what are essentially political questions? Thank you Dr. Singer. I remember this summer when Dr. Morriss and
I were in Slovenia and the rest at Balkans we learned it out and through our analysis
that’s Slovenia as a country doesn’t really fear issues like this in other general cyber
and information warfare issues in general. What can we say to a country like Slovenia
that just isn’t worried about this issue at all? Could these rules help them? I think I would pointed them, look at what
has hit some of both larger, more powerful states than you and how tough a time they’ve
had it to look in your neighborhood and what has happened to nearby neighbors. So we’ve seen these campaigns war democracy
enhanced Russian influence, spread misinformation and Hungary and Macedonia and the like. So again, it’s the kind of thing that we had
what happened to the US 2016 election had already happened in Ukraine in 2014, it did
fall and then 2015 we just sort of had this well it couldn’t happen to us, right? The second part of the challenge is, and this
again goes to some of the things that you all as students are wrestle with all the time,
is there’s no one single field that owns this topic. It’s a topic, notice what we’ve talked about
everything from national security to education policy to the role of business networks. And so one of the other challenges and I think
that we’ve had with this is you see, and this is actually what motivated us in part to do
the book, is that the people that understood what Russia was doing in Ukraine were completely
disconnected from the campaign beat reporter. So the people were like, “No, this is the
game plan. It’s out in the open. No, they’ve been doing this for years.” And then another set of people are utterly
surprised by it, or in turn that people that do counter terrorism or interested in Middle
East issues are fundamentally different than someone who’s wrestling with social media
gang issues in Chicago or vice versa. And we’re seeing insights it can be pulled
from these different fields. But too often as kind of both government policy
but also academia research. We try and break it down into these fields
and we have these silos and so we don’t get shared insights and information and that would
be the other thing for your colleagues potentially in Slovenia would be, you got to think of
this as there’s a certain military aspect of it. There’s a certain election aspect of it. There’s also, your education people need to
be involved in this discussion and that’s an absence that we have on our space too. Hi, good evening, sir. So to my understanding in information warfare
and in Psy-Ops, we’ve conventionally used pamphlets, we’ve used posters, we’ve broadcasted
information. But my question for you is, would you recommend
in information operations incorporating LikeWar as an information really capability to use
on the battlefield? And would it be ethical? And if not, what would those concerns be? We’re already doing it and will you put your
finger on one of the challenges is that in some ways we were behind the curve. We’re pushing out pamphlets when other people
are running YouTube videos, getting hundreds of thousands of hits. But it’s not just about the platform that
you’re on, you can be on these platforms and utterly fail because you have a process that’s
wrong. It’s not an understanding and embracing what’s
happening. One of the other aspects of virality is that
it understands that every single message that that’s pushed out, every single sort of weapons
use is also like an experiment. And so the operations that when, whether it’s
a political campaign to literally the Buzzfeed company to ISIS operations don’t send out
one message, they actually flood this zone. But it’s not just the amount, it’s the process
of it. So there’s a great comparison, you mentioned
Psy-Ops and like we’re talking to the world of special operations command. So special operations command again, go back
to the question about learning. At the early stage of this, they had a process
where essentially around, don’t hold me to this number. I’m literally gooing to say don’t hold me
to this number. But let’s say roughly 10 mid-career so major
Lieutenant Colonel level, white male, most of them non-Arabic speaking would debate for
multiple days using PowerPoint. What’s the perfect counter ISIS message, which
is of course targeting some 18-year-old from the region. And that same time period, ISIS as an entity
is pushing out, it had 50 different official ISIS media channels online. It had roughly, again, loose numbers, 10,000
fighters with their own messages that they’re pushing out to another, depending on how you
cut it, 20,000 fanboys who are not official members, they’re not in the battlefield but
they’re somewhere else chirping out in the message. This mass amount of messages on lots of different
things and then one of them would hit and then the whole network would pivot on top
of it. For those of you that are interested in electoral
politics, that process and misunderstanding is it near mirror of Hillary Clinton’s campaign,
which had, I believe the number was 11 people in charge of her Twitter feed. And so 11 people will debate back and forth
about what the one message would be and you get lowest common denominator. And the outcome, go back to that notion of
authenticity. Hillary Clinton is a real person, she did
not come across as real online. Burger King, I hate to tell you there’s not
an actual Burger King, Burger King comes across as real online, right? And so it’s not just about being in these
spaces, it’s about how you operate in them and whether, again, you’re embracing these
rules or not. So this will be the last question, go ahead. Good evening, Dr. Singer. Thank you for the amazing speech and the valuable
insights it provided. Thank you. My question for you is, what do you think
the best approach is for reducing the spread of misinformation while also preventing the
induction of bias into social media? Great, great question to end on. So one of the issues that’s playing out in
this sort of discourse over bias is if you’re a sports fan, it’s called working the referees. So the funny way of kind of dealing with toxicity
online is then spend, and the conspiracy in your life is to say, literally the game itself
is rigged against me. And you will see this happen, most often it
goes back to that notion of ignorance. Politicians particularly do this have both
strides where they’ll say, “Hey, my Twitter feeds suggested…” A democratic will say, “My Twitter feed suggested
to me that I ought to follow Donald Trump, Ivanka Trump and Fox News. This shows it’s biased against me or, and
turn a Republican figure, a senator did this saying, “It’s suggested that I ought to follow,
Huffington Posts and Elizabeth Warren. It shows they’re biased against me.” No, it shows that you’re ignorant, it shows
that you don’t know how the algorithm on the side that it’s a natural algorithm that’s
designed, its whole purpose is to create engagement. Why is it steering Donald Trump to your feet? Not because it thinks you love Donald Trump
or wants to persuade you to love Donald Trump. It thinks you will react to it, you will click
more, you will share more, you will argue with him. And we are a company that makes money off
of the engagement or in turn it steering you to this other side. So much of the discourse around the bias problem
is actually drawn out of this combination of one working the referees, trying to shape
the environment so that you get greater kind of control over potentially during the legislative
process. And the second, it’s drawing upon ignorance. So it is pushed and those arguments work,
but if we are ourselves more ready users it’d be a little bit the equivalent of you’ll recall
at one point and on the internet, you had a senators that, “Well, it’s like the internet. It’s like a bunch of tubes.” And like, “No.” And the fact that you think it’s like that
means maybe you ought not to be the one deciding its future. And it’s the same phenomenon, a lot of you
probably saw those first hearings with Mark Zuckerberg and many of the questions that
were asked to them, some of them were on the bias question. Most of the legislative interaction with them
was embarrassing. Now, my hope is we get that combination of
one, the individuals themselves go, “I can’t let this happening. I got to get smarter about this.” And you’re seeing that happen in more recent
hearings and two, that’s on us to be the ones that pressure back and say, “No, this is what
I want you to push. I am a voter, you are or I’m a user of your
company.” This kind of explanation doesn’t work anymore,
I actually understand the rules of the game and this is what I would like you to see to
better represent my interests, be it either as a citizen or better represent my interests
as a user, a customer of your company. So I think, again, none of this notice at
no point did I say, censor people support one side or the government intervention in
terms of censoring. You can still maintain freedom of speech and
yet have an internet ecosystem that doesn’t reward the forces of toxicity and the toxicity
forces are this information foreign influence campaigns. Conspiracy theory, hate and extremism, it
is not necessary that they find a hospitable environment. We can create an environment that makes it
harder for them to operate and easier for us to defend ourselves. But it all goes back to that notion of layers
of government option corporate and our own individual knowledge and sense of responsibility. So again, I really appreciate the opportunity. Great question and also just to talk to all
of you about it. Thank you. Peter, thank you very much for your contributions
over the past 48 hours and also Benedetta, Ian, David, and Paul. Your thoughts have been very impactful for
us and that have been significant and will help us develop leaders and respond to some
of the things that you addressed. So thank you very much. Have a good evening everybody.

Leave a Reply

Your email address will not be published. Required fields are marked *