Thursday, 29 October 2020

UNITED STATE POLLS:MICHIGAN AGAIN AT THE HEART OF MISCHIEF AS ELECTIONS DRAW NEAR

 


 

Michigan again a target of election disinformation

Ashley Nerbovig

Detroit Free Press

View Comments

0:30

0:37

Supporters hold signs as Republican Presidential candidate Donald Trump speaks to a crowd during a rally at the Macomb Community College Sports & Expo Center in Warren on October 31, 2016.

Disinformation tactics used to mislead voters are continuing and evolving just days before the presidential election.

What role disinformation may play on Election Day in Michigan isn’t yet clear, but, in the four years since the last presidential election, questions still linger about how social media changes what happens in the voting booth. Some strategies from 2016 seem to be continuing, including targeting minority groups with messages meant to discourage them from casting a ballot.

After the 2016 presidential election in Michigan, proof of Russia’s deliberate attempt to mislead U.S. voters through a coordinated disinformation campaign on social media began to emerge. Michigan was one of nine states that two Russian agents visited in June 2014 as part of an intelligence-gathering mission, according to a Senate Intelligence Committee report on Russian interference in the 2016 election.

Misinformation and disinformation are terms used to describe false information, but misinformation is false information shared without the intent to cause harm. Disinformation is when someone deliberately creates or shares information that is incorrect to inflict damage, such as telling people the wrong date of the election.

Young Mie Kim studied Russian interference in the 2016 presidential election and continues to monitor for Russian-linked accounts during the 2020 presidential election cycle. Kim is a professor at the University of Wisconsin, Madison, where she is part of a research project called Project DATA, or Digital Ad Tracking and Analysis. The project focuses on the 2020 election and tracks digital political ads to learn how parties, organizations and candidates target and speak to potential voters.

One example her team captured was an Instagram account called "Michigan_Black_Community." It posed as an African American group in Michigan focused on racial issues. The group included anti-Kamala Harris posts,  and Kim's group captured it in September 2019, just before the primaries.  Instagram removed the account in mid-October in a sweep to rid the platform of false accounts created by Russian actors.

Get the Elections newsletter in your inbox.

Inform your vote. Sign up for Michigan election news and commentary from the Free Press.

Your Email

However, despite teams such as Kim's trying to archive accounts and track the actions of foreign and domestic disinformation campaigns, it is difficult to get a clear picture of what is happening in the 2020 election, said Josh Tucker, a New York University professor of politics and co-director of NYU's Center for Social Media and Politics. That type of data isn’t being released in real time, Tucker said.

In the case of Facebook, data about what users saw in 2016 was never released, according to multiple sources the Free Press talked to.

There is more archived data about how human beings interact with politics than ever in the history of scholarship because of all this digital tracing, Tucker said. But it isn’t public.

“All this data that we need to try to understand how the political world is being transformed by social media is actually owned by the companies whose impact we’re trying to understand,” Tucker said. “And they have a say on who gets access to this and how you get access to this.”

Testimony before Congress from these tech company executives has given the public some insight into how their news feeds are formed. 

During a Senate hearing on Oct. 28, U.S. Sen. Gary Peters, D-Mich., asked Facebook CEO Mark Zuckerberg what his company was doing to prevent extremist groups from growing on the platform.

“We’ve taken a number of steps here including disqualifying groups from being included in our recommendation system at all if they routinely are being used to share misinformation or if they have content violations or a number of other criteria,” Zuckerberg said.

Democratic presidential candidate Hillary Clinton talks to a crowd of over 4,100 people inside Shed 3 at Eastern Market in Detroit, Michigan on November 4, 2016, four days before the United States presidential election.

Michigan on Russia’s radar

In Special Counsel Robert Mueller’s report on the investigation into Russian interference in the 2016 presidential election, President Donald Trump’s 2016 campaign chairman, Paul Manafort, told a Russian intelligence officer about his plans to win the election, including naming four key “battleground” states, according to testimony from Rick Gates, Manafort’s former business partner.

That list included Michigan, as well as Wisconsin, Pennsylvania and Minnesota. Detroit was mentioned once on page 34 of the report, from a Nov. 7, 2016, tweet when a troll referred to the city, tweeting: "Detroit residents speak out against the failed policies of Obama, Hillary & democrats."

Facebook told the Senate Intelligence Committee in 2017 that in the two years leading up to the 2016 election, ads bought by the Kremlin-linked Russian operative, the Internet Research Agency, reached as many as 126 million users. In January 2018, Twitter announced approximately 1.4 million users had some type of interaction with an IRA-controlled account.

This doesn't tell the whole story, though, said Kim, the professor who found the "Michigan_black_community" Instagram account. Once an IRA ad was shared by another user, Facebook considered that share to be "organic." Facebook did not release data about where the IRA content went after it stopped being a paid advertisement and became something shared by other Facebook users. 

When U.S. intelligence officials warned the House Intelligence Committee that Russia was again ramping up efforts to poke around in the 2020 election, Kim wrote a piece for the Brennen Center for Justice about how much more brazen Russia accounts became after 2016.

In addition to emphasizing wedge issues, such as race and gun control, Kim found the accounts run by the IRA appeared to be targeting battleground states, including Michigan, by putting the state directly into the account's user name.

These accounts also no longer appear to be buying ads; rather, they're now sharing domestic content. This blurs the line between foreign and domestic actors enough to sometimes evade the safety protocols Facebook and other platforms put into place.

Last week, Director of National Intelligence John Ratcliffe announced Iran was behind emails sent to people in Pennsylvania and Ohio threatening they would be attacked if they went to the polls. The people behind the intimidationattack used a domain that made it appear the emails came from the far-right authoritarian group Proud Boys. No officials have announced any Michigan voters getting similar emails.

In this screen grab from video provided by the 36th District Court in Detroit, Jacob Wohl, left, and Jack Burkman, shown in the center left photo, are seen during an arraignment being conducted over Zoom on Oct. 8, 2020, in Detroit. Burkman and Wohl were arraigned on voter intimidation charges, Michigan Attorney General Dana Nessel announced.

A new election playbook

It isn’t just foreign actors who are trying to confuse or mislead voters. Earlier this month, Jacob Wohl, a 22-year-old Los Angeles resident, and Jack Burkman, a 54-year-old resident of Arlington, Virginia, were charged with several felonies related to voter intimidation and election fraud. The pair, who pleaded not guilty, are accused of orchestrating a series of inaccurate robocalls to Michigan voters, telling them that law enforcement and debt collectors would use the information provided by those who cast an absentee ballot. 

Partisan news sites have also sprung up, all with innocuous names that an unwitting reader could mistake for just a normal local newspaper. There are at least 34 of these sites with names linked to Michigan areas.

Sites not always mentioned when talking about social media have also come under scrutiny, such as Spotify, which on Oct. 19 removed four QAnon conspiracy podcasts after Media Matters for America noted the group was being left to thrive on its platform, according to Business Insider.

The legacy of the 2016 Russian interference in U.S. elections is not that Russia created a blueprint for foreign interference, Tucker said. Rather it is a toolkit for anyone to do these coordinated campaigns online. 

“Who's got the biggest stake in the outcome of elections?” Tucker asked. “It's the domestic actors in those countries."

Facebook CEO Mark Zuckerberg appears on a screen as he speaks remotely during a hearing before the Senate Commerce Committee on Capitol Hill on Oct. 28, 2020, in Washington. The committee summoned the CEOs of Twitter, Facebook and Google to testify during the hearing.

Platforms on guard

After 2016, it seemed like the platforms were caught unaware both of the rapid rise of political disinformation and the foreign influence campaigns, Tucker said.

They’ve had four years to prepare for this and have made some changes, Tucker said, including monitoring for fake accounts and deplatforming (removing) pages that spread false information. Both Twitter and Facebook have added voting information links to content related to the U.S. elections. Facebook restricted new political or social justice advertising starting Oct. 27. Ad campaigns submitted before the deadline continue to run, but changes to the content of those ads is limited and new campaigns are not allowed until Nov. 5, after the election. The platform will host no political advertising on Nov. 4.

Twitter added a new prompt that pops up when users try to retweet a link they haven't recently opened. The prompt tells the user, "Headlines don't tell the full story," and gives them the option to read the story on Twitter before retweeting. The new feature was meant to "give you better context and to reduce the unintentional spread of misleading information on Twitter."

“Now, in the aftermath of these elections, will we see if they were more effective, I don’t know,” Tucker said. “And I don’t know about both of those things. I don’t know how much we’ll be able to see at the end of the day,  and I’m also not sure what we’ll learn.”

Facebook also announced in August it would partner with independent researchers to study whether the platform changed the outcome of the 2020 election. Dartmouth government professor Brendan Nyhah is one of the researchers who will participate in that effort.

When Nyhah, who previously taught at the University of Michigan, studied how people's beliefs changed after exposure to misleading and false information, he found articles didn't change people's minds, but did appear to increase partisanship. More studies need to be done, he said, but he said he believes people should be more thoughtful about how they talk about disinformation. 

"It doesn't mean this stuff doesn't matter, but we need to be more specific in thinking about how it might matter and for whom," Nyhah said. 

Fadi Quran is the disinformation lead for Avaaz, an online activist organization group with more than 65 million members worldwide that focuses on issues such as climate change and disinformation. Social media platforms are the ones that can tell who has interacted with disinformation and how their behavior changed, and they don't do that, Quran said. He was skeptical the research being done by Nyhah and others would yield the type of transparency needed to answer questions about the effects of disinformation. 

"In many cases, Facebook promises, gets all the positive PR and then pulls back," Quran said. "We saw that with NYU recently."

Quran was referring to Facebook demanding an end to the collection of data by New York University researchers whose tool, AdObserver, shows who is being micro-targeted by political ads on the platform. Users install the AdObserver browser extension and allow it to capture the ads they are served on Facebook.

The AdObservatory is part of NYU's Online Political Transparency Project. In a Medium post about the features of the AdObserver tool, NYU explained how political campaigns can upload to Facebook a spreadsheet of people’s names, phone numbers or email addresses to target them with advertising.

For instance, the AdObservatory shows that in Michigan, Sen. Peters uploaded an email list from Anne Lewis Consulting LLC, a media strategy firm used by Political Action Committees such as the AFL-CIO Committee on Political Education Treasury Fund. Peters also targeted people in Ann Arbor and Marysville. 

Peters' opponent, Republican businessman John James, uploaded his own list to Facebook. He also targeted people with homes in Michigan and who liked Ben Shapiro, Glenn Beck, Rush Limbaugh and/or Tucker Carlson.

In total, Peters and his associated pages so far have spent about $1.4 million on Facebook advertising, compared with James, who has spent about $165,000, according to AdObserver's data. 

In 2016, foreign actors had access to the same tools Peters and James use now. 

Quran pointed to the billions spent on Facebook advertisements as part of why he doubts the idea the platform doesn't have the power to change elections.

"The truth is if you're a marketer and you've used Facebook to sell your products, you do know how much influence the platform can have in terms of its reach on people," Quran said.

The truth is, even if disinformation campaigns are 5% effective or 3% effective, that could sway a swing state, Quran said. 

Country divides

As acting assistant secretary of defense from 2015 to 2017, U.S. Rep. Elissa Slotkin, a Democrat from Holly, said she saw what an effective tool disinformation can be in dividing countries. She watched Russia use disinformation campaigns in former Soviet states in Eastern Europe. 

Russia’s strategy was to turn people within a country against one another, Slotkin said. Part of the reason disinformation is so effective is that it capitalizes on real issues, such as what to do about a global pandemic and how to address racial divides, she said. 

“What they’re doing is taking those differences and amplifying them and making people feel like they have to take sides,” Slotkin said. 

While speaking to a group of about 40 Michigan voters recently, Slotkin asked how many had lost a friend or family member over politics this year. Almost everyone in the group raised their hand, she said. People can’t stand the tension and they just want this election to be done, she said. 

A limited, but dangerous threat

While disinformation undermines some of the fundamental tenets of democracy, such as informed voting, Tucker did emphasize what a small percentage of overall media Russian trolls took up in 2016. 

“We always want to be careful about overstating the influence of a small number of tweets during the course of the election,” Tucker said. 

It might have affected some individuals in 2016, but the idea that it affected a large number of people seems highly doubtful, Tucker said. Overstating how much false information was circulating in 2016 can lead people to feel helpless, when researchers know the problem wasn’t so large that disinformation drowned out legitimate news in 2016, he said.

Three of Facebook's own security researchers wrote a report in April 2017 about information operations on the platform and said, "The reach of the content shared by false amplifiers was marginal compared to the overall volume of civic content shared during the US election."

When looking at all the content created during the run up to 2016, Kim said the idea that other content drowned out the disinformation campaigns makes sense. However, the strategy appeared to focus on minority groups in certain states — meaning they might have been inundated with disinformation. 

To report misinformation

In anticipation of another year of deceptive advertising in Michigan, Secretary of State Jocelyn Benson created an email address where voters can report misinformation to the Secretary of State's Office. Depending on the reach and content of the information, it can be corrected on the Secretary of State's fact check page or escalated to the Attorney General's office. Attempts to misinform the public in 2016 included claims that mailing in a ballot wouldn’t count, that certain polling places would be closed and that  there were long lines to vote, Benson said. 

“A lot of the nefarious things would say, ‘Election Day’s been moved,’” Benson said. 

Minority groups in swing states are often the primary targets of this sort of messaging, Kim said. Those voters, especially, should be wary as Election Day approaches, she said

No comments:

Post a Comment