Friday, October 14, 2022
HomeTechnologyIt’s time to rethink how we fight election misinformation

It’s time to rethink how we fight election misinformation

Last month, all four major online social platforms — Meta, Twitter, YouTube, and TikTok — released their plans for combating misinformation and disinformation in the weeks leading up to the 2022 US midterms. 

Meta We will be voting alerts, real-time fact-checking in both English and Spanish, and as it did in 2020, it will also be banning “new political, electoral and social issue ads” during the week leading up to the election. Twitter It is important to focus on “prebunks,” proactively fact-checking content in users’ feeds based on search terms and hashtags and will have election-themed Explore pages. YouTube Is rolling outInformation widgets for search pages of candidates TikTok We will beWe will continue to enforce our long-standing ban on political advertisements with curated hashtags that are rich in verified information. 

You’d be forgiven, though, if you couldn’t keep any of this straight in your head anymore or can’t immediately parse what makes these any different from any other election. Researchers and fact-checkers are the same. 

“I don’t think any platform is in ‘good shape’”

“Unless Facebook has drastically changed the core functions and design of its platform, then I doubt any meaningful changes have happened and ‘policing misinformation’ is still piecemeal, whack-a-mole, and reactive,” Erin Gallagher, a disinformation researcher on the Technology and Social Change research team at the Shorenstein Center, told The Verge.

The world has changed and the internet’s biggest platforms don’t seem to realize it. If polls are to be believed, 45 percent of Americans — and 70% of Republicans — believe Some variations of the “Big Lie” that former President Donald Trump won the 2020 election. Candidates from non-affiliated organizations Are runningThere are conspiracy theories in more than 25 states. more prevalent. Even though platforms are being considered for banning 2020 voter fraud allegations, they remain in place. still full of “Big Lie” content. In the battle between platform moderators & conspiracy theories, conspiracy theory won.

November’s election will be the first time that many Americans will enter the voting booth since last year’s insurrection, which was planned and subsequently livestreamed on many of the same platforms now emphasizing their commitments to democracy. And it’s this new troubling political reality that isn’t reflected in the current policies of the four major platform companies. 

“I hate to be extremely pessimistic but I don’t think any platform is in ‘good shape,’” Gallagher said.

What’s next? How can we manage online platforms after an insurrection? Are we doomed? The simple answer is that each platform has reached its limit.

The “war room” looked like a room full of computers

Katie Harbath, CEO of Anchor Change and former public policy director for Facebook, said very little of what she’s seen from platforms regarding the US midterms this year feels new. Harbath left Facebook in 2021 and said she’s particularly concerned that none of the Big Tech companies have anything in their election policies that mention coordinating across platforms on internet-wide conspiracy theories.

“How does this mis- and disinformation spread amongst all these different apps? How do they interplay with one another,” Harbath told The Verge. “We don’t have enough insight into that because nobody has the ability to really look cross-platform at how actors are going to be exploiting the different loopholes or vulnerabilities that each platform has to make up a sum of a whole.”

The idea of a misinformation war room — a specific place with specific staffers devoted entirely to banning and isolating misinformation — was pioneered by Facebook after the Cambridge Analytica scandal. The twin shocks of Donald Trump’s 2016 victory and the unexpected Brexit vote created a need for a way for internet platforms to show that they were actively safeguarding against those who wanted to manipulate the democratic process around the world. 

Meta (then Facebook), wanted to shift the narrative ahead of the 2018 US midterm elections and the Brazilian presidential election. Journalists were invited to visit a physical war room. The Associated Press Description as “a nerve center the social network has set up to combat fake accounts and bogus news stories ahead of upcoming elections.” From pictures, it looked like a room full of computers with a couple of clocks on the wall showing different time zones. The war room Then, the site would be closedThe announcement was made less than one month later. It proved to be a great piece of PR for the company. It gave a sense that the company was there, and helped to make the often mundane task of maintaining a large website more personal.

Harbath said that the election war rooms were meant to centralize the company’s rapid response teams and often focused on fairly mundane issues like fixing bugs or quickly taking down attempts at voter suppression. Harbath cited one example of warroom content moderation from the 2018 midterms: Trump’s campaign ran an ad about undocumented migrants crossing the border. The ad was not allowed to run during internal debate. It was eventually decided. Block the ad

“No platform has been transparent about how much content even gets labeled”

“Let’s say I got a phone call from some presidential candidate’s team because their page had gone down,” she said. “I could immediately flag that for the people in the War Room to instantly triage it there. And then they had systems in place to make sure that they were routing things in the right perspective, stuff like that.”

Much of this triaging took place in public, with journalists and analysts flagging problematic content and moderators responding. The platform will be used in the 2020 election Finally, the bottom has been cracked on “stop the steal” content — more than two months after results of the election were settled.

Corey Chambliss is a spokesperson for Meta. The Verge that the 2018 policy with regard to working with “government, cybersecurity, and tech industry partners” during elections was still accurate for this year’s midterms. Chambliss would not specify which industry peers Meta communicates with but said that its “Election Operations Center” will be, in effect, head of Election Day this year. 

In a report Published this month about removing coordinated authentic activity in Russia and China, Facebook said, “To support further research into this and similar cross-internet activities, we are including a list of domains, petitions and Telegram channels that we have assessed to be connected to the operation. We look forward to further discoveries from the research community.”

“There are also just more platforms now.”

There are other reasons not to be pessimistic. The current election response focuses on using artificial intelligence and filters to automatically flag misleading content and remove higher-level coordinated disinformation. But if you’re someone who spends 10 hours a day consuming QAnon content in a Facebook Group, you’re probably not going to see a fact-checking widget and suddenly deradicalize. And making things even more frustrating, according to Gallagher, is the fact that there aren’t any actual numbers on how many posts are flagged as misleading or false.

“As far as I know, no platform has been transparent about how much content even gets labeled or what the reach of that labeled content was, or how long did it take to put a label on it, or what was the reach before vs. after it was labeled,” she said.

Also, if you’re someone immersed in these digital alternate realities, in all likelihood, you’re not just using one platform to consume content and network with other users. You’re probably using several at once, none of which have a uniform set of standards and policies. 

“There are also just more platforms now,” said Gallagher, thinking of alternative social media platforms like Rumble, Gettr, Parler, Truth Social, etc. “And TikTok, which is wildly popular.”

The platforms also have new functions. Social media is more than a place to make friends, share life updates and connect with other communities. It has become a complex interconnected web of many platforms with different algorithms, and different incentives. These sites face more problems than any company can handle.

“There was a big platform migration that happened both since 2020, and since January 6th.”

Karan Lala, a fellow at the Integrity Institute and a former member of Facebook’s civic integrity team, told The Verge that it’s useful now to focus on how different apps deliver content to users. He breaks them down into two categories: distribution-based and community-based.

“TikTok, apps like Instagram, those are distribution-based apps where the primary mechanism is users consuming content from other users,” Lala said. “Versus Facebook, which has community-based harms. Right?”

This first group of apps, which includes Instagram, TikTok, and others, can pose a major challenge during big news events such an election. This year’s midterms won’t be the first “TikTok election” in the US in the literal sense, but it will be the first US election where TikTok, not Facebook, is the dominant cultural force in the country. Meta’s flagship platform reported that it lost users for the first time this year and, per A report from recently TechCrunchTikTok moved the app out the Apple App Store top 10, this summer. 

Brandi Geurkink, senior fellow at Mozilla, says TikTok also has the lowest transparency of any major platform. “It’s harder to scrutinize, from the outside, TikTok than it is some other platforms, even like Facebook — they have more in terms of transparency tools than TikTok,” Geurkink told The Verge.

Geurkink was part of the team at Mozilla that recently published “These Are ‘Not’ Political Ads,” a report that found TikTok’s ban on political ads is extremely easy to bypass and that the platform’s new tool that lets creators pay to promote their content has virtually no moderation, allowing users to easily amplify politically sponsored content. TikTok however has. It has been updated this month, blocking politicians and political parties from using the platform’s monetization tools, such as gifting, tipping, and the platform’s Creator Fund. The VergeTikTok was contacted for comment.

“I think what we’ve advocated for, for a long time, is there to basically be external scrutiny into the platforms,” Geurkink said. “Which can be done by external researchers, and TikTok hasn’t really enabled that in terms of transparency. They’ve done a lot less than the other platforms.”

It’s not just a lack of transparency with regard to how the platforms moderate themselves that’s a problem, however. We don’t know how these platforms work together as a network. Though, thanks to Meta’s own Widely Viewed Content Reports, we do have some sense of how linked these different platforms are now. 

The most popular domain on Facebook during The second quarter of 2022YouTube.com accounted for nearly 170 million views and TikTok.com for 108 millions. This puts a halt to the notion that each platform could independently moderate their content. But it’s not just content coming from other big platforms like YouTube and TikTok that create weird moderation gray areas for a site like Facebook. 

“If people genuinely believe a false claim, all they’re going to think is that the social media company is trying to work against what they perceive to be the truth.”

Sara Aniano, a disinformation analyst at the Anti-Defamation League’s Center on Extremism, told The VergeRumble, a fringe right-wing website, is becoming increasingly influential. Their content is shared back on mainstream platforms like Facebook.

“There was a big platform migration that happened both since 2020, and since January 6th,” Aniano said. “People figured out that they were getting censored and flagged with content warnings on mainstream social media platforms. And maybe they went to places like Telegram or Truth Social or Gab, where they could speak more freely, without consequence.”

Bad actors — the users who aren’t just blindly sharing content they think is true or don’t care enough to personally verify — know that larger mainstream platforms will suspend their accounts or put content warnings on their posts, so they’ve gotten better at moving from platform to platform. Their followers can also be influenced by their actions when their posts are flagged as false or misleading.

“If people genuinely believe a false claim, all they’re going to think is that the social media company is trying to work against what they perceive to be the truth,” she said. “And that is kind of the tragic reality of conspiracism, not just leading up to the election, but around everything, around medicine, around doctors, around education, and all the other industries that we’ve been seeing attacked over and over again.”

She said that the Arizona primaries were a good example of how it all works together. Conspiracy theoryThere was a conspiracy to rig an Arizona primary. It claimed that Maricopa county election officials used the pens they had in their possession. It was a repetition of the #SharpieGate conspiracy theory that first became viral in 2020. 

The hashtag #SharpieGate remains hidden currently Facebook. But that hasn’t stopped right-wing publishers From writing about itYou can also share their articles on the platform. YouTube’s search results are completely free of conspiracy theory content; the hashtag isn’t blockedTwitter, but it is blocked by TikTok Users are still making videosLearn more. 

Ivy Choi, YouTube’s policy communications manager, said The Verge that the platform is not blocking #SharpieGate content but is demoting it in the platform’s search terms. “When you look for ‘#Sharpiegate 2.0,’ YouTube systems are making sure authoritative content is at the top,” she said. “And making sure that borderline content is not recommended.”

“I mean, any attempt at mitigation and more stringent content moderation is a good thing,” Aniano said. “I would never say that it’s futile. However, I think it is necessary to acknowledge that the problem, as well as the distrust that has been created in the democratic system since 2020, is deeply systemic. It cannot be solved in a week, it can’t be solved in a year, it may take lifetimes to rebuild this trust.”


RELATED ARTICLES

LEAVE A REPLY

Please enter your comment!
Please enter your name here

Most Popular

Recent Comments