August 12, 2022
META asked to reduce electoral misinformation efforts as US midterm vote approaches: details

Facebook owner Meta is quietly easing some safeguards designed to thwart voting misinformation or foreign interference in US elections as the November midterm voting nears.

It’s a sharp departure from the social media giant’s multi-billion dollar efforts to increase the accuracy of posts about US elections and about the company exploiting people’s data and allowing lies to dominate its site. His outrage is to gain the confidence of the MPs and the public. 2016 campaign.

Axle is raising alarm about the priorities of Meta and how some people can take advantage of the world’s most popular social media platform to spread misleading claims, launch fake accounts and incite partisan extremists.

“They’re not talking about it,” said former Facebook policy director Katie Harbath, who is now CEO of tech and policy firm Anchor Change. “Best case scenario: They’re still doing a lot behind the scenes. Worst case scenario: They back off, and we don’t know how that’s going to manifest itself on the platform over the medium term.”

Since last year, Meta has closed an examination of how lies are amplified in political ads on Facebook by indefinitely removing researchers from the site.

CrowdTangle, the online tool that the company introduced to hundreds of newsrooms and researchers to identify trending posts and misinformation on Facebook or Instagram, has been dormant for a few days now.

Public communication about the company’s response to election misinformation has certainly been quiet. Between 2018 and 2020, the company issued more than 30 statements detailing how it would prevent US election misinformation, prevent foreign opponents from running ads or posts around votes, and tame divisive hate speech. .

Top officials hosted question-and-answer sessions with journalists about the new policies. CEO Mark Zuckerberg wrote Facebook posts and opinion articles via social media promising to remove false voting information, calling for more rules to tackle foreign interference in US elections.

But this year Meta has only released a one-page document outlining plans for the election, even though the potential threats to the vote are clear. Several Republican candidates are making false claims about the US election on social media. In addition, Russia and China have continued aggressive social media campaigns aimed at further political division among American audiences.

Meta says that elections are a priority and that policies developed in recent years around electoral misinformation or foreign interference are now driving the company’s operations hard.

“With every election, we incorporate what we’ve learned into new processes and set up channels to share information with the government and our industry partners,” said META spokesman Tom Reynolds.

He declined to say how many employees would be on the project to guard US elections full time this year.

During the 2018 election cycle, the company offered tours and photos for its election response battle room and prepared head counts. But The New York Times reports that in this year’s election the number of META employees working has been reduced from 300 to 60, a figure META disputes.

Reynolds said Meta will pull in hundreds of employees working in the company’s other 40 teams to oversee the upcoming vote, along with unspecified workers on the polls team.

The company continues a number of initiatives to limit election misinformation, such as a fact-checking program launched in 2016 that enlists the help of news outlets to verify the veracity of popular lies spreading on Facebook or Instagram. . The Associated Press is part of META’s fact-checking program.

This month, Meta also rolled out a new feature for political ads that allows the public to find details about how advertisers target people on Facebook and Instagram based on their interests.

Still, META has thwarted other attempts to identify election misinformation on its sites.

It stopped revamping CrowdTangle, a website it introduced to newsrooms around the world that provided information about trending social media posts. Journalists, fact-checkers and researchers used the website to analyze Facebook content, including finding popular misinformation and who is responsible for it.

That tool is now “dying,” former CrowdTangle CEO Brandon Silverman, who left Meta last year, told the Senate Judiciary Committee this spring.

Silverman told the AP that CrowdTangle is working on an upgrade that would make it easier to find the text of Internet memes, which can often be used to spread half-truths and evade the surveillance of fact-checkers, for example.

“There’s no real shortage of ways you can organize this data to make it useful to different parts of the fact-checking community, newsrooms and wider civil society,” Silverman said.

Silverman said that not everyone in the meta agrees with that transparent approach. The company hasn’t rolled out any new updates or features to CrowdTangle in over a year, and it’s experienced hours of outages in recent months.

Meta also shut down efforts to investigate how misinformation is spread through political ads.

The company indefinitely revoked access to Facebook to a pair of New York University researchers who said they collected unauthorized data from the platform. NYU professor Laura Adelson said the move came hours after she shared plans with the company to investigate the spread of propaganda on the platform around the January 6, 2021, attack on the US Capitol, which is now the subject of a House investigation. Is.

“When we took a closer look, we found that their systems were dangerous for a lot of their users,” Adelson said.

Privately, former and current META employees say that highlighting those threats around the US election has sparked a public and political backlash for the company.

Republicans regularly accuse Facebook of unfairly censoring conservatives, some of whom have been removed for breaking the company’s rules. Meanwhile, Democrats regularly complain that the tech company isn’t doing enough to curb misinformation.

“It’s something that’s so politically fraught, they’re trying to run away from it than jump in head first.” Harbath, former policy director of Facebook, said. “They just see it as a big old pile of headaches.”

Meanwhile, regulation in the US is no longer on the company, as lawmakers have failed to reach a consensus on what surveillance the multibillion-dollar company should be subject to.

Freed from that threat, Meta leaders have devoted the company’s time, money and resources to a new project in recent months.

Zuckerberg dived into this massive rebranding and restructuring of Facebook last October, when he changed the company’s name to Meta Platform. He plans to spend years and billions of dollars to develop his social media platform into a nascent virtual reality creation called “Metaverse” – like the Internet brought to life, rendered in 3D.

His public Facebook page posts now focus on product announcements, praise artificial intelligence, and photos of him enjoying life. The company’s blog posts announcing news about election preparations are not written by them.

In a post by Zuckerberg last October, a former Facebook employee defended the company after leaking internal documents showing how the platform perpetuates hate and misinformation. He also reminded his followers that he had inspired Congress to modernize the rules surrounding elections for the digital age.

“I know it’s disappointing to see the good work we’ve done wrongly, especially for those of you who are making significant contributions to safety, integrity, research and product,” he wrote on October 5. “But I believe that in the long term if we keep trying to do the right thing and continue to provide experiences that improve people’s lives, it will be better for our community and our business.”

It was the last time he discussed the election work of the Menlo Park, California-based company in a public Facebook post.


Leave a Reply

Your email address will not be published.