Home / US / Is Facebook really ready for the 2020 election?

Is Facebook really ready for the 2020 election?



Ever since Russian agents and other opportunists abused its platform in an attempt to manipulate the 2016 U.S. presidential election, Facebook has insisted – repeatedly – that it has learned its lesson and is no longer a conduit for disinformation, voter suppression and disruption. of the elections.

But it has been a long and challenging journey for the social network. External critics, as well as some Facebook employees, say the company’s efforts to review its rules and strengthen its warranties remain wholly insufficient for the task, despite spending billions on the project. As for the reason, they emphasize the company̵

7;s persistent reluctance to act decisively for much of that time.

“Am I worried about the election? I’m terrified, “said Roger McNamee, a Silicon Valley venture capitalist and one of the first Facebook investors turned vocal critic.” At the current scale of the company, it’s a clear and present danger to democracy and national security. “

The company’s rhetoric has certainly received an update. CEO Mark Zuckerberg now casually references possible outcomes that were unimaginable in 2016 – including possible civil unrest and potentially a controversial election that Facebook could easily worsen – as challenges the platform now faces.

“This election will not be as usual,” Zuckerberg wrote in a Facebook post in September in which he highlighted Facebook’s efforts to encourage voting and remove disinformation from its service. “We all have a responsibility to protect our democracy.”

Yet for years, Facebook executives seemed to be taken by surprise every time their platform, created to connect the world, was used for malicious purposes. Zuckerberg offered multipleapologies over the years, as if no one could have predicted that people would use Facebook to stream homicides and suicides, incite ethnic cleansing, promote false cancer treatments or attempt to steal elections.

While other platforms like Twitter and YouTube have also struggled to tackle misinformation and hateful content, Facebook stands out for its reach and scale and, compared to many other platforms, its slower response to the challenges identified in 2016.

In the aftermath of the election of President Donald Trump, Zuckerberg offered an extraordinarily deaf joke regarding the idea that the “fake news” spread on Facebook could have influenced the 2016 elections, calling it “a pretty crazy idea”. A week later, he replied to the comment.

Since then, Facebook has issued a stream of mea culpa for its slowness in taking action against threats in the 2016 election and has promised to do better. “I don’t think they’ve gotten any better at listening,” said David Kirkpatrick, author of a book on the rise of Facebook. “What has changed is that more people have told them they have to do something.”

The company hired external fact-checkers, added restrictions – then more restrictions – on political ads, and removed thousands of accounts, pages, and groups that it found engaging in coordinated “inauthentic behavior.” This is Facebook’s term for fake accounts and groups that maliciously target political discourse in countries ranging from Albania to Zimbabwe.

It has also begun to add warning labels to posts that contain incorrect voting information, and has at times taken steps to limit the circulation of misleading posts. In recent weeks the platform has also been banned posts denying the Holocaust and joined Twitter to limit its spread of an unverified political story about Hunter Biden, son of Democratic presidential candidate Joe Biden, published by the conservative New York Post.

All of this unquestionably puts Facebook in a better position than it was four years ago. But that doesn’t mean he’s fully prepared. Despite the toughened rules banning them, violent militias are still using the platform to organize. Recently, this included a thwarted plot to kidnap the governor of Michigan.

In the four years since the last election, Facebook’s earnings and user growth have soared. This year, according to FactSet, analysts expect the company to make profits of $ 23.2 billion on revenue of $ 80 billion. It currently boasts 2.7 billion users worldwide, up from 1.8 billion right now in 2016.

Facebook faces a series of government investigations into its size and market power, including a U.S. Federal Trade Commission antitrust investigation. A previous FTC investigation framed Facebook with a $ 5 billion fine, but did not require any additional changes.

“Their number 1 priority is growth, not harm reduction,” Kirkpatrick said. “And that’s unlikely to change.”

Part of the problem: Zuckerberg maintains an iron grip on the company, but doesn’t take criticism of him or his creation seriously, accuses social media expert Jennifer Grygiel, a communications professor at Syracuse University. But the public knows what’s going on, he said. “They see COVID misinformation. They see how Donald Trump exploits it. They cannot fail to see it. “

Facebook insists it takes the challenge of disinformation seriously, especially when it comes to elections.

“The election has changed since 2016, and so has Facebook,” the company said in a statement which sets out its policies on election and voting. “We have more people and better technology to protect our platforms and we have improved our content and enforcement policies.”

Grygiel says such comments are normal. “This company uses PR instead of an ethical business model,” he said.

Kirkpatrick notes that board members and executives who opposed the CEO – a group that includes the founders of Instagram and WhatsApp – have left the company.

“He is so certain that Facebook’s overall impact on the world is positive,” and that critics don’t give him enough credit for it, said Zuckerberg’s Kirkpatrick. Consequently, the Facebook CEO is not inclined to receive constructive feedback. “He doesn’t have to do anything he doesn’t want to do. He’s out of control,” Kirkpatrick said.

The federal government has thus far left Facebook to itself, a lack of accountability that has only empowered the company, according to US Representative Pramila Jayapal, a Washington Democrat who grilled Zuckerberg during a hearing on Capitol Hill in July.

Warning labels are of limited value if the algorithms behind the platform are designed to send polarizing material to users, he said. “I think Facebook has done some things that indicate it understands its role. But it was, in my opinion, too little, too late “.


Source link