For Facebook executives in Washington, the post does not appear to violate its policies, which allows leaders to publish posts on the use of force by the government if the message is intended to warn the public, but it quickly came in. line. MEPs had already contacted the White House earlier in the day with an urgent appeal to change the language of the post or simply delete it, people said.
In the end, Trump published again, saying his comments were meant to be a warning after all. Zuckerberg then went online to explain his rationale for keeping the place high, noting that Trump̵
The frantic push-pull was just the latest episode of a five-year struggle by Facebook to adapt to Trump’s ways of overcoming limits. The president hasn’t changed his rhetoric since he was a candidate, but the company has continually changed its policies and products in certain ways of surviving its presidency.
Facebook has limited its efforts against false and misleading news, has adopted a policy that explicitly allows politicians to lie and has even altered its news feed algorithm to neutralize claims that it was distorted by conservative publishers, according to more than one a dozen previous and current employees and previously unreported documents obtained from The Washington Post. One of the documents shows that it started in 2015, when Trump, as a candidate, released a video calling for a ban on Muslims from entering the United States. Facebook executives refused to remove it, setting in motion an exception for the political speech.
The concessions to Trump have led to a transformation of the global information battlefield. They paved the way for a growing list of digitally savvy politicians who repeatedly push disinformation and incendiary political language to billions of people. It complicated the public’s understanding of important events such as the pandemic and the protest movement, as well as contributing to polarization.
And when Trump came to power, fear of his anger drove Facebook towards more deferential behavior towards his growing number of right-leaning users, tilting the balance of the news that people see on the net, according to the current and former employees.
Facebook is also facing a slow-burning morale crisis, with over 5,000 employees denouncing the company’s decision to leave Trump’s post which said “when looting begins, filming begins.”
Bowing to this pressure on Friday, Zuckerberg announced a series of new policies aimed at improving police content on the site. This includes putting labels on posts that violate hate speech or other policies, including those of political leaders.
But the company said the post wouldn’t qualify.
As the United States goes to another presidential election while facing a pandemic and civil unrest, the latitude given to Trump can offer him a potential advantage. In recent months, it has used Facebook and other platforms to disseminate misleading information about coronavirus cures, electoral fraud and demonstrators’ motivations, often targeting a left-wing movement as a cause of violence without citing evidence.
It also places Facebook in increasing conflict with its counterparts in Silicon Valley. Twitter has labeled several presidential tweets as offensive and misleading, and the social media platform Snapchat has reduced the reach of the president’s account.
“The value of being in favor of people in power outweighs almost every other concern for Facebook,” said David Thiel, a Facebook security engineer who resigned in March after his colleagues refused to remove a place that he believed it was a “dehumanizing speech” by the Brazilian president.
Facebook claims that the use of incendiary populist language precedes social media. Nick Clegg, Facebook’s vice president of global affairs and communications, said in a statement that populism was not invented in Silicon Valley, indicating centuries of political history prior to the existence of social media companies.
“From the Arab Spring to local candidates who challenge historic political operators, social media has also helped open up politics, not favor one side over the other,” added Clegg. “Studies have shown that the drivers of populism are complex and cannot be reduced to the use of social media, in fact political polarization has fallen in many countries with a high use of the Internet.”
Facebook declined to make Zuckerberg available for an interview, although he stressed that Zuckerberg opposed Trump when his Muslim immigration ban went into effect. The White House declined to comment.
Zuckerberg often talks about making choices that stand the test of time, preserving the values of Facebook and the WhatsApp and Instagram branches for all its nearly 3 billion monthly users for many years in the future, even when such decisions are unpopular or controversial.
At some point, however, he wanted a different approach to Trump.
Laying the foundations
Prior to the 2016 elections, the company largely saw its role in politics as a courtship of political leaders to buy publicity and pass on their views, according to people familiar with the company’s thinking.
But this started to change in 2015, when Trump’s candidacy picked up speed. In December of the same year, he released a video in which he said he wanted to ban all Muslims from entering the United States. The video went viral on Facebook and was an early indication of the tone of his application.
Outrage over the video led to a town hall across the company, where employees denounced the video as hate speech, in violation of the company’s policies. And in meetings on the issue, senior leaders and policy experts overwhelmingly said they believed the video was a hate speech, according to three former employees, who spoke on condition of anonymity for fear of punishment. Zuckerberg said at the meetings that he was personally disgusted and removed from it, people said. Some of these details have previously been reported.
During one of the meetings, Monika Bickert, Facebook’s vice president for politics, drafted a document to target the video and shared it with leaders including Sheryl Sandberg, Zuckerberg’s deputy general manager and vice president of global politics Joel Kaplan, the most important republican of society.
The document, which had not previously been reported and obtained by The Post, weighed four options. They included removing the post for hate speech violations, a one-off exception for it, creating a broad exemption for political speech, and even weakening the company’s community guidelines for everyone, allowing for comments like “Blacks aren’t allowed” and “Get gays out of San Francisco.”
Facebook spokesman Tucker Bounds said that the latter option has never been seriously considered.
The document also listed the possible “PR risks” for each. For example, lowering standards in general would raise questions such as “would Facebook provide a platform for Hitler?” Bickert wrote. On the other hand, an incision for a political speech across the board risked opening the doors for even more hateful comments on the “script”.
In the end, Zuckerberg was denied by his desire to remove the post partly from Kaplan, according to the people. Instead, executives created an allowance whereby a noteworthy political speech would be considered when deciding whether the posts violated community guidelines.
This allowance was not formally incorporated into the policies, although it informed the ad hoc decision-making process on the political speech for the next few years, according to the people. When a formal topical policy was announced in October 2016, in a post on Kaplan’s blog, the company did not discuss Trump’s role in shaping it.
In an interview, Bickert said the company eventually made a call to keep the video on Trump’s Muslim ban because executives interpreted Trump’s comment in the sense that the then candidate was not talking about all Muslims, but rather advocating a political position on immigration as part of a noteworthy political debate. He said he did not remember the document in which the options were presented.
Facebook’s limits added that the “topical” policy was added in 2016 after content reviewers removed a photo of a naked girl fleeing a napalm attack during the Vietnam War. “Our goal was to recognize the useful public benefit of preserving content that in other contexts would not have been allowed,” said Bounds. “In the case of elected officials, it also ensures that they will be accountable for their words,” so that people can judge for themselves.
In the spring of 2016, Zuckerberg was also disproved by his desire to write a post specifically condemning Trump for his calls to build a wall between the United States and Mexico, after Washington advisers warned that it might seem like choosing the parties, according to Dex Torricke -Barton, one of Zuckerberg’s former writers.
Carved political speech laid the groundwork for how the company would manage not only Trump, but populist leaders from around the world who published content that tests these boundaries, such as Rodrigo Duterte in the Philippines, Jair Bolsonaro in Brazil and Narendra Modi in India.
“Though [Facebook] has repressed disinformation, the most problematic influencers are politicians, “said Claire Wardle, director of the US First Draft, an organization dedicated to the fight against disinformation that has a partnership with Facebook to control the articles. “You can do all the fact checking in the world, but these influencers have a disproportionate impact.”
Trump presented a unique challenge, he added. “Until then, nobody would have considered a president who would have said those things.”
Protect the right
After the elections, it became clear that Russia had used social media to sow disinformation. Facebook soon became a frequent target of the president’s anger. He tweeted that the social media giant was “anti-Trump” and was trying to undermine his victory.
At the same time, GOP leaders have intensified criticism that platforms like Facebook and Twitter, with liberal-led leadership ranks, have tried to limit the reach of right-leaning voices.
“There is no credible research to support Trump’s claim that social media platforms suppress conservative content, but he still managed to get them to revise their rules for him,” said former Facebook spokesman Nu Wexler, who left the company in 2018.
While Facebook struggled to deal with foreign interference and misinformation, its executives in the nation’s capital claimed that caution and respect was needed to survive the new political environment, according to three people familiar with the company’s thinking.
Facebook’s security engineers in December 2016 presented the results of a large internal investigation, known as Project P, to senior summits on how false and misleading news spread so virally during the election. When the Facebook security team highlighted dozens of pages that peddled fake news, senior Washington executives, including Kaplan, opposed their immediate shutdown, arguing that this would have a disproportionate impact on conservatives, according to people. who are familiar with the thinking of the company. Eventually, the company closed far fewer pages than originally proposed as it began developing a policy to deal with these problems.
A year later, Facebook considered revising its scrolling news feed, the home page screen that most users see when they open the site. As part of the change to help limit misinformation, it changed its news feed algorithm to focus more on friends and family posts than publishers.
During the change meeting, Kaplan wondered if the renewed algorithm would harm right-sided publishers more than others, according to three people familiar with the company’s thinking that he spoke of anonymity for fear of punishment. When the data showed it would happen – conservative skidding was pushing more content that violated its policies, the company had discovered – it successfully pushed for changes to make the new algorithm to be what it considered most balanced in its impact, they said people.
Isolated and divided
With the 2020 elections on the horizon, Facebook and Zuckerberg’s direct approach to free speech was leaving him increasingly isolated in Silicon Valley.
That summer, the company’s leaders gathered to revisit its topical exception, which had hitherto been determined on a case-by-case basis, with the most controversial calls made by Zuckerberg. Internally, some were not clear to what extent that room for maneuver extended, according to two people.
Clegg, the company’s new head of global affairs and communications and former British deputy prime minister, announced the results of that meeting in a speech held in Washington in September 2019. Except for speeches that cause violence or harm in the real world. Facebook would allow politicians to express themselves virtually without control over social media. The Facebook network of independent verifiers, which had been established as a key part of the company’s response to disinformation, would not have assessed their claims and the community guidelines would not apply to politicians.
Facebook didn’t want to be an arbiter of truth in the political debate, he said, echoing Zuckerberg’s long-standing position.
The speech angered some employees, triggering more than 250 of them to sign a petition that disagreed with the decision because they thought it would give politicians a lift.
A former executive, Yael Eisenstat, who worked to improve the political advertising process, wrote in The Post that the controversy was “the biggest test of whether [Facebook] it will never really put society and democracy before profit and ideology. “
He said he routinely experienced how the company’s efforts for integrity were often weakened by “the few rumors that ultimately decided the company’s general direction.”
Meanwhile, in October, when Facebook faced more regulatory and political problems, Zuckerberg and his wife Priscilla Chan went to the White House for a private dinner with Trump, part of the CEO’s effort to cultivate personal relationships in Washington. .
While the pandemic and civil unrest dominated the first half of this year, Trump continued to turn to social media platforms to spread disinformation. He touted the unproven hydroxychloroquine as a possible cure for coronavirus and claimed without evidence that the leftist antifa movement was behind the violence in George Floyd’s protests.
Meanwhile, Facebook employees have begun to challenge the company’s decisions.
Two months before Trump’s “looting, shooting”, the Brazilian president published about the country’s indigenous population, saying: “The Indians are undoubtedly changing. They are becoming more and more human beings just like us.”
Thiel, the security engineer and other employees internally claimed that it violated the company’s internal guidelines against the “dehumanizing speech”. They referred to Zuckerberg’s own words as they testified before the October Congress in which he said that the dehumanizing speech “is the first step towards inciting violence.” In internal correspondence, Thiel was told that he did not qualify as racism – and may even have been a positive reference to integration.
In May, after years of internal debate, Twitter chose to go in the opposite direction. Trump has labeled two misleading tweets from Trump on e-mail cards with a fact-checking label.
Trump responded two days later with an executive order that could harm social media companies by removing a key exception that limits their liability for the content posted on their sites.
The next day, Trump tweeted the Minnesota protests. Twitter quickly tagged the tweet for violating the glorification of violence rules and Snapchat stopped promoting Trump’s account the following week. YouTube has told The Post that it keeps politicians at the same standards as everyone else.
Facebook, on the other hand, has chosen to bargain with the White House, asking for a cancellation or modification, people said. Axios reported the call for the first time, which Facebook’s limits confirmed to The Post.
As employees raged on bulletin boards internally and externally on Twitter, Zuckerberg told workers that Facebook’s policies could change again in light of Trump’s post. The company had rules that allowed “the use of force by the state,” he said, but they were vague and did not include the possibility that such statements could signal harmful assault. Bickert’s team has scheduled a series of political meetings for the coming weeks.
In June, Facebook removed a series of announcements from Trump’s campaign with Nazi symbolism, after an initial internal assessment that found that the announcements did not violate the company’s policies, according to documents displayed by The Post. During the meetings, senior executives argued that not removing them would be perceived as too much of a hassle for the president, according to a person familiar with the discussions.
Last week, the advertiser’s boycott picked up momentum. Hershey, Verizon, Unilever, Coca-Cola and others said they were temporarily advertising.
On Friday, Zuckerberg told employees of a live streaming city hall that he was changing the company’s policy to label the problematic and noteworthy content that violated the company’s policies as Twitter does, an important concession of the growing wave of criticism. . He also stated in more explicit language that society would remove seats from politicians who incited violence and suppressed the vote. However, civil rights leaders claimed that his claims did not go far enough.
“There are no exceptions for politicians in any of the policies I am announcing today,” said Zuckerberg.