Whistle-Blower Blaming Facebook For Giving To January 6 Violence, Memo Says

“We will continue to face scrutiny – some positive and some negative,” he said in a statement. “But we must also keep our heads up.”

Here is Mr. Clegg’s full invitation:


You will have seen a series of articles about us published in the Wall Street Journal in recent days, and the public interest has aroused. This Sunday evening, a former employee who published the company’s internal affairs in the Journal will appear in a 60-minute segment on CBS. We understand that this passage is likely to confirm that we are contributing to segregation in the United States, and we urge that the dramatic steps we have taken in the 2020 elections were quickly relieved and delivered the horrific events of January 6 at the Capitol.

I know some of you – especially those in the US – will get questions from friends and family about these things so I wanted to take a moment when we go over the weekend to offer what I hope is the best position for us to work in these important areas.

Facebook and Polarization

People are understandably concerned about social unrest and seek solutions and solutions to problems. Social media has seriously affected society in recent years, and Facebook has often been the site of many of these conversations. So it was natural for people to ask if it was part of the problem. But the assertion that Facebook is the cause of the controversy is not supported by the facts – as Chris and Patrat said in their letter on the subject earlier this year.

The rise of segregation has been the subject of intense research for education in recent years. In fact, there is no major agreement. But the available evidence does not support the idea that Facebook, or the media at large, is the main reason for the split.

The rise of political segregation in the US dates back to the media by decades. If it were true that Facebook was the main reason for the split, we would expect to see it go up everywhere in the known Facebook. It’s not. In fact, polarization is low in several countries with the highest social media use at the same time that it has increased in the US.

In particular, we hope that the mention suggests that the change in Facebook News Feeding position algorithm was responsible for promoting polarizing internal content on the platform. In January 2018, we created a position to promote Social Media (MSI) – so that you can see more content from friends, family and groups you are a part of in News Giving. This change was largely driven by internal and external research which showed that meaningful interaction with friends and family on our platform was better in human settlements, and we continued to improve and improve time as we do with all creative steps. After all, everyone who has a friendly uncle or former classmate who has a strong or extreme view of disagreement – that is life – and the change meant that you are more likely to experience the same. Even so, we have developed industry-leading tools to eliminate hate speech and reduce the distribution of problematic material. As a result, the prevalence of hate speech on our platform has now dropped to about 0.05%.

But the simple fact remains that the transition to algorithmic position systems on one social media platform cannot explain the wider social polarization. Of course, splitting content and inaccuracies is available on platforms without any algorithmic position, including secret messaging applications such as iMessage and WhatsApp.

Elections and Democracy

Perhaps there is no other issue we are talking about as a company other than our job to radically change the way we approach elections. Beginning in 2017, we began to build new defense systems, introduce new technologies, and strengthen our policies to prevent interference. Today, we have more than 40,000 people in the company working on security and safety.

Since 2017, we have disrupted and eliminated more than 150 secrecy activities, including in the face of a major democratic election. In 2020 alone, we have eliminated more than 5 billion counterfeit accounts – disclosing all of them before anyone can claim it. And, from March to Election Day, we have removed more than 220,000 Facebook and Instagram posts in the US for violating voter harassment laws.

In view of the unprecedented instances of holding a controversial election, we put the so-called “breaking glass” – and we spoke openly about it – before and after Election Day to respond to the unusual signals we were seeing on our platform and to keep potential internal violations from spreading. .

These steps were not uncommon – vague instruments designed to deal with a crisis situation. It is like closing all city streets and highways in response to a temporary threat that may be hidden elsewhere. In practice, we recognize that we have incurred a considerable amount of material that has not violated our regulations prioritizing public safety during a period of extreme instability. For example, we have restricted the distribution of live videos predicted by our system in relation to elections. It was an extraordinary step that helped prevent possible internal violations from infection, but it also affected many common and common internal factors, including some non-selective. We don’t want to take this kind of dirty, catch-all measurements into normal situations, but these haven’t been the norm.

We only reversed these emergency measures – based on careful data analysis – when we saw a return to other conditions. We have left some of them for a long time until February this year and others, such as not encouraging the public, politicians or new parties, have decided to protect it completely.

Attacking Enemy Groups and Other Dangerous Organizations

I want to make it very clear: we work to stabilize, not to increase hate speech, and we have clear laws that prohibit internal violence. We do not benefit from the division, in fact, only the opposite. We do not allow risky organizations, including military forces or social unrest, to create false plots, to plan on our platforms. And we eliminate internal praise or support for hate groups, terrorist groups, and criminal gangs.

We have been more aggressive than any other internet company in combating malicious, including internal threats to drive the polls. But our work against these hate groups was years in the making. We’ve removed tens of thousands of QAnon pages, groups and accounts from our apps, deleted the first #StopTheSteal Group, and removed the references to stop stealing it in a hurry to unlock. In 2020 alone, we have removed more than 30 million internal pieces violating our terrorist practices and more than 16 million internal violations of our rules surrounding organized hatred by 2020. remove their praise, support, and representation. Between August last year and January 12 this year, we tracked about 900 military organizations under our Dangerous and Individuals policy and removed thousands of Pages, groups, events, Facebook profiles and Instagram accounts associated with these groups.

This work will never be complete. There will always be new threats and new solutions to problems, both in the US and around the world. That’s why we stay alert and watchful – and we will always have to.

That is also why the suggestion that was sometimes made that a violent uprising on January 6 would not have happened if the non-social media were misleading anyway. To put it bluntly, the work of these events is always full of perpetrators of violence, and those who are politically and internationally strongly motivated by it. Major democracies whose widespread use of the news media are always holding elections – for example the German election last week – without fraud. We are committed to sharing with the Creating Act the material that we can access on our services related to these horrific events. But downplaying the complex causes of segregation in America – or actual rebellion – to the technical definition is unclear.

We will continue to face the challenge – some positive and some negative. We will continue to be asked difficult questions. And many people will continue to doubt our intentions. That’s what comes with being part of a company that has a significant impact on the world. We need to be humble enough to accept criticism when it is appropriate, and to change when it is corrected. We are not perfect and we do not have all the answers. That’s why doing the kind of research that has been the subject of these articles in the first place. And we will continue to find ways to respond to the responses we hear from our users, including ways to try to ensure that politics does not take away their News Feeds.

But we must also keep our heads up. You and your teams do an amazing job. Our products and services have a positive impact on the world and in people’s lives. And you have every reason to be proud of that work.

Leave a Comment