Press Releases
WASHINGTON – Today, U.S. Sens. Mark R. Warner (D-VA), Mazie K. Hirono (D-HI) and Bob Menendez (D-NJ) pressed Facebook regarding its failure to prevent the propagation of white supremacist groups online and its role in providing such groups with the organizational infrastructure and reach needed to expand. In a letter to CEO Mark Zuckerberg, the Senators criticized Facebook for being unable or unwilling to enforce its own Community Standards and purge white supremacist and other violent extremist content from the site. They also called on Zuckerberg to answer a series of questions regarding Facebook’s policies and procedures against hate speech, violence, white supremacy and the amplification of extremist content.
“The United States is going through a long-overdue examination of the systemic racism prevalent in our society. Americans of all races, ages, and backgrounds have bravely taken to the streets to demand equal justice for all,” wrote the Senators. “While Facebook has attempted to publicly align itself with this movement, its failure to address the hate spreading on its platform reveals significant gaps between Facebook’s professed commitment to racial justice and the company’s actions and business interests.”
Citing reports by the Tech Transparency project and other numerous investigative reports, the Senators highlighted ways in which right-wing extremist groups have used Facebook as a recruitment and organizational tool. They also underscored Facebook’s own contributions to the public safety problem, which include autogenerating pages for white supremacist organizations, promoting white supremacist pages and even directing users who visit these groups to other extremist or far-right content.
“This evidence stands in marked contrast to Facebook’s professed commitment to combat extremism by redirecting users who search for terms associated with white supremacy or hate groups to the page for ‘Life After Hate,’ an organization that promotes tolerance,” the Senators added. “The Tech Transparency Project found that Facebook directed users to the ‘Life After Hate’ page in only six percent of the searches for white supremacist organizations.”
Sens. Warner, Hirono and Menendez also cited a number of instances where online radicalization facilitated by Facebook led to real life consequences, such as when three members of a “boogaloo” group on Facebook plotted to bring Molotov cocktails to a Black Lives Matter Protest, or when Air Force Staff Sergeant Steven Carrillo used Facebook to talk about committing violent acts and to meet the individual who eventually drove his getaway van after Carrillo shot and killed a federal security officer.
In the letter, the Senators requested answers to the following questions by July 10th:
- Does Facebook affirm its policy against hate speech and will it seriously enforce this policy?
- What procedures has Facebook put in place to identify and remove hate speech from its platform? To what degree do these procedures differ with respect to public Facebook pages and private groups?
- Does Facebook affirm its policy against violence and incitement and will it seriously enforce this policy?
- What procedures has Facebook put in place to identify and remove violence and incitement from its platform? To what degree do these procedures differ with respect to public Facebook pages and private groups?
- Does Facebook affirm its commitment to ban “praise, support and representation of white nationalism and white separatism on Facebook and Instagram” as detailed in the company’s May 27, 2019 post and will it seriously enforce this commitment?
- What steps has Facebook implemented since announcing this policy to remove “praise, support and representation of white nationalism and white separatism on Facebook and Instagram?”
- Please provide our offices with any Facebook internal research concerning the platform’s amplification of extremist groups.
- How often are you personally briefed on the status of domestic extremist and white supremacist groups on Facebook and the platform’s efforts to address these groups?
- Who is the senior-most Facebook official responsible for addressing white supremacist groups’ activity on Facebook and which Facebook executive does this employee report directly to?
- What role did Vice President of Global Public Policy Joel Kaplan play in Facebook’s decision to shut down and de-prioritize internal efforts to contain extremist and hyperpolarizing activity on Facebook?
- What role did Mr. Kaplan play in the participation of the Daily Caller, an outlet with longstanding ties to white nationalist groups, in Facebook’s fact-checking program?
- When violent extremist groups actively and openly use a platform’s tools to coordinate violence, should federal law continue to protect the platform from civil liability for its role in facilitating that activity?
A copy of the letter is available here and below.
Dear Mr. Zuckerberg:
We write to express our serious concerns about Facebook’s lack of action to prevent white supremacist groups from using the platform as a recruitment and organizational tool. The United States is going through a long-overdue examination of the systemic racism prevalent in our society. Americans of all races, ages, and backgrounds have bravely taken to the streets to demand equal justice for all. While Facebook has attempted to publicly align itself with this movement,[1] its failure to address the hate spreading on its platform reveals significant gaps between Facebook’s professed commitment to racial justice and the company’s actions and business interests.
On April 22, a full month before Americans started recent protests for racial justice, the Tech Transparency Project issued a report detailing the ways right-wing extremist groups were using Facebook to plan a militant uprising in the United States in response to stay-at-home orders issued to cope with the coronavirus pandemic.[2] The organization’s research uncovered “125 Facebook groups devoted to the ‘boogaloo,’” a term with ties to white supremacist movements used to describe a coming civil war.[3] Many of the groups’ posts were explicit in their calls for violence, including discussions of “tactical strategies, combat medicine, and various types of weapons, including how to develop explosives and the merits of using flame throwers.”[4] The groups experienced unchecked growth in the months leading up to the report and remained on Facebook at least as of early June,[5] despite Facebook’s prior claims that it was “studying trends around [boogaloo] and related terms on Facebook and Instagram” and that it “do[es]n’t allow speech used to incite hate or violence, and will remove any content that violates our policies.”[6]
A subsequent report issued on May 21 provided further detail regarding the extent of Facebook’s white-supremacist problem—and Facebook’s lack of attention to this public safety problem.[7] The Tech Transparency Project found that 113 of the 221 white supremacist organizations designated as hate groups by the Southern Poverty Law Center and the Anti-Defamation League—a staggering 51%—have a presence on Facebook.[8] Many of the organizations’ pages were actually auto-generated by Facebook after a Facebook user identified a white supremacist or neo-Nazi organization as his or her employer.[9] Perhaps more troubling, Facebook actively promoted these and other white supremacist sites. According to the Tech Transparency Project, “Facebook’s ‘Related Pages’ feature often directed users visiting white supremacist Pages to other extremist or far-right content, raising concerns that the platform is contributing to radicalization.”[10]
The Tech Transparency Project report echoes similar findings by the Southern Poverty Law Center (which has also tracked how these groups spread dangerous misinformation about COVID-19 on Facebook),[11] along with several investigative news reports.[12] One investigative report even concluded that Facebook served as a key recruitment tool for right-wing militia groups to recruit police officers to their movements.[13]Facebook is hardly a passive actor in this context: a recent exposé by The Wall Street Journal revealed that Facebook’s own researchers had found that “64% of all extremist group joins are due to our recommendation tools.”[14] The report concluded that Facebook senior executives shut down efforts to reform the platform’s tendency to amplify hyperpolarized and extremist content after Vice President of Global Public Policy Joel Kaplan deemed the efforts “paternalistic.”[15]
This evidence stands in marked contrast to Facebook’s professed commitment to combat extremism by redirecting users who search for terms associated with white supremacy or hate groups to the page for “Life After Hate,” an organization that promotes tolerance.[16] The Tech Transparency Project found that Facebook directed users to the “Life After Hate” page in only six percent of the searches for white supremacist organizations.[17]
Unfortunately, the online radicalization facilitated by Facebook can lead to deadly consequences. On June 16, federal authorities charged Air Force Staff Sergeant Steven Carrillo with the June shooting death of a federal security officer outside a courthouse in Oakland.[18] Authorities also charged the driver of the getaway van,Robert Alvin Justus.[19] Justus and Carrillo had met on Facebook.[20] According to the criminal complaint against Carrillo, a search of Carrillo’s Facebook account revealed not only communications with Justus, but instances where Carrillo expressed his intention to commit violent acts.[21]
In another instance in early June, federal authorities arrested three men on charges that they planned to bring Molotov cocktails to a Black Lives Matter protest.[22] All three were members of a boogaloo group on Facebook.[23] According to the Tech Transparency Project, one of the men arrested was a member of two private boogaloo groups identified in Tech Transparency Project’s April 22 report.[24] Following reporting in the Huffington Post and other media outlets, a Facebook representative told Huffington Post on April 23, “[w]e’ve removed groups and Pages who’ve used [boogaloo] and related terms for violating our policies.”[25] Yet, according to the complaint, the three men used a different online group—a Nevada boogaloo Facebook group—to facilitate organizing a planned attack on the march.[26]
The prevalence of white supremacist and other extremist content on Facebook—and the ways in which these groups have been able to use the platform as organizing infrastructure—is unacceptable. Facebook’s Community Standards expressly state: “We do not allow hate speech on Facebook.”[27] In a March 27, 2019 post, Facebook made clear that this prohibition “has always included white supremacy.”[28] At that same time, Facebook expanded its prohibition to include “praise, support and representation of white nationalism and white separatism.”[29] And the Community Standards purport to prohibit “organizations and individuals that proclaim a violent mission,” including “organized hate” groups.[30]
In light of these clear policies—and others against “Violence and Incitement” and “Dangerous Individuals and Organizations”—we are concerned Facebook is unable (or unwilling) to enforce its own Community Standards[31] and rid itself of white supremacist and other extremist content.
We request that you to answer the following questions by July 10, 2020:
- Does Facebook affirm its policy against hate speech and will it seriously enforce this policy?[32]
- What procedures has Facebook put in place to identify and remove hate speech from its platform? To what degree do these procedures differ with respect to public Facebook pages and private groups?
- Does Facebook affirm its policy against violence and incitement and will it seriously enforce this policy?[33]
- What procedures has Facebook put in place to identify and remove violence and incitement from its platform? To what degree do these procedures differ with respect to public Facebook pages and private groups?
- Does Facebook affirm its commitment to ban “praise, support and representation of white nationalism and white separatism on Facebook and Instagram” as detailed in the company’s May 27, 2019 post[34]and will it seriously enforce this commitment?
- What steps has Facebook implemented since announcing this policy to remove “praise, support and representation of white nationalism and white separatism on Facebook and Instagram?”
- Please provide our offices with any Facebook internal research concerning the platform’s amplification of extremist groups.
- How often are you personally briefed on the status of domestic extremist and white supremacist groups on Facebook and the platform’s efforts to address these groups?
- Who is the senior-most Facebook official responsible for addressing white supremacist groups’ activity on Facebook and which Facebook executive does this employee report directly to?
- What role did Vice President of Global Public Policy Joel Kaplan play in Facebook’s decision to shut down and de-prioritize internal efforts to contain extremist and hyperpolarizing activity on Facebook?[35]
- What role did Mr. Kaplan play in the participation of the Daily Caller, an outlet with longstanding ties to white nationalist groups,[36] in Facebook’s fact-checking program?
- When violent extremist groups actively and openly use a platform’s tools to coordinate violence, should federal law continue to protect the platform from civil liability for its role in facilitating that activity?
Thank you in advance for your attention to this critical matter.
Sincerely,
###