Representative Malinowski Highlights the Danger of Social Media Platforms’ Amplification of Extremist Content

September 24, 2020
Press Release

(Washington, DC) Today, Representative Tom Malinowski submitted a statement to the House Committee on Energy and Commerce’s Subcommittee on Consumer Protection and Commerce as part of its hearing on “Mainstreaming Extremism: Social Media’s Role in Radicalizing America.” His statement, which can be found here, describes how social media platforms’ algorithmic amplification of harmful, conspiratorial content threatens Americans’ safety and the integrity of our democratic system.

Representative Malinowski has been an outspoken critic of large social media platforms and the role that their recommendation engines play in spreading white supremacist, anti-Semitic content to their users. Last month, Representative Malinowski introduced a bipartisan resolution to condemn QAnon, which the FBI has warned is likely motivating some domestic extremists to engage in violence, and to urge all Americans, regardless of partisan affiliation, to seek information from authoritative sources. In September, he hosted a public forum with leading experts to discuss the proliferation of conspiracy theories and misinformation online, and how to guard against them. Last year, Representative Malinowski led the successful effort to restore funding to the U.S. Department of Homeland Security to prevent domestic extremism, including violent acts of anti-Semitism (funding the Trump administration cut in 2017).

Read a portion of the statement below, or in its entirety here.

“The algorithmic amplification of hateful, divisive, and conspiratorial content threatens the safety of Americans and the integrity of our democratic system. Despite the industry’s repeated claims that it is addressing the problem with the urgency it demands, its recommendation tools continue to push harmful, radicalizing content to users. I have seen it first-hand. Earlier this summer, my office ran a simple, 10- minute experiment in which we enabled a VPN, downloaded a new browser, and created a Facebook account to see if – and how quickly – white supremacist content would be recommended to the account after it joined a small handful groups related to Alex Jones, QAnon, and other alt-right, conspiracy-oriented topics. The recommendations came immediately, including for groups or pages named, “Proud to be a White American,” “Alt-Right Memes for Alt-White Teens,” “White genocide,” “George Soros the Cockroach King,” “Proud White Man,” and “The Extinction of the White Race.”

Facebook has known about the dangers of its recommendation engine for years. An internal company presentation from 2018 found that “64% of all extremist group joins are due to our recommendation tools…[o]ur recommendation systems grow the problem.” The presentation further noted that “[o]ur algorithms exploit the human brain’s attraction to divisiveness,” something that early Facebook employees have admitted was intentional and central to the core design of the product. Facebook executives reportedly blocked efforts at the time to address the issues raised in the internal presentation.

When Facebook does take some sort of action, it is often too little and too late. In August 2020, when Facebook removed several hundred groups and pages tied to QAnon, millions of Americans had already been exposed to the dangerous theories it promotes; the FBI had already sounded the alarm about how it likely motivates domestic extremists to commit violent acts; and the conspiracy had already undermined trust in America’s democratic institutions and deepened our nation’s political polarization.”