E&C Leaders Urge YouTube to Reform its Extremist Content Policies and Strengthen Enforcement Efforts
Energy and Commerce Committee Chairman Frank Pallone, Jr. (D-NJ), Communications and Technology Subcommittee Chairman Mike Doyle (D-PA), and Consumer Protection and Commerce Subcommittee Chair Jan Schakowsky (D-IL) wrote to Google CEO Sundar Pichai today expressing deep concern over the growing threat of extremist content propagating on YouTube and raising questions over the company’s policies and enforcement efforts to combat extremists and protect viewers.
Videos uploaded to YouTube by extremists have proliferated in recent years, with steep increases in the number of channels created, videos posted, and users watching, liking, and commenting on such content. The company’s algorithm-driven recommendations for videos, which are designed to maintain user engagement by providing tailored suggestions for more videos to watch, may create dangerous ‘rabbit hole’ effects, guiding users from more innocuous or alternative content to more fringe channels and videos.
“By making these recommendations, YouTube is effectively driving its users to more dangerous, extreme videos and channels,” the three Committee chairs wrote to Pichai.
Despite some efforts to curb extremism on its platform, a recent study by the Anti-Defamation League found that YouTube still amplifies extremist and alternative content.
“Ultimately, we believe that YouTube bears a moral responsibility for the content it amplifies on its platform and should disrupt the pipeline of extremism driving users to more fringe videos and channels,” the Committee leaders continued in their letter. “YouTube should make meaningful reforms to its policies and strengthen enforcement efforts to eradicate dangerous alternative and extremist content on its platform. Gaining users, maintaining engagement, and generating more advertising revenue cannot come at the expense of our national security.”
As part of their inquiry, the Committee leaders requested written responses to a series of questions by March 17, including:
- Describe how YouTube defines and classifies violative extremist content, borderline content, and authoritative and trusted content.
- Describe all policy changes YouTube implemented to stop the spread of extremism on its platform and when such changes were made.
- Since November 1, 2020, how many videos and channels has YouTube removed from its platform for violating its policies related to extremism?
- How many videos and channels containing content YouTube considers borderline are on the platform?
- How much advertising revenue has been generated by content creators whose videos YouTube removed for violating its extremism policies?
- Provide the total number of videos that YouTube has flagged as containing borderline content since November 1, 2020.
Read the full letter here.