A pair of bipartisan senators wants to hold social media giants accountable for pushing content that radicalizes Americans.
Senators John Curtis (R-UT) and Mark Kelly (D-AZ) introduced the Algorithm Accountability Act Wednesday, which seeks to amend [PDF] section 230 of the Communications Decency Act to force a duty of care on social media platforms. That duty, says the duo, requires platforms to abandon algorithms that push harmful content.
"Too many families have been hurt by social media algorithms designed with one goal: make money by getting people hooked," Kelly said in a joint press release. "Over and over again, these companies refuse to take responsibility when their platforms contribute to violence, crime, or self-harm. We're going to change that."
If it were to pass - and that's a big if - the law would amend section 230, added to the books in 1996, which has served as a shield for online publishers against liability for content their users post. The law was passed to prevent companies from being held liable for material posted in comments sections, for instance, but has since grown into an essential shield for companies that rely on user-generated content to form the core of their platforms, including giants like Google's YouTube and Meta's Facebook and Instagram.
The proposed law would carve a loophole into section 230 to hold companies liable if a court found that their recommendation algorithms pushed content that radicalized an individual, leading to bodily injury or death, in a way that "a reasonable person would see as foreseeable and attributable to the algorithm."
The bill also proposes to invalidate pre-dispute arbitration agreements and joint action waivers included in terms of use for social media platforms.
"What began as a commonsense protection for a fledgling industry has grown into a blanket immunity shield for some of the most powerful companies on the planet," Curtis said.
Section 230 has been a bugbear for elected officials on both sides of the aisle for years. Both the Biden and the first Trump administrations took action to try to change or eliminate the law in order to hold social media companies accountable for content their users post, but previous attempts have all stalled out.
Initial attempts sought to repeal the provision entirely, but more recent bills, like the SAFE TECH Act, introduced multiple times since 2020, and the Algorithm Accountability Act, instead seek to narrow its protections.
Even Curtis and Kelly's bill has a direct antecedent: the Protecting Americans from Dangerous Algorithms Act, which was introduced in 2021 but died in committee. Several other bills aiming to change section 230 were introduced in 2025 alone, and none have done much more than add to the to-be-shredded piles in Congressional offices.
Meanwhile, the tech companies shielded by 230 are only growing more powerful. ®
Source: The register