Share this @internewscast.com
The ongoing debate surrounding Section 230, the legal provision that shields internet platforms from liability for user-generated content, took center stage at a recent Senate Commerce Committee hearing. This time, the hearing was marked by two significant themes: a surge in legal challenges to the law’s breadth and an intensified bipartisan concern over potential government censorship.
During the session, Senator Brian Schatz of Hawaii made a pointed remark about the law, stating, “Section 230 is not one of the Ten Commandments. The notion that modifying it would obliterate internet freedom is absurd.” This sentiment is echoed in a new legislative proposal from Senators Dick Durbin and Lindsey Graham, seeking to phase out Section 230 completely as it reaches its 30-year milestone. Meanwhile, other legislative efforts aim to refine and limit the law’s reach.
Fundamentally, Section 230 serves as a safeguard for social media networks, online forums, and digital comment sections, ensuring they are not held accountable for user-posted content and granting them the ability to moderate content without fear of repercussions. While it underpins numerous online services, critics argue that its broad protections are outdated, particularly as they apply to today’s tech giants. The hearing largely focused on two pressing issues: the impact of social media on children and claims of bias against conservative viewpoints.
Adding to the complexity of the debate is a recent trial in Los Angeles. Jurors are deliberating whether platforms like Instagram and YouTube are culpable for harm due to their design choices, which may not be covered by Section 230’s protections. Matthew Bergman, representing the Social Media Victims Law Center, brought this issue before the committee. Behind him sat grieving parents holding pictures of their children, who they claim suffered due to online dangers.
Bergman argued against a full repeal of Section 230 but urged Congress to clarify that the law should not shield platforms from accountability for their design decisions. Some lawmakers questioned whether new legislation was necessary for families like Bergman’s clients, or if the issue could be resolved under the current legal framework. Bergman warned that leaving the matter solely to the courts could result in further tragedies, stating, “more kids are going to die” if action is not taken.
In the words of one observer, “It’s no longer theoretical that the door swings both ways in Washington,” highlighting the tangible and urgent nature of the debate as it unfolds in the nation’s capital.
Another throughline of the hearing was a broad awareness of the dangers of government censorship and the potential to chill speech online, including through coercive threats or jawboning. Schatz praised the leadership of committee chair Ted Cruz (R-TX), who attacked the Biden administration for jawboning but also criticized Federal Communications Commission Chair Brendan Carr’s threats to broadcasters and proposed legislation to address censorship. And Schatz said he was concerned by the Biden administration’s approach to disinformation about the covid-19 pandemic, which included urging social media companies to remove posts spreading it. “It’s no longer theoretical that the door swings both ways in Washington, and this is going to bite us all in the butt and we have to fix it,” he said.
Cruz disagrees with colleagues who want to repeal Section 230 entirely, believing it would incentivize tech platforms “to engage in more censorship to protect themselves from litigation.” Still, he said, “we should consider whether reform of Section 230 is needed to encourage more speech online and stop Big Tech censorship.”
Tensions flared when Sen. Eric Schmitt (R-MO) squared off with one witness, Stanford Law School director of platform regulation Daphne Keller. In his prior role as attorney general of Missouri, Schmitt unsuccessfully sued the Biden administration for its alleged pressuring of social media companies over covid and election disinformation. Schmitt took aim at Keller over her connection to Stanford, whose Internet Observatory was effectively dismantled after facing persistent attacks from the right over its work to identify election misinformation.
Earlier in the hearing, Keller said she “didn’t love” the pressure exhibited by Biden administration officials, but that the suit had failed to turn up evidence the government caused platforms to remove posts. Keller said that resulted in a “problematic” Supreme Court ruling that will make it harder for “the real victims of jawboning in the future” to get into court. But she also said the current administration, including actions from Carr, has helped usher in an era of “jawboning that is unprecedented in my lifetime.”
Keller disputed, however, that the Stanford Internet Observatory had a “role with the Biden administration” to flag content that “didn’t line up with the Biden administration’s view,” as Schmitt claimed. She said her colleagues were “exercising their First Amendment rights to go talk to the government and say what they thought should happen.”
When Keller added she wasn’t involved in the conversations between her colleagues and the Biden administration, Schmitt retorted, “You can read all about it in Missouri v. Biden, the lawsuit that went to the Supreme Court.”
“The one you lost?” Keller shot back. (Schmitt clarified it was sent back to the lower court.)
Some witnesses proposed alternatives to removing or changing Section 230. Knight First Amendment Institute policy director Nadine Farid Johnson suggested passing privacy protections, adding interoperability requirements for social networks, and expanding researchers’ access to platforms, saying this could keep companies from using personal data to hook users and offer more insight into how the platforms work.
The hearing briefly dealt with the new regulatory questions raised by Silicon Valley’s latest focus: generative AI. Americans for Responsible Innovation President Brad Carson said Section 230 should not protect AI outputs, and warned against preempting AI laws that could rein in a fast-growing industry — criticizing a policy supported by some Republicans, including Cruz. Cruz also brought up the Take It Down Act, a law that requires platforms to remove reported nonconsensual intimate images, whether real or AI-generated, as an example of “targeted legislation” that avoids amending Section 230.
No matter how much pressure Congress puts on platforms to add guardrails, though, Cruz acknowledged kids will look for ways around them. After taking away his then-14-year-old daughter’s phone as a punishment, he recalled during the hearing, his wife got an email from Verizon “that didn’t make any sense.” The daughter soon confessed to removing the SIM card from her phone before handing it over, and using it in a burner phone. “I was both annoyed and really proud at the same time,” Cruz said. “It does show just how completely outmatched parents are with trying to keep up with teenagers with these issues.”