Friday, November 7, 2025
Global Current News
  • News
  • Finance
  • Technology
  • Automotive
  • Energy
  • Cloud & Infrastructure
  • Data & Analytics
  • Cybersecurity
  • Public Safety
  • News
  • Finance
  • Technology
  • Automotive
  • Energy
  • Cloud & Infrastructure
  • Data & Analytics
  • Cybersecurity
  • Public Safety
No Result
View All Result
Global Current News
No Result
View All Result

Court: Social platforms not liable for Buffalo shooting

by Juliane C.
August 9, 2025
in Cloud & Infrastructure
Social

Credits: REUTERS/Dado Ruvic/Illustration/File Photo

China offers discounted power rates to tech firms to accelerate domestic AI chip production

Comscore names new APAC head to accelerate cross-platform analytics expansion

Asia-Pacific workers top global charts in generative AI adoption, but oversight remains weak

The racist massacre that occurred in 2022 in Buffalo raised important discussions about the role of social media in spreading hate and extremist content aimed at influencing violent activity. A New York appeals court ruling brought this debate back. in a case involving some of the world’s largest technology companies.

Court clears social media outlets of Buffalo attack

Several social media companies should not be held liable for helping an avowed white supremacist who killed 10 Black people in 2022 at a Buffalo, New York grocery store, a divided New York state appeals court ruled on Friday. Reversing a lower court ruling, the state Appellate Division in Rochester said defendants including Meta Platforms’ META.O Facebook and Instagram, Google’s GOOGL.O YouTube, and Reddit were entitled to immunity under a federal law that protects online platforms from liability over user content.

The case arose from Payton Gendron’s racially motivated mass shooting at Tops Friendly Markets on May 14, 2022. Relatives and representatives of victims, as well as store employees and customers who witnessed the attack, claimed the defendants’ platforms were defective because they were designed to addict and radicalize users like Gendron.

Large platforms used immunity provided for in federal law

Lawyers for the plaintiffs did not immediately respond to requests for comment. Other defendants included Alphabet, Amazon.com AMZN.O, Discord, 4chan, Snap SNAP.N and Twitch, all of which Gendron used, the mid-level state appeals court said.

The victims’ families’ central argument was that the platforms’ algorithms amplified hateful and white supremacist content, creating a creating a favorable scenario that encouraged the attacker’s radicalization. The plaintiffs argue that this is a structure designed to maximize engagement, even if it means promoting extremist content.

Most see the risk to the internet in increasing accountability

Writing for a 3-2 majority, Justice Stephen Lindley said holding social media companies liable would undermine the intent behind Section 230 of the federal Communications Decency Act, to promote development of and competition on the internet while keeping government interference to a minimum. While condemning Gendron’s conduct and “the vile content that motivated him to assassinate Black people simply because of the color of their skin,” Lindley said a liability finding would “result in the end of the Internet as we know it.”

“Because social media companies that sort and display content would be subject to liability for every untruthful statement made on their platforms, the Internet would over time devolve into mere message boards,” he wrote.

The decision further highlights the issues surrounding the defense of online content regulation versus the freedom of platforms to operate without government interference. Judge Lindley emphasized that the importance of legal liability for each post could have devastating consequences for the functioning of the internet, in addition to restricting freedom of expression

Justices Tracey Bannister and Henry Nowak dissented, saying the defendants force-fed targeted content to keep users engaged, be it videos about cooking or puppies, or white nationalist vitriol.

“Such conduct does not maintain the robust nature of Internet communication or preserve the vibrant and competitive free market that presently exists for the Internet contemplated by the protections of immunity,” the judges wrote.

Shooter convicted, but legal debate persists

Gendron pleaded guilty to state charges including murder and terrorism motivated by hate, and was sentenced in February 2023 to life in prison without parole. He faces related federal charges that could lead to the death penalty. Questioning of potential jurors in that case is scheduled to begin in August 2026, court records show.

The judgment on platform liability may never reach absolute consensus. While courts uphold Section 230’s legal protection for companies, pressure is growing from lawmakers, and victims’ families for change, questioning how far business models can be immune from liability for these tragedies.

GCN.com/Reuters

GCN

ยฉ 2025 by Global Current News

  • Contact
  • Legal notice

No Result
View All Result
  • News
  • Finance
  • Technology
  • Automotive
  • Energy
  • Cloud & Infrastructure
  • Data & Analytics
  • Cybersecurity
  • Public Safety

ยฉ 2025 by Global Current News