A Year of Content Moderation and Section 230

Matthew Feeney andWill DuffieldMany Americans will remember 2020 as the year of the COVID-19 pandemic and the Trump v. Biden presidential election campaigns. As COVID-19 spread and the election came closer, millions of Americans took to online platforms to express their opinions, theories, and stories and to seek information. Platforms were put in the unenviable position of developing content moderation policies related to the pandemic and election season, trying to halt the spread of potentially life ‐​threatening medical misinformation and political conspiracy theories.These efforts made “Big Tech” content moderation one of the most discussed legislative issues of the year. President Trump andJoe Biden both called for the repeal of Section 230 of the Communications Decency Act – the most cited law in content moderation discussion – and lawmakers from both sides of the aisle, as well as regulators at federal agencies, have released Section 230 proposals. This post provides an analysis of what 2020 can teach lawmakers, policy professionals, and regulators about the futu re of content moderation.Section 230 of the Communications Decency Act is a  critical but widely misunderstood law. Passed in 1996, Section 230 includes two key provisions, the so‐​called “sword” and “shield” of the law. The shield states that interactive computer services are not – with few exceptions – the publisher of content users upload to the interact ive computer service. Th...
Source: Cato-at-liberty - Category: American Health Authors: Source Type: blogs