Affiliate Disclosure
If you buy through our links, we may get a commission. Read our ethics policy.

Department of Justice releases list of potential Section 230 reforms

Proposed reforms suggested by the Department of Justice on Wednesday seek to fine-tune and limit Section 230 immunity for online platforms who do not remove malicious content from their websites.

The Department of Justice has released a list of critical areas of Section 230 that they have deemed "ripe for reform." The action follows about six months of work by the department, and is in close proximity to President Donald Trump's executive order intended to sidestep online publication protections.

The reforms proposed on Wednesday afternoon by the DoJ do not change Section 230. However, the proposed alterations make it clear that a platform — such as Twitter or Facebook — must enforce community policing or risk losing immunity generally granted under the provision.

Section 230 of the Communication Decency Act from 1996 as it stand, prevents online venues from being held liable for the content posted by their user base. However, Section 230 has also broadly stated that these platforms must self-police, though, until now, the stipulation was laxly enforced.

The first significant reform proposed seeks to remove immunity from platforms that allow malicious content to propagate on their sites. This includes content that facilitates child abuse, terrorism, cyber-stalking, and "Bad Samaritans." If a website were to allow malicious content to run unchecked, then they would lose protections granted to them under Section 230, leaving them vulnerable to costly fines and lawsuits.

The second reform area sets out to increase the government's ability to "protect citizens from harmful and illicit content." If implemented, Section 230 immunity would not be granted if a platform is brought under federal investigation should the venue not be taking appropriate measures to moderate such behavior by users.

The third reform removes Section 230 protections in the case of antitrust claims. The intent of the alteration is to encourage competition and prevent major platforms from invoking Section 230 protections in antitrust cases, as it would discourage competitors from filing legitimate concerns of anticompetitive behavior.

In a fourth focus of reform, the Department of Justice seeks to "promote open discourse and greater transparency" by clarifying the text and original purpose of Section 230. They intend to replace some of the vague terminologies with more precise wording.

The reason for the terminology refinement as presented by the DoJ is that the changes would "reduce online content harmful to children— while limiting a platform's ability to remove content arbitrarily or in ways inconsistent with its terms or service simply by deeming it "objectionable." In practice, it appears that as long as online venues moderate content in accordance with federal guidelines, and potentially more restrictive — and published — rules enforced by the forum, Section 230 protections still apply.

The Department of Justice also says in Wednesday's guidance that moderation of user content does not confer "publisher" status of that content to the venue. This seems contrary to what the President is seeking in his executive order, which explicitly stated that publisher status should apply to venues that moderate or fact-check content.

There is already a bipartisan investigation into whether Section 230, which was written in the earliest days of the internet, is still adequate or relevant. That investigation is broadly concerned with whether Section 230 facilitates online child abuse, but the Department of Justice's plans are believed to be more specific.

Neither the President nor the Department of Justice may unilaterally change Section 230. Any modifications to the law would need to get the full approval of the House and Senate, and signed into law by the President. At present, it isn't clear how much bipartisan support that the Department of Justice proposals have.