Brendan Carr, currently the senior Republican commissioner at the Federal Communications Commission (FCC), was chosen as the next FCC Chairman by Trump. Carr, who has been an FCC commissioner since he was appointed by Trump in 2017, wrote the Project 2025 chapter on the FCC.
While FCC commissioners require Senate confirmation, the chairman is appointed by the President and, if they are already a commissioner, does not need an additional Senate confirmation to take the role. Since Carr is currently a commissioner, he is guaranteed to become the FCC Chairman unless Trump changes his mind about the pick.
Even if Carr had not written a 13-page chapter explaining his plans for the FCC, he has been unusually outspoken about his beliefs for a commissioner – including on topics the FCC has no authority over. He accused China of allowing COVID-19 to spread, attacked the World Health Organization’s response to the pandemic, called on the Federal Trade Commission to crack down on tech companies like Facebook and Twitter/X, and accused Representative Schiff of running a “secret and partisan surveillance machine.”
Still, his Project 2025 chapter offers the most detailed view of his plans for the FCC in the next four years, and there’s one consistent theme in those proposals: deregulation.
Of course, there’s always an exception that proves the rule, and in this case, it’s about content moderation.
In contrast to his other plans for the FCC, Carr is clear that he wants the federal government to have more control over content moderation decisions made by internet-based companies, especially when it comes to social media giants like Facebook, Twitter/X and Youtube. One of his central platforms for FCC reform has to do with Section 230 immunity, which Congress created nearly 30 years ago.
The issue at hand was a 1995 case in New York state courts, Stratton Oakmont, Inc. v. Prodigy Servers. The court ruled that Prodigy, an online bulletin board, was liable for libel after an anonymous user posted libelous statements on their forum because Prodigy had engaged in content moderation. By engaging in any content moderation at all, Prodigy stopped being a simple “distributor” and became a publisher.
In very simple terms, a distributor is an organization that reports or distributes material without examining the contents – as long as they do not know that the material breaks the law, they aren’t liable. In contrast, a publisher does check the content of what they publish or allow to be shown, so they are responsible for making sure every comment or post does not violate the law.
In response, Prodigy and other companies said they would simply stop moderating any content posted to their websites. It would be prohibitively expensive to review every post for libel or other unlawful material, so the only options were to shut down completely or allow any kind of unlawful content without restraints.
Congress responded by passing the Communications Decency Act (CDA), part of the Telecommunications Act of 1996. While much of the CDA was struck down by the Supreme Court, Section 230 remained in effect, granting internet companies immunity from most lawsuits related to comments or posts on their sites, as well as lawsuits regarding their content moderation decisions. The reasoning was simple: if we wanted to avoid unlawful content on the internet, then we needed to allow websites to remove that content.
The Electronic Frontier Foundation (EFF), a nonprofit organization that defends “civil liberties in the digital world,” explained how the law works in several posts back in 2020, when Trump issued an executive order attempting to limit that immunity. Section 230 (c)(1), they say, “shields from liability all traditional publication decisions related to content created by others, including editing, and decisions to publish or not publish.”
In other words, subsection (c)(1) says companies cannot be sued because of what someone else posted on their website.
Section 230 (c)(2), on the other hand, protects companies “from legal challenges brought by users when platforms decide to edit or to not publish material they deem to be obscene or otherwise objectionable” as long as the decision is made “in good faith,” according to the EFF.
So subsection (c)(2) says companies can’t be sued because of what they decide to remove or what they won’t allow on their sites, with the caveat that their moderation has to be done in good faith.
Of course, this only provides protection from civil penalties, not criminal ones. For example, a website dedicated to helping people plan robberies could still be subject to criminal investigation and penalties.
What Carr hopes to do is change those (c)(2) protections from a separate, additional protection from liability into a limiting clause. Subsection (c)(2), he argues, explains how a site could lose its protections completely.
In that formulation, companies would be forced to prove to the government that their moderation practices are in good faith. If the government decides that they haven’t been acting in good faith, then the company would lose all protection from lawsuits, leaving them with the impossible choice faced by Prodigy in 1995: close down, abandon all moderation efforts, or independently check every post for its potential to generate a lawsuit.
It is even more difficult to independently check every comment today than it was in 1995. At the time, Prodigy argued that it received an impossible 60,000 comments per day. In 2023, Twitter/X averaged 500 million tweets per day, or roughly 60,000 tweets every 10 seconds.
On the other hand, Twitter/X has seen a consistent drop in users ever since Elon Musk took over the company. NBC News spoke to people who had quit using the site, all of whom listed the increase in “bots, partisan advertisements and harassment” as primary issues – problems which started growing after Musk dismantled many of the company’s existing moderation systems. Once a seemingly unbeatable behemoth of social media, “X has lost an average of 14% of its users monthly,” reports Fortune Magazine.
Fortune also credited the decline to Musk’s decision to cut down “the hundreds of people that once worked on content moderation for Twitter to just a handful of contract employees.”
If Twitter/X can’t survive after reducing its moderation capacity, other sites will have no chance of staying solvent after giving up moderation entirely.
In that case, companies really only have two options: close down or do what the government tells them to.
EFF notes that for extremely large companies there may be a third option: drag out the court battles indefinitely. However, that would only be available to companies like Meta and Google, who have the financial and legal resources required to lock the U.S. government in an endless loop of lawsuits and appeals. Smaller social media companies would be crushed by legal fees before they ever got a chance to challenge the dominance of Meta’s Facebook or Google’s YouTube.
Studies have shown that ending or weakening Section 230 protections would strengthen big tech companies, not weaken them. EFF points out that Zuckerberg went before Congress to testify in favor of weakening Section 230 and that large corporations have historically supported efforts to reduce Section 230 protections, likely because they know it would further cement their market dominance.
This is the real common thread in Carr’s plans for the FCC: Strengthening the power of the largest tech companies – as long as they do what Trump wants.
While his Project 2025 chapter names “reining in Big Tech” as the top priority for the FCC, nearly every suggestion empowers Big Tech companies instead.
Carr also wants the federal government to undercut state and local regulations on cell tower construction, while cutting back on federal environmental and historic preservation rules governing new cell tower construction. He says the FCC should impose “limits on the fees that local and state governments can charge for reviewing those wireline applications and time restrictions on the government’s decision-making process” on page 854, continuing to say that the FCC should work with the Bureau of Land Management and the U.S. Forest Service to “address the delays that continue to persist when it comes to building Internet infrastructure on federal lands.”
A few pages later, Carr says the “FCC should engage in a serious top-to-bottom review of its regulations and take steps to rescind any that are overly cumbersome or outdated,” with the goal of “creating a market-friendly regulatory environment” for cable, broadband, and satellite Internet providers.
A more concerning proposal would give special treatment to companies owned by Trump allies like Elon Musk, who has been given a position in the brand new Department of Government Efficiency, or DOGE – a reference to a cryptocurrency Musk has supported over the years. Jeff Bezos, who made the unprecedented decision to block the Washington Post’s endorsement of Kamala Harris this year, also receives a nod from Carr.
For example, Carr specifically names Elon Musk’s StarLink company and Bezos’ Project Kuiper for deregulation. “One of the most significant technological developments of the past few years has been the emergence of a new generation of low-earth orbit satellites like StarLink and Kuiper,“ Carr writes on page 855. “The FCC should expedite its work to support this new technology by acting more quickly in its review and approval of applications to launch new satellites.”
This isn’t the first time Carr has spoken out to support Musk. After Twitter accepted his $44 billion acquisition offer, groups called on the FCC to block the sale. Carr took the rare step of responding with an individual press release, saying the FCC had “no authority to block Elon Musk’s purchase of Twitter, and to suggest otherwise is absurd.” He now argues that the FCC does have authority to regulate social media companies, even in the granular details of their content moderation policies.
If Carr manages to reshape Section 230 to fit his own imaginings, we should expect crony capitalism to ramp up in social media. It is hard to imagine the Trump administration will go after Elon Musk’s Twitter/X moderation practices. While Musk turned Twitter/X into exactly the sort of partisan platform Carr claims to detest, he did so to benefit Donald Trump, going so far as to act as a Trump surrogate at campaign events and hosting a live interview with Trump on Twitter/X.
Instead, Trump’s administration would likely target websites they see as less friendly – places like Meta’s Instagram, Threads and Facebook, or the upstart social media site Bluesky that seems to have collected many of the progressive users that abandoned Twitter/X.
Ultimately, Brendan Carr’s plan to “rein in Big Tech” at the FCC is more about ensuring private companies show fealty to Donald Trump than it is about protecting the data, privacy, and livelihoods of people in America.