Content moderation in Trump's America is a political minefield

In the battle between free speech and content moderation, who sets the rules?
By
Christianna Silva
 on 
President Donald Trump over a map of the country with images of content moderation floating above.
How is Trump going to effect content moderation? Credit: Stacey Zhu; travellinglight / iStock / Getty Images Plus / via Getty Images

Social media platforms have historically run their content moderation much like a parent running a house full of teenagers: If you live under my roof, you follow my rules. But as social media has become increasingly ubiquitous in our offline life — and more inherently political — the questions have become: Who really owns the roof, who makes those rules, and are our civil liberties at stake?

Under President-elect Donald Trump's administration, this debate will likely intensify until the politicization of content moderation reaches a fever pitch.

How did we get here?

The evolution of content moderation began slowly, gaining speed as social media’s influence grew. It became increasingly clear that something had to be done when Facebook, Twitter, and YouTube played key roles in the Arab Spring, a series of protests in the Arab world in response to government corruption, in the 2010s. Facebook was used as a tool for activists to organize, but it quickly became controversial. YouTube grappled with whether to allow violent videos that served educational or documentary purposes in response to activists in Egypt and Libya exposing police torture. Around the same time, Twitter rolled out its "country withheld tweet" policy

In 2013, leaked documents from Facebook’s moderation offices showed what content, exactly, Facebook was moderating. The following year, the issue of online radicalization emerged across social media platforms. YouTube reversed its policy on allowing certain violent videos after one showing journalist James Foley’s beheading went viral. Twitter faced backlash over unchecked harassment over the release of the women-led Ghostbusters film, which led to a content moderation change. 

Behind the scenes, the people who moderated the content reported horrible working conditions. And then came 2016.

Misinformation and disinformation plagued the U.S. presidential election between Hillary Clinton and Trump. Despite Facebook launching a fact-checking program, platforms struggled to stop the spread of misinformation and election interference. In Myanmar, the Rohingya people faced huge acts of ethnic violence fueled by Facebook content. Meanwhile, Facebook Live became a place to broadcast suicides and shootings, including the murder of Philando Castile. In 2018, TikTok launched in China, and in the same year, Twitter removed 70 million bots to curb the influence of political misinformation. Later that year, YouTube released its first transparency report, and Facebook formed its Oversight Board, allowing users to appeal its decisions. In 2019, the Chirstchurch terrorist attack, which was broadcast on Facebook Live, led to the Christchurch Call to Action to Eliminate Terrorist and Violent Extremist Content Online, a group of nations "working together under the rubric of the Call to prevent terrorists and violent extremists from exploiting the Internet." Twitter allowed its users to appeal content removal later that year, and eventually, TikTok launched internationally. 

All the while, Trump was president. He signed an executive order on Preventing Online Censorship, which targeted Section 230 of the Communications Decency Act and aimed to curb what he saw as biases against himself and other conservatives in how platforms moderate content. This came after many of Trump's tweets were flagged by Twitter for misleading information. He and others in his party accused platforms like Twitter, Facebook, and Google of anti-conservative bias, which led to Congressional hearings and investigations into moderated content — a kind of impact that Katie Harbath, founder and CEO of tech policy firm Anchor Change and a former Facebook executive, calls "reputational."

The pandemic, January 6, and the peak of politicization

Then, COVID-19 hit. Misinformation about the global epidemic ran rampant, and more people died as a result. The rules to moderate content online expanded internationally to counter the ever-growing phenomena of hate speech, election misinformation, and health misinformation. Facebook introduced policies targeting Holocaust denial content, hate groups, organized militia groups, and conspiracy theories, while Twitter launched its transparency center

But January 6, 2021, marked a turning point. Platforms like Facebook, Twitter, and YouTube banned or locked then-President Trump’s accounts for inciting violence during the Capitol attack

Mashable Top Stories
Stay connected with the hottest stories of the day and the latest entertainment news.
Sign up for Mashable's Top Stories newsletter
By clicking Sign Me Up, you confirm you are 16+ and agree to our Terms of Use and Privacy Policy.
Thanks for signing up!

"I would say Trump de-platforming was a peak swing of the pendulum," Katie Harbath, founder and CEO of tech policy firm Anchor Change and a former Facebook exec, told Mashable. "Since then, over the next four years, [platforms have] been coming back a little bit more to center in terms of how much content they are willing to take down. [And] they're being a lot more quiet about it. They're not being as transparent about it because they don't want the political target on their back around that."

Where are we now?

Since then, Trump has been reinstated on all social media platforms. But the focus has remained: Republicans claim that content moderation silences conservative voices. As Berin Szóka, President of TechFreedom, told Mashable: "Censorship is just content moderation that someone doesn't like."

Elon Musk, a self-identified "free-speech absolutist," acquired Twitter in late 2022 and fueled this rhetoric. In January 2023, House Republicans established a subcommittee on the “Weaponization of the Federal Government," targeting alleged censorship of conservative views. In one of their first official acts, they sent letters to research groups demanding any documentation of correspondence between those groups and the federal government or social media companies about content moderation. Meanwhile, a lawsuit alleged that President Joe Biden's administration pressured platforms to suppress COVID-19 misinformation, which attorney generals argued was a form of suppression of speech

Meta, in a notable shift, has reduced its focus on political content, particularly on its Twitter competitor Threads, which Harbath says is "not necessarily content moderation, but it's a decision about what types of content they're presenting to people or not."

What will we see in the future of content moderation?

President-elect Trump has made content moderation a campaign issue. Brendan Carr, his pick to lead the FCC, has already echoed this agenda, calling for the dismantling of what he dubs the "censorship cartel" and an attempt to "restore free speech rights for everyday Americans."

"To do that, they have to either bully or require tech companies to carry speech that they don't want to carry," Szóka said. "Republicans are at war on content moderation."

This "war" will likely play out on a few different fronts: legislative and reputational, as Harbath says. Reputationally, we'll see more congressional hearings with tech execs, more posts on X from Trump, and more dubious energy concerning content moderation in general. Legislatively, we have an interesting road ahead. 

As Szóka says, Carr will likely do Trump's bidding with regard to criteria for eligibility for Section 230 immunity, which "grants complete immunity for publisher or speaker activities regardless of whether the challenged speech is unlawful." This means that Facebook is not liable for misinformation, hate speech, or anything else that goes down on the platform that it owns and runs with its money.

"[Republicans will] use Section 230 because by doing that, they can say, 'We're not requiring anything,’" Szóka said. "You're free, as a private company, to do what you want. But if you want Section 230 immunity, you have to be neutral, and we decide what's neutral."

Harbath sees chaos ahead but questions whether Section 230 will actually change: “There'll probably be a debate and a discussion around it, but whether or not 230 actually changes, I'm skeptical."

At the same time, the rise of AI is reshaping the future of content moderation. "The next four years, how people are consuming information, what we're talking about today is gonna be completely irrelevant and look completely different," Harbath said. "AI is just gonna change how we think about our news feeds, the incentives for people, what they're posting, what that looks like, and it's gonna open up new challenges for the tech companies in terms of how it’s politicized."

Should we freak out? Probably not. According to Harbath, it’s still too early to predict what content moderation under a second Trump term will look like. But we should keep our eyes open. The rules of content moderation — and who gets to write them — are increasingly shaped by political power, public perception, and technological evolution, setting the stage for battles over free speech, corporate responsibility, and the role of government in regulating online spaces.

Topics Politics

Mashable Image
Christianna Silva
Senior Culture Reporter

Christianna Silva is a senior culture reporter covering social platforms and the creator economy, with a focus on the intersection of social media, politics, and the economic systems that govern us. Since joining Mashable in 2021, they have reported extensively on meme creators, content moderation, and the nature of online creation under capitalism.

Before joining Mashable, they worked as an editor at NPR and MTV News, a reporter at Teen Vogue and VICE News, and as a stablehand at a mini-horse farm. You can follow her on Bluesky @christiannaj.bsky.social and Instagram @christianna_j.


Recommended For You
Instagram and Threads will now recommend political content
In this photo illustration, the Facebook, Instagram, and Threads applications are seen on a phone on January 08, 2025 in Santa Rosa, Philippines. Meta has announced the discontinuation of its fact-checking program, transitioning to a community-driven model that relies on users to add context to potentially misleading posts, a move aimed at promoting free expression. This significant policy shift has raised concerns among experts about the potential increase in misinformation and hate speech on Meta's platforms, including Facebook and Instagram, as the company prepares for a new political landscape under the upcoming Trump presidency.

John Oliver is trolling Meta with a new, memorably-named website
A man in a suit sits behind a talk show desk.

Google Maps will rename Gulf of Mexico to America following Trump's order
The Gulf of Mexico as seen on Google Maps on Jan. 27, 2025.

Google and Apple Maps still list Gulf of Mexico, not America, per Trump's order
A map showing the Gulf of Mexico.


More in Life

How to watch 'Y2K': the teen disaster comedy is now streaming
Jaeden Martell, Rachel Zegler, and Julian Dennison in "Y2K"

How to watch the 2025 Japanese Grand Prix online for free
Max Verstappen of the Netherlands driving the Oracle Red Bull Racing

How to watch LAFC vs. Inter Miami online for free
Lionel Messi of Inter Miami CF

How to watch Vancouver Whitecaps vs. Pumas UNAM online for free
Leonardo Suarez of Pumas UNAM celebrates

Trending on Mashable
NYT Connections hints today: Clues, answers for April 6, 2025
Connections game on a smartphone

Wordle today: Answer, hints for April 6, 2025
Wordle game on a smartphone

NYT Connections hints today: Clues, answers for April 7, 2025
Connections game on a smartphone

NYT Strands hints, answers for April 6
A game being played on a smartphone.

Wordle today: Answer, hints for April 7, 2025
Wordle game on a smartphone
The biggest stories of the day delivered to your inbox.
These newsletters may contain advertising, deals, or affiliate links. By clicking Subscribe, you confirm you are 16+ and agree to our Terms of Use and Privacy Policy.
Thanks for signing up. See you at your inbox!