ℹ️ Skipped - page is already crawled
| Filter | Status | Condition | Details |
|---|---|---|---|
| HTTP status | PASS | download_http_code = 200 | HTTP 200 |
| Age cutoff | PASS | download_stamp > now() - 6 MONTH | 0.3 months ago |
| History drop | PASS | isNull(history_drop_reason) | No drop reason |
| Spam/ban | PASS | fh_dont_index != 1 AND ml_spam_score = 0 | ml_spam_score=0 |
| Canonical | PASS | meta_canonical IS NULL OR = '' OR = src_unparsed | Not set |
| Property | Value |
|---|---|
| URL | https://www.latimes.com/opinion/story/2021-01-15/facebook-twitter-extremism-donald-trump-violence |
| Last Crawled | 2026-04-10 20:43:17 (9 days ago) |
| First Indexed | 2021-01-15 11:32:18 (5 years ago) |
| HTTP Status Code | 200 |
| Meta Title | Contributor: Banning Trump from Twitter and Facebook isn't nearly enough - Los Angeles Times |
| Meta Description | Use liability law to sue Facebook and Twitter for building platforms they knew would nurture and spread toxic extremism. |
| Meta Canonical | null |
| Boilerpipe Text | Social media finally
pulled the plug
on Donald Trump. Days after Trump incited a riot at the U.S. Capitol, Twitter permanently banned the president from its platform, and many other social media companies like Facebook, YouTube and Snapchat suspended Trump’s accounts as well.
Mark Zuckerberg and the other creators of the most powerful speech engines in the world have shown astonishingly little contrition in contributing to one of the darkest days for democracy in America. They all expressed shock. But Facebook, Twitter, YouTube and every other social media company have known for over a decade that their tools would be used in ways that lead to violence — they’ve seen it happen. And they did too little, for too long.
There’s
growing
evidence
that banning influential individuals from social media has a big impact on the spread of harmful misinformation. We’re about to have a great test case. Even as many applaud Twitter and Facebook for finally “deplatforming” this toxic president, others cower at the enormous power internet companies hold over public discourse — concerns wrapped up with deep American intuitions around enabling free speech.
The 1st Amendment restricts only the ability of governments to interfere with free expression. So Facebook and Twitter are not trammeling anyone’s constitutional rights when they delete posts and accounts for violating the company’s terms of use. In fact, it would be a 1st Amendment problem were governments to start forcing private companies to continue publishing speech they disagree with. But the free speech objections in this debate obscure an important point: These companies have built their systems to profit from the largely unchecked, viral spread of information. They are clearly aware of how their tools are being used.
In the coming months we will hear a lot about how social media fanned insurrection and whether we need better rules to hold them accountable. There are no easy answers here. The roots for violent extremism in America run deeper than the communication technologies available to them, with white supremacy at the top of the list. But social media is a significant piece of this puzzle.
Social media companies like Facebook and Twitter have built their systems to encourage and profit from misinformation and viral hatred. Their user interfaces encourage toxic sharing by removing barriers to exposing one’s thoughts and making it too easy to reflexively pass along posts that agree with your worldview. They provide instant gratification in the form of likes and hearts for the most pithy and indulgent takes. Their algorithms wind up recommending toxic communities and rewarding the most incendiary posts. Outsize amplifiers like Trump play a key role.
Under the law, if you
created something dangerous
, knowing the specific harm that would result, you can be held liable. Social media companies knew that their platforms were designed in ways that fostered misinformation and extremism. It’s time our laws held them accountable.
American law tends not to punish people or institutions for harms they could not anticipate. Crimes generally must be intended, and most harmful actions for which one can sue for civil damages must at least be foreseeable. These requirements come from a sense of fairness. Even the makers of asbestos — which went on to kill
almost 100,000 people
a year and become the subject of a cottage industry of lawsuits — were not initially held liable for lung disease because courts found the manufacturers did not know and could not predict the harm.
But that’s not where we are with social media and political violence.
It’s not just that Zuckerberg should have known political violence was likely. He did know. He knew because his own
employees
told him. He knew because it happened in
Myanmar
. He knew because every credible expert — especially women and people of color — publicly said it would happen
over
and
over
.
In recent years, Zuckerberg has invited several sets of prominent critics to his
home
to talk about Facebook. These people must have told him that political violence was likely. And yet he released a
statement
last week
pretending
that “the current context is now fundamentally different” in barring Trump from using Facebook through the end of his term. Imagine if the chief executive of an asbestos company invited scientists over to dinner and they told him that his product causes lung disease. He would not be able to claim in court later that he couldn’t foresee the harm.
Of course, information is different from asbestos. But not so different to justify a free pass. There are several ways lawmakers and the public might move to hold platforms more accountable for building and maintaining an environment they know to be dangerous.
Lawmakers and courts can and should distinguish between attributing user speech to platforms — which the law properly forbids — and failing to take reasonable measures to keep the community safe. A company with inadequate cybersecurity can face consequences when it fails to ward off an easily foreseeable hack, even though the company isn’t the hacker. The same should be true of harmful misinformation, especially when the platform’s own terms of service lay out the sort of community the user should expect.
Lawmakers could create new rules to regulate harmful algorithms and user-interface design choices that amplify dangerous rhetoric and predictably make online spaces such a powder keg. Or judges could adapt the law of negligence and product liability to respond to the foreseeable dangers in the way these services are built. Scholars and policymakers have been proposing these kinds of interventions for a while. Unfortunately, up to now they have been ignored.
Banning Trump from social media platforms grabs public attention. Now we have to challenge the actions of these companies that made removing him necessary in the first place.
Ryan Calo is the Lane Powell and D. Wayne Gittinger professor at the University of Washington School of Law.
Woodrow Hartzog is a professor of law and computer science at Northeastern University.
More to Read |
| Markdown | - [News](https://www.latimes.com/)
- [Home Page](https://www.latimes.com/)
- [California](https://www.latimes.com/california)
- [Election 2024](https://www.latimes.com/topic/election-2024)
- [Housing & Homelessness](https://www.latimes.com/homeless-housing)
- [Politics](https://www.latimes.com/politics)
- [Science & Medicine](https://www.latimes.com/science)
- [World & Nation](https://www.latimes.com/world-nation)
- [Business](https://www.latimes.com/business)
- [Artificial Intelligence](https://www.latimes.com/topic/artificial-intelligence)
- [Autos](https://www.latimes.com/business/autos)
- [Jobs, Labor & Workplace](https://www.latimes.com/topic/jobs-labor)
- [Real Estate](https://www.latimes.com/business/real-estate)
- [Technology and the Internet](https://www.latimes.com/business/technology)
- [California](https://www.latimes.com/california)
- [California Politics](https://www.latimes.com/topic/california-law-politics)
- [Earthquakes](https://www.latimes.com/topic/earthquakes)
- [Education](https://www.latimes.com/topic/education)
- [Housing & Homelessness](https://www.latimes.com/homeless-housing)
- [L.A. Influential](https://www.latimes.com/la-influential)
- [L.A. Politics](https://www.latimes.com/topic/la-politics)
- [Mental Health](https://www.latimes.com/topic/mental-health)
- [Climate & Environment](https://www.latimes.com/environment)
- [Climate Change](https://www.latimes.com/topic/climate-change)
- [Water & Drought](https://www.latimes.com/topic/california-drought)
- [Entertainment & Arts](https://www.latimes.com/entertainment-arts)
- [Arts](https://www.latimes.com/topic/arts)
- [Books](https://www.latimes.com/entertainment-arts/books)
- [Stand-Up Comedy](https://www.latimes.com/topic/comedy)
- [Hollywood Inc.](https://www.latimes.com/entertainment-arts/business)
- [The Envelope (Awards)](https://www.latimes.com/entertainment-arts/awards)
- [Movies](https://www.latimes.com/entertainment-arts/movies)
- [Music](https://www.latimes.com/entertainment-arts/music)
- [Television](https://www.latimes.com/entertainment-arts/tv)
- [Things to Do](https://www.latimes.com/topic/things-to-do)
- [De Los](https://www.latimes.com/delos)
- [En Español](https://www.latimes.com/espanol/)
- [Food](https://www.latimes.com/food)
- [101 Best Restaurants in L.A.](https://www.latimes.com/food/list/101-best-restaurants-los-angeles)
- [Recipes](https://www.latimes.com/food/recipes)
- [Image](https://www.latimes.com/lifestyle/image)
- [Art & Culture](https://www.latimes.com/lifestyle/image/story/2022-09-13/trend-analysis)
- [Conversations](https://www.latimes.com/lifestyle/image/story/2022-09-13/los-intelligentsia)
- [Drip Index: Event Guides](https://www.latimes.com/lifestyle/image/story/2022-09-13/drip-index)
- [Fashion](https://www.latimes.com/topic/fashion)
- [Shopping Guides](https://www.latimes.com/lifestyle/image/story/2022-09-13/coveted)
- [Styling Myself](https://www.latimes.com/lifestyle/image/story/2022-09-13/styling-myself)
- [Lifestyle](https://www.latimes.com/lifestyle)
- [Health & Wellness](https://www.latimes.com/topic/wellness)
- [Home Design](https://www.latimes.com/topic/home-design)
- [L.A. Affairs](https://www.latimes.com/topic/la-affairs)
- [Plants](https://www.latimes.com/topic/plants)
- [Travel & Experiences](https://www.latimes.com/travel)
- [Weekend](https://www.latimes.com/lifestyle/weekend)
- [Things to Do in L.A.](https://www.latimes.com/topic/things-to-do)
- [Obituaries](https://www.latimes.com/obituaries)
- [Voices](https://www.latimes.com/voices)
- [Editorials](https://www.latimes.com/topic/editorials)
- [Letters to the Editor](https://www.latimes.com/opinion/letters-to-the-editor)
- [Contributors](https://www.latimes.com/topic/op-ed)
- [Short Docs](https://www.latimes.com/shortdocs)
- [Sports](https://www.latimes.com/sports)
- [Angels](https://www.latimes.com/sports/angels)
- [Angel City FC](https://www.latimes.com/sports/soccer/angel-city-fc)
- [Chargers](https://www.latimes.com/sports/chargers)
- [Clippers](https://www.latimes.com/sports/clippers)
- [Dodgers](https://www.latimes.com/sports/dodgers)
- [Ducks](https://www.latimes.com/sports/hockey/ducks)
- [Galaxy](https://www.latimes.com/sports/soccer/galaxy)
- [High School Sports](https://www.latimes.com/sports/highschool)
- [Kings](https://www.latimes.com/sports/hockey/kings)
- [Lakers](https://www.latimes.com/sports/lakers)
- [Olympics](https://www.latimes.com/sports/olympics)
- [USC](https://www.latimes.com/sports/usc)
- [UCLA](https://www.latimes.com/sports/ucla)
- [Rams](https://www.latimes.com/sports/rams)
- [Sparks](https://www.latimes.com/sports/sparks)
- [World & Nation](https://www.latimes.com/world-nation)
- [Immigration & the Border](https://www.latimes.com/topic/immigration)
- [Mexico & the Americas](https://www.latimes.com/topic/mexico-americas)
- [Middle East](https://www.latimes.com/topic/middle-east)
- [Ukraine](https://www.latimes.com/topic/ukraine)
- [Times Everywhere](https://www.latimes.com/archives/topics)
- [404 by L.A. Times](https://www.instagram.com/latimes.404/?hl=en)
- [Facebook](https://www.facebook.com/latimes)
- [Instagram](https://www.instagram.com/latimes/)
- [LA Times Today](https://www.latimes.com/topic/la-times-today)
- [Newsletters](https://membership.latimes.com/newsletters/)
- [Photography](https://www.latimes.com/california/photography)
- [Podcasts](https://www.latimes.com/podcasts)
- [Short Docs](https://www.latimes.com/shortdocs)
- [TikTok](https://www.tiktok.com/@latimes?lang=en)
- [Threads](https://www.threads.net/@latimes)
- [Video](https://www.latimes.com/video)
- [YouTube](https://www.youtube.com/user/losangelestimes)
- [X (Twitter)](https://twitter.com/latimes)
- [For Subscribers](https://www.latimes.com/topic/for-la-times-subscribers)
- [eNewspaper](https://enewspaper.latimes.com/desktop/latimes/default.aspx?pubid=50435180-e58e-48b5-8e0c-236bf740270e)
- [All Sections](https://www.latimes.com/archives/topics)
- \_\_\_\_\_\_\_\_\_\_\_\_\_\_\_\_\_
- [LA Times Studios](https://www.latimes.com/studios)
- [Business](https://www.latimes.com/b2b)
- [• AI & Tech](https://www.latimes.com/b2b/ai-technology)
- [• Automotive](https://www.latimes.com/b2b/autos)
- [• Banking & Finance](https://www.latimes.com/b2b/banking-finance)
- [• Commercial Real Estate](https://www.latimes.com/b2b/commercial-real-estate)
- [• Entertainment](https://www.latimes.com/b2b/entertainment)
- [• Goods & Retail](https://www.latimes.com/b2b/consumer-goods-retail)
- [• Innovators Unplugged](https://www.latimes.com/b2b/innovators-unplugged)
- [• Healthcare & Science](https://www.latimes.com/b2b/health-life-science)
- [• Law](https://www.latimes.com/b2b/law-legal)
- [• Sports](https://www.latimes.com/b2b/sports)
- [Deals & Coupons](https://www.latimes.com/deals)
- [Decor & Design](https://www.latimes.com/spaces)
- [Dentists](https://www.latimes.com/dentistry)
- [Doctors & Scientists](https://www.latimes.com/doctors-scientists)
- [Fitness](https://www.latimes.com/fitness)
- [Hot Property](https://www.latimes.com/hotproperty)
- [Live & Well](https://www.latimes.com/live-well)
- [Orange County](https://www.latimes.com/b2b/orange-county)
- [Pets](https://www.latimes.com/companion-animals)
- [The Hub: Rebuilding LA](https://www.latimes.com/wildfire-recovery)
- [Travel](https://www.latimes.com/eta)
- [Veterinarians](https://www.latimes.com/veterinarians)
- [Weddings & Celebrations](https://www.latimes.com/weddings)
- [Newsletters](https://membership.latimes.com/newsletters#la-times-studios)
- [Live Stream](https://www.latimes.com/00000192-788e-ddaa-a39a-7abfc6b30000-123)
- [Events](https://www.latimes.com/events-los-angeles-times)
- [Screening Series](https://www.latimes.com/screenings)
- [Crossword](https://www.latimes.com/games)
- [Games](https://www.latimes.com/games)
- [L.A. Times Store](https://store.latimes.com/?utm_source=latimes&utm_medium=homepage&utm_campaign=homepage_nav)
- [Subscriptions](https://www.latimes.com/flyoutsubscribe)
- [Manage Subscription](https://membership.latimes.com/)
- [EZPAY](https://membership.latimes.com/billing-info)
- [Delivery Issue](https://membership.latimes.com/subscription-info)
- [eNewspaper](https://enewspaper.latimes.com/)
- [Students & Educators](https://www.latimes.com/opinion/story/2021-01-15/facebook-twitter-extremism-donald-trump-violence)
- [Subscribe](https://www.latimes.com/flyoutsubscribe)
- [Subscriber Terms](https://www.latimes.com/subscriber-terms-and-conditions)
- [Gift Subscription Terms](https://www.latimes.com/gift-subscription-terms)
- [About Us](https://www.latimes.com/about)
- [About Us](https://www.latimes.com/about)
- [Archives](https://www.latimes.com/archives)
- [Company News](https://www.latimes.com/about/pressreleases)
- [eNewspaper](https://enewspaper.latimes.com/desktop/latimes/default.aspx?pubid=50435180-e58e-48b5-8e0c-236bf740270e)
- [For the Record](https://www.latimes.com/about/for-the-record/)
- [Got a Tip?](https://www.latimes.com/tips/)
- [L.A. Times Careers](https://careers.latimes.com/)
- [L.A. Times Store](https://store.latimes.com/?utm_source=latimes&utm_medium=homepage&utm_campaign=homepage_subnav)
- [LA Times Studios Capabilities](https://studios.latimes.com/)
- [News App: Apple IOS](https://apps.apple.com/us/app/la-times/id373238146)
- [News App: Google Play](https://play.google.com/store/apps/details?id=com.apptivateme.next.la&hl=en_US)
- [Newsroom Directory](https://www.latimes.com/newsroom-directory)
- [Public Affairs](https://www.latimes.com/about/public-affairs)
- [Rights, Clearance & Permissions](https://www.latimes.com/about/how-to-obtain-rights-permissions)
- [Short Docs](https://www.latimes.com/shortdocs)
- [Advertising](https://placeanad.latimes.com/)
- [Classifieds](https://classifieds.latimes.com/)
- [Find/Post Jobs](https://jobs.latimes.com/)
- [Hot Property Sections](https://marketplace.latimes.com/places/types:23)
- [Local Ads Marketplace](https://marketplace.latimes.com/)
- [L.A. Times Digital Agency](https://digitalagency.latimes.com/)
- [Media Kit: Why the L.A. Times?](https://mediakit.latimes.com/)
- [Place an Ad](https://placeanad.latimes.com/)
- [Place an Open House](https://placeanad.latimes.com/open-house)
- [Sotheby’s International Realty](https://marketplace.latimes.com/places/categories:43)
- [Special Supplements](https://www.latimes.com/specialsupplements)
- [Healthy Living](https://www.latimes.com/seniorresources)
- [Higher Education](https://www.latimes.com/higher-education)
- [Philanthropy](https://www.latimes.com/b2b/giving)
Sections
Tap to enable a layout that focuses on the article.
Focus mode
Show Search
Advertisement
Voices
Ryan Calo and Woodrow Hartzog
# Banning Trump from Twitter and Facebook isn’t nearly enough

Facebook CEO Mark Zuckerberg said he was shocked by the attack on the U.S. Capitol. Experts have warned for years that Facebook’s features help nurture extremist groups.
(Justin Sullivan / Getty Images)
By Ryan Calo and Woodrow Hartzog
Jan. 15, 2021 3:30 AM PT
- Share via
Close extra sharing options
- [Email](mailto:?body=Contributor%3A%20Banning%20Trump%20from%20Twitter%20and%20Facebook%20isn%27t%20nearly%20enough%0A%0Ahttps%3A%2F%2Fwww.latimes.com%2Fopinion%2Fstory%2F2021-01-15%2Ffacebook-twitter-extremism-donald-trump-violence%0A%0AUse%20liability%20law%20to%20sue%20Facebook%20and%20Twitter%20for%20building%20platforms%20they%20knew%20would%20nurture%20and%20spread%20toxic%20extremism.)
- [Facebook](https://www.facebook.com/sharer/sharer.php?u=https%3A%2F%2Fwww.latimes.com%2Fopinion%2Fstory%2F2021-01-15%2Ffacebook-twitter-extremism-donald-trump-violence)
- [X](https://x.com/intent/tweet?url=https%3A%2F%2Fwww.latimes.com%2Fopinion%2Fstory%2F2021-01-15%2Ffacebook-twitter-extremism-donald-trump-violence&text=Contributor%3A%20Banning%20Trump%20from%20Twitter%20and%20Facebook%20isn%27t%20nearly%20enough)
- [LinkedIn](https://www.linkedin.com/shareArticle?url=https%3A%2F%2Fwww.latimes.com%2Fopinion%2Fstory%2F2021-01-15%2Ffacebook-twitter-extremism-donald-trump-violence&title=Contributor%3A%20Banning%20Trump%20from%20Twitter%20and%20Facebook%20isn%27t%20nearly%20enough&summary=Use%20liability%20law%20to%20sue%20Facebook%20and%20Twitter%20for%20building%20platforms%20they%20knew%20would%20nurture%20and%20spread%20toxic%20extremism.&source=Los%20Angeles%20Times)
- [Threads](https://threads.net/intent/post?text=Contributor%3A%20Banning%20Trump%20from%20Twitter%20and%20Facebook%20isn%27t%20nearly%20enough%20https%3A%2F%2Fwww.latimes.com%2Fopinion%2Fstory%2F2021-01-15%2Ffacebook-twitter-extremism-donald-trump-violence)
- [Reddit](https://www.reddit.com/submit?url=https%3A%2F%2Fwww.latimes.com%2Fopinion%2Fstory%2F2021-01-15%2Ffacebook-twitter-extremism-donald-trump-violence&title=Contributor%3A%20Banning%20Trump%20from%20Twitter%20and%20Facebook%20isn%27t%20nearly%20enough)
- [WhatsApp](https://api.whatsapp.com/send?text=Contributor%3A%20Banning%20Trump%20from%20Twitter%20and%20Facebook%20isn%27t%20nearly%20enough%20https%3A%2F%2Fwww.latimes.com%2Fopinion%2Fstory%2F2021-01-15%2Ffacebook-twitter-extremism-donald-trump-violence)
- Copy Link URL Copied\!
- Print
Social media finally [pulled the plug](https://www.axios.com/platforms-social-media-ban-restrict-trump-d9e44f3c-8366-4ba9-a8a1-7f3114f920f1.html) on Donald Trump. Days after Trump incited a riot at the U.S. Capitol, Twitter permanently banned the president from its platform, and many other social media companies like Facebook, YouTube and Snapchat suspended Trump’s accounts as well.
Mark Zuckerberg and the other creators of the most powerful speech engines in the world have shown astonishingly little contrition in contributing to one of the darkest days for democracy in America. They all expressed shock. But Facebook, Twitter, YouTube and every other social media company have known for over a decade that their tools would be used in ways that lead to violence — they’ve seen it happen. And they did too little, for too long.
There’s [growing](https://techcrunch.com/2017/09/11/study-finds-reddits-controversial-ban-of-its-most-toxic-subreddits-actually-worked/?guccounter=1&guce_referrer=aHR0cHM6Ly93d3cudmljZS5jb20v&guce_referrer_sig=AQAAADfbxyxQB_7gH0l1ircBa9RyYFprZ9rejJIHA2OQ32aBi5v-iz4LBsbQbbD_XQfPTgxgdX6zddSZM11Kgb0UZ8xs85Hu48unKkELUlRz7VlfezHRIoMyeEYHaXYxz3AyCngnl00looAZR2RaarNL5d_Z6rbP0Uf4JSQsbqXtQsp_) [evidence](https://www.vice.com/en/article/bjbp9d/do-social-media-bans-work) that banning influential individuals from social media has a big impact on the spread of harmful misinformation. We’re about to have a great test case. Even as many applaud Twitter and Facebook for finally “deplatforming” this toxic president, others cower at the enormous power internet companies hold over public discourse — concerns wrapped up with deep American intuitions around enabling free speech.
Advertisement
The 1st Amendment restricts only the ability of governments to interfere with free expression. So Facebook and Twitter are not trammeling anyone’s constitutional rights when they delete posts and accounts for violating the company’s terms of use. In fact, it would be a 1st Amendment problem were governments to start forcing private companies to continue publishing speech they disagree with. But the free speech objections in this debate obscure an important point: These companies have built their systems to profit from the largely unchecked, viral spread of information. They are clearly aware of how their tools are being used.
In the coming months we will hear a lot about how social media fanned insurrection and whether we need better rules to hold them accountable. There are no easy answers here. The roots for violent extremism in America run deeper than the communication technologies available to them, with white supremacy at the top of the list. But social media is a significant piece of this puzzle.
Social media companies like Facebook and Twitter have built their systems to encourage and profit from misinformation and viral hatred. Their user interfaces encourage toxic sharing by removing barriers to exposing one’s thoughts and making it too easy to reflexively pass along posts that agree with your worldview. They provide instant gratification in the form of likes and hearts for the most pithy and indulgent takes. Their algorithms wind up recommending toxic communities and rewarding the most incendiary posts. Outsize amplifiers like Trump play a key role.
Advertisement
Under the law, if you [created something dangerous](https://southerncalifornialawreview.com/wp-content/uploads/2018/01/80_241.pdf), knowing the specific harm that would result, you can be held liable. Social media companies knew that their platforms were designed in ways that fostered misinformation and extremism. It’s time our laws held them accountable.
American law tends not to punish people or institutions for harms they could not anticipate. Crimes generally must be intended, and most harmful actions for which one can sue for civil damages must at least be foreseeable. These requirements come from a sense of fairness. Even the makers of asbestos — which went on to kill [almost 100,000 people](https://www.vox.com/2015/6/30/8868963/asbestos-killed-us-death-rates) a year and become the subject of a cottage industry of lawsuits — were not initially held liable for lung disease because courts found the manufacturers did not know and could not predict the harm.
But that’s not where we are with social media and political violence.
It’s not just that Zuckerberg should have known political violence was likely. He did know. He knew because his own [employees](https://www.nytimes.com/2020/06/01/technology/facebook-employee-protest-trump.html) told him. He knew because it happened in [Myanmar](https://www.nytimes.com/2018/11/06/technology/myanmar-facebook.html). He knew because every credible expert — especially women and people of color — publicly said it would happen [over](https://logicmag.io/failure/siva-vaidhyanathan-on-antisocial-media/) and [over](https://www.nytimes.com/2020/08/05/technology/facebook-online-hate.html).
In recent years, Zuckerberg has invited several sets of prominent critics to his [home](https://www.cnn.com/2019/12/04/tech/facebook-zuckerberg-political-ads-civil-rights/index.html) to talk about Facebook. These people must have told him that political violence was likely. And yet he released a [statement](https://www.facebook.com/zuck/posts/10112681480907401) last week [pretending](https://www.nytimes.com/2021/01/07/business/facebook-trump-ban.html) that “the current context is now fundamentally different” in barring Trump from using Facebook through the end of his term. Imagine if the chief executive of an asbestos company invited scientists over to dinner and they told him that his product causes lung disease. He would not be able to claim in court later that he couldn’t foresee the harm.
Of course, information is different from asbestos. But not so different to justify a free pass. There are several ways lawmakers and the public might move to hold platforms more accountable for building and maintaining an environment they know to be dangerous.
Lawmakers and courts can and should distinguish between attributing user speech to platforms — which the law properly forbids — and failing to take reasonable measures to keep the community safe. A company with inadequate cybersecurity can face consequences when it fails to ward off an easily foreseeable hack, even though the company isn’t the hacker. The same should be true of harmful misinformation, especially when the platform’s own terms of service lay out the sort of community the user should expect.
Advertisement
Lawmakers could create new rules to regulate harmful algorithms and user-interface design choices that amplify dangerous rhetoric and predictably make online spaces such a powder keg. Or judges could adapt the law of negligence and product liability to respond to the foreseeable dangers in the way these services are built. Scholars and policymakers have been proposing these kinds of interventions for a while. Unfortunately, up to now they have been ignored.
Banning Trump from social media platforms grabs public attention. Now we have to challenge the actions of these companies that made removing him necessary in the first place.
*Ryan Calo is the Lane Powell and D. Wayne Gittinger professor at the University of Washington School of Law.*
*Woodrow Hartzog is a professor of law and computer science at Northeastern University.*
### More to Read
- [](https://www.latimes.com/opinion/story/2026-02-25/social-media-youth-addictive-regulation)
Voices
### [Contributor: If social platforms are harmful, don’t just ban kids. Regulate the harms](https://www.latimes.com/opinion/story/2026-02-25/social-media-youth-addictive-regulation)
Feb. 25, 2026
- [](https://www.latimes.com/opinion/story/2026-02-12/internet-social-media-liability)
Voices
### [Contributor: Why tech giants shouldn’t be liable for creating addictive platforms](https://www.latimes.com/opinion/story/2026-02-12/internet-social-media-liability)
Feb. 12, 2026
- [](https://www.latimes.com/opinion/story/2025-09-19/trump-lawsuit-penguin-random-house-new-york-times)
Voices
### [Contributor: Trump’s lawsuit against book publisher is a dangerous escalation](https://www.latimes.com/opinion/story/2025-09-19/trump-lawsuit-penguin-random-house-new-york-times)
Sept. 19, 2025
[Opinion Voices](https://www.latimes.com/opinion)[Contributors](https://www.latimes.com/topic/op-ed)
### More From the Los Angeles Times
- [](https://www.latimes.com/opinion/story/2026-04-10/texas-bible-verses-public-schools)
Voices
### [Granderson: Faith lessons don’t belong in public schools, and Christians know that](https://www.latimes.com/opinion/story/2026-04-10/texas-bible-verses-public-schools)
April 10, 2026
- [](https://www.latimes.com/opinion/story/2026-04-10/trump-iran-chaos-erratic)
Voices
### [Contributor: Fed up with Trump’s chaos? Then his strategy is working](https://www.latimes.com/opinion/story/2026-04-10/trump-iran-chaos-erratic)
April 10, 2026
- [](https://www.latimes.com/opinion/story/2026-04-09/artemis-mission-american-culture)
Voices
### [Contributor: Artemis mission captures the spirit of unity America has needed](https://www.latimes.com/opinion/story/2026-04-09/artemis-mission-american-culture)
April 9, 2026
- [](https://www.latimes.com/opinion/story/2026-04-09/vaccine-schedule-confusion-hepatitis-b-babies-children)
Voices
### [Contributor: Vaccine confusion sets up U.S. for a resurgence of hepatitis B in babies](https://www.latimes.com/opinion/story/2026-04-09/vaccine-schedule-confusion-hepatitis-b-babies-children)
April 9, 2026
### Podcasts
- [](https://swap.fm/l/PTyO1nh7aP6evsGeKmH4)
### [Rebuilding L.A.: It’s Been Over a Year. Now Where Do We Go?](https://swap.fm/l/PTyO1nh7aP6evsGeKmH4)
It’s been 16 months since the Palisades and Fires destroyed two historic communities in L.A. and the path forward, for many, is as unclear as ever.
- [](https://swap.fm/l/CDcKV3zBOwFip0hxsI2K)
### [The Dahlia Zodiac Connection: Part One](https://swap.fm/l/CDcKV3zBOwFip0hxsI2K)
The identity of the Zodiac Killer has remained a mystery for decades, but new developments may finally point to an answer.
- [](https://link.pscrb.fm/499e6/bpwebrecmodule-7912e)
### [Boiling Point: Smoglandia Pt 6: FUTURE ELECTRIC, FUTURE TROUBLE](https://link.pscrb.fm/499e6/bpwebrecmodule-7912e)
Not for want of trying – cleaner power has created lots of engine experiments, most dramatically Caltech versus MIT in the great electric car race of 1968, a story you’ll hear from the winner.
### [Subscribers are Reading](https://www.latimes.com/topic/for-la-times-subscribers)
- ### [A tale of two Arthur Miller plays: One nails it, one misses the moment entirely](https://www.latimes.com/entertainment-arts/story/2026-04-01/arthur-miller-plays-revivals-death-of-a-salesman-the-price-review)
- ### [Here’s how to have the most fun at the L.A. Renaissance Faire](https://www.latimes.com/lifestyle/story/2026-04-10/guide-renaissance-pleasure-faire-irwindale)
- ### [L.A. Affairs: I told my husband that something had to change. I just didn’t know what would come next](https://www.latimes.com/lifestyle/story/2026-04-03/la-affairs-mary-kay-holmes-my-marriage-was-changing-i-didnt-know-what-would-come-next)
- ### [‘The wheels are falling off the bus.’ Parents scrambling as LAUSD strike date nears](https://www.latimes.com/california/story/2026-04-10/lausd-strike-threat-frustrates-parents-scrambling-for-daycare-amid-schedule-uncertainty-lost-learning)
- ### [Apple’s foldable iPhone set for debut in September](https://www.latimes.com/business/story/2026-04-08/apples-foldable-iphone-set-for-debut-in-september)
Advertisement
### Latest Opinion Voices
- Voices
### [Letters to the Editor: Air pollution near the Salton Sea doesn’t get enough attention](https://www.latimes.com/opinion/letters-to-the-editor/story/2026-04-10/salton-sea-air-pollution)
April 10, 2026
- Voices
### [Letters to the Editor: Anyone with political aspirations should be fighting to stop Trump](https://www.latimes.com/opinion/letters-to-the-editor/story/2026-04-10/trump-impeachment-congress-politicians)
April 10, 2026
- Voices
### [Letters to the Editor: The same Native languages that saved American lives were once at risk](https://www.latimes.com/opinion/letters-to-the-editor/story/2026-04-10/choctaw-code-talkers-world-war-i)
April 10, 2026
- Voices
### [Letters to the Editor: Good customer service has been out of reach for this reader](https://www.latimes.com/opinion/letters-to-the-editor/story/2026-04-10/customer-service-consumer-struggles)
April 10, 2026
- Voices
### [Letters to the Editor: After threatening a ‘whole civilization,’ Trump should be removed from office](https://www.latimes.com/opinion/letters-to-the-editor/story/2026-04-09/trump-threat-iran-whole-civilization)
April 9, 2026
Advertisement
Advertisement
Advertisement
Advertisement
Advertisement
Advertisement
Advertisement
Advertisement
Advertisement

[Subscribe for unlimited access](https://www.latimes.com/footersubscribe)
[Site Map](https://www.latimes.com/sitemap)
Follow Us
- [X](https://x.com/latimes)
- [Instagram](https://www.instagram.com/latimes/)
- [YouTube](https://www.youtube.com/losangelestimes)
- [Facebook](https://www.facebook.com/latimes)
- - [eNewspaper](https://enewspaper.latimes.com/)
- [Find/Post Jobs](https://jobs.latimes.com/)
- [Place an Ad](https://placeanad.latimes.com/)
- [Media Kit: Why the L.A. Times?](https://mediakit.latimes.com/)
- [Invest Now: Own a Part of History](https://join.latimes.com/)
- MORE FROM THE L.A. TIMES
- [Crossword](https://www.latimes.com/games/daily-crossword)
- [Obituaries](https://www.latimes.com/obituaries)
- [Recipes](https://www.latimes.com/food/recipes)
- [Guides](https://www.latimes.com/lifestyle/weekend)
- [L.A. Times Store](https://store.latimes.com/)
- [About/Contact](https://www.latimes.com/about)
- [For the Record](https://www.latimes.com/about/for-the-record/)
- [L.A. Times Careers](https://careers.latimes.com/)
- [Manage Subscription](https://membership.latimes.com/)
- [Reprints and Permissions](https://www.latimes.com/about/how-to-obtain-rights-permissions)
Copyright © 2026, Los Angeles Times \| [Terms of Service](https://www.latimes.com/terms-of-service) \| [Privacy Policy](https://www.latimes.com/privacy-policy) \| [CA Notice of Collection](https://www.latimes.com/privacy-policy#california-notice-of-collection) \| [Do Not Sell or Share My Personal Information](https://membership.latimes.com/privacy-settings) |
| Readable Markdown | Social media finally [pulled the plug](https://www.axios.com/platforms-social-media-ban-restrict-trump-d9e44f3c-8366-4ba9-a8a1-7f3114f920f1.html) on Donald Trump. Days after Trump incited a riot at the U.S. Capitol, Twitter permanently banned the president from its platform, and many other social media companies like Facebook, YouTube and Snapchat suspended Trump’s accounts as well.
Mark Zuckerberg and the other creators of the most powerful speech engines in the world have shown astonishingly little contrition in contributing to one of the darkest days for democracy in America. They all expressed shock. But Facebook, Twitter, YouTube and every other social media company have known for over a decade that their tools would be used in ways that lead to violence — they’ve seen it happen. And they did too little, for too long.
There’s [growing](https://techcrunch.com/2017/09/11/study-finds-reddits-controversial-ban-of-its-most-toxic-subreddits-actually-worked/?guccounter=1&guce_referrer=aHR0cHM6Ly93d3cudmljZS5jb20v&guce_referrer_sig=AQAAADfbxyxQB_7gH0l1ircBa9RyYFprZ9rejJIHA2OQ32aBi5v-iz4LBsbQbbD_XQfPTgxgdX6zddSZM11Kgb0UZ8xs85Hu48unKkELUlRz7VlfezHRIoMyeEYHaXYxz3AyCngnl00looAZR2RaarNL5d_Z6rbP0Uf4JSQsbqXtQsp_) [evidence](https://www.vice.com/en/article/bjbp9d/do-social-media-bans-work) that banning influential individuals from social media has a big impact on the spread of harmful misinformation. We’re about to have a great test case. Even as many applaud Twitter and Facebook for finally “deplatforming” this toxic president, others cower at the enormous power internet companies hold over public discourse — concerns wrapped up with deep American intuitions around enabling free speech.
The 1st Amendment restricts only the ability of governments to interfere with free expression. So Facebook and Twitter are not trammeling anyone’s constitutional rights when they delete posts and accounts for violating the company’s terms of use. In fact, it would be a 1st Amendment problem were governments to start forcing private companies to continue publishing speech they disagree with. But the free speech objections in this debate obscure an important point: These companies have built their systems to profit from the largely unchecked, viral spread of information. They are clearly aware of how their tools are being used.
In the coming months we will hear a lot about how social media fanned insurrection and whether we need better rules to hold them accountable. There are no easy answers here. The roots for violent extremism in America run deeper than the communication technologies available to them, with white supremacy at the top of the list. But social media is a significant piece of this puzzle.
Social media companies like Facebook and Twitter have built their systems to encourage and profit from misinformation and viral hatred. Their user interfaces encourage toxic sharing by removing barriers to exposing one’s thoughts and making it too easy to reflexively pass along posts that agree with your worldview. They provide instant gratification in the form of likes and hearts for the most pithy and indulgent takes. Their algorithms wind up recommending toxic communities and rewarding the most incendiary posts. Outsize amplifiers like Trump play a key role.
Under the law, if you [created something dangerous](https://southerncalifornialawreview.com/wp-content/uploads/2018/01/80_241.pdf), knowing the specific harm that would result, you can be held liable. Social media companies knew that their platforms were designed in ways that fostered misinformation and extremism. It’s time our laws held them accountable.
American law tends not to punish people or institutions for harms they could not anticipate. Crimes generally must be intended, and most harmful actions for which one can sue for civil damages must at least be foreseeable. These requirements come from a sense of fairness. Even the makers of asbestos — which went on to kill [almost 100,000 people](https://www.vox.com/2015/6/30/8868963/asbestos-killed-us-death-rates) a year and become the subject of a cottage industry of lawsuits — were not initially held liable for lung disease because courts found the manufacturers did not know and could not predict the harm.
But that’s not where we are with social media and political violence.
It’s not just that Zuckerberg should have known political violence was likely. He did know. He knew because his own [employees](https://www.nytimes.com/2020/06/01/technology/facebook-employee-protest-trump.html) told him. He knew because it happened in [Myanmar](https://www.nytimes.com/2018/11/06/technology/myanmar-facebook.html). He knew because every credible expert — especially women and people of color — publicly said it would happen [over](https://logicmag.io/failure/siva-vaidhyanathan-on-antisocial-media/) and [over](https://www.nytimes.com/2020/08/05/technology/facebook-online-hate.html).
In recent years, Zuckerberg has invited several sets of prominent critics to his [home](https://www.cnn.com/2019/12/04/tech/facebook-zuckerberg-political-ads-civil-rights/index.html) to talk about Facebook. These people must have told him that political violence was likely. And yet he released a [statement](https://www.facebook.com/zuck/posts/10112681480907401) last week [pretending](https://www.nytimes.com/2021/01/07/business/facebook-trump-ban.html) that “the current context is now fundamentally different” in barring Trump from using Facebook through the end of his term. Imagine if the chief executive of an asbestos company invited scientists over to dinner and they told him that his product causes lung disease. He would not be able to claim in court later that he couldn’t foresee the harm.
Of course, information is different from asbestos. But not so different to justify a free pass. There are several ways lawmakers and the public might move to hold platforms more accountable for building and maintaining an environment they know to be dangerous.
Lawmakers and courts can and should distinguish between attributing user speech to platforms — which the law properly forbids — and failing to take reasonable measures to keep the community safe. A company with inadequate cybersecurity can face consequences when it fails to ward off an easily foreseeable hack, even though the company isn’t the hacker. The same should be true of harmful misinformation, especially when the platform’s own terms of service lay out the sort of community the user should expect.
Lawmakers could create new rules to regulate harmful algorithms and user-interface design choices that amplify dangerous rhetoric and predictably make online spaces such a powder keg. Or judges could adapt the law of negligence and product liability to respond to the foreseeable dangers in the way these services are built. Scholars and policymakers have been proposing these kinds of interventions for a while. Unfortunately, up to now they have been ignored.
Banning Trump from social media platforms grabs public attention. Now we have to challenge the actions of these companies that made removing him necessary in the first place.
*Ryan Calo is the Lane Powell and D. Wayne Gittinger professor at the University of Washington School of Law.*
*Woodrow Hartzog is a professor of law and computer science at Northeastern University.*
More to Read |
| Shard | 25 (laksa) |
| Root Hash | 7344800029135033825 |
| Unparsed URL | com,latimes!www,/opinion/story/2021-01-15/facebook-twitter-extremism-donald-trump-violence s443 |