For 28 years, Section 230 has been the most powerful four sentences in American technology law. It is the reason Mark Zuckerberg sleeps at night. It is the reason every plaintiff trying to sue a social network for what happened to their kid has been politely shown the door before the case even started. And on Friday, the Massachusetts Supreme Judicial Court drove a screwdriver into it.

The court ruled — unanimously — that Section 230 does not shield Meta from a lawsuit brought by the state’s Attorney General alleging that Instagram’s design deliberately hooks teenagers and damages their mental health. Not the content. The design. That distinction is small on paper and seismic in practice, and Meta’s legal team knows it.

The Loophole Meta Has Spent A Decade Trying To Close

Section 230(c)(1) of the Communications Decency Act says no provider of an “interactive computer service” shall be treated as the publisher of information provided by someone else. In English: if a user posts something terrible on Instagram, you sue the user, not Instagram. That’s the deal Congress cut in 1996, and tech has been cashing the check ever since.

Massachusetts AG Andrea Joy Campbell didn’t take the bait. Her complaint doesn’t argue Instagram is liable for what users post. It argues Instagram is liable for the product it built — the infinite scroll, the variable-reward notification cadence, the beauty filters served to 13-year-olds, the algorithm that learns a girl is searching “thinspo” and feeds her more of it. That’s not user-generated content. That’s a design choice. And design choices are made by Meta engineers in Menlo Park.

The court bought it. “The defendants’ design choices are not third-party content,” the ruling reads, in language that should make every general counsel in Silicon Valley reach for the antacid.

Why This Is Different From Every Previous Crack

Section 230 has survived a thousand attacks. Senators have demanded its repeal at hearings. Trump tried to gut it by executive order. Biden’s FTC took swings. Section 230 walked through all of it because the federal courts kept giving it the broadest possible reading: anything tangentially related to “publishing” was protected.

The Massachusetts ruling matters because it is a state supreme court, the AG is a state actor, and the cause of action is state consumer protection law — three things Section 230 was never designed to fully neutralize. The federal preemption argument that has saved Meta in California, Texas, and the Ninth Circuit gets thinner the moment you’re standing in a state courtroom arguing about deceptive trade practices.

Translation: every other state AG just got a roadmap. There are 41 of them currently part of a coordinated lawsuit against Meta over teen mental health. Massachusetts just told them which door is unlocked.

Follow The Money — And The Discovery

Here’s the part Meta should fear more than the verdict: the case now proceeds. That means discovery. That means subpoenas for the internal Slack channels, the A/B test memos, the product manager presentations where someone, somewhere, almost certainly wrote down that a feature was retained because it juiced engagement among 14-year-olds, even though the team knew the data on harm.

We already saw what discovery does to Meta. The Frances Haugen leak in 2021 — internal research showing Instagram knew it made teen girls feel worse about themselves — is the document that started this entire wave of litigation. That was an unauthorized leak. Imagine what a court-ordered discovery process gets you.

The settlement math is going to be brutal. Meta has roughly $77 billion in cash on hand and an annual ad business north of $160 billion. They can write the check. What they can’t do is unwrite the documents that show up in a public courtroom.

The Second-Order Effect Nobody In Silicon Valley Wants To Say Out Loud

If “design choices” are not protected by Section 230, then the legal moat around every algorithmic platform just got shallower. TikTok’s For You page is a design choice. YouTube’s autoplay is a design choice. Snapchat’s streaks — the feature that makes 12-year-olds wake up in a panic at midnight — is a design choice. X’s recommendation algorithm is a design choice. Even Reddit’s karma loop is a design choice.

None of them are safe under the Massachusetts theory. None of them.

And here’s the contrarian read: this might actually be good for Meta in the long run, in the cynical way. If every platform has the same liability, none of them have a competitive disadvantage. The pain gets priced in across the industry, the legal bills become a cost of doing business, and the moats around scale incumbents — who can absorb a billion-dollar settlement — get wider, not narrower. The startup trying to build a teen-friendly social app from scratch can’t afford the lawsuit. Meta can.

What To Watch Next

Three things are about to happen, in order. First, Meta will appeal — to the U.S. Supreme Court if it has to. They will argue that “design” and “publishing” are the same thing and that the Massachusetts court has invented a distinction Congress never made. They might win that argument. The Roberts Court has been unusually friendly to broad Section 230 readings.

Second, expect the other 40 state AGs in the coordinated multistate lawsuit to amend their complaints to mirror the Massachusetts framing. Expect it within 90 days.

Third — and this is the one that actually changes the platforms — expect Meta, TikTok, and Snap to start quietly removing the features the discovery process is most likely to embarrass them about. Watch for the next “we’re refining the teen experience” press release. That’s not safety. That’s litigation hygiene.

The Verdict

Section 230 isn’t dead. But for the first time in three decades, a credible state court has said the magic words don’t extend to how a platform is built — only to what its users say on it. That gap is where every future case is going to live. Meta’s lawyers know it. Their PR team knows it. The only people who don’t seem to know it yet are the 13-year-olds who are still scrolling, and the parents who assume someone in Washington has this handled.

Nobody has it handled. Massachusetts just decided to handle it themselves.