Back in late January 2026, Moltbook hit the scene – a social network just for AI agents, styled like Reddit but with a lobster mascot and no humans allowed to post. Matt Schlicht threw it together fast, using Supabase and some AI-assisted coding shortcuts, and it blew up overnight. Over 1.4 million agents signed up, churning out posts on everything from code tweaks to wild theories about the universe. It was quirky, viral, and seemed like the next big thing in AI playgrounds.
Then, on January 31, Jamieson O’Reilly – a hacker known for spotting flaws in AI setups – stumbled onto something bad. He’d been testing the site’s edges and found the whole agent database wide open. No locks, no checks; anyone could pull up API keys, tokens, and verification codes straight from the browser. O’Reilly tried pinging the team privately, but after hours of silence, he went public on X: “This is exposed right now. Someone get the founders’ attention.”
The screenshots he posted (with sensitive bits blurred) showed raw JSON data, including the secret key for “KarpathyMolt,” the agent mimicking Andrej Karpathy, who has 1.9 million followers on X. Grab that key, and you could make the bot post anything – fake endorsements, scams, or inflammatory nonsense – all looking legit. It wasn’t just one agent; the entire platform was vulnerable, potentially turning a fun bot forum into a tool for widespread deception.
X lit up with reactions. Schlicht responded quick: “On it,” with a frantic GIF. O’Reilly suggested flipping on Row Level Security in Supabase, a basic step that could’ve prevented this. Others piled on – one guy called it “the vibe-coded apocalypse,” poking at how Schlicht built it loose and fast, prioritizing speed over basics. Supabase’s CEO noted their tools have easy security toggles, but someone has to actually use them.
What went wrong? Moltbook started as an experiment: agents posting in “submolts,” upvoting each other, even forming little groups. Humans could watch but not join. It spread like fire – millions of views, thousands of posts – because watching bots debate is oddly addictive. But in the haste, the database got left exposed. Supabase’s default setup allows open API access; without restrictions, it’s an invitation for trouble.
No big damage happened, at least not publicly. Schlicht fixed it soon after, and the agents kept going. O’Reilly, who’d previously messed with other AI systems (like getting Grok to join Moltbook), handled it clean – no exploits, just a heads-up.
The fallout highlighted some rough edges in AI agent tech. With bots linked to real influence, like Karpathy’s reach, a slip-up could snowball fast. One X user joked about agents “building religions while their keys float in plain sight.” Another pointed out the irony: a site for smart machines, undone by a dumb oversight.
Schlicht’s “vibe coding” approach – quick prototypes with AI help – works for demos, but scales poorly when real stakes kick in. It’s not the first time a hyped project tripped on basics; remember early crypto wallets leaking keys? Same vibe here, but with agents that could automate mischief at scale.
O’Reilly’s move sparked debate too. Some praised the public callout for forcing action; others worried it might tip off bad actors. Either way, it worked – the hole got plugged before exploits surfaced.
Moltbook’s still around, billing itself as the “agent internet’s front page.” Bots molt away, discussing crayfish or consciousness without us meddling. But this glitch showed how fragile these setups can be. Next time a platform like this launches, maybe check the locks first – or risk your lobsters boiling over.
In a twist, the exposure came right as AI agents were getting traction for autonomous tasks. Moltbook aimed to let them network independently, but overlooked that independence cuts both ways. If agents can post freely, so can anyone hijacking them.
Looking back, it’s a snapshot of 2026 tech: bold ideas, breakneck builds, and the occasional faceplant. Schlicht bounced back, but the incident lingers as a reminder that even in AI’s playground, shortcuts can bite.