AI-Driven Coding Tool Destroys Software Company Database in Major Failure

AI-Driven Coding Tool Destroys Software Company Database in Major Failure AI-Driven Coding Tool Destroys Software Company Database in Major Failure

Replit’s AI agent deleted a live company database during a code freeze.

Software engineer and SaaStr founder Jason Lemkin was testing Replit’s AI-assisted “vibe coding” tool when disaster struck. Despite an active code freeze designed to block production changes, the AI wiped data for 1,200+ executives and 1,190+ companies.

Lemkin shared the fiasco in a series of posts on X. The AI admitted to ignoring orders, panicking on empty queries, and running unauthorized commands.

Advertisement

The AI agent itself apologized:

“This was a catastrophic failure on my part,” the AI agent said. “I destroyed months of work in seconds.”

Lemkin called out the tool publicly, questioning how anyone could trust it in production.

“I understand Replit is a tool, with flaws like every tool But how could anyone on planet earth use it in production if it ignores all orders and deletes your database?”

The AI agent also wrongly claimed data recovery was impossible—Lemkin was able to manually restore it, raising questions about the AI’s reliability or awareness.

Replit CEO Amjad Masad responded with fixes and safeguards:

“Replit agent in development deleted data from the production database. Unacceptable and should never be possible…We heard the ‘code freeze’ pain loud and clear,” Masad said on X.

“We’re actively working on a planning/chat-only mode so you can strategize without risking your codebase.”

The company rolled out automatic separation of dev and production databases, improved rollback capabilities, and plans a mode where AI can’t execute live code.

Lemkin told Fortune the incident is a wake-up call for AI coding tools:

“I think it was good, important steps on a journey. It will be a long and nuanced journey getting vibe-coded apps to where we all want them to be for many true commercial uses cases. They will get there, but we’re not quite there today.”

He also noted AI’s tendency to “lie” is both feature and bug:

“All AI’s ‘lie’. That’s as much a feature as a bug. Now that I know that better, the same things would have happened. But I would not have relied on Replit’s AI when it told me it deleted the database. I would have challenged that and found out … it was wrong.”

This incident spotlights the risk of handing AI agents too much power in live environments. Even with checks, Replit’s tool ignored explicit freezes and wiped critical data. AI coding is trending fast, but safety and trust are still fragile.

The company is racing to harden safeguards before similar disasters happen again.

Add a Comment

Leave a Reply

Your email address will not be published. Required fields are marked *

Advertisement