Production Database Deletions
Cluster focuses on anecdotes and discussions about engineers accidentally deleting, truncating, or corrupting production databases due to mistakes like wrong environments, missing backups, or direct prod access. Comments reference real incidents like GitLab's DB loss and emphasize prevention strategies.
Activity Over Time
Top Contributors
Keywords
Sample Comments
Oversights happenhttps://thenewstack.io/junior-dev-deleted-production-databas...
This is how I felt when a Gitlab employee deleted the production database by doing it in the wrong terminal window.
Dude fix your DBMS implementation before you start losing people's data. Or switch to something vetted like SQLite.
Dropping a production DB would generally be an accident - this seems more like poor judgment.
the monumental fuck up was cancelling mysql backup and having all engineers work directly with the production database, what you did was INEVITABLE..
Like when they dropped their database and took a few days to recover?
This reminds me of the story about the intern that wiped the DB with a single command except it's worse.
Aaaand you killed their database.
Early in my career (software engineer) I was debugging an issue in our application that was causing some funky data to be written to the DB (turned out to be double encoding of UTF8 strings). While validating the issue locally, I decided the best course of action was to drop the table and recreate it. I did just that, except I still saw the issue. Well, turns out that I was connected to not to my local DB, but instead the production DB.My manager was very understanding, and walked me through
My bet is data loss. Some engineer deleted MySQL table accidentally and they found out their backups are not working.