Postgres Testing Practices
Discussions center on best practices for testing applications against real PostgreSQL databases rather than mocks or alternatives like SQLite, focusing on speed optimizations like Docker, testcontainers, and template databases while maintaining production parity.
Activity Over Time
Top Contributors
Keywords
Sample Comments
Depends on how fast you need them to be. I've had success using docker postgres , set up the same as production, where each test creates a connection to it's own test specific database, or is written to run inside a transaction that can clean up after itself using a shared database. In my experience, slower tests that use a real db almost always save net positive dev time.
Well, you're doing it wrong. It's easy and fast to run tests against real postgres (takes a fraction of a second to spin up the DB for your tests if you do it right).
I hate this practice that some companies do of testing their sql database usage... By using another database. For postgres, if you are using the jvm, you can use the amazing test containers library to test reasonably fast, with a postgres database that you can set up to be the same version as you use in production.Any database vendor should either support having their database used quickly in test like these, or at least release a simplified in memory version of their database so people can t
Use the same database for test and production. Nowadays, it is really easy to install PostgreSQL or MySQL locally or in your test environment. SQLite is great, but not for replacing PostgreSQL or MySQL during tests.
Hi Weston, congratulations on launching. You've done a great job of explaining what the project is and how it works. I'm not in the ecosystem you've built this for so I can't comment too much on the project itself, but nice work communicating it.EDIT: after reading through the linked docs a bit more, I have to say that they are quite confusing. I'm not sure what the difference is between Ratifier, Kulvert, and SQL Simulator. I'd recommend finding a potential user
Honestly, just use a real db in testing. Purity is all well and good, but not at the expense of more bugs caught.
OK, that goal makes sense, thanks for explaining. For what it's worth I'm pretty sure you can do this with postgres, tmpfs, and template databases — see my project, pgtestdb [0]. I haven't done a formal perf measurement, but each test would get a fresh and migrated database in about 20ms on my local machine. The setup described runs postgres in a container for convenience, but you could probably also just run a postgres binary and store the data on a memory-backed filesystem.[0
I would just create a per-test-file database (and delete it the next time the test runs). The overhead is very small (compared to booting Postgres) and it works exactly like production because it is exactly production.In general, I am not comfortable using an "almost equivalent" database for tests. Every database engine has its quirks, and if you don't know what they are, your program that passes its tests only "almost" works in production, which is annoying.
It isn't clear if you're already doing this, but Postgres has a 'create database from template' feature. You can initialize your template database with migrations once for a whole test suite, then clone a new database at the start of each test.It's quite fast. I run almost all of my tests this way.I agree, forcing your app into the lowest common denominator of portable SQL is crippling. JSONB columns in particular are extremely useful in Postgres.
PostgreSQL will be close enough to an in memory database if you turn off synchronous_commit. For most test ssuties I would suspect much more time will be spent plannign the queries (which can be improved with prepared staatments) than the time spent on disk IO if you just reduce the durability requirements of your test database.