We have a PostgreSQL database server that stores a bunch of product catalogs. It weighs in at around 2 TB.

The vast majority of the time, this data is just sitting there on SSD provisioned storage, waiting to be accessed by a batch indexing job.

This costs something on the order of $0.20 per GB per month.

That is quite a bit of money.

So I wrote a script to dump all the catalogs to Google Cloud Storage, one by one.

When we need one of them, we spin up a PostgreSQL instance and restore only the DB we need for the duration of the job, then delete the instance.