Pantheon Community

Tell me about your off-site backups

Pantheon docs note “Backups created on Pantheon are stored offsite on Google Cloud Storage instances, however a full-fledged backup solution is strongly recommended for retention.”

Curious what others are doing for their off-site backup retention and if it’s automated/integrated (whether via Bash script/cron or something else).


I have a project in Google Cloud with a VM (that’s configured with terminus to access my Pantheon sites) and a storage bucket. Then I have a cronjob to pull down Pantheon’s production database backups every day at around 5am and put it in a folder in the VM. Then I do a find cmd to delete any backups older than 10 days. Then I run gsutil rsync to sync the folder w/ the storage bucket. The advantage of this being in Google is that it’s the same data center as Pantheon, so (I assume) you get a break on transfer costs/speed.

Here’s my cron:

/home/jason/vendor/bin/terminus backup:get --element=db --to=/mnt/pdisk/backups/drupal/$(/bin/date +\%Y-\%m-\%d).sql.gz

Then here’s my find cmd:

find /mnt/pdisk/backups/drupal -type f -ctime +10 -delete

Then here’s my rsync cmd:

gsutil -m rsync -r /mnt/pdisk/backups/drupal/ gs://[BUCKETNAME]/

Glad you asked because I see I need to add a few more sites to our setup! :slight_smile:

1 Like

Thanks Jason. I appreciate the recipe!

So, it seems as though programmatically pulling backups off-site always requires the intervention of a machine (VM or laptop) that is running Terminus, either to run the commands to create and get the backup file, or even to get the url of the last backup.

I’m curious if others have recipes that don’t involve maintaining a VM/machine and Terminus version.

You might could set something up in CircleCI with a schedule to do this?