Transferring backups from P.sh to S3
The information in this post is accurate as of the published date . Please make sure to check any linked documentation within the post for changes/updates.
I think this would make a good how to guide.
Asking here first. I understand I can install s3cmd or aws-cli. Is there any examples kicking around?
-
Do you mean backups taken by the “Backup” button in the UI, or something you would do manually with
mysqldump
andrsync
?0 -
I’m not sure of the format of the P.sh backup. But I assume (like mine) some clients have a need to store backups within their own infra or own specified location. So in my case I mean the database and files dumped and rsynced respectively over to S3 or wherever.
0 -
See: How to sync database backups to Amazon S3
I was able to transfer my PostgreSQL dumps to S3 doing the following:
- Create sensitive variables for my AWS credentials
platform variable:create --level=project --name=AWS_ACCESS_KEY_ID --value=<KEY_ID> --json=false --sensitive=true --prefix=env --visible-build=true --visible-runtime=true platform variable:create --level=project --name=AWS_SECRET_ACCESS_KEY --value=<ACCESS_KEY> --json=false --sensitive=true --prefix=env --visible-build=true --visible-runtime=true
- Since PostgreSQL comes with a default password
main
, I created another sensitive environment variablePGPASSWORD
to set up the dump cron job
platform variable:create --level=project --name=PGPASSWORD --value=main --json=false --sensitive=true --prefix=env --visible-build=true --visible-runtime=true
- I created a mount to save the dumps to in my
.platform.app.yaml
mounts: 'backups': source: local source_path: backups
- Installed the
awscli
in my build hook
if [ ! -z "$AWS_ACCESS_KEY_ID" ] && [ ! -z "$AWS_SECRET_ACCESS_KEY" ]; then pip install futures pip install awscli --upgrade --user 2>/dev/null fi
- Created a cron job that called my forwarding script restricted to
master
crons: dumps_to_s3: spec: '*/15 * * * *' cmd: | if [ "$PLATFORM_BRANCH" = master ]; then ./cron/forward_dumps_s3.sh fi
- Then included a script to perform the dumps and AWS syncs every 15 minutes
#!/usr/bin/env bash TODAY_DATE=$(date +%Y-%m-%d-%H:%M:%S) DUMP_FILE=${TODAY_DATE}-DUMP.sql DB_HOST="database.internal" DB_PORT=5432 DB_USER="main" DUMP_LOC="backups" AWS_BUCKET=<your bucket name> AWS_FOLDER=<bucket folder name> pg_dump --host=$DB_HOST --port=$DB_PORT --username=$DB_USER --file=$DUMP_LOC/$DUMP_FILE aws s3 sync ~/$DUMP_LOC s3://$AWS_BUCKET/$AWS_FOLDER/
0
Please sign in to leave a comment.
Comments
3 comments