EC2 backup strategy on S3 – Ubuntu

6 August 2011 by Jinesh Parekh No comments

It is recommended to backup only the database and the mysql files on s3. You do not want to backup the entire server image as that is expensive in terms of space and hence price. Below is a step by step instruction on how you could do that:

Setup s3sync

  1. Login to your EC2 instance
  2. cd ~/
  3. wget http://s3.amazonaws.com/ServEdge_pub/s3sync/s3sync.tar.gz
  4. tar xzvf s3sync.tar.gz; cd s3sync
  5. mkdir certs; cd certs
  6. wget http://mirbsd.mirsolutions.de/cvs.cgi/~checkout~/src/etc/ssl.certs.shar
  7. sh ssl.certs.shar
  8. cd ..

Edit s3config.rb

  1. vi s3congig.rb
    Replace confpath using below (highlighted change in red)
    confpath = [“./”,”#{ENV[‘S3CONF’]}”, “#{ENV[‘HOME’]}/.s3conf”, “/etc/s3conf”]

Locate your s3 credentials

  1. Login to http://aws.amazon.com/
  2. Click on accounts
  3. Click on Security credentials
  4. Locate access credentials section and click on Access Key Tab
  5. You will notice your access key there and show button to see the secret

aws-account

access-credentials1

Configure s3sync with s3 credentials

  1. cp s3config.yml.example s3config.yml
  2. vi s3config.yml and modify the below
    aws_access_key_id: 11111111111111111111111
    aws_secret_access_key: 222222222222222222222
    ssl_cert_dir: /home/user/s3sync/certs

Create a bucket in your s3 account

  1. Log into https://console.aws.amazon.com/s3/
  2. Create bucket <good_name_for>

Create your shell script in your ec2 instance

The below sample is to backup a rails application known as redmine. It backs up the database and uploaded attachments which lives in <app_root>/files = redmine/files directory.

  1. cd ~/s3sync
  2. mkdir s3backup
  3. vi s3backup.sh and you can replace and use the below
    cd ~/
    BUCKET=redmine_archives

    DBNAME=redmine
    DBPWD=admin
    DBUSER=root
    NOW=$(date +_%b_%d_%y)
    echo ‘compressing /usr/local/apps/redmine/files’
    tar czf files$NOW.tar.gz /usr/local/apps/redmine/files
    mv files$NOW.tar.gz s3sync/s3backup
    cd s3sync/s3backup
    echo ‘creating database dump for tb_production schema’
    touch $DBNAME.backup$NOW.sql.gz
    mysqldump -u $DBUSER -p$DBPWD $DBNAME | gzip -9 > $DBNAME.backup$NOW.sql.gz
    echo ‘creating a compressed file for application and the database dumps’
    tar czf server_backup$NOW.tar.gz $DBNAME.backup$NOW.sql.gz files$NOW.tar.gz
    rm -f $DBNAME.backup$NOW.sql.gz files$NOW.tar.gz
    cd ..
    echo ‘uploading to s3’
    ruby s3sync.rb -r –ssl s3backup/ $BUCKET:
    echo ‘cleaning up files created by this operation’
    cd ~/s3sync/s3backup
    rm -f ~/s3sync/s3backup/*
  4. sudo chmod +x s3backup.sh

NOTE: ruby s3sync.rb -r –ssl s3backup/ $BUCKET: it is minus minus ssl. When you copy the script as is, it messes up the — into dash.

See it in action

  1. cd ~/s3sync
  2. ./s3backup.sh
  3. Login to your s3 bucket and see the tar file created.

Automate

Below makes the backup run once every day.

  1. crontab -e
  2. 0 0 * * * /root/s3sync/s3backup.sh

Jinesh Parekh

Founder CEO, Idyllic.

Follow me on Twitter

Leave a Reply

Your email address will not be published. Required fields are marked *

Subscribe To Our Blog

Get access to proven marketing ideas, latest trends and best practices.

Next up home

Contact

Lets build cool stuff

Share your contact information & we will get in touch!

I want (Tell us more about your dream project)