christinawarren.com

For news tips, pitches, speaking engagements or media inquiries: Contact

Using Amazon S3 to backup Media Temple’s Grid (gs)

Proper backups are like eating your vegetables -- we all say we'll do it and that it is a good idea, but it is so much easier NOT to do it and eat Oreo cookies instead. Don't risk losing your website because you didn't bother backing up.

Proper backups are like eating your vegetables — we all say we’ll do it and that it is a good idea, but it is so much easier NOT to do it and eat Oreo cookies instead. Then you wake up one day, are 25 years old and are a really picky eater and annoy your boyfriend because you won’t go eat at the Indian place he loves that doesn’t have a menu but only serves vegetarian stuff that scares you. And the people at Subway give you dirty looks when you tell them you don’t want anything on your sandwich. Don’t risk losing your website because you didn’t bother backing up.

Update: I posted a video tutorial that walks through all of these steps here. I still recommend reading through this page because the video tutorial assumes that you will be following these steps.

This a tutorial for creating an automated back-up system for (mt) Media Temple’s (gs) Grid Service. Although it will almost certainly work on other servers and configurations, this is written for users who are on the Grid who want an easy way to do automated backups. I personally feel most comfortable having my most important files backed-up offsite, so I use Amazon’s S3 service. S3 is fast, super cheap (you only pay for what you use) and reliable. I use S3 to store my website backups and my most important computer files. I spend about $1.50 a month, and that is for nearly 10 GBs of storage.

You can alter the script to simply store the data in a separate location on your server (where you can then just FTP or SSH in and download the compressed archive), but this process is assuming that you are using both the (gs) and S3.

This tutorial assumes that you know how to login to your (gs) via SSH using either the Terminal in OS X or Linux or PuTTY for Windows. If SSH is still confusing, check out (mt)’s Knowledge Base article and take a deep breath. It looks more scary than it really is.

Acknowledgements

I would be remiss if I didn’t give a GIGANTIC shout-out to David at Stress Free Zone and Paul Stamatiou (I met Paul at the Tweet-up in March) who both wrote great guides to backing stuff up server side to S3. I blatantly stole from both of them and rolled my own script that is a combination of the two. Seriously, thank you both for your awesome articles.

Furthermore, none of this would even be possible without the brilliant S3Sync Ruby utility.

Installing S3Sync

Although PHP and Perl script exist to connect with the S3 servers, the Ruby solution that the S3Sync dudes created is much, much better.

The (gs) already has Ruby on it (version 1.8.5 as of this writing), which is up-to-date enough for S3Sync.

OK, so log-in to your (gs) via SSH. My settings (and the defaults for (gs), I assume) are to place you in the .home directory as soon as you login to SSH.

Once you are at the command line, type in the following command:

wget http://s3.amazonaws.com/ServEdge_pub/s3sync/s3sync.tar.gz

This will download the latest S3Sync tarball to your .home folder

tar xvzf s3sync.tar.gz

This uncompresses the archive to its own directory.

rm s3sync.tar.gz</p>

<p>cd s3sync</p>

<p>mkdir certs</p>

<p>cd certs</p>

<p>wget http://mirbsd.mirsolutions.de/cvs.cgi/~checkout~/src/etc/ssl.certs.shar</p>

<p>sh ssl.certs.shar</p>

<p>cd ..</p>

<p>mkdir s3backup

That will delete the compressed archive, make a directory for certificates (certs), download an SSL certificate generator script, execute that script and create a backup directory within the s3sync directory called “s3backup.”

Now, all you need to do is edit two files in your newly created s3sync folder. You can use TextEdit, TextMate, NotePad or any other text editor to edit these files. You are only going to be changing a few of the values.

I edited the files via Transmit, but you can use vi straight from the command line if you are comfortable.

The first file you want to edit is called s3config.yml.sample

You want to edit that file so that the aws_access_key and aws_secret_access_key fields correspond to those from your S3 account. You can find those in the Access Information area after logging into Amazon.com’s Web Services page.

Make sure that the ssl_cert_dir: has the following value (if you created your s3sync folder in the .home directory): /home/xxxxx/users/.home/s3sync/certs were xxxxx is the name of your server.

You can get your entire access path by typing in

pwd
at the command line.

Save that file as s3config.yml

The next step is something I had to do in order to get the s3 part of the script to connect, but it may not be required for all server set-ups, but it was for the (gs).

Edit the s3config.rb file so that the area that says

confpath = [xxxxx]

looks like this

confpath = ["./", "#{ENV['S3CONF']}", "#{ENV['HOME']}/.s3conf", "/etc/s3conf"]

Writing the backup script (or editing mine)

OK, that was the hard part. The rest is pretty simple.

I created the following backup script called, “backup_server.sh” This script will backup the content of the domain directories you specify (because if you are like me, some of your domain folders are really just symlinks) and all of your MySQL databases. It will then upload each directory and database in its own compressed archive to the S3 Bucket of your choice. Buckets are unique, so create a Bucket using either the S3Fox tool or Transmit or another S3 manager that is specific for your website.

This is the content of the script:

#!/bin/sh</p>

<h1>A list of website directories to back up</h1>

<p>websites="site1.com site2.com site3.com"</p>

<h1>The destination directory to backup the files to</h1>

<p>destdir=/home/xxxxx/users/.home/s3sync/s3backup</p>

<h1>The directory where all website domain directories reside</h1>

<p>domaindir=/home/xxxxx/users/.home/domains</p>

<h1>The MySQL database hostname</h1>

<p>dbhost=internal-db.sxxxxx.gridserver.com</p>

<h1>The MySQL database username - requires read access to databases</h1>

<p>dbuser=dbxxxxx</p>

<h1>The MySQL database password</h1>

<p>dbpassword=xxxxxxx</p>

<p>echo <code>date</code> ": Beginning backup process..." > $destdir/backup.log</p>

<h1>remove old backups</h1>

<p>rm $destdir/*.tar.gz</p>

<h1>backup databases</h1>

<p>for dbname in <code>echo 'show databases;' | /usr/bin/mysql -h $dbhost -u$dbuser -p$dbpassword</code>
do
if [ $dbname != "Database" ];
then
echo <code>date</code> ": Backing up database $dbname..." >> $destdir/backup.log
/usr/bin/mysqldump --opt -h $dbhost -u$dbuser -p$dbpassword $dbname > $destdir/$dbname.sql
tar -czf $destdir/$dbname.sql.tar.gz $destdir/$dbname.sql
rm $destdir/$dbname.sql
fi
done</p>

<h1>backup web content</h1>

<p>echo <code>date</code> ": Backing up web content..." >> $destdir/backup.log
for website in $websites
do
echo <code>date</code> ": Backing up website $website..." >> $destdir/backup.log
tar -czf $destdir/$website.tar.gz $domaindir/$website
done</p>

<p>echo <code>date</code> ": Backup process complete." >> $destdir/backup.log</p>

<h1>The directory where s3sync is installed</h1>

<p>s3syncdir=/home/xxxxx/users/.home/s3sync</p>

<h1>The directory where the backup archives are stored</h1>

<p>backupdir=/home/xxxxx/users/.home/s3sync/s3backup</p>

<h1>The S3 bucket a.k.a. directory to upload the backups into</h1>

<p>s3bucket=BUCKET-NAME</p>

<p>cd $s3syncdir
./s3sync.rb $backupdir/ $s3bucket:

For (mt) Media Temple (gs) Grid Server users, you just need to change the “site1.com” values to your own domains (you can do as many as you want) and substitute all the places where marked “xxxxx” with your server number (again, you can find this by entering “pwd” at the command line) and with your database password (which is visible in the (mt) control panel under the “Database” module.

Make sure you change the value at the end of the script that says “BUCKET-NAME” to the name of the S3 Bucket you want to store you backups in.

Now that you have edited the script, upload it to your /data directory.

Change the permissions (you can do this either via SSH

chmod a+x backup_server.sh
or using your FTP client to 755.

Now, test the script.

In the command line type this in:

cd data</p>

<p>./backup_server.sh</p>

<p>

 

And watch the magic. Assuming everything was correctly input, an archived version of all your domain directories and all of your MySQL databases will be put in a folder called “s3backup” and then uploaded directly to your S3 server. Next time you run the script, the backup files will be replaced.

Check to make sure that the script is working the way you want it to work.

Automate the script

You can either run the script manually from the command line or set it to run automatically. I’ve set mine to run each night at midnight. To set up the cron job, just click on the Cron Jobs button in the (mt) Admin area:

 

(mt) AccountCenter - filmgirl.tv : (gs) GridControls Uploaded with plasq‘s Skitch!

 

 

and set you parameters. The path for your script is: /home/xxxxx/data/backup_server.sh.

Enjoy your backups!

One note: The compressed domain archives retain their entire directory structure, as such, there is a .home directory that may not appear in Finder or Windows Explorer unless you have invisible or hidden files turned on. Don’t worry, all your data is still retained in those archives.

Update (7/27/2008): If you are getting an error that says something like Permanent redirect received. Try setting AWS_CALLING_FORMAT to SUBDOMAIN

Add the following array to your s3config.yml file AWS_CALLING_FORMAT: SUBDOMAIN

The error is either because your bucket is in the EU or there is something else funky with its URL structure. Changing that value should allow the script to perform as intended.

Feeds-

Alltop, all the cool kids (and me)

Instagram

Follow me on App.net

Calendar-

June 2008
S M T W T F S
« May   Jul »
1234567
891011121314
15161718192021
22232425262728
2930  

112 people have left comments

Paul Stamatiou - Gravatar

Paul Stamatiou said:

S3 is great huh? That s3config.rb path thing must have been a change with the recent s3sync as I had to do the same thing when I just upgraded.. didn’t have to do that when I had published it. Anyways, thanks for bringing that to my attention, I’ve updated my article.

Posted on: June 24, 2008 at 6:39 pmQuote this Comment
Christina - Gravatar

Christina said:

Thanks for your original tutorial and article Paul — it helped me out tremendously when I was setting the whole thing up. And yes, S3 is the bees knees as they say.

Posted on: June 24, 2008 at 6:43 pmQuote this Comment
George Ornbo - Gravatar

George Ornbo said:

Great write up. The Amazon S3 service is perfect for backing up a web server remotely, safely and cheaply. I’m also using it to do off site backups for machines inside the network. For a small business it is a great solution.

Posted on: June 25, 2008 at 3:00 amQuote this Comment
Mike Marley - Gravatar

Mike Marley said:

Neat article Christina.

/me adds it to list of KB articles that need to be written.

Mike Marley Senior Technical Support (mt) Media Temple, Inc.

Posted on: June 25, 2008 at 10:47 amQuote this Comment
Ross - Gravatar

Ross said:

Nice post….very very detailed. Hopefully you put it in the how to part of the MT forum?

Posted on: July 1, 2008 at 10:41 amQuote this Comment
Michael - Gravatar

Michael said:

Excellent resource, thank you. One note on my experience when configuring on the gs:

I was receiving errors like “-o is not a valid variable” or something like this, when the script was trying to execute the mysql dump. I changed it to –opt (vs. -opt in your script).

Thanks again!

Posted on: July 5, 2008 at 8:32 pmQuote this Comment
Christina - Gravatar

Christina said:

Thanks for the info Michael! The script actually DOES say –opt, but the way that “code” is displayed on this page didn’t show the dashes clearly (I’ll have to try to change that) — the downloadable script has the correct dashes too. I’m glad that it worked for you and I appreciate the feedback on the –opt thing. I’ll do my best to change the text display now.

Posted on: July 6, 2008 at 2:37 amQuote this Comment
Christina - Gravatar

Christina said:

OK – I fixed the code formatting! It should all be correctly displayed now.

Posted on: July 6, 2008 at 3:56 amQuote this Comment
duivesteyn - Gravatar

duivesteyn said:

I’ve modified this to better suit CPanel based sites with sql support at http://duivesteyn.net/2008/amazon-s3-backup-for-webserver-public_html-sql-bash/

hope it helps someone

Posted on: July 6, 2008 at 4:48 amQuote this Comment
Christina - Gravatar

Christina said:

duivesteyn said: I’ve modified this to better suit CPanel based sites with sql support at http://duivesteyn.net/2008/amazon-s3-backup-for-webserver-public_html-sql-bash/ hope it helps someone

Oh that’s awesome! Thanks for posting your script!

Posted on: July 6, 2008 at 4:51 amQuote this Comment
Thejesh GN - Gravatar

Thejesh GN said:

s3 is amazing. I also use it deliver images and other media files to blog since its fast.

Posted on: July 6, 2008 at 12:24 pmQuote this Comment
Karl Hardisty - Gravatar

Karl Hardisty said:

Christina,

Thanks for putting in the time and effort to not only come up with the script, but to make it robust/structured/pretty enough for sharing. It’s good to know that someone has tested it and put it to public scrutiny, and that it’s worthwhile. You’ve saved me (and plenty others I’m sure) a lot of time.

Much appreciated.

Posted on: July 7, 2008 at 7:56 amQuote this Comment
Josh Price - Gravatar

Josh Price said:

Simple yet very effective

Thanks!

Posted on: July 8, 2008 at 12:43 pmQuote this Comment
Host Disciple » Blog Archive » Inside the hosting mind of a blogger - Gravatar

Host Disciple » Blog Archive » Inside the hosting mind of a blogger said:

[…] hosted there (calm down fellas she is dating someone already). Christina wrote a very informative how-to on backing up your MediaTemple GS […]

Posted on: July 9, 2008 at 12:06 pmQuote this Comment
Patrick Sikorski - Gravatar

Patrick Sikorski said:

Can I borrow your wisdom? I’m getting some weird errors: warning.com@cl01:~/data$ ./backup_server.sh tar: Removing leading /' from member names tar: Removing leading/' from member names tar: Removing leading /' from member names tar: Removing leading/' from member names tar: Removing leading /' from member names tar: Removing leading/' from member names tar: Removing leading `/' from member names You didn't set up your environment variables; see README.txt s3sync.rb [options] version 1.2.6 --help -h --verbose -v --dryrun -n --ssl -s --recursive -r --delete --public-read -p --expires="" --cache-control="" --exclude="" --progress --debug -d --make-dirs --no-md5 One of or must be of S3 format, the other a local path. Reminders: * An S3 formatted item with bucket 'mybucket' and prefix 'mypre' looks like: mybucket:mypre/some/key/name * Local paths should always use forward slashes '/' even on Windows * Whether you use a trailing slash on the source path makes a difference. * For examples see README.

Not sure where my problem is, do you have any idea?

Awesome article by the way!

Posted on: July 9, 2008 at 9:38 pmQuote this Comment
Christina - Gravatar

Christina said:

Patrick Sikorski said: Can I borrow your wisdom? I’m getting some weird errors: warning.com@cl01:~/data$ ./backup_server.sh tar: Removing leading `/’ from member names tar: Removing leading …

Patrick, OK, this was a problem I had in the beginning, and I had to change the s3config.rb file so that confpath = ["./", "#{ENV['S3CONF']}", "#{ENV['HOME']}/.s3conf", "/etc/s3conf"] — make sure that has been changed and try again.

As for the tar: removing leading “/” from member names, that’s fine.

Hope this helps!

Posted on: July 9, 2008 at 11:41 pmQuote this Comment
Patrick Sikorski - Gravatar

Patrick Sikorski said:

Did this go as smoothly for you as it did for me lol. Now for some reason I’m getting this.

./s3sync.rb:28:in `require': ./s3config.rb:19: syntax error, unexpected tIDENTIFIER, expecting ']' (SyntaxError) config = YAML.load_file("#{path}/s3config.yml") ^ ./s3config.rb:25: syntax error, unexpected kEND, expecting $end from ./s3sync.rb:28

After you told me about that code, I realized that I didn’t copy it right. This is probably something just as stupid.

Posted on: July 10, 2008 at 8:56 amQuote this Comment
Christina - Gravatar

Christina said:

Patrick, Did you rename the s3confi.yml.sample file to s3config.yml?

If you did, I’ll have to check the codebase (it is possible a new version of S3sync was released since I’ve written the article) and investigate.

We’ll get this working!

This might be the sort of thing I should do a screencast of, from start to finish, to supplement the written guide. Hmm…

Posted on: July 10, 2008 at 10:20 amQuote this Comment
Patrick Sikorski - Gravatar

Patrick Sikorski said:

Yes I renamed the file. I guess a new version could have been released… but you didn’t write the article that long ago. Update your version of it and see if it breaks (backup first lol). A screencast would be cool….!

Posted on: July 10, 2008 at 10:28 amQuote this Comment
Christina - Gravatar

Christina said:

OK — a new version has NOT been released, so I’m thinking this is probably as simple as a mis-typed comma or period somewhere.

I’ll make a screencast today, going from start to finish.

Posted on: July 10, 2008 at 11:04 amQuote this Comment
Patrick Sikorski - Gravatar

Patrick Sikorski said:

Awesome, I’ll delete everything and start over when you make the screen cast. Thanks!

Posted on: July 10, 2008 at 11:53 amQuote this Comment
Karl Hardisty - Gravatar

Karl Hardisty said:

Christina,

Excellent news. I can guarantee at least one viewer.

Posted on: July 10, 2008 at 12:08 pmQuote this Comment
Video Tutorial: Automate Media Temple (gs) backups with Amazon S3 | www.ChristinaWarren.com - Gravatar

Video Tutorial: Automate Media Temple (gs) backups with Amazon S3 | www.ChristinaWarren.com said:

[…] This entry was posted on Sun, July 13th, 2008. You can follow any responses to this entry through the RSS 2.0 feed. You can leave a response, or trackback from your own site. Previous Entry […]

Posted on: July 13, 2008 at 8:27 pmQuote this Comment
Matt - Gravatar

Matt said:

The backups are going fine, but the s3sync portion keeps giving me:

Connection reset: Connection reset by peer 99 retries left, sleeping for 30 seconds Connection reset: Connection reset by peer 98 retries left, sleeping for 30 seconds Connection reset: Connection reset by peer 97 retries left, sleeping for 30 seconds Connection reset: Connection reset by peer … and so on

Any ideas?

Posted on: July 16, 2008 at 3:31 pmQuote this Comment
Matt - Gravatar

Matt said:

Hmmm… I guess that’s normal? I checked my S3 bucket and it had all the files there and in the right size. So are those messages just there to say that it is still working?

Posted on: July 16, 2008 at 3:47 pmQuote this Comment
Christina - Gravatar

Christina said:

Hmm, I don’t get any of those Matt — but if the copies are transferring over correctly, I guess its fine.

Posted on: July 17, 2008 at 10:09 pmQuote this Comment
Cedric - Gravatar

Cedric said:

I tried 2 times, but it doesn’t work at all for me :

tar: Removing leading /' from member names tar: Removing leading/’ from member names tar: Removing leading /' from member names tar: Removing leading/’ from member names tar: Removing leading /' from member names tar: Removing leading/’ from member names tar: Removing leading /' from member names tar: Removing leading/’ from member names Permanent redirect received. Try setting AWS_CALLING_FORMAT to SUBDOMAIN S3 ERROR: # ./s3sync.rb:290:in +': can't convert nil into Array (TypeError) from ./s3sync.rb:290:ins3TreeRecurse’ from ./s3sync.rb:346:in main' from ./thread_generator.rb:79:incall’ from ./thread_generator.rb:79:in initialize' from ./thread_generator.rb:76:innew’ from ./thread_generator.rb:76:in initialize' from ./s3sync.rb:267:innew’ from ./s3sync.rb:267:in `main’ from ./s3sync.rb:735

Posted on: July 25, 2008 at 5:38 pmQuote this Comment
Maxwell Scott-Slade - Gravatar

Maxwell Scott-Slade said:

I have set it up exactly the same as your blog (btw, thanks for all this) but I also get this error:

tar: Removing leading /' from member names tar: Removing leading/' from member names Permanent redirect received. Try setting AWS_CALLING_FORMAT to SUBDOMAIN S3 ERROR: # ./s3sync.rb:290:in +': can't convert nil into Array (TypeError) from ./s3sync.rb:290:ins3TreeRecurse' from ./s3sync.rb:346:in main' from ./thread_generator.rb:79:incall' from ./thread_generator.rb:79:in initialize' from ./thread_generator.rb:76:innew' from ./thread_generator.rb:76:in initialize' from ./s3sync.rb:267:innew' from ./s3sync.rb:267:in `main' from ./s3sync.rb:735

Posted on: July 27, 2008 at 6:59 amQuote this Comment
Ced - Gravatar

Ced said:

Maybe s3sync has been updated since this post ?

Posted on: July 27, 2008 at 9:32 amQuote this Comment
Christina - Gravatar

Christina said:

OK, so both Cedric and Maxwell are getting the same error. I looked up that error and it appears to be associated with EU buckets. Are either of you using buckets in the EU?

To change this, you need to add this line to your s3config.yml file:

AWS_CALLING_FORMAT: SUBDOMAIN

Posted on: July 27, 2008 at 9:38 amQuote this Comment
Christina - Gravatar

Christina said:

Ced — no, I just double-checked. I think the issue is either with using an EU bucket name or something else in the bucket-name not being correct. EU Bucket names cannot contain capital letters (so they are not case-sensitive), whereas US bucket names can.

Make sure your bucket name is correct in the script. I think adding the AWS_CALLING_FORMAT parameter to the yml file will solve the problem.

Posted on: July 27, 2008 at 9:43 amQuote this Comment
Ced - Gravatar

Ced said:

@Christina : Yes it works now ! Thanks for your help.

Posted on: July 27, 2008 at 11:13 amQuote this Comment
Christina - Gravatar

Christina said:

Ced, Glad to hear it! I’ve updated the post with that information in case anyone else runs into the same issue.

Posted on: July 27, 2008 at 11:59 amQuote this Comment
Maxwell Scott-Slade - Gravatar

Maxwell Scott-Slade said:

Thanks Christina, that’s all working fine now. It’s so awesome to know that the site is getting backed up everyday to a safe place 100%. It’s nice to turn on the email feature in the Cron job section so you know it’s all done.

A guide to remember. I never used SSH before, now that I have I feel pretty happy it all works!

Posted on: July 28, 2008 at 4:07 amQuote this Comment
Faire un backup de son blog sur Amazon S3 | 64k - Gravatar

Faire un backup de son blog sur Amazon S3 | 64k said:

[…] avec ce service, mais je n’avais pas vraiment pris le temps de tester. Je suis tombé sur un billet de Christina Warren qui m’a décidé, puisqu’il décrit toute la procédure pour faire un backup d’un […]

Posted on: July 28, 2008 at 5:31 amQuote this Comment
Easily Backup Your Entire Website to S3 | HighEdWebTech - Gravatar

Easily Backup Your Entire Website to S3 | HighEdWebTech said:

[…] sent me a great link to a Ruby script that will backup your website and push that backup file to Amazon S3 for safe, […]

Posted on: July 28, 2008 at 2:07 pmQuote this Comment
Philip - Gravatar

Philip said:

So does this backup everything or just the content that’s changed since the last backup?

Posted on: August 24, 2008 at 6:45 pmQuote this Comment
Links of Interest – CSS-Tricks - Gravatar

Links of Interest – CSS-Tricks said:

[…] Warren has a comprehensive and excellent tutorial on creating a Ruby script to back up your entire web server, including databases, and upload them […]

Posted on: August 26, 2008 at 8:46 amQuote this Comment
Christina - Gravatar

Christina said:

Philip, I was having some problems having effective recursive backups, so it’s just doing everything. Your comment reminds me to re-investigate the best/most effective way to do it recursively though (it would be like a few character changes to the script), so I’ll do that later this week and post an update with my findings. Realistically, your databases are going to be changing more frequently than your actual directories, so you can always set a separate CRON job to run the databases and certain folders every day, and other folders less frequently. That’s what I do anyway — my blogs and databases are backed up daily and a few domains that are just basically image storing for right now get updated once a week or once a month.

Posted on: August 26, 2008 at 9:40 amQuote this Comment
Matt - Gravatar

Matt said:

The backup part works great for me, but not the s3sync. Cron job doesn’t even bother with copying anything over. When I do the site copy to S3 manually, it usually dies after copying just a few sites. Wish I could get that part working as that is the important part!

Posted on: August 26, 2008 at 4:35 pmQuote this Comment
Christina - Gravatar

Christina said:

Matt, Are you getting an error of any kind? About how much data is being copied over before it dies? I’ve successfully copied over more than 500 megabytes before using the script (a test case and also a backup of some photographs I uploaded to my gs temporarily when at my parent’s house). Let’s see if we can figure out why it isn’t working.

Posted on: August 26, 2008 at 6:29 pmQuote this Comment
Matthew Barker - Gravatar

Matthew Barker said:

Nope, not getting any errors that matter it seems. Earlier I reported this error:

Connection reset: Connection reset by peer 99 retries left, sleeping for 30 seconds 98 retries left, sleeping for 30 seconds …

But that doesn’t seem to really matter. I am backing up 10 sites, with only 1 being larger than 500 MB; the gzipped/tarred file is currently 1.3 GB in size. The odd thing about all of this is that sometimes everything works when I do it manually, but that is only sometimes. It generally quits when transferring the 1.3 GB file to Amazon, with no error messages encountered. But with the cron job running, it sometimes quits when tarring the 1.3 GB site, but generally tars everything just fine, but doesn’t transfer a thing. That’s the hard part about trying to troubleshoot this problem; sometimes it works, sometimes it doesn’t; and when it doesn’t work, it doesn’t die at the same place every time.

Posted on: August 27, 2008 at 12:15 amQuote this Comment
Peter - Gravatar

Peter said:

I was hoping I could download your full backup script and customize it to my needs, but it looks like access it denied on the file from S3. Is there any chance you could make it public?

Posted on: August 27, 2008 at 10:31 amQuote this Comment
Marcus McCurdy - Gravatar

Marcus McCurdy said:

Thanks for this great write up. I went through it last night and it worked like a champ and I learned a thing or two in the process. This really makes backing up a media temple site a breeze.

Posted on: August 27, 2008 at 4:13 pmQuote this Comment
Christina - Gravatar

Christina said:

Matt, I remember you had that error. I’ll do what I can to investigate why this seems to not be working. It might be necessary to create two separate Cron jobs – one for the biggest site, one of the others – to see if that is an acceptable workaround.

Peter — OK, try it now — I changed the URL structure. It was working fine, I’m not sure what could have changed. If you still have issues, let me know.

Marcus — yay! I’m so happy this worked for you!

Posted on: August 27, 2008 at 4:54 pmQuote this Comment
Automatically back up your entire web server files and databases to Amazon S3 | QuickPipe - Gravatar

Automatically back up your entire web server files and databases to Amazon S3 | QuickPipe said:

[…] Warren has a comprehensive and excellent tutorial on creating a Ruby script to back up your entire web server, including databases, and upload them […]

Posted on: August 31, 2008 at 12:25 amQuote this Comment
Links of Interest | Proba - Gravatar

Links of Interest | Proba said:

[…] Warren has a comprehensive and excellent tutorial on creating a Ruby script to back up your entire web server, including databases, and upload them […]

Posted on: August 31, 2008 at 6:15 pmQuote this Comment
Matt - Gravatar

Matt said:

Thanks Christina … this was really helpful. How about a Part 2 with rolling backups? 😉

Posted on: September 8, 2008 at 6:08 pmQuote this Comment
Automatically back up MetiaTemple Grid (gs) via Amazon S3 | Randomess of Josh Price - Gravatar

Automatically back up MetiaTemple Grid (gs) via Amazon S3 | Randomess of Josh Price said:

[…] months now, and thought I’d share it with you guys.  Christina Warren has written up some excellent instructions on how to back up MediaTemple’s Grid (gs) to your Amazon S3 account.  There is also a video. […]

Posted on: September 25, 2008 at 3:09 pmQuote this Comment
Michael York - Gravatar

Michael York said:

Hey Christina,

Not sure if you got my e-mail or not…

Anyways, I wanted to know why I keep getting “Connection Reset: connection reset by peer” error messages. When looking at the S3 bucket, not all of the files have transferred over…

Thanks! Michael

Posted on: September 25, 2008 at 5:39 pmQuote this Comment
Petter - Gravatar

Petter said:

Matthew Barker said: Nope, not getting any errors that matter it seems. Earlier I reported this error: Connection reset: Connection reset by peer 99 retries …

I am experiencing the same problem. It is a fairly large site. About 2.5GB worth of photos. I have split the backup script up in two parts, the DB and the Site. The DB works fine and goes up to S3 easily. Leads me to believe it is related to the size :). Have to divide it up in smaller chunks or similar.

Mat, Christina, how did/would you approach it? any suggestions.

Posted on: September 29, 2008 at 10:34 amQuote this Comment
Christina - Gravatar

Christina said:

Peter, Mat, and anyone else receiving the connection reset message: It IS size related. I’m working on a new post that details changing the number of resets and does recursive backups for large files – so that you don’t transfer everything over every single time. I need to test it out and make sure it is solid before posting, but it should be up sometime this week!

Posted on: September 29, 2008 at 10:37 amQuote this Comment
Jordan - Gravatar

Jordan said:

Christina,

Thanks for the well written guide. I was able to get my backups running perfectly but did have one question.

I am using a slightly modified version of your backup script, and while it works well as far as creating the backup files, I don’t see any of the echo messages in either the shell or in the cron emails. I only see the following:

tar: Removing leading /' from member names tar: Removing leading/' from member names

Otherwise, this seems to be the perfect backup system for Media Temple gs users.

Posted on: September 30, 2008 at 4:44 pmQuote this Comment
William B. - Gravatar

William B. said:

Hey Christina,

This has really helped my mission, thanks for the guide. Works flawlessly for me.

Posted on: October 5, 2008 at 10:59 amQuote this Comment
Poll Results: How do you back up your websites? – CSS-Tricks - Gravatar

Poll Results: How do you back up your websites? – CSS-Tricks said:

[…] this poll was to tap into your collective knowledge and find some sweet automated backup solutions. Christina’s solution is absolutely awesome and fits into my situation perfectly (being on Media Temple, and having an S3 […]

Posted on: October 8, 2008 at 8:45 amQuote this Comment
Tim Peacock - Gravatar

Tim Peacock said:

Christina, really appreciate the help with this. A couple of errors, but got these ironed out, and now working perfectly. Saved me so much time !!

Posted on: October 13, 2008 at 6:25 amQuote this Comment
Ced - Gravatar

Ced said:

How to choose the database to save ? (and not all the databases) Because with more than 5-6 db, it uses too much ressources : “(mt) Media Temple’s automated MySQL monitoring systems have detected an increase in database activity for your Grid-Service”

Posted on: October 15, 2008 at 3:22 amQuote this Comment
Matt - Gravatar

Matt said:

Any word on when this will be ready? I could definitely use it; my backups fail consistently because of one site that is around 2GB in size.

Christina said: Peter, Mat, and anyone else receiving the connection reset message: It IS size related. I’m working on a new post that …
Posted on: October 15, 2008 at 10:49 pmQuote this Comment
Christina - Gravatar

Christina said:

Matt, It will be up by the end of the week. I had work commitments and then went on vacation and this week has been crazy busy!

Posted on: October 15, 2008 at 10:57 pmQuote this Comment
Matt - Gravatar

Matt said:

Any luck on the size limitation issue? Thanks!

Posted on: October 27, 2008 at 7:04 amQuote this Comment
Week in Review (October 20 through October 26) | GrandmasterB dot com - Gravatar

Week in Review (October 20 through October 26) | GrandmasterB dot com said:

[…] I also setup an account on the Amazon AWS to remotely backup my sites. After reading a tutorial by Christina Warren, I setup the necessary software and a shell script to do automatic backups of my site daily. The […]

Posted on: October 27, 2008 at 4:45 pmQuote this Comment
creede - Gravatar

creede said:

I get a connection reset too, but it seems to go through eventually. Let me know if you get rolling backups going. Thanks again for the great tool!

Posted on: November 12, 2008 at 6:30 pmQuote this Comment
Amazon S3 for Backups – Effective Programming - Gravatar

Amazon S3 for Backups – Effective Programming said:

[…] Christina Warren […]

Posted on: November 26, 2008 at 3:02 amQuote this Comment
Mikkel - Gravatar

Mikkel said:

I get the same error as Matt when I test the script:

Connection reset: Connection reset by peer

99 retries left, sleeping for 30 seconds

However when I look in my bucket (using transmit), I see the files are turning up just fine, so I guess I should just ignore the error. But it would be nice to have it running smoothly.

An yes, I’m using a EU bucket (sitting in Denmark).

Another thing I’ve noticed; It backs up all of my databses, but only the domain directories I choose in the “backup_server.sh” file. Is there a way to exlude databases? I have a lot of test sites (with db’s), that arent realy that important to backup.

Posted on: December 8, 2008 at 7:32 amQuote this Comment
Samm - Gravatar

Samm said:

First, Christina, thank you for this article – I am using mt (gs) and S3 and am having problem. Probably a mistake on my part, but looking for some help. I have completed the setup, and when I run the command ./backup_server.sh, I get the message: “bad interpreter: No such file or directory”. The file is clearly there, I can see it via FTP and it shows with the dir command in Putty. Any ideas?

Posted on: December 10, 2008 at 12:26 pmQuote this Comment
Michael - Gravatar

Michael said:

Hey Christina,

Any news on the recursive script?

Thanks! –Michael

Posted on: December 10, 2008 at 2:08 pmQuote this Comment
trs21219 - Gravatar

trs21219 said:

i got this running fine but i want the backups to go into subdirectories in my s3 bucket for each day that it backs up. i got it doing it for my server but how do i make it make the folders on S3?

im adding this to the end of the directory path at the top and bottom of the backup.sh

directorypath........./date +%m-%d-%y/

Posted on: December 16, 2008 at 2:29 amQuote this Comment
trs21219 - Gravatar

trs21219 said:

trs21219 said: i got this running fine but i want the backups to go into subdirectories in my s3 bucket for each …

nevermind i got it ! here is my backup_server.sh file

!/bin/sh

A list of website directories to back up

websites="domains.com domain2.com"

The destination directory to backup the files to

destdir=/home/xxxxx/users/.home/s3sync/s3backup/date +%m-%d-%y

The directory where all website domain directories reside

domaindir=/home/xxxx/users/.home/domains

The MySQL database hostname

dbhost=internal-db.sxxxxx.gridserver.com

The MySQL database username - requires read access to databases

dbuser=xxxxxx

The MySQL database password

dbpassword=xxxxxxx

mkdir $destdir rm $destdir/*.tar.gz

echo date ": Beginning backup process..." > $destdir/backup.log

backup databases

for dbname in echo 'show databases;' | /usr/bin/mysql -h $dbhost -u$dbuser -p$dbpassword do if [ $dbname != "Database" ]; then echo date ": Backing up database $dbname..." >> $destdir/backup.log /usr/bin/mysqldump --opt -h $dbhost -u$dbuser -p$dbpassword $dbname > $destdir/$dbname.sql tar -czf $destdir/$dbname.sql.tar.gz $destdir/$dbname.sql rm $destdir/$dbname.sql fi done

backup web content

echo date ": Backing up web content..." >> $destdir/backup.log for website in $websites do echo date ": Backing up website $website..." >> $destdir/backup.log tar -czf $destdir/$website.tar.gz $domaindir/$website done

echo date ": Backup process complete." >> $destdir/backup.log

echo date ": Compacting Package Total..." >> $destdir/backup.log tar -czf $destdir.tar.gz $destdir

rm -r -f $destdir

The directory where s3sync is installed

s3syncdir=/home/xxxxx/users/.home/s3sync

The directory where the backup archives are stored

backupdir=/home/xxxxxx/users/.home/s3sync/s3backup/

The S3 bucket a.k.a. directory to upload the backups into

s3bucket=BUCKET NAME HERE

cd $s3syncdir ./s3sync.rb $backupdir/ $s3bucket:

hope this helps some people… i got it from trail and error… thanks again christina for the awesome script!

Posted on: December 16, 2008 at 3:06 amQuote this Comment
Adam - Gravatar

Adam said:

I thought I did everything right, but got this after running the script manually:

command not found line 4:

command not found line 8:
command not found line 10:
command not found line 14:
command not found line 16:
command not found line 20:
command not found line 22:
command not found line 26:
command not found line 28:
command not found line 32:
command not found line 34:
command not found line 40: ./backup_server.sh: line 42: /backup.log: Read-only file system
command not found line 44: rm: cannot remove /*.tar.gz': No such file or directory rm: cannot remove\r’: No such file or directory
command not found line 50: ‘/backup_server.sh: line 70: syntax error near unexpected token '/backup_server.sh: line 70:fi

What did I do wrong?

Posted on: January 7, 2009 at 4:45 pmQuote this Comment
Adam - Gravatar

Adam said:

Adam said: I thought I did everything right, but got this after running the script manually: : command not found line 4: : command …

Nevermind. My bucket name had a typo. Sorry.

Does anyone know of a way to filter out certain directories (for example: “cache” folders)? I also have dropbox folders that I don’t need to backup, etc. I’d love to exclude them.

Posted on: January 8, 2009 at 5:18 pmQuote this Comment
A - Gravatar

A said:

“Proper backups are like eating your vegetables — we all say we’ll do it and that it is a good idea, but it is so much easier NOT to do it and eat Oreo cookies instead”

Its refreshing to see analogies like these..once in a while atleast..web would be so much greener, warmer & healthier if even guy geeks start giving such analogies..i wish..

Posted on: January 17, 2009 at 5:31 amQuote this Comment
Adam - Gravatar

Adam said:

Adam said: Does anyone know of a way to filter out certain directories (for example: “cache” folders)? I also have dropbox folders that I don’t need to backup, etc. I’d love to exclude them.

In case anyone else is interested, I made two changes to the script that I’m really happy with. I changed the compression type to bzip2 (bz2) instead of gzip (gz), and my file sizes shrunk dramatically! Additionally, I excluded some files and directories that I didn’t need backed up and knew were full of tons on files. This saves a lot of time and end file size. Here’s the main change, but a couple others need to be made for this to fully work:

tar -jcf $destdir/$website.tar.bz2 –exclude ‘/cache’ –exclude ‘/dropbox’ –exclude ‘*.zip’ $domaindir/$website

Posted on: January 17, 2009 at 1:09 pmQuote this Comment
Johnny - Gravatar

Johnny said:

hi!

Is there any way to backup the e-mails as well from the grid-server?

(Thanks for the tutorial, it’s working fine on my side.)

Posted on: January 18, 2009 at 7:59 amQuote this Comment
jota - Gravatar

jota said:

Hello,

It’s been a few weeks since the script stopped working for me.

Now I’m getting this error

Read from remote host domain.com: Connection reset by peer Connection to domain.com closed.

Any clue what might be happening?

Posted on: June 6, 2009 at 8:25 amQuote this Comment
Matt - Gravatar

Matt said:

Hello Christina,

Excellent script! It seems to work fine for me all the files and dbs get backed up but I get this error for some reason:

/usr/bin/mysqldump: Got error: 1044: Access denied for user ‘db62917’@’%’ to database ‘information_schema’ when using LOCK TABLES

Not quite sure why this is, any ideas?

Matt

Posted on: June 23, 2009 at 4:38 amQuote this Comment
Pietro - Gravatar

Pietro said:

Hi Christina, thanks a lot! Your tutorial is fantastic!

I’m getting the same error as Matt, though: /usr/bin/mysqldump: Got error: 1044: Access denied for user ‘dbxxxxx’@’%’ to database ‘information_schema’ when using LOCK TABLES

Anyway, the script is working…

Pietro

Posted on: July 20, 2009 at 4:21 amQuote this Comment
Karl Gechlik - Gravatar

Karl Gechlik said:

I am getting errors, can you help me out? Any help would be greatly appreciated.

tar: Removing leading /' from member names S3 command failed: list_bucket prefix max-keys 200 delimiter / With result 404 Not Found S3 ERROR: # ./s3sync.rb:290:in+’: can’t convert nil into Array (TypeError) from ./s3sync.rb:290:in s3TreeRecurse' from ./s3sync.rb:346:inmain’ from ./thread_generator.rb:79:in call' from ./thread_generator.rb:79:ininitialize’ from ./thread_generator.rb:76:in new' from ./thread_generator.rb:76:ininitialize’ from ./s3sync.rb:267:in new' from ./s3sync.rb:267:inmain’ from ./s3sync.rb:735

Posted on: August 4, 2009 at 10:20 amQuote this Comment
Christina - Gravatar

Christina said:

Hey guys, OK — I’m uploading new copies of the screencast now and will have those links switched in about 15 minutes. Karl, give me a little bit to look over the code, I need to do a follow-up post (13 months later, wow!) anyway 🙂

Posted on: August 4, 2009 at 10:25 amQuote this Comment
Karl Gechlik - Gravatar

Karl Gechlik said:

Thank you so much Christina I am sitting on the edge of my seat pulling my hair out for a few days now (more like a week!) I can generate the backup file but I have to manually xfer it to s3. I am wondering if it is my bucket name it has a – in it. Any assistance is REALLY REALLY appreciated! You rock.

Posted on: August 4, 2009 at 10:34 amQuote this Comment
MattO - Gravatar

MattO said:

SOME HELP FOR NEWBIES ON HOW TO DO THIS AND SOME PROBLEMS I RAN INTO:

Hi Christina,

You Rock! This is a huge help – you have no idea how hard this stuff can be to get done when your not a pro.

I just want to document a few problems that I encountered when trying this so others can hopefully skip my frustrations on Saturday Morning : )

1) If you run into the problem of not being able to see the s3sync directory that you create.

  • In Christina’s video it shows up as a directory next to your domain folder, however, in my experience it wasn’t anywhere to be found. I knew it existed because I could see it in SSH, but in FTP and my Account Center it was invisible.

  • To rectify this you can go into your FTP setting and show hidden files. In filezilla this is under the Server Menu and is called “Force showing hidden files”

-Then, your not done yet – to actually see the s3sync directory – go into your users directory and you should now see a directory called “.home” – you’ll find it in there.

2) Put your server_backup.sh script in the data directory – this is the main data directory in the same hierarchical level as your domains folder and etc folder

3) When you figure all this out and you actually run the script you’ll get some errors – the first ones I hit were related to buckets in AWS.

-If your new at all this you need to create a bucket, but this can be kinda confusing.

-I finally found this simple way to do it: http://it.toolbox.com/blogs/oracle-guide/cloud-studio-for-aws-ec2-and-s3-28117

-Essentially you download the cloud studio freeware (FYI – I’m on MAC), open it, go into the settings and enter your account info, don’t worry about all the RED text (that has to do with another amazaon image tool), go down toward the bottom of the screen and add a new bucket. That should do the trick – now go to the server_backup.sh script and put the name of your bucket there like she tells you to do.

4) You might still be getting an error that has this in it: “s3sync/s3backup/*.tar.gz’: No such file or directory”

-This is because the way that the amazon directories are downloaded is a bit different than the script is written (I think).

-To solve this simply go into your FTP and simply move the S3backup directory inside of the s3sync directory where it is looking for it.

OK! I know it’s the clearest explaination, but I hope it saves someone a few hours!

Thanks again for the detailed tutorial!

-MattO

http://mattoden.com

Posted on: August 22, 2009 at 3:25 pmQuote this Comment
MattO - Gravatar

MattO said:

Hi Christina,

Quick Question:

Is there a way to modify the script so that you don’t overwrite the backups on S3?

For Example – I would like to run a daily backup and on S3 I would like the html directories and the sql databases to append the date to each version. Day one: Example.com_8_25_09.tar.gz, Example.com_8_26_09.tar.gz, etc.

That way, if there is a problem, I can roll back to the last fully functioning date.

Thanks,

MattO

Posted on: August 22, 2009 at 3:35 pmQuote this Comment
Christina - Gravatar

Christina said:

Matt, Amazing, amazing — I’ll be updating the post this week (I’ve been saying that for a year but I actually mean it this time) and will be using your corrections — the formatting has changed a bit in the last year and I need to update the script.

I’ll also include a way for you to write a new file each time (with a date appended). I’m sure we can figure out a way to have a rule to only keep the last X or whatever too.

Posted on: August 23, 2009 at 2:03 pmQuote this Comment
Coalesce - Gravatar

Coalesce said:

Thanks for the helpful tutorial. We’re going to be awaiting your revised script!

Posted on: August 27, 2009 at 12:34 pmQuote this Comment
Andrew - Gravatar

Andrew said:

Just implemented this on my gs server, thanks for the great script. Eargly anticipating the revised script with the dated backups.

Posted on: August 28, 2009 at 11:24 pmQuote this Comment
zxr - Gravatar

zxr said:

Hi guys!

I still have a problem when working with a big backup file. One of my sites is around 1gb of data and this method does not work for It. I run the script but It says comthing like the time has expired…

Any solutiono for this?

Thanks!

Posted on: August 29, 2009 at 12:33 pmQuote this Comment
Mikkel Hansen - Gravatar

Mikkel Hansen said:

Any news on the update? Would realy love to be able to keep the last x days of backups on the server!

Posted on: September 8, 2009 at 5:50 pmQuote this Comment
Andrew - Gravatar

Andrew said:

I was wondering if you had an ETA for a way to keep multiple dated backups, I needed that today. Thanks for the script though, it was a lifesaver.

Posted on: September 15, 2009 at 12:06 amQuote this Comment
Brett Wilcox - Gravatar

Brett Wilcox said:

Hello Christina!

I just wanted to share with you and everyone here that I have created a VERY robust backup script that keeps multiple rotating backups and such.

I will be releasing more information soon but I have posted the code at http://www.brettwilcox.com/2009/09/20/amazon-s3-backup-script/ for anyone interested.

Shoot me an email if anyone has any questions!

brett@brettwilcox.com

Posted on: September 21, 2009 at 1:21 amQuote this Comment
Anthony Abraira - Gravatar

Anthony Abraira said:

This is probably a stupid question, but is there a means of doing something where this can instead zip up to one file that goes to a designated folder on the site. That way one can just go in and download the zip file and have their back up without needing an additional server storage space.

If this has been answered I am sorry, can somebody point me in the right direction?

Posted on: September 21, 2009 at 2:49 pmQuote this Comment
Brett Wilcox - Gravatar

Brett Wilcox said:

Hello Anthony,

I have created that very script that you are talking about.

http://www.brettwilcox.com/2009/09/20/amazon-s3-backup-script/

Just disable the amazon S3 Sync option and it will create multiple rotating dated backups without the need for amazon S3.

Posted on: September 21, 2009 at 2:55 pmQuote this Comment
Brett Wilcox - Gravatar

Brett Wilcox said:

I have released a newer version of the script and published it to google code – http://code.google.com/p/ziplinebackup/

Posted on: September 30, 2009 at 5:09 pmQuote this Comment
Hugh Esco - Gravatar

Hugh Esco said:

A couple of thoughts I have not seen addressed above:

I would be careful about deleting the previous backup prior to creating the new one.

If memory serves, an S3 bucket is limited to a gigabyte or so. This may be the issue folks are encountering backing up larger sites.

Essential touch to time stamp backups and keep the last several around. I once had to roll back several days to find an uncorrupted database backup. That was when I gave up email (for scp and rsync) as a transport for files which mattered.

if you can add some sort of log rotate function to this script you’ll have a full featured backup system in place.

Backing up to S3 is a fine way to store public data accessible on the web anyway, but for folks dealing with proprietary data, inhouse file stores, or at least encrypted S3 buckets are probably a more appropriate way to go.

And one last important step, as you add new sites and new resources which ought to be backed up, it important to update your backup script to account for those.

Thanks for the article. Was unaware of this ruby tool.

— Hugh

Posted on: October 20, 2009 at 1:27 amQuote this Comment
irms - Gravatar

irms said:

This was awesome. Thanks!

Posted on: December 14, 2009 at 8:16 amQuote this Comment
Clayton - Gravatar

Clayton said:

Thanks for the tutorial, Christina. This post, along with your video, were a huge help for me.

Posted on: January 5, 2010 at 8:06 pmQuote this Comment
Clayton - Gravatar

Clayton said:

This appears to be working, but I am getting an error message at the top of my Cron job notification email: “Got error: 1044: Access denied for user ‘dbXXXXX’@’%’ to database ‘information_schema’ when using LOCK TABLES”

Anyone else get this?

Posted on: January 8, 2010 at 10:07 amQuote this Comment
Christina Warren - Gravatar

Christina Warren said:

Clayton, Yes — a user was nice enough to put up a new script on github — the link is in the comments (a few posts before yours) – I desperately need to update this blog post with that info — thanks for the reminder!

Posted on: January 8, 2010 at 10:09 amQuote this Comment
Karl L. Gechlik | AskTheAdmin.com - Gravatar

Karl L. Gechlik | AskTheAdmin.com said:

Thanks for your help. I rewrote the script to create custom filenames with the date in them. I then croned another script to remove files after 7 days keeping 7 backups. I could not get your MySql part to work so I cronned out MySql dumps and put them in the tar. Finally I reduced the file size with compression and filtering out specific file types I didnt need backed up. I will get it documented and up on http://www.askTheAdmin.com as soon as I can.

Posted on: January 8, 2010 at 10:14 amQuote this Comment
Jesse - Gravatar

Jesse said:

Thank you so much for spelling all this out for us! This was extremely helpful! I am using Amazon S3 now, it’s kind of tricky to use but this helped me save much time 🙂

Posted on: February 24, 2010 at 5:12 pmQuote this Comment
Media Temple backups naar Amazon S3 met S3Sync - Gravatar

Media Temple backups naar Amazon S3 met S3Sync said:

[…] GB per maand + upload en download fees. Er is geen verplichte afname van het aantal GBs. Kudos voor Christina Warren en haar blogpost die de basis vormt voor deze Nederlandse vertaling en kudos voor Brett Wilcox, […]

Posted on: March 12, 2010 at 2:54 pmQuote this Comment
Andre Kreft - Gravatar

Andre Kreft said:

Hi Christina, love the script and has been working for a while now. Lately I got some strange error’s I cant seem to fix.

Like: No such file or directory in my Cron Deamon emails and when I try to kick off my cron job through SSH I can’t kick off ./server_backup.sh in my data folder. It keep saying: -bash: /backup_server.sh: No such file or directory

Anything changed on MT side? or am I missing something. Hope you can help.

Posted on: March 17, 2010 at 4:41 amQuote this Comment
the Blog Butler - Gravatar

the Blog Butler said:

Thanks for writing this up Cristina, was a great help and got me off to a good start. I have everything working fine when I run it from an SSH session but when the cron kicks the job off it will back up the site but will NOT transfer it to my S3 bucket. It does it just fine when I run it manually but not from cron.

Anyone have any clue on this?

Posted on: June 29, 2010 at 6:40 pmQuote this Comment
Tanner Hobin - Gravatar

Tanner Hobin said:

Thank you, thank you, thank you!

So I tried to make things a little easier on myself by making the “list of website directories to backup” /domains and “the directory where all website domain directories reside” /home/XXXXX/users/.home and the resulting .tar.gz file only had folders in it for home>XXXXX>users. No domains backed-up.

Simply put, I was hoping to skip having to list all domains and the need to update the script every time I add/remove a domain. Any idea how I might be able to do that?

Again, thank you.

Posted on: July 16, 2010 at 5:47 pmQuote this Comment
Brett - Gravatar

Brett said:

Any idea on how to correct these errors in the zipline scripte?

======================================================================

Database Backup Start Time Fri Oct 8 01:33:30 PDT 2010

Backing up databases from /home/86415/data/database_backups tar: Removing leading `/’ from member names tar: /home/86415/data/database_backups: Cannot stat: No such file or directory tar: Error exit delayed from previous errors Complete

======================================================================

Database Backup End Time Fri Oct 8 01:33:30 PDT 2010

======================================================================

Starting Transfer to Online Storage Backup Fri Oct 8 01:33:30 PDT 2010

Now transfering Backups to Amazon S3 S3 command failed: list_bucket max-keys 200 prefix delimiter / With result 404 Not Found S3 ERROR: # ./s3sync.rb:290:in +': can't convert nil into Array (TypeError) from ./s3sync.rb:290:ins3TreeRecurse’ from ./s3sync.rb:346:in main' from ./thread_generator.rb:79:incall’ from ./thread_generator.rb:79:in initialize' from ./thread_generator.rb:76:innew’ from ./thread_generator.rb:76:in initialize' from ./s3sync.rb:267:innew’ from ./s3sync.rb:267:in `main’ from ./s3sync.rb:735

Complete

Posted on: October 8, 2010 at 4:36 amQuote this Comment
Brett - Gravatar

Brett said:

Oh crap this is the wrong blog to ask that question, I should really not mess with this sort of thing this late at night. ><

Posted on: October 8, 2010 at 4:38 amQuote this Comment
Barton - Gravatar

Barton said:

I also get the error:

Access denied for user ‘dbxxxxx_s3sync’@’%’ to database ‘yyyyy’ when using LOCK TABLES

Posted on: October 18, 2010 at 2:25 pmQuote this Comment
Barton - Gravatar

Barton said:

Okay, to all people with the “Access denied” … “using LOCK TABLES” error:

The backup is working, its just failing when it tries to backup the information_schema, which isn’t required.

Posted on: October 18, 2010 at 2:30 pmQuote this Comment
nosaukums - Gravatar

nosaukums said:

Some ideas, I defined in the beginning of backup script: DATESTAMP=date +%Y-%m-%d

and modified to include it in all filenames i.e.:

backup databases

for dbname in echo 'show databases;' | /usr/bin/mysql -h $dbhost -u$dbuser -p$dbpassword do if [ $dbname != "Database" ]; then echo date ": Backing up database $dbname..." >> $destdir/backup.log /usr/bin/mysqldump -h $dbhost --opt --skip-lock-tables -u$dbuser -p$dbpassword $dbname > $destdir/$DATESTAMP.$dbname.sql tar -czf $destdir/$DATESTAMP.$dbname.sql.tar.gz $destdir/$DATESTAMP.$dbname.sql rm $destdir/$DATESTAMP.$dbname.sql fi done

backup web content

echo date ": Backing up web content..." >> $destdir/$DATESTAMP.backup.log for website in $websites do echo date ": Backing up website $website..." >> $destdir/backup.log tar -czf $destdir/$DATESTAMP.$website.tar.gz $domaindir/$website done

also useful thing – exclude a specific directory (in my case cash, save some precious bandwidth 😉 ) tar -czf $destdir/$DATESTAMP.$website.tar.gz $domaindir/$website --exclude 'notthis/cache/*'

also couldnt figure out database locking issue so i added “–skip-lock-tables” to command. so far so good.

thanks so much fur such a great tutorial.

Posted on: May 9, 2011 at 4:15 amQuote this Comment
nosaukums - Gravatar

nosaukums said:

ouch, formatting screwed up.

anyway.. just add “$DATESTAMP.” to all “$dbname.sql.tar.gz” and to first “$website.tar.gz”

tadaa..

and not tested yet, but I replaced the original “rm $destdir/*.tar.gz” (remove all tar.gz files) with “find /$destdir -type f -mtime +10 -exec rm {} \;” (delete all files older then 10 days.)

Posted on: May 9, 2011 at 4:20 amQuote this Comment
humbert - Gravatar

humbert said:

mmmmm cookies……!

Posted on: October 26, 2011 at 10:56 pmQuote this Comment
Rosamunda - Gravatar

Rosamunda said:

Thanks Christina! I was so helpless about what to do to backup remotely my sites in my brand new MT account!! THANK YOU VERY MUCH!!! Gracias! Rosamunda from Buenos Aires, Argentina

Posted on: August 19, 2012 at 1:04 pmQuote this Comment
Dave F - Gravatar

Dave F said:

Thanks so much for posting this. It was EXACTLY what I was looking for. Perfectly written and commented. THANK YOU!. I did run into one snag. I keep getting this error when trying to backup my domain archive. Broken pipe: Broken pipe 99 retries left, sleeping for 30 seconds Broken pipe: Broken pipe 98 retries left, sleeping for 30 seconds

The sql archive uploaded to s3 perfectly but it fails with this file for some reason. Any idea why? My site is VERY image dependant (for a modeling agency). The site archive is about 10GB. It created the archive but its failing to upload it from my MT Grid Server.

Any help is greatly appreciated.

Thanks,

Dave

Posted on: May 6, 2013 at 1:10 amQuote this Comment
Dave F - Gravatar

Dave F said:

Just a follow up. I split the very large tar file into 950MB parts and it seems to solve the Broken Pipe problem. Hope this helps someone.

Posted on: May 6, 2013 at 10:04 amQuote this Comment