Random Thoughts

Tech, words and musings from an Englishman in Seattle

Server Meltdown Part Six - It's Alive!

Now that the shiny new linux system is up and running, it was actually relatively easy to bring it online as my home server, replacing all the functionality of my now dead system.

This post will deal with everything except the installation of Samba (which provides Windows Domain Controller services) - those details will be in an upcoming post.

First up, some basic stuff. I need to be able to log into the box from another system as it's going to run headless and I want the monitor that it's currently using back on my Vista box.

Basic Setup

While logged in as root on the console, add my user account (created during setup) to /etc/sudoers using visudo.

Next, edit /etc/apt/sources.list, removing the cdrom entries as all further package installs will be using the net and I don't want apt-get complaining that it can't access the cdrom drive.

Make sure the system is up to date:

apt-get update
apt-get dist-upgrade
apt-get upgrade

Now, I want to be able to perform the rest of the setup remotely, so install ssh and friends in order to ssh into the system.

apt-get install ssh

Joy, now I can login and perform the rest of the installation remotely rather than at the console.

Network services

The clock needs to be set right, so:

apt-get install ntpdate
ntpdate time.windows.com

Yes, I used Microsoft's time server - it's the only one I can remember off the top of my head!

Next, I need to get the network time service (NTP) running on the machine. It will be providing time services to all other machines on the network and periodically setting it's own time against the root time servers.

apt-get install ntp ntp-doc

You'll need to edit /etc/ntpd.conf and then /etc/init.d/ntpd restart to get it to notice the changes. Note that pretty much everything I talk about here has either a config file in /etc or it's own directory of config files, also in /etc. They're pretty self-explanatory - just take a look at the config files themselves and the related documentation. Everything also has a script located in /etc/init.d to control it's operation.

For this reinstall, I just diff'd my backed up config files against the newly installed files to make sure there wasn't anything new that I needed to be aware of and then just copied my old files over and restarted the service.

Next up, bind - the DNS server. I have a local DNS domain in my house that all the clients have an entry in, the linux box serves up that domain and caches domain requests so that the only nameserver the client machines need to know about is this linux box.

apt-get install bind9 bind9-doc

Bind is probably the hardest thing to configure. Handily I had all my backups (yay, me!). I'll probably write up a post dedicated to that at some point, though one thing did bite me a little: if you're restoring your configuration files from backup and get an auth error when trying to restart or reload the server, just killall named and start it up fresh as the authorization key in /etc/bind/rndc.key probably changed when you copied across the old data.

At this point, edit /etc/resolv.conf and point the nameserver line at the localhost, 127.0.0.1 so that client binaries on the system itself use your shiny new nameserver.

Next up DHCP. This is a little service that client machines use to get an IP address. In a home environment this is normally handled by your wireless or broadband router, but I prefer to have the server do it as other useful information, such as name and time server information is also passed to the client. Configuration is fairly simple - check out the documentation.

apt-get install dhcp3-server

Sweet! The base services are now all configured. At this point it's probably a good idea to reboot the server to make sure all these services come up nice and cleanly.

A Few Other Things That I Do

I like to be able to mount drives from other machines on the linux box. For example, my Infrant NAS exports a “backup” share that the server backs itself up to. I use autofs for this.

apt-get install autofs

Edit /etc/auto.master and un-comment the line for auto.net. The backup share is now available at /net/blob/c/backup. FYI, 'Blob' is the name of my Infrant NAS box…

After that it was just a matter of reinstalling my crontabs from backup and then this blog and a few other things are automatically backed up to the NAS. Cool. Safety is back…

Another thing to mention is that I use Amazon S3 to backup my photos and videos. The scripts that do that are written in ruby, so that also needs to be installed.

apt-get install ruby rubygems

I need rubygems installed as it brings with it the openssl ruby package.

The last thing (modulo Samba), that I need is dynamic DNS updating. I use dyndns.org so that I can have a friendly DNS same to connect to the server when I'm not at home. The linux box handles updating the DynDns database with whatever IP address Verizon happens to be giving me at the time of update. I use inadyn to accomplish this.

apt-get install inadyn

Unfortunately, inadyn doesn't come with any form of script to get it started, or any useful documentation whatsoever. So I just copied an existing script in /etc/init.d and got it going with a few minor modifications. Let me know if you're interested in a copy.

All in all, the entire process took me about an hour to get everything setup once the base linux system was successfully installed.

Next up - Samba!

Persistent Storage for Amazon EC2 Services

Talk about serendipity…

I was just talking about using EC2 and S3 for backups and along comes a solution to actually make it easier to build.

Without persistent storage, it would be hard to persist the backed up data to S3 as you'd have to do the file to S3 mapping (I think) by yourself. With this feature, you effectively create a blob of S3 storage and use it as a drive in your EC2 machine instance.

Nice.

Actually, that's how I thought it worked all along, but was weirded out earlier today when I found out that it didn't.

Well, as of 9pm tonight it does.

Serendipity.

Backups Moved To Amazon S3

Previously, I've been backing up all the irreplaceable household data (movies, photos, etc…) to my colo'd server using Scott Ludwig's most excellent Link-Backup script.

Unfortunately I've been running out of space on that server and am thinking about decommissioning it, so as the first step of the decommission I need to move my offsite backups somewhere else. The obvious place appears to be Amazon's S3 service as the price of storage seems very cheap - $0.15 per gigabyte. There are also transfer charges, but in the general case it's upload only, so I'm ignoring that. I have about 100GB to store, which works out at $15 per month.

Very reasonable.

The next thing to figure out is “how do I perform the backup?” I initially thought that the easiest thing to do would be to continue to use Link-Backup. All I need to do is bring up and EC2 instance every time I need to backup and then use Link-Backup as I had been doing previously.

The big problem with this was that bringing up an instance is not trivial. Relatively simple, but not trivial. Also, the EC2 tools require Java 1.5 to run and it's not available on PPC Debian linux. Sigh.

I then dig a search around and found Jeremy Zawodny's excellent blog post on the subject which provides a nice roundup of the available solutions. I decided to go with the ruby based s3sync.

Set up was pretty simple. I first created an AWS account, collected the required data and configured s3sync. I then created some scripts as suggested by John Eberly and I was off to the races!

I ran some tests to make sure I could:

  1. Backup a small directory that had some hierarchy in it.
  2. Restore that directory.
  3. Make some small changes to the local directory and make sure that only the changes were synced.
  4. Pull back a single file using the included s3cmd script.

It all worked perfectly, so now I have ~70GB of photos syncing up to S3.

Sweet!

Next steps:

  1. Add everything else (home movies, etc…) to the backup
  2. Rebuild my little webapp using Google's AppEngine.
  3. Move my blogs somewhere else (maybe Media Temple - an opinions? I need to run MovableType and MySql).
  4. Decommission server.
  5. Use saved cash to buy more toys.

Sounds like a plan.

© 2001 to present, Steve Lacey.