Blogging on Ansible

A good friend of mine recently voiced the opinion that the day will soon be upon us when the traditional SysAdmin will no longer be relevant and all operations activities will be some variety of DevOps.

Now, I’ve been managing my own infrastructure for some years but every package has been manually installed with every configuration file modified by hand, resulting in snowflake servers which are nigh-on impossible to recreate exactly and have occasionally caused me a lack of sleep. I’ve known for a long time that I could be doing better.

Configuration Mismanagement

In recent years all the clients I’ve contracted with have used either Chef or Puppet to automate the management of their infrastructure.

The funny thing has been that the last couple have both used Puppet in conjunction with some bespoke, cobbled together system to upload the modules, manifests etc to the target machines and run puppet from there, bypassing the usual master/agent setup. Seeing this once was unusual but seeing it twice in a row struck me as very odd indeed and made me think that there was perhaps something overcomplicated with the “usual” approach.

Ansible Simplicity

With Ansible there is no client/server model: all you need is a repository of playbooks, SSH access to an inventory of machines and the rest is as simple as running ansible-playbook. Very straightforward.

Something I’d wanted to do for some time is migrate my own modest infrastructure away from a single overloaded, handcrafted VPS and to get everything under configuration management in the process.

I’d seen a few examples of Ansible and it was one of the buzzwords being bandied about at the time so it seemed like good opportunity to get my hands dirty and approach my own environment as I would a client’s albeit with cheap OpenVZ containers instead of EC2 instances.

Migrate All The Things

One of the things to be migrated was this blog. It had been running on a woefully out-of-date version of WordPress and using the same theme for the past half decade or more so getting both of those updated along with the infrastructure aspect was good.

Whilst I was shaking things up I’ve also taken the opportunity to move the blog from it’s own domain, sickbiscuit.com, to a subdomain of my personal site, blog.stevenwilkin.com, in an effort to simplify my web presence.

In terms of servers this is now running on one I’m now using to host PHP apps, there’s another that handles redirects from the old domain along with other static sites and there’s a third that stores the nightly backups produced. All three have been provisioned with Ansible.

I found using YAML for the playbooks to be a bit unusual to begin with but on the whole Ansible has been easy to get up and running with and so far seems simpler than the alternatives.

Next Steps

Before I can decommission the old VPS completely I still have to migrate my mail server along with a few other apps which will require a bit of effort. When that’s out of the way I may for the sake of completeness setup monitoring with Nagios and I’ve a strange hankering to start running my own DNS servers. I know.

Time will tell what exactly I’ll do but with Ansible provisioning new machines will be a breeze.

Installing RMagick on Debian Lenny

Just a quick reminder to myself on how I installed RMagick on a Debian 5.0.4 “lenny” VPS. All commands to be run as root.

Build ImageMagick from source

curl -O ftp://ftp.imagemagick.org/pub/ImageMagick/ImageMagick.tar.gz
tar xvzf ImageMagick.tar.gz -C /usr/src/
cd /usr/src/ImageMagick-6.6.6-6/
./configure
make
make install

Install the RMagick Gem

gem install rmagick

Bingo!

As an interesting aside, ImageMagick handles a lot of image formats via delegate libraries. I previously installed RMagick on a CentOS box and had to separately install TrueType fonts which were necessary for the project in question. These had to be installed before ImageMagick and were accessible through yum and the xorg-x11-fonts-truetype package.

Migrating a WordPress database between domains

Occassionally I find myself developing a WordPress theme which will then require moving from development into production or otherwise having to move a blog between domains.

Apart from the transfer of the files, including plugins, theme and core WordPress installation, there is only 1 slightly gotcha: the database. I’ve never been quite sure why but WordPress stores the URL of the blog in it’s database, not once, but twice. For the blog to be migrated the database needs a slight tweak.

As always, create a backup before doing anything. This can be done with phpMyAdmin or using the mysqldump command like so:

$ mysqldump -uUSER -pPASSWORD DATABASE > /path/to/backup.sql

If the database is to be hosted on a different machine you can then import this dump using using which ever method you’re comfortable with. Finally, run the following query on the database, substituting your own domain name:

UPDATE
	wp_options
SET
	option_value = 'http://DOMAIN.TLD'
WHERE
	option_name IN ('siteurl', 'home');

That’s it!

MySQL database backup with remote storage

Prevent a disaster

After reading Jeff Atwood’s backup failure last month I decided to finally get around to doing something I’d been intending to do “one of these days” but had in actual fact been putting off for years.

Here’s the steps I took to ensure the databases on my webserver were backed up every night and copies of the dumps stored remotely.

On the remote storage machine

Generate an ssh key pair with and empty password and put the public key on the remote server. This will give our script access to the server without requiring you to enter a password each time:

$ ssh-keygen -t rsa -f /home/steve/code/db_backup/id_rsa
$ scp /home/steve/code/db_backup/id_rsa.pub REMOTEHOST:

This script will fetch all the backups, logging in as the rsync user and using the private key just generated. It’s located at /home/steve/code/db_backup/sync_backups.sh:

#!/usr/bin/env bash
rsync -e "ssh -l rsync -i /home/steve/code/db_backup/id_rsa" -avz REMOTEHOST:mysql/ /data/primary/backup/mysql/

Have this happen automatically daily at 12:20am:

$ crontab -l
# m h  dom mon dow   command
20	0	*	*	*	/home/steve/code/db_backup/sync_backups.sh
$

On the machine to be backed up

Create a new user and allow ssh access with the previously generated key:

# adduser rsync
# mkdir ~rsync/.ssh
# mv ~steve/id_rsa.pub ~rsync/.ssh/authorized_keys
# chown rsync:rsync ~rsync/.ssh/authorized_keys
# chmod 400 ~rsync/.ssh/authorized_keys

This script will dump all available databases and is located at /root/bin/backup_databases.sh:

#!/usr/bin/env bash
 
# dump all available databases
# SJW
 
AUTH='-uroot -pROOTPASSWORD'
DBS=`mysql $AUTH --skip-column-names -e 'SHOW DATABASES;'`
BACKUPS='/home/rsync/mysql/'
 
for DB in $DBS
do
	mysqldump $AUTH $DB > $BACKUPS`date +%Y%m%d%H%M`_$DB.sql
done
 
# delete backups older than 5 days
find $BACKUPS -mtime +5 -type f | awk '{print "rm "$1}' | sh

Have the script run nightly at 12:10am via cron:

# crontab -l
# m h  dom mon dow   command
10  0 * * * /root/bin/backup_databases.sh
#

Closing thoughts

This approach is realtively straight forward, everything happens automatically and it could easily be extended to cover mailboxes, source code repositories, uploaded content etc. However, for mission-critical databases master-slave replication may be more appropriate. For further reading you may enjoy JWZ’s thoughts on backups.

# shutdown -h now

I’ve just shutdown the beige box that was home to this blog for just over a year and a half until I started renting a VPS.

I had intended to shutdown this machine since the start of the year but never got around to it and after a techie chat on IM with Dave Dripps earlier in the day decided to just “pull the finger out” and do the needful.

I archived the contents of /var/www/htdocs and my home directory for safe keeping, copying them over to my file server, and dumped all the MySQL databases I still hadn’t migrated.

Self-hosting a site was a great learning opportunity but not something a business could be built up upon and comparing the energy cost of running a machine 24×7 versus renting a VPS priced in American Dollars the choice wasn’t hard to make, it’s just a shame it took me half a year to finally flick the switch.

Goodnight substance!

Building the brand – stevenwilkin.com gets a facelift

I’ve just released the latest iteration of my “professional” site, stevenwilkin.com.

There’s the possibility the designers I’m working with will think my design-fu is weak, but the site badly needed something as it has been barely put to use in the years I’ve owned it.

I don’t know how the Steven Wilkin web experience will evolve but the words of Eric S. Raymond are ringing true when he mentions not hiding behind a hacker-style alias so I may put more emphasis on this domain in the future.

The future is looking exciting concerning the web and my own contribution to it these days so who knows what will transpire and of course I invite you all along for the ride :)

Updating WordPress via Subversion: it works!

The last time a new version was released I decided to update my WordPress installation with Subversion with the idea being that this would make future updates easier.

Well, the good news is that this technique works :)

All it took was 3 simple steps:

  1. $ cd /var/www/sickbiscuit.com/blog
  2. $ svn switch http://svn.automattic.com/wordpress/tags/2.5/ .
  3. launch wp-admin/upgrade.php via web-browser

To be safe I backed up the database prior to the update and so far everything seems good, job’s a good ‘un!

Updating WordPress via Subversion

I read a few months back that Stuart Langridge was using Subversion to keep his WordPress up-to-date and I thought: “that’s clever” and didn’t do anything about it.

Today I was talking to Matt and he mentioned updating one of his WordPress installations and I noticed I was due an update myself. I downloaded the latest release and was having a quick skim through the upgrade procedure to make sure I wasn’t forgetting about anything and I spotted a link to the Subversion update instructions… I’m off work sick today and have the time so I decided to give it a go.

I backed up my database and checked out the latest stable version:

steve@decaf~$ cd /var/www/sickbiscuit.com
steve@decaf:/var/www/sickbiscuit.com$ svn co http://svn.automattic.com/wordpress/tags/2.3.3/

I copied over my database config file, theme, plugins and uploads and ran the wp-admin/upgrade.php script via my browser. The final act was to modify the symlink pointing to the WordPress directory and it worked first time. Profit!

Hopefully whenever I need to update in the future all I’ll have to do is use the following:

svn sw http://svn.automattic.com/wordpress/tags/NEW_VERSION/

followed by running the database upgrade script again. Quick and painless.

Site migration

I’ve just finished migrating sickbiscuit.com from my home development machine to my new VPS.

DNS records have been updated and decaf is now handling mail and web traffic for the domain allow the only thing I’ve copied over is this blog.

Hopefully this will give me the motivation needed to spruce things up a bit as the last iteration of sickbiscuit.com looked like it was designed by a programmer ;)

ISP style mail server on Debian VPS

Last month I decided to invest in a VPS from VPSLink.

I had been considering this for a while, especially after my experience using an Ubuntu VPS with Infurious and after 2 power failures within as many weeks due to building work nearby to my home, my hand was forced. No more hosting on a Linux box on the end of a DSL connection for me!

I opted for a XEN based VPS running Debian Etch. I've really come to love APT based distros after running Kubuntu on my desktop before I was endowed with a Mac and with the relative ease of setting up all the Infurious services on Ubuntu. So I decided to go upstream and it's a far cry from my past experiences with Slackware :)

My first priority was getting my LAMP stack up and running and I spent my free time over the past few days following this excellent tutorial.

Just like many other things in the FLOSS world: you get the instructions, follow those instructions and it Just Works. This instance was no different and all I really had to so was copy and paste commands & configuration settings and I probably spent more time doing background reading, testing each part as I went along and keeping track of all the changes I made on my personal wiki.

The result is I can now host email accounts for as many domains I wish, provide access to those accounts over IMAPS and perform server-side virus scanning and spam filtering.

I've said it before and I'll say it again: I love free software!