Ansible for Server Provisioning

Download the src(github).

Blog Post Series

The first thing we are going to cover is Ansible. This is not a “Welcome to Ansible” post. This will be specifically how I used Ansible to deploy my blog using Docker. If Ansible is new to you and would like some introductory material you can watch some videos supplied by Ansible and/or read this short, but thorough walk-through of Ansible.

This post became much longer than I had anticipated. There was going to be a section that covered the changes to WordPress that will make it get its configuration from the environment. That will now be in the next post. This means that you may have issues if you deploy a brand new install of WordPress following this post. It is a chicken and the egg situation. Ansible sets up docker, but the changes to WordPress rely on the docker configuration which is not setup yet, so either way steps would be missing. I feel that Ansible is a better starting point.

Why Ansible

Ansible is just one of many tools in the provisioning space. The other major ones are Chef, Puppet, and SaltStack. This is not a comparison of all these technologies. I have chosen Ansible. Here is why. Continue reading “Ansible for Server Provisioning”

Demo video for Ansible and docker

I have created a short video where I demonstrate using Ansible and docker together. I create a new droplet and Digital Ocean and create a working copy of my blog about 7 minutes. The best part is that it is completely reproducible. I will cover everything in the video in depth over the next few weeks. Continue reading “Demo video for Ansible and docker”

Moved my blog using Ansible and Docker

This blog has had a long and generally boring history. It started off on an old computer in my basement. This was in late 2011. I then needed to upgrade Ubuntu. This lead me to move the site into the cloud on Amazon EC2. It was a standard LAMP server running Ubuntu 12.04. That was three years ago and I needed to upgrade to the next Ubuntu LTS. To move to the new server I created an Ansible playbook to setup the server and Docker to run the site. I will be writing this up over the next few weeks.

In addition to moving to Docker, I will also cover WordPress performance. Previously I did not worry about performance that much. WordPress makes it easy to run a blog, but you must do all optimizations yourself. I did not spend much time optimizing as I had other projects and this was and still is a personal side project.

A Workflow Story

Bad workflows suck.

This post is going to be a little different than my other posts. It is going to be a partially fictional story. I say partially becuase all of the events I describe have happened to me in one form or another.

You open gmail, to check over your email as Aptana is opening. Another Amazon email. You only searched once for some info on a TV that your friend posted about buying on Facebook. Now you have a list of the top selling LED TVs. You feel spammed, intruded upon. Although you did click on two of the TVs. Each were at least 5 inches bigger than his.

An email catches your eye, an error report from your app. You had just added a facebook login. Pushed it out to the prod server yesterday. It just blew up while someone was signing up.

Continue reading “A Workflow Story”

Getting Back to Blogging

I have not posted anything here in awhile. My plan is to create a new post every month. I was previously attempting this and I was successful for a little while in 2012.

The last two months I have been really busy. The first thing that took a lot of my time was my primary job. I work for a hospital. We have just finished out a huge project where we replaced the core application all the physicians and nurses use. I was working ten to twelve hour days for a little while.

The other project was a small responsive mobile site. I am planning on adding some posts about my experiences here. I wrote the site in MVC 4. This is a .Net based framework. It made sense to use it based on the architecture of the organization. If you have never tried out MVC 4 or used Entity Framework and Linq, you should. All of these technologies are a joy to work with.


I have not forgotten about Python and Django. I have a few posts in the works and you should see them here shortly. I do want to give a shout out to the book Two Scoops of Django. I am still pretty new to Django and this book really helped me. It has quite a few best practices that I immediately started to implement.

Amazon S3 Bucket Permissions and s3cmd

I recently moved this website to run in the cloud. It was previously running on an old computer in my basement. I have had plans to play with EC2 machines. I also needed to update my Ubuntu install since it was 10.04. One more mark for moving the server to the cloud column was that I switched ISPs. AT&T was my ISP for the last 3 years, but Comcast had wooed me in with a deal. I was experiencing some intermittent slowness and connection drops with AT&T, so I decided to switch over to Comcast.

One thing that I did not know when I switched was that Comcast does not allow any hosting. I noticed a few days after switching that my site was not up. At first I thought it may have been the router as I replaced that when I changed ISPs. After a little research I discovered Comcast blocks all ports coming in. All of these lead me to the epiphany that now is the perfect time to move the site onto EC2.

Most of Amazon’s AWS documentation is terse and not definitive in any way.

Amazon S3 and s3cmd

I am not really going to go into how to setup and run an EC2 instance. There is a lot of information on the internet about how to do this. Another reason is that EC2 instance management does not have any definitive answers as each situation will require different solutions. I will note that for a simple setup (one server) can be done completely with the web interface. Most tutorials will have you use the EC2 command line tools.

I will talk about S3 and backing up your data with s3cmd. Backing up your data is similar to flossing. You are always told you need to do it, but it is very easy to skip. It also does not affect you until you have a teeth cleaning or your server goes down. I did not have this problem as I had good backups that made moving my server to the cloud very easy.

The old server was using Dropbox as a poor man’s cloud sync. I did not run it as a daemon, I would let it run in the background. This was not a huge issue as I only rebooted the server once or twice a year. I decided to change that to syncing to S3.

Backup Scripts

Linux is really easy to backup. Most configs are text files, so all you have to do is tar them up. All that I backup is:

  • sql dump
  • tar of the folder that held the website
  • tar of etc
  • a list of installed packages (dpkg –get-selections)

Cron these up and you have a disaster ready backup plan.


S3cmd is tool that allows you to get and put data on S3. I know that Ubuntu 12.04 does have it in the repository, so installation is easy. Configuring it is a little more difficult. Well I should say that configuring S3 with the correct permissions is more difficult.

You can easily just use your Amazon AWS user and give it full permissions, but you should have a separate user. First thing is to create a user in IAM (AWS dentity and Access Management). Make sure you copy the Access Key ID and secret. If you do not copy this you cannot get the secret again. You can create a new one, but each user can only have two credentials (you can delete old ones).

This is where things get difficult. There is no documentation that clearly states what permissions you should add. Here are the permissions I finally went with that allows s3cmd to work:

  "Statement": [
      "Effect": "Allow",
      "Action": "s3:*",
      "Resource": ["arn:aws:s3:::[your bucket]/*"]
       "Effect": "Allow",
       "Action": "s3:ListAllMyBuckets",
       "Resource": "*",
       "Condition": {}

Please note that these are attached to the user. There are two permissions here. The first one allows the user to do everything inside of that bucket. The other allows the user to list all the buckets.

S3cmd will now work. You tell it to put your files on S3, s3cmd put s3://[bucket name]/file_name. That’s it. My current backups only cost me around $.10 a month. There’s really no reason to not be backing up your server to the cloud.

New Posts coming soon

I have fallen behind writing posts through Christmas and the new year. I plan on making more posts and github repos. I can tell by the traffic and starring/forking of my repos that these posts at least help a few people out there.  Now that I have stated my excuses, a saying I have heard is that excuses only make you feel better, is to lay out a road map of my next few posts. This will also force me to actually make good on these statements.

Some things I will get to:

  • Django – I have found a new framework that I like. Unfortunately that means that I will most likely not be doing many more Zend Framework posts.
  • Facebook – Because I have been using Django I had to write some code to access Facebook.
  • WebRTC – Otherwise known as webcam through javascript. I will have a post about how to use it.
  • Amazon Web Services – I recently moved my website to be hosted on EC2. I will make a post or two about that.

You can see that even though I have not been making posts, I have been busy.

To finish off the post here is a photo of my dog Gizmo: