AmazonWebservices_Logo.svgLaunched in 2006, Amazon Web Services (AWS) ‘owns’ the global market for cloud Infrastructure as a Service (IaaS). In my last post I described some of the steps we began with SaaS: Google Apps, Instructure Canvas, etc. In 2012-2013 we began the fun process of experimenting with moving some of our on-premises services to the cloud with AWS.

Our first experiment with AWS was for the 2012 launch of our community platform, ‘Spaces. We began using the Heroku Platform as a Service (PaaS) – with files stored in Amazon’s Simple Storage Service (S3). S3 stores the files that community members upload: PDF documents, images, etc. The cost is astounding – a whole 3.3 cents per GB per month! (Sydney pricing at 5/3/2015).

In those days AWS was yet to open Australian data centres. Our files were stored in US alongside the Heroku service (itself hosted on AWS).

The AWS Storage Gateway

2012 brought with it a few exciting moves for AWS + Newington. Two key developments:

  1. Amazon opened a Sydney data centre and
  2. They launched the Storage Gateway service.

The storage gateway is really incredible:

  • We provision a 32TB volume in AWS (private address space)
  • That volume gets presented as an iSCSI target
  • On campus we deploy an VM image that manages the gateway in either ‘gateway-cached’ or ‘gateway-stored’ modes, connecting to the remote volume and presenting it as a CIFS share.
  • We just pay for what we use.

Our initial use case was to create a ‘learning media’ drive – a CIFS share larger than the spare SAN storage we had available for the ‘write once; seldom read’ use we often have with campus videos and project work.

Outgrowing Heroku

Platform as a Service with Heroku was amazingly easy. Our Spaces app (Ruby on Rails + PostgreSQL). What we rapidly discovered, however, is that it was getting very expensive to keep scaling. We had growing expertise working with Amazon’s tools. In 2013 we steadily transitioned the service to Amazon. I’d love to do a full write-up, but in lieu of that, the quick summary of the service is:

  • Rails app + web server on the Elastic Computer Cloud (EC2) – initially a single m1.large instance
  • Redis for caching and job queue on Elasticache
  • Postgres with with automatic failover and snapshots on Relational Database Service (RDS)
  • Load balancing and other magic from the Elastic Load Balancer (ELB)

By late 2013 however it was clear that what we were seeing with AWS had the opportunity to be far more than a side project. With an appropriate strategy it looked like we could realistically speak of a time when the College would run few if any critical service on site. It was time to get down and dirty and do some planning.