Part of our modus operandi at AWS is to help customers lower the costs of operating their infrastructure. When we are able to achieve better efficiencies in our own datacentres, we pass those savings back on to our customers. That can come in the form of price reductions, or in a variety of payment models.
With Amazon EC2, for example, you can choose to request compute resources as and when you need them, and pay a fixed on-demand price per hour. For more predictable workflows, you can pay a small up-front price for reserved capacity. This allows us to plan our own operations more efficiently, and so we're able to pass those savings back to you in the form of a reduced hourly price. However, even with the combined reserved capacity and on-demand requests, there can still be unused capacity available in the AWS datacentres.
Rather than let this capacity go to waste, we offer it to customers at a variable price, on the EC2 Spot Market. You can name your own price for this under utilized capacity by placing bids, and achieve significant savings compared to on-demand or reserved capacity. Spot instances allow you to run time insensitive tasks at a lower cost, or to overclock your infrastructure, but adding more bang per unit cost.
To help illustrate how customers are using spot instances, I wanted to introduce you to some customers who have seen some steep cost reductions by integrated spot into their workflows. Today's willing volunteer is ecommerce data legend, TellApart.
TellApart reduced costs by 75%
"At TellApart, our goal is to build a big data platform that enables retailers to unlock the power of their customer data", said Julie Black, Director of Engineering at TellApart. "Every day we’re analyzing massive amounts of shopping data from hundreds of millions of users and building complex machine learning algorithms that empower retailers to identify their highest quality customers and deliver them perfectly targeted marketing."
So how do they use spot instances?
"At the core of our system is our data processing pipeline. It serves as the backbone of our machine learning system and produces thousands of business metrics consumed daily by our clients", continues Julie, "To make this all work, we use Amazon Elastic MapReduce to bring up a Hadoop cluster that can batch process log data generated by our distributed fleet of machines."
"We use Spot Instances for all Hadoop Task Nodes for the flexibility and cost effectiveness we require. Through simple bidding optimizations on the Spot market, we have been able to cut our data processing costs more than 75%".
Using spot instances with Hadoop and Amazon Elastic MapReduce is very straight forward: you can specify your bid price right in the AWS console when spinning up new clusters in just a few clicks. Easy money.
The Amazon EC2 Spotathon, 2012
If you'd like to shine a spotlight on what you do (or are planning to do) with the spot market, you'll be interested in the very first Spotathon. There's a grand prize of $2500 of AWS credit available to the winner who shows the most innovative, low cost use of spot instances in their application architecture.
That's around 43,000 compute hours of the new m3.xlarge instances at the current spot price.
You can register for the Spotathon today. Looking forward to seeing how you're using spot.