Recent AWS Customer Success Stories & Videos

More AWS Customer Success Stories...

« Pig Latin - High Level Data Processing with Elastic MapReduce | Main | SpringSource Cloud Foundry - Enterprise Java in the Cloud »


TrackBack URL for this entry:

Listed below are links to weblogs that reference Adding the Export to AWS Import/Export:


Feed You can follow this conversation by subscribing to the comment feed for this post.


This is great - it closes the loop.

How about exporting a snapshot of an EBS volume?

Rich Bruklis

I have a quick question about the Export feature for Disaster Recovery. How does one retrieve data?

If I have to send you a tape cartridge or hard drive or other media, do you copy it back to that device and send it overnight?

If that is the method and let's say it is 5 years later, that media might be totally obsolete (i.e. a tape cartridge that has been upgraded or completely retired).

Just curious. I am thinking about DR in terms of RTO and RPO.


Colin Percival

When will you "close the loop" and integrate AWS Export with the retail side of so that I can order a drive and have it arrive filled with my data rather than needing to ship it back and forth three times? For anyone who wants their data quickly -- especially once you add support for international users -- this could be a major benefit.


One obvious use would be to keep offsite backups of data stored in S3.

And of course disaster recovery in extreme cases (using S3 as an offsite backup), but for that purpose having the option to buy a drive from Amazon instead of shipping your own might be better since it can save several days.

Mark McAndrew

Dear Jeff,

If AWS can make profitable use of several million volunteered home PCs (assumed-insecure nodes, of course, but incredibly cheap), then you might like to take a quick look at this.

The Charity Engine launches later this year and, with the backing of ten international charities, should quickly become the largest volunteer grid in the world. All it needs now are the founding customers.

Obviously, highly sensitive apps will never be suitable. Public-funded 'big science' projects (climate change, clean energy, particle physics, et al) are ideal, but of greater relevance here is The Charity Engine's massive public cloud becoming a powerful new option for AWS customers.

As The Charity Engine has negligible overheads, it can provide CPU/hours for 'less than cost' (ie. below the price of the electricity they consumed). It will therefore be stunningly cheap to use, at around 1c per CPU/hour or even less. A 1000 CPU/hour job for $10, a 1m CPU/hour job for $10k. These are order-of-magnitude cost savings.

If there is indeed a market for truly vast amounts of dirt-cheap, embarrassingly parallel processing and/or VERY distributed storage, then we are about to witness the ultimate evolution of the cloud paradigm - a world wide computer.

If AWS would like to be involved, we would welcome the opportunity to meet up and discuss the matter in more detail.

Best regards,

Mark McAndrew
The Charity Engine

PS. Please forgive us contacting you via your blog like this. It is because we've found many spam filters reject the word 'charity'. A sad state of affairs, I'm sure you'll agree.


Congratulations! Our selection committee compiled an exclusive list of the Top 100 cloud computing Blogs, and yours was included! Check it out at

You can claim your Top 100 Blogs Award Badge at


Steven Roussey

If data could fit on a BlueRay disc, that would be a nice option.


How about subscribing to periodic exports on to disk. As suggested here; new harddrives, blueray, whatever works.

The use-case here would be physical backups.

The comments to this entry are closed.

Featured Events

The AWS Report

Brought to You By

Jeff Barr (@jeffbarr):

Jinesh Varia (@jinman):

Email Subscription

Enter your email address:

Delivered by FeedBurner

April 2014

Sun Mon Tue Wed Thu Fri Sat
    1 2 3 4 5
6 7 8 9 10 11 12
13 14 15 16 17 18 19
20 21 22 23 24 25 26
27 28 29 30