Is AWS Snowmobile the best route for getting your data in to the cloud?

Dec 12, 2016

Did you see the latest AWS release to hit the news last week? For anyone that missed the launch of the ‘AWS Snowmobile’ let me save you some effort by summarizing the story that features the latest Re:Invent from AWS to solve the issue of moving large volumes of data to the cloud though it would seem that getting data back from the Cloud remains an open subject.

The newly released ‘Snowmobile’ is a tamper-proof, climate controlled and waterproof truck, built as a data migration solution that can store up to 100 Petabytes of data. The idea is that you upload your data, which can then be driven by road to Amazon’s closest data centre to be uploaded to the Cloud.

Super Speed Data Migration?

Amazon’s Jeff Barr has stated: “Each Snowmobile includes a network cable connected to a high-speed switch capable of supporting one terabit per second of data transfer spread across multiple 40 gigabits per second connections,”

Continuing in a follow-on blog he clarified that, “Assuming that your existing network can transfer data at that rate, you can fill a Snowmobile in about 10 days.”

This has to be the best example yet (don’t forget last year’s RE:Invent Snowball) that shows how even the big guys are still trying to find a way to help overcome the challenges that moving ever-increasing volumes of data present to any enterprise that has a requirement to move it beyond it’s on-prem location and in to the cloud.

Let’s face it, a solution that can ensure the secure, super speed data migration is something that AWS really wants to own. However AWS, is this the best way to do it?

I’ll go to on explain why I find this mystifying in a minute, but for now, I want to talk a bit about the practicalities of the Snowmobile….

So how does the AWS Snowmobile work?

Firstly, I’m going to pick up on Mr Barr’s statement that this 18-wheeler truck is “capable of supporting one terabit per second of data transfer spread across multiple 40 gigabits per second connections.” Whilst this sounds very quick, we did a little calculation and to be able to achieve this upload speed you would actually need 23 (to be accurate 22.7), 40Gb per second connections. Now I don’t know about you but I can’t think there would be many businesses with an existing IT infrastructure with 23, 40Gb per second connections spare?

Secondly, imagine you did have these 23 connections and you are able to get your 100 PBs of data on to the truck in ‘record’ time. Now what? The truck needs to then make it’s way to the nearest Amazon Cloud depot where the data will need human intervention to be uploaded to the AWS cloud. Once it leaves your premises, your precious data is then in the hands of your friendly delivery driver, not to mention other careful road users, until it gets to its destination.

Thirdly, there are other considerations around risk, data change speeds, upload capabilities at every AWS datacenter…

Let’s consider what this means in practical terms .

1. You need to invest in or have access to 23 40Gb connections

2. The Snowmobile apparently consumes a whopping 370KW, just to power the cooling system, will that fit with your eco-responsibility plans or striving to become carbon neutral?.

3. Check you have that much power available to you, physical space in your car park etc..

4. In order to achieve the 1 Terabit a second quoted, you need to be moving 100 GB of data per second, (I confess to being a simpleton and have opted to calculate using 10 i.e. 1PB = 100,000,000GBs) then by my calculations you are looking at 11.57 days to upload your data into the 18 wheeler.

5. Then there is at least a couple of days for the drive, another 11.57 days to upload into AWS, so, by my calculations, approaching a month to move the full 100 PBs

Why am I mystified?

So now I’ve highlighted some of the practicalities of the AWS Snowmobile I want to get back to why I’ve been mystified about it since I read the news. To do this, I need to give you some background about my clients, for those that don’t know much about what I do.

I work with Enterprise businesses that all have a common problem. They are struggling to overcome some business critical challenges that are involved with moving large volumes of data, complicated by network latency and packet loss. They could be struggling to achieve a complete weekly backup required by industry regulation, or they could be living in fear that their inability to move data to where it needs to be quickly is leaving them at risk, should a disaster strike.

The fact is that most businesses need to be able to move, store and analyze data at speed to give them competitive advantage and mitigate business risk. But with data volumes getting larger by the week, traditional methods of moving data over a network are now not able to cope with the demand. In fact, they are buckling under the pressure. Everyone is looking for a new solution, including, it would seem, the mighty AWS.

What people don’t realize is that I have been working with the solution that does the exact job that the AWS Snowmobile was built to do, except there won’t be a need to take your data on a little road trip, nor will you need a 370KW cooling system.

If you take a look at some of other posts, I have been writing avidly for the last few months about transformation technology already out in the marketplace that can solve these data migration challenges. My purpose of writing these posts was to try to inform people about the fact that you shouldn’t operate in a vacuum when it comes to tried and tested technology. You need to be well-researched and make it your business to know what is changing, as innovative companies are emerging all the time that are creating game-changing solutions designed to fix some of the new (and old) challenges that businesses are facing.

Could exploiting the network be a better option?

The fact is, Bridgeworks – the company I work for – has technology that will do EXACTLY what the Snowmobile does, except for a few very important key differentiators.

• It can work within the existing network, accelerating to AWS today.

• If 100 PBs really is your goal then provide us with 23 x 40Gb connections to AWS and we will actually be MUCH quicker.

• Your data wouldn’t need to leave the network at all and given that you can encrypt at source, we will accelerate secured data at the same speeds as AWS quotes except that there is no road trip involved and no additional upload time, just secure data being fed straight to AWS.

Getting data into the Cloud is one thing but don’t you need to get it out too?

Jeff Barr suggests Snowmobiles are ideal for ageing data centres that still rely on racks full of disk and tape drives storing data. His idea that that by moving all that data onto current storage technologies via the cloud, enterprises will spending less time and money trying to squeeze additional performance out of ageing hardware.

That may be true, at Bridgeworks, our technology is all about accelerating data, getting your data not just IN but OUT of the Cloud as fast as you can feed us and your links will allow. The shocking fact is that most networks are only being used at less than 20% of their potential performance. The common misconception is that bigger pipes = faster speeds which just doesn’t hold up when the performance killer, latency is involved.

We use machine intelligence (yes, Artificial Intelligence!) to optimize the performance of the pipes so that you get massively accelerated data transfer speeds. Up to 200 times acceleration and even faster in some cases.

The net of this is to do the same transfer that Jeff Barr talked about in the press release about the ‘Snowmobile’, give Bridgeworks access to the 23 40Gb connections, we could complete the upload in 11.57 days, the difference being that it would be securely in the AWS Cloud. My suggestion for a leap forward would be bigger pipes into the AWS datacenters. It would solve this little problem, given that we already have the technology to exploit the links?

If you don’t believe me, a version of our technology specifically for AWS customers is in the marketplace for SAN data with IP Flows launching shortly. (https://aws.amazon.com/marketplace/pp/B0147AIBJU).

So before you get on board with the – in my opinion – “retro” option, of trucking your data to the cloud via the ‘Snowmobile’, my advice is to do a little digging about what else is out there because my guess is that the AWS Snowmobile is a short term solution to something that will quickly be superseded by technology that transforms and 100Gb pipes.

I would be interested to hear your views on the AWS Snowmobile?