The Spectre of Latency

Dec 04, 2015

The latest Bond villain has more than 007 to worry about

With an action packed James Bond blockbuster, we have come to expect farfetched fantasy when it comes to technology. From invisible and missile launching cars to the weapon-enabled Omega watch, Mr Bond has been wowing us with his futuristic spy tech since he hit the silver screen.

Mr Bond always has villains concocting equally farfetched methods of taking over the world. True to form, Spectre follows this tried and tested format of 007, foiling the sinister motives of a rogue,omnipotent organisation intent on gathering intelligence data from nine countries.

With the ever-growing importance of data in our society today, this plot is particularly relevant for the data age in which we are living, and of course touches on the uncomfortable themes of data sharing and its security out in the world. Leaving those emotive subjects behind, today’s truth is that the data generated from minute-to-minute, in locations all over the world, has considerable value and is fast becoming the currency of the future.

[Warning, spoiler alert!]



As portrayed by Spectre’s Bond villain – Ernst Stavro Blofeld (convincing performance by Christoph Waltz) – the true potential and maximum value of data is only realised when it can be managed, analysed and assimilated at speed. In the case of Spectre’s devious mission, the key was to quickly collate data from nine different sources from across the globe, forming part of a wider plan to take complete control over the most powerful nations.

The success of such a strategy is directly related to how fast data can get from source, to its final destination. In this case, that destination is a fictional high tech data analysis centre in the middle of the desert that is thousands of miles away from the locations where data is initially being generated. In order to succeed, Mr Blofeld would want the data to be absolutely current. With the huge volumes involved, in combination with the distance the data is travelling, he is inevitably going to experience some serious delays unless he has thought carefully about his approach to moving the data.

Evil plot or not, this requirement of getting data to where it needs to be, fast, is no different to the real challenge that faces companies big and small today.

Moving volumes of data is completely at the mercy of a hidden and unchanging villain, which has become a massive inhibitor for important business functions that rely on getting data to where it needs to be, as quickly as possible. That villain’s name is Latency and its specialty is delay. Whether it’s disaster recovery, cloud, Big Data, backup, business continuity, replication or data migration strategies, all should be planned to work efficiently around the limitations that this delay presents.

The reality is that as the volume of data increases, the more people are struggling to get performance from their IT infrastructure and the slower the movement of data becomes.

Continuing with Mr Bond, we suspect that if Spectre was built with today’s technology that organizations are struggling with, it would be at the least days, but more likely weeks old, before they were able to start working with the data. In other words, out of date and useless to Blofeld’s evil plot.

Back in the real world, when it comes to moving big volumes of data, companies simply haven’t evolved and are still architecting around the problem of latency. In other words, the industry has historically struggled to find an ingenious Q-style gadget that turbo-charges data from place to place. Instead it has been throwing bigger IT infrastructure at the problem – at massive additional cost – in the hope that this will win the battle with this spectre of latency (see what we did there!)

The problem is… and we’re sure James Bond would testify… having a bigger gun doesn’t necessarily mean that you are a more effective agent. ‘Intelligence and Smarts’ are the words here and in the case of data movement, just applying bigger pipes to overcome the effects of latency doesn’t mean you fixed the problem. Quite easily you could have become less efficient, without making any impact on the velocity.

Indulge me. Blofeld is smiling with an evil glint in his eye, flicking the switch on his multi-billion dollar data intelligence plan, rather than a deluge of data, he gets a trickle and the plan is foiled by his inability to plan for the last critical stage – getting the data to where he needs it. Austin Powers gold dust!

Despite spending millions on data strategies, this is a common scenario. For the unfortunate Blofeld, a lack of planning around the latency issue would mean that James had days (even weeks) of time to locate Spectre’s underground Data Centre and destroy it before any self-respecting villain could even think of executing their devilish plan.

OK so I realise that we have shamelessly elaborated on the latest Bond plot to illustrate a very common problem. However in the real world, we speak to CEO’s, CIO’s and analysts every day who all have stories relating to how debilitating latency can be to business.

Our SCION PORTrockIT and WANrockIT technology was built with this problem front and centre, designed to radically improve performance of existing IT infrastructures and to accelerate data movement. Bridgeworks has shaken (not stirred) the problem, embedding intelligence for optimization and acceleration to deliver up to 100 times (and beyond) improvement in performance Disruptive innovation that even Q would back to paralyze the latency issue.

That has got to be worthy of a pat on the back from M himself, right?!



Bridgeworks CEO features in this recent article from Forbes, about why data driven chatbots are seeking security in new technologies.

Read more

By using our our website you agree to our use of cookies and the terms and conditions outlined in our Privacy Policy and Cookie Policy. If you disagree with our use of cookies and personal data you should leave our website immediately.