Bursting, Big Data and 'Baddies'

Written by Phil Alsop, Editor, DCS Europe Published 2016-02-18 08:00:00

Aside from visiting their girlfriends and/or lying on the bathroom floor/in their beds for much of the festive break, various sons played copious amounts of console games. While two were content with taking, respectively, Watford FC and Union Berlin (inspired by a recent studying visit to this city) to various national, European and seemingly global footballing successes, one was very much into victories of a different kind – the military type. Whereas the football was self-contained, in that it required no internet connection, the war games were heavily reliant on being online. Unfortunately, there were screams of anguish on several occasions when the server went down, and battle was suspended. Had the gaming company responsible for the service had access to extra server capacity via the Cloud, then there would have been no interruption to the online experience.

Of course, it may be that the company in question had taken the view that, as it had already banked my son’s annual subscription, if he and, presumably, some of his fellow gamers, were denied access to their games for a short while, what did it matter – he was unlikely/unable to go elsewhere to find the same game, and it would, therefore, have been an unnecessary ‘extra’ cost to ensure that all subscribing gamers had continual online games access. Whatever the reason for the loss of service, using a cloud to burst extra capacity is becoming an increasingly important tool in the IT/data centre departments’ armoury. The lack of reported Christmas sales-related website crashes suggests that more and more businesses are switching on to the bursting option, and this trend is likely to be a feature of the cloud landscape during 2016.

While bursting is simply adding extra capacity via the public cloud, the likeliest cloud trend for 2016 will be the accessing of new services and, in particular, Big Data applications. The cloud offers enterprise users the very real opportunity of renting infrastructure that they could not afford to purchase to run Big Data queries that could have a massive, positive impact on their organisations’ day to day strategy and profitability. While Big Data is likely to be the headline example of how the public cloud brings previously unaffordable IT and data centre infrastructure/applications within reach of virtually all end users over the coming months, there’s no doubt that there will be examples.

Perhaps the most interesting of these is security-as-a-service. With the issue of cyber security rarely away from the headlines, most enterprises are acutely aware of the need to improve the security of their IT/data centre assets. Accessing security technology via the cloud just might prove to be the optimum way of addressing the situation – a fixed, manageable monthly cost, with the service being updated and improved regularly and, most importantly, access to a level of security that could not be afforded if it had to be purchased outright.

So, plenty of new reasons to consider using more specialised technologies and services offered by colocation/public cloud/managed service providers, alongside the already well understood and used ‘non-essential, lowest common denominator’ options, such as email and database management.

However, there remains a significant obstacle to any significant disruption to the generally accepted hybrid cloud model of the present time – security. Perhaps slightly confusing, given the earlier stated positivity around security-as-a-service, but, for many end users, IT and data centre infrastructure security remains a major concern. Logical or not, the thought process is that information contained within the ‘four walls’ of an enterprise is much safer than any data hosted in the cloud. The chances are that any particular cloud/managed service provider has a rather better security infrastructure than the average enterprise – so security should not be an issue. However, there’s no denying that the ‘baddies’ enjoy nothing more than targeting blue chip, household names – and if your data happens to be residing in a high profile cloud, it could well be at greater risk than if it lies anonymously in your own data centre.

There’s no easy solution to this dilemma, other than carrying out a rigorous risk assessment before making any decision concerning the balance of your public/private/hybrid cloud strategy, and the growing recognition and acceptance that, wherever your data resides, sooner or later it will be compromised, so it’s more about how you react to the inevitable rather than to try and defend the impossible. That said, by going the colocation/service provider route, end users are giving themselves the best chance of minimising security issues, as they will have access to a range of security services –  including DDoS mitigation, intrusion detection management, managed security monitoring, penetration testing/vulnerability assessments and compliance advice  -  that are unlikely to be available to the same level in-house.

We can end with one last ‘b’ – balance. It might not be an amazing insight to suggest that the vast majority of end users have already recognised that the hybrid cloud model – balancing private and public infrastructure and applications – is the only sensible option. However, it will be surprising if this balance doesn’t tilt more in favour of the public cloud over the coming months.