Let's stop getting physical

Let's stop getting physical

Software-defined data centers are less reliant on people, and that’s a good thing

Written by Max Smolaks, News Editor at Datacenter Dynamics Published Thursday, 19 July 2018 08:04

The modern data center is increasingly being defined by software. Rather than deploying hardware with functionality permanently baked into silicon, more and more organizations are opting for cheaper, more flexible systems that can be configured on the spot.

In a perfectly executed software-defined data center (SDDC), adapting to changing conditions doesn’t require an expensive upgrade, just a few lines of code.

This journey started many years ago with the arrival of server virtualization – a method of abstracting resources like compute and memory, and redistributing them at will. The software-defined approach then seeped into other domains of IT; first storage, then networking. We’ve also seen promises of software-defined power and software-defined cooling.

In SDDC, all of these elements feed into data center infrastructure management software (DCIM) – yet another level of abstraction that collects and analyzes operational data from a myriad of sensors dotted around the facility to help operators make informed decisions.

This kind of approach is used extensively by the biggest Internet companies, such as Google and Amazon Web Services. Over the course of the past three or four years, the same technology has been filtering down to the world of enterprise computing, with impressive results.

The benefits of SDDC are many: since most of the features are delivered in software, software-defined equipment is usually cheaper, more configurable and enables a high degree of automation, which, in turn, helps cut costs and speed up response to external events.

But SDDC also requires the kind of expertise that was rarely seen on the data center floor: in the facilities of the future, engineers will have to work alongside programmers.

Humans aren’t all that great

One of the main aims of SDDC is to eliminate the need for physical contact with data center equipment. When people and machines interact, terrible things can happen: human error is still one of the biggest challenges to data center availability, responsible for 22 percent of all unplanned outages, according to a widely quoted study by the Ponemon Institute.

Making software responsible for configuration, monitoring and management of your kit minimizes the risk of accidents. Data centers are dangerous buildings that have killed people in the past, and we would all like to avoid this happening in the future.

Software is also much better at repetitive tasks like OS installation, updates and backup operations. And some of the emerging automated management techniques can actually consolidate workloads to ensure that the data center is only using the bare minimum of hardware, helping reduce electricity bills – even though this approach might have a negative impact on performance.

APIs rule

All of this wizardry wouldn’t be possible if the industry didn’t master application programming interfaces, or APIs. These sets of universal subroutine definitions, protocols, and other tools serve as the communications channel for software, linking disparate systems from various vendors into a seamless fabric.

Think about APIs as virtual patch cables – server hardware comes in all shapes and sizes, but since we standardize on Ethernet, almost any kind of device will have an exposed Ethernet jack to connect it to others. So it is with software: almost any application designed to send and receive data will have an exposed API to connect it to other applications.

Our robot overlords

I believe that the natural progression of SDDC will eventually take us to completely automated, ‘lights out’ facilities that will be serviced by robots, not people.

Automated tape libraries for long-term storage have already gone mainstream, and I’ve seen a robotic switch that can make new connections on-the-fly. Meanwhile, immersive liquid cooling has eliminated the need for complex cooling setups - instead of networks of pipes, chillers and CRACs, all you require is a tub of mineral oil.

Robots could eventually learn to replace servers and even entire racks, using many of the technologies developed for the shipping and logistics industry – expect Amazon to take the lead on this.

Do not be alarmed

Whether you are running your own data center or leasing space in a colocation facility, chances are you are already using some technologies that fall under the SDDC umbrella. There’s no need to rush and buy new hardware – this approach will likely become standard in time for your next technology refresh.