Scott Noteboom, CEO/Founder, Litbit
Considering the companies I have worked for, It is not uncommon for people to question me about the secrets, wonders and magic that occur behind the scenes of an Apple or Yahoo data center, or a stealth new startup company. Because loose lips sink ships, I have a duty to leave the details aside. However, not wanting to fully disappoint, I do like to paint a couple strokes of color that allows the curious to use their own imagination, to view our world as a modern equivalent of the Wonderful World of Oz. Besides, after years of working in these places, I have come to realize that we are hardly in Kansas anymore.
Our data center world is just as bright and majestic as Oz:
-We’ve got our own Sparkling Emerald City of servers, network and storage. We take great pride in modernizing its technology every 3 years or so.
-We’ve got a Yellow Brick Road of application code, whose steps of advancements progress monthly, or even faster-- via the latest in continuous delivery development cycles.
-We work hard to maintain a Golden Aura of Security that protects us from the dangers of the outside world, through fortification updates that can occur daily.
That said, shall we pay no mind to the man behind the curtain?
Just as in Oz, the bright aspects that are obvious (emerald city, yellow brick roads, and golden aura) can easily distract, hypnotize and/or blind us from the not so obvious gaps that exist behind the curtain. The Man Behind the Curtain (MBTC) represents those hidden gaps-- the biggest weaknesses, dependencies and vulnerabilities in an ecosystem. Unfortunately, just as in Oz, our data center world has its own MBTC:
The Man Behind the Curtain in the data center world consists of the industrial or “Deep Infrastructure” controls that automate the operation of our power, cooling and physical security systems. Without the controls provided by this MBTC, our data centers no longer shine. They turn to black.
A look behind the Deep Infrastructure controls curtain exposes an equivalent to the insecure, wrinkled old man, who controlled the levers and switches in the Land of Oz. Utilizing technologies that are multiple generations behind those that power the systems and software within the data center, our deep infrastructure controls are just as antiquated:
Non-existent encryption and device level access & permissions control (authentication, authorization and accounting) throughout Deep Infrastructure devices. Imagine running your servers without any username, password, ssh, encryption, etc.
Antiquated compute and data processing capabilities of the 8 or 16 bit control electronics, that are still in use today in these devices, are commonly 200x less powerful than even a modern $35 Raspberry Pi hobbyist board.
"Until our deep Infrastructure possesses the capability for continuous technology advancement, from both an electronics and software perspective, the true advantages of the modern, software defined data center cannot be realized"
In a world where automated, robust code distributions within the data center are often pushed bi-weekly or monthly, the legacy embedded code that powers our controls often remain static for years and, when deep infrastructure software upgrades are necessary, code distribution is typically delivered via truck.
From a communications perspective, many of the core protocols used in controls (Modbus, Bacnet, etc) are based on 1970’s and 1980’s technology. Servers of today that communicate via 10gig ethernet, can be powered or cooled by systems that still talk serially at 9600 baud.
In a nutshell, the systems that provide the most critical controls in our data centers, are currently the most vulnerable from a security and technology perspective. Compounding this challenge is the existence of long standing technical debt, driven by a lack of upgradability in the gear over its 15 year life-cycle. Until our Deep Infrastructure possesses the capability for continuous technology advancement, from both an electronics and software perspective, the true advantages of the modern, software defined data center cannot be realized.
Several important principles come to mind when we look towards the future of deep infrastructure:
1) Security- We must properly secure our control systems with end-to-end encryption and point level permissions management. All data stored or transmitted between systems should be zero knowledge. Left unfixed, current Deep Infrastructure security weaknesses hold very high potential for becoming globally catastrophic in nature.
2) Upgradability- Being that industrial deep infrastructure gear is commonly built to last for 15 years, the only way to avoid a greater than 15 year technology innovation cycle is to build-in effective upgrade capability for control electronics (blade based, which enables evolutionary tracking to both Moore’s law and price:performance sweet spots for CPU, memory and storage,) networking and software.
3) Software Defined- Once upgradable, an effective continuous software delivery mechanism must be in place. This mechanism will enable continuous software driven improvements to be realized in areas such as performance, reliability, security and efficiency. This shifts Deep Infrastructure from being human driven and software dependent, to being software defined and real-time data driven. This is where the majority of future improvements in cost, efficiency, security and usability will come.
It’s time for the antiquated Man Behind the Curtain to retire his role in driving the deep infrastructure controls of our data centers. It’s time to modernize this long-static legacy, before we encounter our own version of Dorothy’s tornado. And, when it comes to where we’d like to see realization of our goals for a modern software driven data center: there’s no place like home.