Why the Operating System Matters

In IT today, we love to note that the infrastructure layer has become commoditized. This started with virtualization, as we could create many virtual machines within a single physical machine. Cloud has taken us further with a key value proposition of delivering cloud services on any standard server or virtualized environment, enabling easier scalability and faster service delivery, among other benefits.

This commoditization argument has moved to the infrastructure software layer, with some conjecturing that the operating system running on top of hardware, hypervisor or container is also easily exchangeable. This is especially true with open source OSes. Ubuntu, Debian, CentOS and the many other flavors of community Linux are sometimes seen as interchangeable. This particularly has been an argument in the OpenStack community.

The reality is that it does matter what operating system you use, and not all are created equal. Now, if you are a start-up, or doing a pilot project, or in a lab, then using any free operating system is probably fine. But if you are managing enterprise IT? Not so much.

When I joined Microsoft’s fledgling security products team in the early 2000’s, security and Microsoft were an oxymoron. This was primarily because of the number of critical vulnerabilities and outright exploits in and against the Windows operating system. While attacks on Windows continue, Microsoft significantly changed its resources and priorities around security, building an anti-malware and security response team, among other initiatives and improvements in and around the product. That happened because the company and its customers realized the foundational security of the OS was a requirement.

Linux has taken a bite out of the Windows server market, with Red Hat Enterprise Linux leading the way among those open source-based operating systems. Why? One reason is Red Hat’s commitment to making Linux more secure and focusing on security in subsequent releases. Of course, Red Hat delivers many other enterprise characteristics required like hardening, stability, support, hardware and software certifications, and easy scalability; but it is often the security capabilities of Red Hat Enterprise Linux that make enterprises choose it, whether as a Windows replacement or over other Linux flavors.

Foundationally, a secure OS enables secure coding practices, code hardening and patching of vulnerabilities. A secure OS also supports your security policies around access control, identity management, encryption and integration. Not only does it keep the bad guys out, it enables controlled access for your authorized users.

This is why a secure operating system needs to be at the foundation of OpenStack cloud services, containers, virtualization, application platforms, and other solutions across the IT stack. With the advent of containers, this need becomes even more critical, because while containers are ideal for scaling applications, particularly micro-services based apps, they are, in fact, an OS technology that inherits its security strengths and weaknesses. Therefore, what OS is running in the container matters, particularly for security.

I emphasize security, but the OS you use has many other implications across the stack. With OpenStack, for example, you need an OS to provide functionality to the many OpenStack services as well as to 3rd party driver certifications and support. The OS also directly impacts the performance of an OpenStack cloud.

Some organizations will continue to “roll their own” Linux, and if you have an IT team that can dedicate itself to the reliability and security of the operating system, then this may be a good option for you. In reality, few internal IT teams can have experts in operating systems, infrastructure, applications, management, and security, as well as the many other emerging technology areas under one roof. Your IT energy and investment should be directed to those areas that can truly impact the business and where you have your core competence.

Most IT is still struggling to modernize existing applications and infrastructure. The demand from the lines of business for new cloud-native applications in order to better compete in today’s digital economy is huge. The best place for IT to help the business compete is to deliver applications that meet this need quickly and cost-effectively.

One way to do that is by using open source tools and software. This is the power of open source. It can drive fast innovation. However, enterprises should have the choice of using more stable, supported versions of open source code. Open standards have emerged to help companies identify best practices and supported projects.

While open source code is behind the majority of new applications, cloud services, and big data solutions, enterprises can also choose vendor-supported versions of these projects, which bring a level of support and services typical of other “packaged” software products.

The key is balancing innovation with security. As I’ve often said, the CMO might be pushing IT for new applications or analytics, but at the end of the day, if there is a data breach or security issue, it will be the CIO whose butt is on the line. Achieving that balance is today’s CIO mandate.

Innovation is needed to give an organization a competitive edge. But security has also become a competitive differentiator. Embracing innovation without security can lead to a PR embarrassment at best and lost revenue or a destroyed business, at worst. Ask any of the companies who have recently suffered a severe data breach and had customer records stolen.

That cool new mobile app might be a high priority to get your users to shop online, but only if they can trust your brand to keep their personal information secure.

IT leaders need to look beyond the hype. They need to maintain security principles and understand when and where security cannot be compromised. One of those places is the operating system.