The Strong Case for Interoperability, Part 1

What is interoperability and how did it get started? The idea of interoperability was born in the world of manufacturing. If you think back to before the industrial revolution, machines and inventions were created as one-off systems. Whether you were making weapons, building houses, or fabricating clothes, things took a long time to create and repair because they were essentially made from scratch. Eventually, everyone began implementing concepts such as interchangeable parts and the assembly line to make manufacturing easier. As these practices caught on, it made sense to not only allow for parts to work among products, but also allow for letting different products to all work together. The ability for equipment or systems to work together is the definition of interoperability.

Concepts are great, but how are they actually used?

Let’s stick with non-IT uses right now. The easiest example of interoperability is your car. There are hundreds of car manufacturers in the world making thousands of different models. Similarly, there are hundreds of oil companies and refineries. Not to mention, there are thousands of gas stations. So many different companies are involved in the process just so you can go and fill your car with gasoline. Chances are that you can go to any gas station and the pump will work with your car. Why? Because these companies all worked together to be interoperable. They decided on a standard that made it easier for the consumer and grew the industry.

The automobile industry is just one example. The medical, military, and manufacturing industries are all big proponents of interoperability. Can you imagine if hospital equipment was different from hospital to hospital? Room to room?

What does interoperability mean in IT?

The term interoperability has been in the computer science lexicon for a long time. It has always been about allowing systems to work together but has steadily evolved in the industry over time. In the early days of computers, code was compiled onto specific hardware. Moving it from one computer to another took effort. As the field evolved, high-level languages were created that could work on multiple platforms. Then, networks were created which allowed computers to talk to each other. Standards started to form as different companies tried to work with their partners but they were often still proprietary to a single large vendor such as IBM or Microsoft.

Finally, a more modern form of interoperability developed in the IT world: open standards. Groups dedicated to the formation and evolution of standards formed such as ISO, IETF, OASIS, and IEEE. Companies and individuals began to see the value in working together to become more interoperable and advance their industries.

That’s all good, but why is it not a higher priority in IT software and cyber security?

Given that the concept of interoperability has been around for a long time, we should have this figured out by now, right? Well, yes and no. There are two issues that most companies struggle to deal with when it comes to interoperability:


A hacker is trying to get information that they wouldn’t normally have the right to access. This causes many companies and creators of computer software to be paranoid. The thought process seems to be “If I reveal the least information possible about my system, then it is more secure.” This theory comes mostly from business and the protection of trade secrets. Some companies still believe in this theory, but most security experts will tell you it’s dead wrong. The modern theory is that the more open my system is to inspection by people, the more flaws will be found and given to me, resulting in more things I can fix to make my system better. In this way, embracing open standards and interoperability is actually more secure.


If a company is interoperable, chances are another company offers the same service or product. This causes some companies to rebel against interoperability and instead lock their customers into a proprietary solution. Why have competition if you can avoid it? Almost any experienced IT manager or engineer has stories about a system that they simply cannot remove because switching to an alternative platform is just too much effort. These sorts of systems frequently become stagnant, opting not to update and add value for customers. This practice works for companies in the short term, but as they build a customer base, users tend to get frustrated with the lack of options down the road.

What about Fornetix?

The unofficial motto of Fornetix is “In Standards We Trust.” We truly believe that the more interoperable our products are, the better. To that effect, we perform interoperability testing with the Key Management Interoperability Protocol (KMIP) community every year. You can view our consistently strong results here. We help develop and support those standards by actively engaging in the KMIP and PKCS #11 technical committees. The end result is a better value to our customers and to the cyber security industry as a whole.

Continued: The Strong Case for Interoperability – Part 2 – Transitions