Sometimes two people find each other very attractive, but they are just plain incompatible. Alas, it will never be. Similarly, when intermingling technology, compatibility can again stand in the way. The massive barrier for the Internet of Things is the lack of standardized models, reported Popular Science last year: “The result is that the Internet of Things is actually hundreds of smaller, fractured Internets.”
In other words, the technology press was letting the technology industry know that it was not going to take the Internet of Things seriously until integration was simplified. The lack of a common platform or operating system for IoT devices represented a potential stumbling block for the tech segment that could damage its emerging profitability. After all, the field’s growth potential is staggering. BI Intelligence forecast in September 2014 that the number of connected “things” within the IoT will increase 374% over the next four years, growing from 1.9 billion (the current number) to 9 billion in 2018.
Given the massive expectations for the industry, various companies have released networking protocols designed to create a solid basis for all the many items connected to the Internet of Things. Google’s project is entitled Thread and promises to “securely and reliably connect hundreds of products around the home.” GE, Intel, and Qualcomm have all announced their own compatibility products as well.
Compatibility in the Cloud
The Internet of Things is not the only large, trendy, growing segment of the computing industry that has been suffering from incompatibility. Everyone understands that the principles behind distributed virtualization is strong for speed, efficiency, and affordability; but it’s for good reason that many people find the whole cloud idea “a bit nebulous,” wrote Vivienne Rojas of the International Standards Organization. Standards were lacking. Now, though, the playing field has changed (along with the IEEE standard mentioned below).
As of October 15, the ISO – which Data Center Knowledge calls “the world’s best known standards body” – has published, in partnership with the International Electrotechnical Commission, a pair of standards to better describe and delineate the parameters of typical cloud hosting environments. They were designed through coordinated collaboration of expertise that spanned 30+ nations, included advisors from the International Telecommunication Union, and were managed by a technical committee composed of representatives from the ISO and the IEC.
Standard #1 – ISO/IEC 17788: “Cloud computing – Overview and vocabulary”
One of the standards, ISO/IEC 17788, defines various categories of cloud systems, both by the type of service provided and by the level of security built into the environment. The elements of the SPI model – Software as a Service, Platform as a Service, and Infrastructure as a Service – are all described and delimited. The same is true for the various labels – private, public, hybrid, community – that describe the extent to which the cloud is isolated (as a whole or integrated parts) for security, compliance, and general access.
Standard #2 – ISO/IEC 17789: “Cloud computing – Reference architecture”
The other set of guidelines, rather than focusing on the broad terminology, centers on the more in-depth, engineering topic of reference architecture (template diagrams for deployment, along with general terms so that deployments are organized meaningfully). The standard includes sample infrastructural setups and information related to the functionality and tasks performed by various pieces of a cloud system, along with the way in which all aspects are interrelated.
Why These New Standards Matter
Rojas explains why these new standards are so important: companies have often had difficulty controlling and managing their cloud projects, resulting in an improvised multi-cloud scenario. Although distributed virtualization – especially in cases that utilize solid state drives – has been praised for its incredibly efficient, unprecedented performance (sometimes outdoing the world’s most advanced supercomputers), failure is possible and can require sophisticated problem-solving. These standards will allow companies that have enacted strong cloud policies (enhancing clarity, accounting accuracy, redundancy, and security) to prove their merits through a credible third-party.
Rojas notes that the growth of cloud hosting has been haphazard, which makes sense due to its enormous strengths: “By maximizing the effectiveness of shared resources, it achieves coherence and economies of scale, much like the electricity grid.” Because so many companies have realized the growth potential of this branch of technology, incompatibility has become a major concern.
Building on a Sustainable Model
Dr. Donald Deutsch, who heads the joint technical committee that created the parameters for the standards and is the VP of standards strategy and architecture for Oracle, noted that the cloud represents a technological seachange. He believes that the stipulations developed by his team “provide a sound foundation for follow-on standards,” hinting at the future release of ISO/IEC standards related to cloud security, service level agreements, data management, and other areas.
Data Center Knowledge also reported that the IEEE Standards Association recently released cloud standards as well, which we will cover in a future blog article. Obviously these standards will be helpful to the cloud industry, but for customers to benefit from their existence, they need to identify which providers believe enough in their systems to get third-party accreditations. Superb Internet is already certified to meet various standards, including ISO 9001:2008. Get your SSD cloud today.
By Kent Roberts