Cloud Convergence
By Michel Chabroux
Devices and the cloud are now becoming part of the same technology continuum. To realize this, there are three things that must be internalized. Let’s call them the three P’s:
● Perception… that the cloud is only useful in the IT world for tools, email, web searches, etc.
● Packaging... of software that is harmonized to handle the heterogeneous nature of a global infrastructure.
● Processes... that need to evolve and drive updated design considerations. Software in devices is traditionally very monolithic while the cloud approach is to think in terms of deployable functions. Software design patterns need to be adapted to this difference.
The cloud has transformed a lot of things in the last few years: how we access storage or compute, how we can perform transactions everywhere and anywhere from a browser or a mobile device, how we interact with businesses, etc. Need examples? Think about the pictures you take on your phone that get stored somewhere, fitness trackers, banking applications, Natural Language Processing (NLP) used to converse with you when you call a company’s toll-free number, and of course, Internet searches. From my vantage point, having worked for 15+ years with businesses making robots, planes, medical equipment, manufacturing controllers, and many other types of systems, the cloud is part of the critical infrastructure. Am I pushing the envelope a bit too far?
Consider this - critical infrastructure players are connecting their devices to get data out of them and offer new services, provide functional and security updates to the software, introduce net new features, or integrate them in a larger system. Devices are not islands anymore. They are part of a global cyber-physical system that spans hyperscale clouds to edge clouds to the electro-mechanical edge.
Are the devices actually moving to the cloud? I do not think so, nor would it make sense… at least today. One may argue that 5G is introducing high-bandwidth, low-latency connectivity. In actuality, this is still not enough when one expects a response time in the single-digit microsecond. So, something must be processing the data from the sensors in-situ. Agreed, a lot of the data will be pushed to the cloud for further processing but 75% will still be consumed, right there, at the edge.
Beyond connectivity and data, a key component of any system today is its software. It is obvious, for many who deploy software, that it is not something to fire and forget. Software needs to be managed from development, to deployment, to operation. Cloud technologies have driven a lot of innovation that make it easier to manage software. One such technology that has enabled a significant amount of scale is the [in]famous container.
Now, when one refers to a container, there is a lot that is unspoken and, many times, a lot of assumptions are made on all sides. To be clear, containers were introduced to solve specific problems. I find Google’s definition to be just perfect: ”Containers are lightweight packages of your application code together with dependencies such as specific versions of programming language runtimes and libraries required to run your software services.“
One of the most direct benefits from this technology is that it provides a solution to the software management conundrum of mission critical systems. Indeed, businesses are already using this technology on the IT side and have ironed out most of the kinks. What remains to be done is connect this last mile to create that end-to-end cloud-based infrastructure.
This may feel like the long and winding way to get to the point, but… that is why, in 2021, we introduced a container engine for our real-time operating system, VxWorks (a key component of our Studio experience) compliant with the Open Container Initiative (OCI) specifications. From one day to the next, one can now use existing infrastructures (e.g. Kubernetes), tools, processes, and workflows, and apply them to their intelligent devices, including those that are functionally safe. In other words, you can deploy and manage your VxWorks or Linux applications using the same constructs as any IT systems. (By the way, we are not saying that a real-time container for VxWorks runs on Linux or vice-versa.)
Why does it actually matter? Cloud convergence brings to bear savings by leveraging standards – the OCI specifications are open and managed by the CNCF, existing infrastructures, and most importantly human knowledge and experience. These changes are steps on the road to digital transformation.
The VxWorks container engine was designed from scratch to meet the stringent demands of mission critical software. It is compliant with the OCI image, runtime, and distribution specifications. It's been designed to be certified up to DO-178C DAL A and is less than 400 KB including all options.
In my next post, I will go over how Wind River will enable you to use a hyperscale cloud to drive digital feedback loop and increase your testing options.