Containers have been around since the 1970s in one form or another, but in 2013 they exploded onto the IT landscape thanks to the launch of Docker. Since then, this “virtualization lite” technology has proliferated in corporate data centers and in the cloud, and now it’s hard to imagine how much of today’s cloud architecture could be built without them.
Such are the benefits of containers that it was perhaps predictable that they would become so popular. Gartner predicts that by 2022, more than 75% of global organizations will be running containerized applications in production, up from less than 30% in 2020.
But what was unexpected was the fact that containers would make such a good fit for large scale IoT deployments, and how they would, therefore, play a large role in the shaping of IoT development. To understand why, let’s take a closer look at what containers are, and what they do.
How Containers Work
Container technology is similar, or at least related, to virtualization technology. But one of the fundamental differences is that a container is far smaller, or “lightweight.” One reason for this is that while a virtual machine simulates a complete server, with an operating system and one or more applications installed, a container is just a runtime environment for an application. That means it includes the application, plus all its dependencies, binaries and configuration files needed to run it, but not an entire operating system. Instead it uses the operating system of the computer, which is hosting the container, and shares this operating system with all the other containers on the system.
Lighter Than Virtual Machines
Thanks to the fact that containers are so lightweight and don’t contain an entire operating system, this makes them highly portable, and it also means that more containers than virtual machines can be run on a single host.
This in turn makes it practical to split applications into many different microservices, each running in its own container. Splitting up applications into separate services has a number of benefits, including making each microservice independent from the others. That means that they can be modified or updated without having a knock-on effect on all the others.
Containers and IoT Development
IoT devices, usually sensors, tend to be deployed “in the field”, often in large numbers. They usually collect large amounts of data, and that data has to be sent somewhere. The devices also need to be controlled, so control messages have to reach the devices in the field, as well as firmware updates.
The obvious place from which to process the data and to control the devices is the cloud, but there’s a very big problem with this approach, as the people behind many early IoT deployments discovered to their cost. The problem is that it is hard to communicate reliably with large numbers of IoT devices from the cloud, because this involves some combination of SIMs, networking hardware, physical leased lines, broadband connections, cellular towers, and other networking technology. The chances of all of this working reliably for the majority of the time is vanishingly small.
Another problem is that where IoT data is needed in real time, sending it all to the cloud is impractical because of the latency that is necessarily introduced. The answer has been to place the data collection and processing functions, and the IoT device control functions, as close to the IoT devices themselves as possible, at the edge of the network.
So what we have ended up with is a scenario where an organization might have a large number of IoT devices deployed in many different locations, with the need to collect the data, analyze it, and also control these IoT devices from a number of different edge locations.
What’s immediately obvious is that these different functions could be offered from a single application, but they could more usefully be offered as discrete microservices: perhaps one for data collection, one for data processing, one for sending data back to the cloud for storage, one for controlling the IoT devices, one for updating their firmware to add new functionality or to close security flaws, and so on. There are a number of benefits of this type of approach, if each microservice is placed in its own container at the edge.
Container Benefits in IoT Architecture
One major benefit is added security. That’s because certain functions, such as data collection, could be classified as “read-only”, so the container tasked with this would need minimal privileges when it comes to its interaction with the IoT devices. By contrast, a container tasked with firmware updating would need more access privileges to the IoT devices.
Not only is this partitioning of functions consistent with security best practices, but it also makes it easier to secure the more privileged containers. That’s because they are smaller than a megalithic application, so their attack surfaces will be smaller. And also because, by their very nature, containers are independent of each other. That means (in theory, at least) that developers can update a container offering, say, data collection services, without worrying about how that will affect the container offering a firmware updating function.
The second benefit of containerization in IoT deployments is that container management systems make it very easy to check new versions of a container out of a repository and push them out to all the edge locations simultaneously. This can be particularly important if a vulnerability is discovered in one container running at all the edge locations. This can quickly be updated without affecting the other containers, and it can easily be pushed out to the edge locations because of the fact that containers are lightweight.
The Future of IoT and Containers
The use of containers has accelerated rapidly since 2013, and the same is true of IoT connections: 2020 marked the turning point where, for the first time, there were more IoT devices connected to networks than non-IoT devices such as smartphones, computers, and laptops, according to IoT Analytics.
What’s clear is that many of these billions of devices aren’t making connections to servers in the cloud. As the number of IoT deployments continues to rise, these new IoT devices will increasingly be connected to compute resources located closer by, and these resources will offer an increasingly varied and sophisticated array of IoT control and data handling functions — all running as microservices packaged in containers.