Serverless computing becomes open source to meet the customer where they are
This content is provided by Red Hat.
Serverless computing has a time. While this has been around for several years, recent shifts from proprietary models to open source have gained momentum. Likewise, container standardization, especially with Kubernetes, has opened up new possibilities and new use cases, as well as fueled innovation.
“It’s really this iteration on that promise that’s been around for what seems like decades now, that is, if you outsource, say, to a cloud provider, you don’t necessarily have to to know or maintain or manage things like servers or databases, ”said John Osborne, chief North American public sector architect at Red Hat. “Two of the key characteristics of the serverless server are that the code is called on demand, usually when an event occurs, and the code can be reduced to zero when it is no longer needed. Essentially, you’ve offloaded some of your infrastructure to a public cloud platform or provider. “
The term serverless is a bit misleading. There are actually servers, of course you just don’t have to know or worry about them as they are owned and managed by the platform. Osborne likens it to the term wireless – because a laptop isn’t plugged into a wall, we call it wireless, even though the signal can travel 10,000 miles over a fiber-optic cable. The only part that is truly wireless is your living room, but it’s really the only part you need to worry about.
One of the main benefits of adopting serverless server is that it speeds time to market. There is no need to worry about procurement or installation, which also saves costs. Developers can just start writing code.
“It’s almost considered an easy little button, because you’re going to increase some of the speed for the developers, and just get the code into production a lot faster,” Osborne said. “In a lot of cases, you’re not necessarily concerned with managing servers, so you’re offloading some of the responsibility of whoever manages this serverless platform for you. If your vendor can manage their infrastructure with very high availability and reliability, you inherit it for your application as well. “
The main barrier to adoption so far is that proprietary solutions, while FedRAMP certified, have simply failed to meet customers where they are. These function-as-a-service platforms are primarily for new applications, Osborne said. But the public sector has many applications that cannot be simply rewritten. It also interrupts existing workflows and there is a high barrier in education.
Containers have now become the de facto mechanism for shipping software. It’s easy to package apps, even most older apps, in a container. Kubernetes will then do much of the heavy lifting for this container-based workload, such as application integrity and service discovery. And with Kubernetes, it will work anywhere: in a public cloud, on-premises, on the edge, or any variation of it. This makes Kubernetes an optimal choice for users who want to run serverless applications with more flexibility to run existing applications in any environment. Although Kubernetes itself is not a serverless platform, there has been a lot of innovation in this area, especially with the knative project which is essentially a serverless extension for Kubernetes.
“The idea is that you can run these types of serverless applications in any environment, so you’re not necessarily locked into what the public cloud has to offer, but wherever Kubernetes can operate, you can run without a server, ”Osborne said. “And because it runs containers, you can support legacy workloads and run them as well, which opens the door to the public sector for many use cases. Traditionally, public sector IT organizations have managed applications with scalability requirements by simply optimizing them for the worst case scenario. They would provide an infrastructure, usually virtual machines, to handle the highest peak and let those machines run 24/7. ”
Serverless can help relieve some of this pain; the application can start when necessary and come back down when it is not.
Osborne said he has seen use cases at some agencies where they receive a huge file – say a 100G data file – every day, so they have server capacity running all day just to process. this file. In other cases, he said he saw agencies buying complex and expensive ETL tools just to transform simple data sets. Both of these examples are good use cases for serverless. Since serverless is also event-based, it is ideal for DevSecOps initiatives. When new code is merged into a repository, it can trigger container rotation to handle testing, build, integrations, and more.
“Once you’ve taken the serverless route, you realize that there are a lot of ramifications associated with using existing tools and frameworks through workflows and architectural models. If you are using containers, this is just a much better way to meet you wherever you are in terms of tools and workflow like logging operations etc. Osborne said. “Open source is really where all the momentum is right now. It’s a big wave; I tell clients to take the lead as much as they can. At least start looking at that kind of development model. “