Your Resource for All Things Apps, Ops, and Infrastructure

Serverless and Containers for Modern Applications – Not an Either/Or Choice

According to a recent study by AHEAD, 82% of enterprise application teams are focused on meeting the needs of the business faster. Because of the increased agility, flexibility, and rapid value to the business, many are getting there through modernizing applications. 

Before you can build modern applications, and modernize legacy apps, you must first determine how you’ll use modern cloud services to deliver these applications.  

When to Use Serverless vs. Containers 

Top of mind for any organization today is the concept of serverless and containerized applications. With these technologies comes all the important things every IT organization wants in terms of flexibility, compute on demand, scale, the list goes on. But making the decision of which services to use can be an impediment to delivering the end solution. 

As with any technology or solution, choosing between serverless or containers depends on the use case and the goal of the application or project. With serverless and containers technologies currently perceived as “cutting edge,” many organizations are looking to understand how they can adopt them to fit the needs of their technology stack—but where should you start? 

Applications 

From an applications perspective, containers and serverless are beneficial technologies because they lower costs (on-demand compute), scale (spin up / spin down), and make your team more efficient (automated build and release). The challenge is that a large majority of applications can’t run entirely on serverless functions because some processes need to be continually running or the application requirements can’t handle the time it takes for the serverless compute to spin-up. This doesn’t mean you can’t still utilize serverless for a portion of that application. When you have processes or functions that aren’t utilized often or are truly event-driven, serverless can be a great fit. In many cases, we can also think about public cloud-based containers filling a similar role to serverless where you aren’t burning compute capacity and can still see the cost and scalability benefits in a similar fashion. 

Data and Analytics 

Serverless and containers also play a role in data and analytics, especially in cases where you’re processing massive amounts of data with high performing servers. The ability to spin up hundreds of compute nodes to process data and then completely spin the environment down on public cloud services allows you to be more efficient with on-premises or public cloud compute capacity. In these cases, containers serve a function in providing complex compute nodes with multiple tools, frameworks, or software installed to help process data in the most effective manner. However, there are also cases where code may be written to process incoming data that will be analyzed at a later point in time, and the reception of that data isn’t consistent. An example is IoT where devices send data to a queue sporadically and the serverless compute must wait for an event to occur before processing that data. In these cases, event-driven serverless functions are a fit for serving that single purpose. 

Operations 

Many of the operational activities you perform are not continuous or always running. For example, if you want to spin down your virtual machines at 5 p.m. every day you’ll have a predetermined schedule and the operation will only run once per day. In theory, having a script or application running 24/7 that is executed only for a couple of minutes is quite the resource hog. In these scenarios, serverless will likely make much more sense because there is a specific event triggering that run. The same could be said for a similar scenario with the need to shut-down a PaaS service if a specific log ID is captured. These are great use cases for AWS Lambda, Azure Functions, or Google Cloud Functions. 

Adoption Challenges 

The biggest challenges we see in the adoption of serverless and containers technologies are the management of these services and the security overlay. With serverless, you have less control. You are only providing the code and as you spin up a new serverless node, you are provided a public-facing endpoint by default. Going out of the way to secure the code and place serverless nodes behind API gateways or application firewalls ensures standardization and security overlays. Container platforms allow an additional level of control and management, providing the ability to enforce standardization across all containers and applications. You can then secure the perimeter of the container platform and install agents to scan for any security flaws. 

In summary, containers and serverless are both advantageous technologies, both of which can play an essential role in modernizing applications. The question we should be asking ourselves is not an all-in if we should be using one versus the other, but when should we be using serverless versus containers.

Subscribe to AHEAD i/o for industry insights,

straight to your inbox.