Lambda Cold Starts, What’s the Problem?

Lambda cold starts are a non-issue, or at least they should be

DPG Media
Level Up Coding

--

Written by Gert Leenders, AWS Cloud Solution Architect at DPG Media

I’m stating the obvious when I say that Lambda cold starts are a popular topic for discussion. DPG Media is no exception to that. I already spend quite some time on cold start discussions at DPG Media’s coffee corner. So why not write it down, right?

Quite often, I hear people complain about Lambda cold starts. Frequently this results in hours of discussion and a time-consuming quest for an answer. For me, Lambda cold starts are just a non-issue, or at least they should be. Allow me to explain why that is.

The Law of Large Numbers

Context is everything. If your Serverless application only has to deal with one customer every eight hours, then yes, a Lambda cold start can hit you hard. At least it’s not hard to imagine that it wouldn’t go by unnoticed. On the other hand, if you run a service that handles a thousand concurrent users on average, then most likely, a cold start will get lost in the crowd. I guess it won’t even be visible in your 95 percentile statistic. So please, bear in mind the context of your service when making architectural decisions in favor of Lambdas.

The Law of Instrument

Talking about the Law of Large Numbers brings us seamlessly to the second topic: The Law of Instrument.

Although I love to use Lambda, it’s not a Golden Hammer. If cold starts hurt you badly — especially in case of a low request count — then maybe Lambda is not the right tool for the job. I agree, most likely it is the cheapest option. But in defense of price, I would argue that your time is too valuable to invest in a project where the extra penny for an ECS Fargate Container or a t3a.nano EC2 instance is a deal-breaker. Furthermore, in a setup where the number of requests is low, there’s probably little need to scale. So we can leave the Lambda benefit of easy scaling out of the equation as well.

Photo by Todd Quackenbush on Unsplash

Cold Starts are entangled with Lambda

At this point, I already should have convinced you that cold starts are entangled with Lambda. As with every technology, there are both advantages and disadvantages tied to it. It’s the job of a Software Architect to consider every aspect of a technology and pick the best tool for the job. It’s a trade-off, Lambda offers you a lot of power and scalability at a very low price. However, this comes at the cost of a slower response now and then…

What about Provisioned Concurrency?

Since December 2019, it’s possible to enable provisioned Concurrency for Lambda. It certainly has added value for some use-cases. On the other hand, you will pay some extra using it, and if you cross the configured threshold, you will feel the same pain as before.

Even now, with provisioned concurrency available, cold starts are just part of Lambda. I still believe that cold starts should not have a noticeable impact on your application in most use-cases. If it does, you should figure out if the benefits (flexibility and easy scaling) outweigh the cold starts.

Choice of Programming Language

To wrap up, I want to spend an extra minute on the ever ongoing debate about the impact of a programming language regarding Lambda cold starts. What annoys me in this debate is the fact that it is beside the point.

In the cold start cases I’ve seen, the latency overhead added by the choice of language stands in great contrast to the latency added by the Lambda source code itself. One roundtrip to query AWS SSM Parameter Store already has a much bigger impact than the difference between the fastest and slowest language through Lambda cold starts.

--

--

We are the tech team behind the digital products of all DPG Media’s brands and internal apps!