Powertools is a developer toolkit to implement Serverless best practices and increase developer velocity.
In this minor release we are adding support for two new environments variables to configure the log level in Logger.
You can now configure the log level of for Logger using two new environment variables: AWS_LAMBDA_LOG_LEVEL
and POWERTOOLS_LOG_LEVEL
. The new environment variables will work along the existing LOG_LEVEL
variable that is now considered legacy and will be removed in the future.
Setting the log level now follows this order:
AWS_LAMBDA_LOG_LEVEL
environment variablelogLevel
constructor option, or by calling the logger.setLogLevel()
methodPOWERTOOLS_LOG_LEVEL
environment variableWe have also added a new section to the docs to highlight the new behavior.
AWS_LAMBDA_LOG_LEVEL
and POWERTOOLS_LOG_LEVEL
(#1795) by @dreamorosiAWS_LAMBDA_LOG_LEVEL
and POWERTOOLS_LOG_LEVEL
(#1795) by @dreamorosi@dreamorosi
This release brings support for the new Node.js 20 AWS Lambda managed runtime as well as tweaking how the Metrics utility emits logs under the hood.
With this release we are excited to announce that Powertools for AWS Lambda (TypeScript) is compatible with the nodejs20.x
AWS Lambda managed runtime ๐.
The toolkit and our public Lambda Layers are both compatible with the new runtime and no code change should be required on your part.
The Metrics utility emits logs using the Embedded Metric Format (EMF). Prior to this release, the logs were emitted using the global console
object. This makes it so that in addition to the payload of the log, AWS Lambda adds the request id and timestamp of the log.
For most customers, and especially those who consume the metrics in Amazon CloudWatch, this is fine as CloudWatch is able to parse the EMF content and create custom metrics. For customers who instead want to send the metrics to third party observability providers the presence of these strings means having an extra parsing step.
To support these use cases, and to align with the behavior of the Logger utility, the Metrics utility now uses a dedicated instance of the Console
object, which allows it to emit only the content of EMF metric. Just like for the Logger, this behavior can be reverted for development environments by setting the POWERTOOLS_DEV
environment variable to a truthy value (i.e. true
, yes
, 1
, on
, etc.).
When POWERTOOLS_DEV
is enabled, the Metrics utility reverts to using the global console
object. This allows customers to place mock and spies and optionally override the implementation for testing purposes.
@dreamorosi
In this patch release we are fixing a bug that affected the Metrics utility.
When using the utility you can set default dimensions that will be added to every metric emitted by your application.
Before this release, when setting a dimension using an existing key, the emitted EMF blob would contain duplicate keys. This release fixes the bug and makes sure that keys are deduplicated correctly.
Additionally we have also improved our Gitpod configuration which should make it easier for contributors to get up and running.
@am29d, @dreamorosi
In this patch release we have improved the logic around creating AWS SDK clients in the Parameters & Idempotency utility, as well as improving our documentation to include sections dedicated to how to contribute and how we manage the project.
Both the Idempotency utility and the Parameters one allow you to bring your own AWS SDK to interact with AWS APIs. This is useful when there's a need to customize the SDK client or share an existing one already used in other parts of the function. Prior to this release, both utilities were instantiating a new AWS SDK client by default, only to then replace it with the customer provided one. In these cases, we were needlessly instantiating a client leading to wasted cycles.
Starting from this version both utilities include a refactored logic that instantiate a new SDK client only when a valid one is not provided by the customer. This way customers bringing their own client don't have to pay the performance hit of instantiating multiple clients.
As part of this release we have also added a new section to our documentation dedicated to explain our processes. The section includes our roadmap, the maintainers' handbook, and a few sections dedicated to contributing. These sections are designed to be a companion to the contributing guidelines, which we also refreshed to make them more focused, and provide a deeper look around specific areas like setting your development environment, finding you first contribution, project's conventions, and testing.
removeComments
to false
in tsconfig.json
(#1754) by @dreamorosi@aws-sdk/*
dev dependencies (#1771) by @dreamorosiaws-xray-sdk-core
to latest (#1769) by @dreamorosi@babel/traverse
from 7.22.19 to 7.23.2 (#1748) by @dependabot@dreamorosi
This release brings all the generally available utilities to the Lambda Layers, improves the Idempotency utility with the addition of a new @idempotent
class method decorator, and makes Tracer more reliable.
Starting from version 21, which corresponds to this release, of our Lambda Layer includes the Idempotency, Parameters, and Batch Processing utilities. The layer comes with its own reduced copy of AWS SDK v3 for JavaScript clients so you can easily attach it to Lambda functions running on Node.js 16 without having to bundle the SDK.
The layers are available in most commercial AWS Regions, go here to learn more about how to use them and find the ARN for your region.
If you use decorators you can now make your class methods idempotent thanks to the new @idempotent
decorator.
You can use the decorator on your Lambda handler, like shown in the image above, or on any method that returns a response. This is useful when you want to make a specific part of your code idempotent, for example when your Lambda handler performs multiple side effects and you only want to make part of it safe to retry.
When segments generated by your code are about to be sent to the AWS X-Ray daemon, the AWS X-Ray SDK for Node.js serializes them into a JSON string. If the segment contains exotic objects like BigInt
, Set
, or Map
in the metadata the serialization can throw an error because the native JSON.stringify()
function doesn't know how to handle these objects.
To guard against this type of runtime errors we have wrapped within try/catch
logic the branches of code in Tracer where this issue could happen. Now, when an error gets thrown during the serialization of a segment within Tracer, we will catch it and log a warning instead.
We are also working with the X-Ray team to add a replacer
function to the serialization logic directly in the X-Ray SDK so that the issue can be better mitigated.
Congratulations to @HaaLeo and @KhurramJalil for having your first contribution to the project, thank you for helping make Powertools better for everyone ๐
Note We have officially started working on the next major version of Powertools for AWS (TypeScript) ๐ฅ We have published a Request For Comment (RFC) that details most of the changes that we have planned and in the coming weeks we'll work on an upgrade guide. We would love to hear what you think about our plan and hear any concern you might have.
arm64
to integration test matrix (#1720) by @dreamorosi@HaaLeo, @KhurramJalil, @am29d, @dreamorosi
In this minor release we have removed the upper bound for middy/core
4.x in our peerDependency
. We only support middy 3.x and in the last release we added peerDependency
section to make it explicit. But some of Powertools user had already middy 4.x in their dependencies and the recent change broke their builds. We have now removed the upper bound to give you the freedom to use middy/core 4.x though we do not support it yet. With more recent requests we will address this issue soon to bring middy 4.x support earlier than we anticipated.
@am29d, @dreamorosi, Alexander Melnyk, Alexander Schueren and Release bot[bot]
In this release we are excited to announce the General Availability of two utilities: Idempotency and Batch.
Warning Breaking Change
We have introduced a breaking change in the batch utility. We have initially followed the python implementation for this utility and after multiple reviews we have realised that the choice of async/sync processors is different in NodeJS ecosystem compared to Python. Async functions are often the preferred choice of synchronous functions and thus we renamed AsyncBatchProcessor
to BatchProcessor
, making it also a default choice. When you need to process a batch synchronously (i.e. SQS Fifo) use the explicit BatchProcessorSync
processor. We have added a dedicated section in the documentation to clarify the implications and when to pick the right processor.
We have improved the docs on the customer feedback we have received over the month. You have now more details how to implement your own persistence store. In addition, Batch and Idempotency are often used together, so we added a section how to integrate Batch into Idempotency, here is an example:
import {
BatchProcessor,
EventType,
processPartialResponse,
} from '@aws-lambda-powertools/batch';
import type {
Context,
SQSBatchResponse,
SQSEvent,
SQSRecord,
} from 'aws-lambda';
import { DynamoDBPersistenceLayer } from '@aws-lambda-powertools/idempotency/dynamodb';
import {
IdempotencyConfig,
makeIdempotent,
} from '@aws-lambda-powertools/idempotency';
const processor = new BatchProcessor(EventType.SQS);
const dynamoDBPersistence = new DynamoDBPersistenceLayer({
tableName: 'idempotencyTable',
});
const idempotencyConfig = new IdempotencyConfig({
eventKeyJmesPath: 'messageId',
});
const processIdempotently = makeIdempotent(
async (_record: SQSRecord) => {
// process your event
},
{
persistenceStore: dynamoDBPersistence,
config: idempotencyConfig,
}
);
export const handler = async (
event: SQSEvent,
context: Context
): Promise<SQSBatchResponse> => {
idempotencyConfig.registerLambdaContext(context);
return processPartialResponse(event, processIdempotently, processor, {
context,
});
};
Welcome TelAviv! We have added il-central-1
to our regions and the layer is now available in this region. Of course, we bumped the version to the same number, so you only have to change the region in your ARN:
arn:aws:lambda:il-central-1:094274105915:layer:AWSLambdaPowertoolsTypeScript:19
A big thanks again to @erikayao93 for the work on the Batch Processing utility that goes GA in this release, we appreciate your work ๐
aws-actions/configure-aws-credentials
action (#1663) by @dreamorosipeerDependencies
(#1685) by @dreamorositsconfig
files (#1667) by @dreamorosipeerDependencies
(#1685) by @dreamorositsconfig
files (#1667) by @dreamorosi@am29d, @dreamorosi, @erikayao93, @sthulb and Release bot[bot]
This release brings another new utility to Powertools for AWS Lambda (TypeScript): introducing the Batch Processing utility โจ The release also improves the Logger utility, which can now include the cause
field in error logs.
Warning This utility is currently released as beta developer preview and is intended strictly for feedback and testing purposes and not for production workloads. The version and all future versions tagged with the
-beta
suffix should be treated as not stable. Up until before the General Availability release we might introduce significant breaking changes and improvements in response to customers feedback.
The batch processing utility handles partial failures when processing batches from Amazon SQS, Amazon Kinesis Data Streams, and Amazon DynamoDB Streams.
When using SQS, Kinesis Data Streams, or DynamoDB Streams as a Lambda event source, your Lambda functions are triggered with a batch of messages.
If your function fails to process any message from the batch, the entire batch returns to your queue or stream. This same batch is then retried until either condition happens first: a) your Lambda function returns a successful response, b) record reaches maximum retry attempts, or c) when records expire.
With this utility, batch records are processed individually โ only messages that failed to be processed return to the queue or stream for a further retry.
To get started, install the utility by running:
npm install @aws-lambda-powertools/batch
Then, define a record handler function:
This function will be called by the Batch Processing utility for each record in the batch. If the function throws an error, the record will be marked as failed and reported once the main handler returns.
Record handlers can be both synchronous and asynchronous, in the latter case the utility will process all the records of your batch in concurrently. To learn more about when it's safe to use async handlers, check the dedicated section in our docs.
When using SQS as a Lambda event source, you can specify the EventType.SQS
to process the records. The response will be a SQSBatchResponse
which contains a list of items that failed to be processed.
To learn more about this mode, as well as how to process SQS FIFO queues, check the docs.
When using Kinesis Data Streams as a Lambda event source, you can specify the EventType.KinesisDataStreams
to process the records. The response will be a KinesisStreamBatchResponse
which contains a list of items that failed to be processed.
Learn more on the docs.
When using DynamoDB Streams as a Lambda event source, you can use the BatchProcessor with the EventType.DynamoDBStreams
to process the records. The response will be a DynamoDBBatchResponse
which contains a list of items that failed to be processed.
Check the docs to learn more about this processor.
Starting from this release, when logging an error with the logger.error()
method, the Logger utility will include the cause
field as part of the JSON-formatted log entry:
The cause
field is available in the Error
class starting from Node.js v16.9.0 and allows to specify the error that caused the one being thrown. This is useful when you are catching an error and throwing your own, but still want to preserve the original cause of the error.
Congratulations and a big thank you to @erikayao93 for the work on the new Batch Processing utility ๐
cause
field to formatted error (#1617) by @dreamorosi@am29d, @dreamorosi, @erikayao93
In this release we are excited to announce the developer beta for the new idempotency utility ๐. This new utility allows you to make your functions idempotent so that multiple invocations will return the same result without side effects.
Warning This utility is currently released as beta developer preview and is intended strictly for feedback and testing purposes and not for production workloads. The version and all future versions tagged with the
-beta
suffix should be treated as not stable. Up until before the General Availability release we might introduce significant breaking changes and improvements in response to customers feedback.
Before getting started, you need to create a persistent storage layer where the idempotency utility can store its state - your lambda functions will need read and write access to it. As of now, Amazon DynamoDB is the only supported persistent storage layer, so you'll need to create a table first. Check the documentation for more information on Persistence layer. Then you configure the persistence layer with your table:
import { DynamoDBPersistenceLayer } from '@aws-lambda-powertools/idempotency/dynamodb';
const persistenceStore = new DynamoDBPersistenceLayer({
tableName: 'idempotencyTableName', // <-- create this table before
});
You can quickly start by initializing theย DynamoDBPersistenceLayer
ย class and using it with theย makeIdempotent
ย function wrapper on your Lambda handler.
import { randomUUID } from 'node:crypto';
import { makeIdempotent } from '@aws-lambda-powertools/idempotency';
import { DynamoDBPersistenceLayer } from '@aws-lambda-powertools/idempotency/dynamodb';
import type { Context } from 'aws-lambda';
import type { Request, Response, SubscriptionResult } from './types';
const persistenceStore = new DynamoDBPersistenceLayer({
tableName: 'idempotencyTableName',
});
const createSubscriptionPayment = async (
event: Request
): Promise<SubscriptionResult> => {
// ... create payment
return {
id: randomUUID(),
productId: event.productId,
};
};
export const handler = makeIdempotent(
async (event: Request, _context: Context): Promise<Response> => {
try {
const payment = await createSubscriptionPayment(event);
return {
paymentId: payment.id,
message: 'success',
statusCode: 200,
};
} catch (error) {
throw new Error('Error creating payment');
}
},
{
persistenceStore,
}
);
If you are usingย Middyย as your middleware engine, you can use theย makeHandlerIdempotent
ย middleware to make your Lambda handler idempotent.
import { randomUUID } from 'node:crypto';
import { makeHandlerIdempotent } from '@aws-lambda-powertools/idempotency/middleware';
import { DynamoDBPersistenceLayer } from '@aws-lambda-powertools/idempotency/dynamodb';
import middy from '@middy/core';
import type { Context } from 'aws-lambda';
import type { Request, Response, SubscriptionResult } from './types';
const persistenceStore = new DynamoDBPersistenceLayer({
tableName: 'idempotencyTableName',
});
const createSubscriptionPayment = async (
event: Request
): Promise<SubscriptionResult> => {
// ... create payment
return {
id: randomUUID(),
productId: event.productId,
};
};
export const handler = middy(
async (event: Request, _context: Context): Promise<Response> => {
try {
const payment = await createSubscriptionPayment(event);
return {
paymentId: payment.id,
message: 'success',
statusCode: 200,
};
} catch (error) {
throw new Error('Error creating payment');
}
}
).use(
makeHandlerIdempotent({
persistenceStore,
})
);
Similar to the Powertools for AWS Lambda (Python) implementation we have created many options for you to customize the persistence store and the idempotency behavior, such as idempotency key, record expiration, hash function, table attributes, local cache, payload validation and more). You can also bring your own Javascript SDK v3 client or pass only client options for the utility to use. See the our documentation for more information.
makeIdempotent
function wrapper (#1579) by @dreamorosimakeIdempotent
function wrapper (#1579) by @dreamorosi@am29d, @dependabot, @dependabot[bot], @dreamorosi, @github-actions[bot], @sthulb and Release bot[bot]
In this release we are excited to announce the General Availability of the Parameters utility ๐ After almost three months of beta period we consider the utility ready for production workloads and consider the API stable.
The Parameters utility provides high-level functions to retrieve one or multiple parameter values from AWS Systems Manager Parameter Store, AWS Secrets Manager, AWS AppConfig, Amazon DynamoDB, or your own parameter store.
To get started, install the library and the corresponding AWS SDK for JavaScript v3:
npm install @aws-lambda-powertools/parameters @aws-sdk/client-ssm
Next, review the IAM permissions attached to your AWS Lambda function and make sure you allow the actions detailed in the documentation of the utility.
You can retrieve a single parameter using the getParameter()
high-level function.
import { getParameter } from '@aws-lambda-powertools/parameters/ssm';
export const handler = async (): Promise<void> => {
// Retrieve a single parameter
const parameter = await getParameter('/my/parameter');
console.log(parameter);
};
For multiple parameters, you can use getParameters()
to recursively fetch all parameters under a path:
import { getParameters } from '@aws-lambda-powertools/parameters/ssm';
export const handler = async (): Promise<void> => {
/**
* Retrieve multiple parameters from a path prefix recursively.
* This returns an object with the parameter name as key
*/
const parameters = await getParameters('/my/path/prefix');
for (const [key, value] of Object.entries(parameters || {})) {
console.log(`${key}: ${value}`);
}
};
Alternatively, you can also fetch multiple parameters using their full name by using the getParametersByName()
function.
To get started, install the library and the corresponding AWS SDK for JavaScript v3:
npm install @aws-lambda-powertools/parameters @aws-sdk/client-secrets-manager
Next, review the IAM permissions attached to your AWS Lambda function and make sure you allow the actions detailed in the documentation of the utility.
You can fetch secrets stored in Secrets Manager using the getSecret()
function:
import { getSecret } from '@aws-lambda-powertools/parameters/secrets';
export const handler = async (): Promise<void> => {
// Retrieve a single secret
const secret = await getSecret('my-secret');
console.log(secret);
};
To get started, install the library and the corresponding AWS SDK for JavaScript v3:
npm install @aws-lambda-powertools/parameters @aws-sdk/client-appconfigdata
Next, review the IAM permissions attached to your AWS Lambda function and make sure you allow the actions detailed in the documentation of the utility.
You can fetch application configurations in AWS AppConfig using the getAppConfig()
function:
import { getAppConfig } from '@aws-lambda-powertools/parameters/appconfig';
export const handler = async (): Promise<void> => {
// Retrieve a configuration, latest version
const config = await getAppConfig('my-configuration', {
environment: 'my-env',
application: 'my-app',
});
console.log(config);
};
To get started, install the library and the corresponding AWS SDK for JavaScript v3:
npm install @aws-lambda-powertools/parameters @aws-sdk/client-dynamodb @aws-sdk/util-dynamodb
Next, review the IAM permissions attached to your AWS Lambda function and make sure you allow the actions detailed in the documentation of the utility.
You can retrieve a single parameter from DynamoDB using the DynamoDBProvider.get()
method:
import { DynamoDBProvider } from '@aws-lambda-powertools/parameters/dynamodb';
const dynamoDBProvider = new DynamoDBProvider({ tableName: 'my-table' });
export const handler = async (): Promise<void> => {
// Retrieve a value from DynamoDB
const value = await dynamoDBProvider.get('my-parameter');
console.log(value);
};
For retrieving multiple parameters, you can use the DynamoDBProvider.getMultiple()
method instead.
If you want to learn more, check the post we have just published on the AWS Compute Blog: Retrieving parameters and secrets with Powertools for AWS Lambda (TypeScript)
A big thank you to all the people who contributed to this utility with PRs, questions, feedback, and bug reports.
IdempotencyPersistenceLayerError
(#1552) by @am29d@aws-sdk/*
pacakges (#1561) by @dreamorosi@aws-sdk/*
pacakges (#1561) by @dreamorosi@am29d, @dreamorosi, @hjgraca and Release bot[bot]