Amazon S3 considerations - Amazon SDK for JavaScript
Services or capabilities described in Amazon Web Services documentation might vary by Region. To see the differences applicable to the China Regions, see Getting Started with Amazon Web Services in China (PDF).

The Amazon SDK for JavaScript V3 API Reference Guide describes in detail all the API operations for the Amazon SDK for JavaScript version 3 (V3).

Amazon S3 considerations

Amazon S3 multipart upload

In v2, the Amazon S3 client contains an upload() operation that supports uploading large objects with multipart upload feature offered by Amazon S3.

In v3, the @aws-sdk/lib-storage package is available. It supports all the features offered in the v2 upload() operation and supports both Node.js and browsers runtime.

Amazon S3 presigned URL

In v2, the Amazon S3 client contains the getSignedUrl() and getSignedUrlPromise() operations to generate an URL that users can use to upload or download objects from Amazon S3.

In v3, the @aws-sdk/s3-request-presigner package is available. This package contains the functions for both getSignedUrl() and getSignedUrlPromise() operations. This blog post discusses the details of this package.

Amazon S3 region redirects

If an incorrect region is passed to the Amazon S3 client and a subsequent PermanentRedirect (status 301) error is thrown, the Amazon S3 client in v3 supports region redirects (previously known as the Amazon S3 Global Client in v2). You can use the followRegionRedirects flag in the client configuration to make the Amazon S3 client follow region redirects and support its function as a global client.

Note

Note that this feature can result in additional latency as failed requests are retried with a corrected region when receiving a PermanentRedirect error with status 301. This feature should only be used if you do not know the region of your bucket(s) ahead of time.

Amazon S3 streaming and buffered responses

The v3 SDK prefers not to buffer potentially large responses. This is commonly encountered in the Amazon S3 GetObject operation, which returned a Buffer in v2, but returns a Stream in v3.

For Node.js, you must consume the stream or garbage collect the client or its request handler to keep the connections open to new traffic by freeing sockets.

// v2 const get = await s3.getObject({ ... }).promise(); // this buffers consumes the stream already.
// v3, consume the stream to free the socket const get = await s3.getObject({ ... }); // object .Body has unconsumed stream const str = await get.Body.transformToString(); // consumes the stream // other ways to consume the stream include writing it to a file, // passing it to another consumer like an upload, or buffering to // a string or byte array.

For more information, see section on socket exhaustion.