Amazon S3 examples using SDK for JavaScript (v3) - Amazon SDK for JavaScript
Services or capabilities described in Amazon Web Services documentation might vary by Region. To see the differences applicable to the China Regions, see Getting Started with Amazon Web Services in China (PDF).

The Amazon SDK for JavaScript V3 API Reference Guide describes in detail all the API operations for the Amazon SDK for JavaScript version 3 (V3).

Amazon S3 examples using SDK for JavaScript (v3)

The following code examples show you how to perform actions and implement common scenarios by using the Amazon SDK for JavaScript (v3) with Amazon S3.

Actions are code excerpts from larger programs and must be run in context. While actions show you how to call individual service functions, you can see actions in context in their related scenarios and cross-service examples.

Scenarios are code examples that show you how to accomplish a specific task by calling multiple functions within the same service.

Each example includes a link to GitHub, where you can find instructions on how to set up and run the code in context.

Get started

The following code examples show how to get started using Amazon S3.

SDK for JavaScript (v3)
Note

There's more on GitHub. Find the complete example and learn how to set up and run in the Amazon Code Examples Repository.

import { ListBucketsCommand, S3Client } from "@aws-sdk/client-s3"; // When no region or credentials are provided, the SDK will use the // region and credentials from the local AWS config. const client = new S3Client({}); export const helloS3 = async () => { const command = new ListBucketsCommand({}); const { Buckets } = await client.send(command); console.log("Buckets: "); console.log(Buckets.map((bucket) => bucket.Name).join("\n")); return Buckets; };
  • For API details, see ListBuckets in Amazon SDK for JavaScript API Reference.

Actions

The following code example shows how to use CopyObject.

SDK for JavaScript (v3)
Note

There's more on GitHub. Find the complete example and learn how to set up and run in the Amazon Code Examples Repository.

Copy the object.

import { S3Client, CopyObjectCommand } from "@aws-sdk/client-s3"; const client = new S3Client({}); export const main = async () => { const command = new CopyObjectCommand({ CopySource: "SOURCE_BUCKET/SOURCE_OBJECT_KEY", Bucket: "DESTINATION_BUCKET", Key: "NEW_OBJECT_KEY", }); try { const response = await client.send(command); console.log(response); } catch (err) { console.error(err); } };
  • For API details, see CopyObject in Amazon SDK for JavaScript API Reference.

The following code example shows how to use CreateBucket.

SDK for JavaScript (v3)
Note

There's more on GitHub. Find the complete example and learn how to set up and run in the Amazon Code Examples Repository.

Create the bucket.

import { CreateBucketCommand, S3Client } from "@aws-sdk/client-s3"; const client = new S3Client({}); export const main = async () => { const command = new CreateBucketCommand({ // The name of the bucket. Bucket names are unique and have several other constraints. // See https://docs.aws.amazon.com/AmazonS3/latest/userguide/bucketnamingrules.html Bucket: "bucket-name", }); try { const { Location } = await client.send(command); console.log(`Bucket created with location ${Location}`); } catch (err) { console.error(err); } };

The following code example shows how to use DeleteBucket.

SDK for JavaScript (v3)
Note

There's more on GitHub. Find the complete example and learn how to set up and run in the Amazon Code Examples Repository.

Delete the bucket.

import { DeleteBucketCommand, S3Client } from "@aws-sdk/client-s3"; const client = new S3Client({}); // Delete a bucket. export const main = async () => { const command = new DeleteBucketCommand({ Bucket: "test-bucket", }); try { const response = await client.send(command); console.log(response); } catch (err) { console.error(err); } };

The following code example shows how to use DeleteBucketPolicy.

SDK for JavaScript (v3)
Note

There's more on GitHub. Find the complete example and learn how to set up and run in the Amazon Code Examples Repository.

Delete the bucket policy.

import { DeleteBucketPolicyCommand, S3Client } from "@aws-sdk/client-s3"; const client = new S3Client({}); // This will remove the policy from the bucket. export const main = async () => { const command = new DeleteBucketPolicyCommand({ Bucket: "test-bucket", }); try { const response = await client.send(command); console.log(response); } catch (err) { console.error(err); } };

The following code example shows how to use DeleteBucketWebsite.

SDK for JavaScript (v3)
Note

There's more on GitHub. Find the complete example and learn how to set up and run in the Amazon Code Examples Repository.

Delete the website configuration from the bucket.

import { DeleteBucketWebsiteCommand, S3Client } from "@aws-sdk/client-s3"; const client = new S3Client({}); // Disable static website hosting on the bucket. export const main = async () => { const command = new DeleteBucketWebsiteCommand({ Bucket: "test-bucket", }); try { const response = await client.send(command); console.log(response); } catch (err) { console.error(err); } };

The following code example shows how to use DeleteObject.

SDK for JavaScript (v3)
Note

There's more on GitHub. Find the complete example and learn how to set up and run in the Amazon Code Examples Repository.

Delete an object.

import { DeleteObjectCommand, S3Client } from "@aws-sdk/client-s3"; const client = new S3Client({}); export const main = async () => { const command = new DeleteObjectCommand({ Bucket: "test-bucket", Key: "test-key.txt", }); try { const response = await client.send(command); console.log(response); } catch (err) { console.error(err); } };
  • For API details, see DeleteObject in Amazon SDK for JavaScript API Reference.

The following code example shows how to use DeleteObjects.

SDK for JavaScript (v3)
Note

There's more on GitHub. Find the complete example and learn how to set up and run in the Amazon Code Examples Repository.

Delete multiple objects.

import { DeleteObjectsCommand, S3Client } from "@aws-sdk/client-s3"; const client = new S3Client({}); export const main = async () => { const command = new DeleteObjectsCommand({ Bucket: "test-bucket", Delete: { Objects: [{ Key: "object1.txt" }, { Key: "object2.txt" }], }, }); try { const { Deleted } = await client.send(command); console.log( `Successfully deleted ${Deleted.length} objects from S3 bucket. Deleted objects:`, ); console.log(Deleted.map((d) => ` • ${d.Key}`).join("\n")); } catch (err) { console.error(err); } };
  • For API details, see DeleteObjects in Amazon SDK for JavaScript API Reference.

The following code example shows how to use GetBucketAcl.

SDK for JavaScript (v3)
Note

There's more on GitHub. Find the complete example and learn how to set up and run in the Amazon Code Examples Repository.

Get the ACL permissions.

import { GetBucketAclCommand, S3Client } from "@aws-sdk/client-s3"; const client = new S3Client({}); export const main = async () => { const command = new GetBucketAclCommand({ Bucket: "test-bucket", }); try { const response = await client.send(command); console.log(response); } catch (err) { console.error(err); } };

The following code example shows how to use GetBucketCors.

SDK for JavaScript (v3)
Note

There's more on GitHub. Find the complete example and learn how to set up and run in the Amazon Code Examples Repository.

Get the CORS policy for the bucket.

import { GetBucketCorsCommand, S3Client } from "@aws-sdk/client-s3"; const client = new S3Client({}); export const main = async () => { const command = new GetBucketCorsCommand({ Bucket: "test-bucket", }); try { const { CORSRules } = await client.send(command); CORSRules.forEach((cr, i) => { console.log( `\nCORSRule ${i + 1}`, `\n${"-".repeat(10)}`, `\nAllowedHeaders: ${cr.AllowedHeaders.join(" ")}`, `\nAllowedMethods: ${cr.AllowedMethods.join(" ")}`, `\nAllowedOrigins: ${cr.AllowedOrigins.join(" ")}`, `\nExposeHeaders: ${cr.ExposeHeaders.join(" ")}`, `\nMaxAgeSeconds: ${cr.MaxAgeSeconds}`, ); }); } catch (err) { console.error(err); } };

The following code example shows how to use GetBucketPolicy.

SDK for JavaScript (v3)
Note

There's more on GitHub. Find the complete example and learn how to set up and run in the Amazon Code Examples Repository.

Get the bucket policy.

import { GetBucketPolicyCommand, S3Client } from "@aws-sdk/client-s3"; const client = new S3Client({}); export const main = async () => { const command = new GetBucketPolicyCommand({ Bucket: "test-bucket", }); try { const { Policy } = await client.send(command); console.log(JSON.parse(Policy)); } catch (err) { console.error(err); } };

The following code example shows how to use GetBucketWebsite.

SDK for JavaScript (v3)
Note

There's more on GitHub. Find the complete example and learn how to set up and run in the Amazon Code Examples Repository.

Get the website configuration.

import { GetBucketWebsiteCommand, S3Client } from "@aws-sdk/client-s3"; const client = new S3Client({}); export const main = async () => { const command = new GetBucketWebsiteCommand({ Bucket: "test-bucket", }); try { const { ErrorDocument, IndexDocument } = await client.send(command); console.log( `Your bucket is set up to host a website. It has an error document:`, `${ErrorDocument.Key}, and an index document: ${IndexDocument.Suffix}.`, ); } catch (err) { console.error(err); } };
  • For API details, see GetBucketWebsite in Amazon SDK for JavaScript API Reference.

The following code example shows how to use GetObject.

SDK for JavaScript (v3)
Note

There's more on GitHub. Find the complete example and learn how to set up and run in the Amazon Code Examples Repository.

Download the object.

import { GetObjectCommand, S3Client } from "@aws-sdk/client-s3"; const client = new S3Client({}); export const main = async () => { const command = new GetObjectCommand({ Bucket: "test-bucket", Key: "hello-s3.txt", }); try { const response = await client.send(command); // The Body object also has 'transformToByteArray' and 'transformToWebStream' methods. const str = await response.Body.transformToString(); console.log(str); } catch (err) { console.error(err); } };

The following code example shows how to use GetObjectLockConfiguration.

SDK for JavaScript (v3)
Note

There's more on GitHub. Find the complete example and learn how to set up and run in the Amazon Code Examples Repository.

// Copyright Amazon.com, Inc. or its affiliates. All Rights Reserved. // SPDX-License-Identifier: Apache-2.0 import { fileURLToPath } from "url"; import { GetObjectLockConfigurationCommand, S3Client, } from "@aws-sdk/client-s3"; /** * @param {S3Client} client * @param {string} bucketName */ export const main = async (client, bucketName) => { const command = new GetObjectLockConfigurationCommand({ Bucket: bucketName, // Optionally, you can provide additional parameters // ExpectedBucketOwner: "ACCOUNT_ID", }); try { const { ObjectLockConfiguration } = await client.send(command); console.log(`Object Lock Configuration: ${ObjectLockConfiguration}`); } catch (err) { console.error(err); } }; // Invoke main function if this file was run directly. if (process.argv[1] === fileURLToPath(import.meta.url)) { main(new S3Client(), "BUCKET_NAME"); }

The following code example shows how to use GetObjectRetention.

SDK for JavaScript (v3)
Note

There's more on GitHub. Find the complete example and learn how to set up and run in the Amazon Code Examples Repository.

// Copyright Amazon.com, Inc. or its affiliates. All Rights Reserved. // SPDX-License-Identifier: Apache-2.0 import { fileURLToPath } from "url"; import { GetObjectRetentionCommand, S3Client } from "@aws-sdk/client-s3"; /** * @param {S3Client} client * @param {string} bucketName * @param {string} objectKey */ export const main = async (client, bucketName, objectKey) => { const command = new GetObjectRetentionCommand({ Bucket: bucketName, Key: objectKey, // Optionally, you can provide additional parameters // ExpectedBucketOwner: "ACCOUNT_ID", // RequestPayer: "requester", // VersionId: "OBJECT_VERSION_ID", }); try { const { Retention } = await client.send(command); console.log(`Object Retention Settings: ${Retention.Status}`); } catch (err) { console.error(err); } }; // Invoke main function if this file was run directly. if (process.argv[1] === fileURLToPath(import.meta.url)) { main(new S3Client(), "BUCKET_NAME", "OBJECT_KEY"); }

The following code example shows how to use ListBuckets.

SDK for JavaScript (v3)
Note

There's more on GitHub. Find the complete example and learn how to set up and run in the Amazon Code Examples Repository.

List the buckets.

import { ListBucketsCommand, S3Client } from "@aws-sdk/client-s3"; const client = new S3Client({}); export const main = async () => { const command = new ListBucketsCommand({}); try { const { Owner, Buckets } = await client.send(command); console.log( `${Owner.DisplayName} owns ${Buckets.length} bucket${ Buckets.length === 1 ? "" : "s" }:`, ); console.log(`${Buckets.map((b) => ` • ${b.Name}`).join("\n")}`); } catch (err) { console.error(err); } };

The following code example shows how to use ListObjectsV2.

SDK for JavaScript (v3)
Note

There's more on GitHub. Find the complete example and learn how to set up and run in the Amazon Code Examples Repository.

List all of the objects in your bucket. If there is more than one object, IsTruncated and NextContinuationToken will be used to iterate over the full list.

import { S3Client, // This command supersedes the ListObjectsCommand and is the recommended way to list objects. ListObjectsV2Command, } from "@aws-sdk/client-s3"; const client = new S3Client({}); export const main = async () => { const command = new ListObjectsV2Command({ Bucket: "my-bucket", // The default and maximum number of keys returned is 1000. This limits it to // one for demonstration purposes. MaxKeys: 1, }); try { let isTruncated = true; console.log("Your bucket contains the following objects:\n"); let contents = ""; while (isTruncated) { const { Contents, IsTruncated, NextContinuationToken } = await client.send(command); const contentsList = Contents.map((c) => ` • ${c.Key}`).join("\n"); contents += contentsList + "\n"; isTruncated = IsTruncated; command.input.ContinuationToken = NextContinuationToken; } console.log(contents); } catch (err) { console.error(err); } };
  • For API details, see ListObjectsV2 in Amazon SDK for JavaScript API Reference.

The following code example shows how to use PutBucketAcl.

SDK for JavaScript (v3)
Note

There's more on GitHub. Find the complete example and learn how to set up and run in the Amazon Code Examples Repository.

Put the bucket ACL.

import { PutBucketAclCommand, S3Client } from "@aws-sdk/client-s3"; const client = new S3Client({}); // Most Amazon S3 use cases don't require the use of access control lists (ACLs). // We recommend that you disable ACLs, except in unusual circumstances where // you need to control access for each object individually. // Consider a policy instead. For more information see https://docs.aws.amazon.com/AmazonS3/latest/userguide/bucket-policies.html. export const main = async () => { // Grant a user READ access to a bucket. const command = new PutBucketAclCommand({ Bucket: "test-bucket", AccessControlPolicy: { Grants: [ { Grantee: { // The canonical ID of the user. This ID is an obfuscated form of your AWS account number. // It's unique to Amazon S3 and can't be found elsewhere. // For more information, see https://docs.aws.amazon.com/AmazonS3/latest/userguide/finding-canonical-user-id.html. ID: "canonical-id-1", Type: "CanonicalUser", }, // One of FULL_CONTROL | READ | WRITE | READ_ACP | WRITE_ACP // https://docs.aws.amazon.com/AmazonS3/latest/API/API_Grant.html#AmazonS3-Type-Grant-Permission Permission: "FULL_CONTROL", }, ], Owner: { ID: "canonical-id-2", }, }, }); try { const response = await client.send(command); console.log(response); } catch (err) { console.error(err); } };

The following code example shows how to use PutBucketCors.

SDK for JavaScript (v3)
Note

There's more on GitHub. Find the complete example and learn how to set up and run in the Amazon Code Examples Repository.

Add a CORS rule.

import { PutBucketCorsCommand, S3Client } from "@aws-sdk/client-s3"; const client = new S3Client({}); // By default, Amazon S3 doesn't allow cross-origin requests. Use this command // to explicitly allow cross-origin requests. export const main = async () => { const command = new PutBucketCorsCommand({ Bucket: "test-bucket", CORSConfiguration: { CORSRules: [ { // Allow all headers to be sent to this bucket. AllowedHeaders: ["*"], // Allow only GET and PUT methods to be sent to this bucket. AllowedMethods: ["GET", "PUT"], // Allow only requests from the specified origin. AllowedOrigins: ["https://www.example.com"], // Allow the entity tag (ETag) header to be returned in the response. The ETag header // The entity tag represents a specific version of the object. The ETag reflects // changes only to the contents of an object, not its metadata. ExposeHeaders: ["ETag"], // How long the requesting browser should cache the preflight response. After // this time, the preflight request will have to be made again. MaxAgeSeconds: 3600, }, ], }, }); try { const response = await client.send(command); console.log(response); } catch (err) { console.error(err); } };

The following code example shows how to use PutBucketPolicy.

SDK for JavaScript (v3)
Note

There's more on GitHub. Find the complete example and learn how to set up and run in the Amazon Code Examples Repository.

Add the policy.

import { PutBucketPolicyCommand, S3Client } from "@aws-sdk/client-s3"; const client = new S3Client({}); export const main = async () => { const command = new PutBucketPolicyCommand({ Policy: JSON.stringify({ Version: "2012-10-17", Statement: [ { Sid: "AllowGetObject", // Allow this particular user to call GetObject on any object in this bucket. Effect: "Allow", Principal: { AWS: "arn:aws:iam::ACCOUNT-ID:user/USERNAME", }, Action: "s3:GetObject", Resource: "arn:aws:s3:::BUCKET-NAME/*", }, ], }), // Apply the preceding policy to this bucket. Bucket: "BUCKET-NAME", }); try { const response = await client.send(command); console.log(response); } catch (err) { console.error(err); } };

The following code example shows how to use PutBucketWebsite.

SDK for JavaScript (v3)
Note

There's more on GitHub. Find the complete example and learn how to set up and run in the Amazon Code Examples Repository.

Set the website configuration.

import { PutBucketWebsiteCommand, S3Client } from "@aws-sdk/client-s3"; const client = new S3Client({}); // Set up a bucket as a static website. // The bucket needs to be publicly accessible. export const main = async () => { const command = new PutBucketWebsiteCommand({ Bucket: "test-bucket", WebsiteConfiguration: { ErrorDocument: { // The object key name to use when a 4XX class error occurs. Key: "error.html", }, IndexDocument: { // A suffix that is appended to a request that is for a directory. Suffix: "index.html", }, }, }); try { const response = await client.send(command); console.log(response); } catch (err) { console.error(err); } };

The following code example shows how to use PutObject.

SDK for JavaScript (v3)
Note

There's more on GitHub. Find the complete example and learn how to set up and run in the Amazon Code Examples Repository.

Upload the object.

import { PutObjectCommand, S3Client } from "@aws-sdk/client-s3"; const client = new S3Client({}); export const main = async () => { const command = new PutObjectCommand({ Bucket: "test-bucket", Key: "hello-s3.txt", Body: "Hello S3!", }); try { const response = await client.send(command); console.log(response); } catch (err) { console.error(err); } };

The following code example shows how to use PutObjectLegalHold.

SDK for JavaScript (v3)
Note

There's more on GitHub. Find the complete example and learn how to set up and run in the Amazon Code Examples Repository.

// Copyright Amazon.com, Inc. or its affiliates. All Rights Reserved. // SPDX-License-Identifier: Apache-2.0 import { fileURLToPath } from "url"; import { PutObjectLegalHoldCommand, S3Client } from "@aws-sdk/client-s3"; /** * @param {S3Client} client * @param {string} bucketName * @param {string} objectKey */ export const main = async (client, bucketName, objectKey) => { const command = new PutObjectLegalHoldCommand({ Bucket: bucketName, Key: objectKey, LegalHold: { // Set the status to 'ON' to place a legal hold on the object. // Set the status to 'OFF' to remove the legal hold. Status: "ON", }, // Optionally, you can provide additional parameters // ChecksumAlgorithm: "ALGORITHM", // ContentMD5: "MD5_HASH", // ExpectedBucketOwner: "ACCOUNT_ID", // RequestPayer: "requester", // VersionId: "OBJECT_VERSION_ID", }); try { const response = await client.send(command); console.log( `Object legal hold status: ${response.$metadata.httpStatusCode}`, ); } catch (err) { console.error(err); } }; // Invoke main function if this file was run directly. if (process.argv[1] === fileURLToPath(import.meta.url)) { main(new S3Client(), "BUCKET_NAME", "OBJECT_KEY"); }

The following code example shows how to use PutObjectLockConfiguration.

SDK for JavaScript (v3)
Note

There's more on GitHub. Find the complete example and learn how to set up and run in the Amazon Code Examples Repository.

Set the object lock configuration of a bucket.

// Copyright Amazon.com, Inc. or its affiliates. All Rights Reserved. // SPDX-License-Identifier: Apache-2.0 import { fileURLToPath } from "url"; import { PutObjectLockConfigurationCommand, S3Client, } from "@aws-sdk/client-s3"; /** * @param {S3Client} client * @param {string} bucketName */ export const main = async (client, bucketName) => { const command = new PutObjectLockConfigurationCommand({ Bucket: bucketName, // The Object Lock configuration that you want to apply to the specified bucket. ObjectLockConfiguration: { ObjectLockEnabled: "Enabled", }, // Optionally, you can provide additional parameters // ExpectedBucketOwner: "ACCOUNT_ID", // RequestPayer: "requester", // Token: "OPTIONAL_TOKEN", }); try { const response = await client.send(command); console.log( `Object Lock Configuration updated: ${response.$metadata.httpStatusCode}`, ); } catch (err) { console.error(err); } }; // Invoke main function if this file was run directly. if (process.argv[1] === fileURLToPath(import.meta.url)) { main(new S3Client(), "BUCKET_NAME"); }

Set the default retention period of a bucket.

// Copyright Amazon.com, Inc. or its affiliates. All Rights Reserved. // SPDX-License-Identifier: Apache-2.0 import { fileURLToPath } from "url"; import { PutObjectLockConfigurationCommand, S3Client, } from "@aws-sdk/client-s3"; /** * @param {S3Client} client * @param {string} bucketName */ export const main = async (client, bucketName) => { const command = new PutObjectLockConfigurationCommand({ Bucket: bucketName, // The Object Lock configuration that you want to apply to the specified bucket. ObjectLockConfiguration: { ObjectLockEnabled: "Enabled", Rule: { DefaultRetention: { Mode: "GOVERNANCE", Years: 3, }, }, }, // Optionally, you can provide additional parameters // ExpectedBucketOwner: "ACCOUNT_ID", // RequestPayer: "requester", // Token: "OPTIONAL_TOKEN", }); try { const response = await client.send(command); console.log( `Default Object Lock Configuration updated: ${response.$metadata.httpStatusCode}`, ); } catch (err) { console.error(err); } }; // Invoke main function if this file was run directly. if (process.argv[1] === fileURLToPath(import.meta.url)) { main(new S3Client(), "BUCKET_NAME"); }

The following code example shows how to use PutObjectRetention.

SDK for JavaScript (v3)
Note

There's more on GitHub. Find the complete example and learn how to set up and run in the Amazon Code Examples Repository.

// Copyright Amazon.com, Inc. or its affiliates. All Rights Reserved. // SPDX-License-Identifier: Apache-2.0 import { fileURLToPath } from "url"; import { PutObjectRetentionCommand, S3Client } from "@aws-sdk/client-s3"; /** * @param {S3Client} client * @param {string} bucketName * @param {string} objectKey */ export const main = async (client, bucketName, objectKey) => { const command = new PutObjectRetentionCommand({ Bucket: bucketName, Key: objectKey, BypassGovernanceRetention: false, // ChecksumAlgorithm: "ALGORITHM", // ContentMD5: "MD5_HASH", // ExpectedBucketOwner: "ACCOUNT_ID", // RequestPayer: "requester", Retention: { Mode: "GOVERNANCE", // or "COMPLIANCE" RetainUntilDate: new Date(new Date().getTime() + 24 * 60 * 60 * 1000), }, // VersionId: "OBJECT_VERSION_ID", }); try { const response = await client.send(command); console.log( `Object Retention settings updated: ${response.$metadata.httpStatusCode}`, ); } catch (err) { console.error(err); } }; // Invoke main function if this file was run directly. if (process.argv[1] === fileURLToPath(import.meta.url)) { main(new S3Client(), "BUCKET_NAME", "OBJECT_KEY"); }

Scenarios

The following code example shows how to create a presigned URL for Amazon S3 and upload an object.

SDK for JavaScript (v3)
Note

There's more on GitHub. Find the complete example and learn how to set up and run in the Amazon Code Examples Repository.

Create a presigned URL to upload an object to a bucket.

import https from "https"; import { PutObjectCommand, S3Client } from "@aws-sdk/client-s3"; import { fromIni } from "@aws-sdk/credential-providers"; import { HttpRequest } from "@smithy/protocol-http"; import { getSignedUrl, S3RequestPresigner, } from "@aws-sdk/s3-request-presigner"; import { parseUrl } from "@smithy/url-parser"; import { formatUrl } from "@aws-sdk/util-format-url"; import { Hash } from "@smithy/hash-node"; const createPresignedUrlWithoutClient = async ({ region, bucket, key }) => { const url = parseUrl(`https://${bucket}.s3.${region}.amazonaws.com/${key}`); const presigner = new S3RequestPresigner({ credentials: fromIni(), region, sha256: Hash.bind(null, "sha256"), }); const signedUrlObject = await presigner.presign( new HttpRequest({ ...url, method: "PUT" }), ); return formatUrl(signedUrlObject); }; const createPresignedUrlWithClient = ({ region, bucket, key }) => { const client = new S3Client({ region }); const command = new PutObjectCommand({ Bucket: bucket, Key: key }); return getSignedUrl(client, command, { expiresIn: 3600 }); }; function put(url, data) { return new Promise((resolve, reject) => { const req = https.request( url, { method: "PUT", headers: { "Content-Length": new Blob([data]).size } }, (res) => { let responseBody = ""; res.on("data", (chunk) => { responseBody += chunk; }); res.on("end", () => { resolve(responseBody); }); }, ); req.on("error", (err) => { reject(err); }); req.write(data); req.end(); }); } export const main = async () => { const REGION = "us-east-1"; const BUCKET = "example_bucket"; const KEY = "example_file.txt"; // There are two ways to generate a presigned URL. // 1. Use createPresignedUrl without the S3 client. // 2. Use getSignedUrl in conjunction with the S3 client and GetObjectCommand. try { const noClientUrl = await createPresignedUrlWithoutClient({ region: REGION, bucket: BUCKET, key: KEY, }); const clientUrl = await createPresignedUrlWithClient({ region: REGION, bucket: BUCKET, key: KEY, }); // After you get the presigned URL, you can provide your own file // data. Refer to put() above. console.log("Calling PUT using presigned URL without client"); await put(noClientUrl, "Hello World"); console.log("Calling PUT using presigned URL with client"); await put(clientUrl, "Hello World"); console.log("\nDone. Check your S3 console."); } catch (err) { console.error(err); } };

Create a presigned URL to download an object from a bucket.

import { GetObjectCommand, S3Client } from "@aws-sdk/client-s3"; import { fromIni } from "@aws-sdk/credential-providers"; import { HttpRequest } from "@smithy/protocol-http"; import { getSignedUrl, S3RequestPresigner, } from "@aws-sdk/s3-request-presigner"; import { parseUrl } from "@smithy/url-parser"; import { formatUrl } from "@aws-sdk/util-format-url"; import { Hash } from "@smithy/hash-node"; const createPresignedUrlWithoutClient = async ({ region, bucket, key }) => { const url = parseUrl(`https://${bucket}.s3.${region}.amazonaws.com/${key}`); const presigner = new S3RequestPresigner({ credentials: fromIni(), region, sha256: Hash.bind(null, "sha256"), }); const signedUrlObject = await presigner.presign(new HttpRequest(url)); return formatUrl(signedUrlObject); }; const createPresignedUrlWithClient = ({ region, bucket, key }) => { const client = new S3Client({ region }); const command = new GetObjectCommand({ Bucket: bucket, Key: key }); return getSignedUrl(client, command, { expiresIn: 3600 }); }; export const main = async () => { const REGION = "us-east-1"; const BUCKET = "example_bucket"; const KEY = "example_file.jpg"; try { const noClientUrl = await createPresignedUrlWithoutClient({ region: REGION, bucket: BUCKET, key: KEY, }); const clientUrl = await createPresignedUrlWithClient({ region: REGION, bucket: BUCKET, key: KEY, }); console.log("Presigned URL without client"); console.log(noClientUrl); console.log("\n"); console.log("Presigned URL with client"); console.log(clientUrl); } catch (err) { console.error(err); } };

The following code example shows how to list Amazon S3 objects in a web page.

SDK for JavaScript (v3)
Note

There's more on GitHub. Find the complete example and learn how to set up and run in the Amazon Code Examples Repository.

The following code is the relevant React component that makes calls to the Amazon SDK. A runnable version of the application containing this component can be found at the preceding GitHub link.

import { useEffect, useState } from "react"; import { ListObjectsCommand, ListObjectsCommandOutput, S3Client, } from "@aws-sdk/client-s3"; import { fromCognitoIdentityPool } from "@aws-sdk/credential-providers"; import "./App.css"; function App() { const [objects, setObjects] = useState< Required<ListObjectsCommandOutput>["Contents"] >([]); useEffect(() => { const client = new S3Client({ region: "us-east-1", // Unless you have a public bucket, you'll need access to a private bucket. // One way to do this is to create an Amazon Cognito identity pool, attach a role to the pool, // and grant the role access to the 's3:GetObject' action. // // You'll also need to configure the CORS settings on the bucket to allow traffic from // this example site. Here's an example configuration that allows all origins. Don't // do this in production. //[ // { // "AllowedHeaders": ["*"], // "AllowedMethods": ["GET"], // "AllowedOrigins": ["*"], // "ExposeHeaders": [], // }, //] // credentials: fromCognitoIdentityPool({ clientConfig: { region: "us-east-1" }, identityPoolId: "<YOUR_IDENTITY_POOL_ID>", }), }); const command = new ListObjectsCommand({ Bucket: "bucket-name" }); client.send(command).then(({ Contents }) => setObjects(Contents || [])); }, []); return ( <div className="App"> {objects.map((o) => ( <div key={o.ETag}>{o.Key}</div> ))} </div> ); } export default App;
  • For API details, see ListObjects in Amazon SDK for JavaScript API Reference.

The following code example shows how to:

  • Create a bucket and upload a file to it.

  • Download an object from a bucket.

  • Copy an object to a subfolder in a bucket.

  • List the objects in a bucket.

  • Delete the bucket objects and the bucket.

SDK for JavaScript (v3)
Note

There's more on GitHub. Find the complete example and learn how to set up and run in the Amazon Code Examples Repository.

First, import all the necessary modules.

// Used to check if currently running file is this file. import { fileURLToPath } from "url"; import { readdirSync, readFileSync, writeFileSync } from "fs"; // Local helper utils. import { dirnameFromMetaUrl } from "@aws-doc-sdk-examples/lib/utils/util-fs.js"; import { Prompter } from "@aws-doc-sdk-examples/lib/prompter.js"; import { wrapText } from "@aws-doc-sdk-examples/lib/utils/util-string.js"; import { S3Client, CreateBucketCommand, PutObjectCommand, ListObjectsCommand, CopyObjectCommand, GetObjectCommand, DeleteObjectsCommand, DeleteBucketCommand, } from "@aws-sdk/client-s3";

The preceding imports reference some helper utilities. These utilities are local to the GitHub repository linked at the start of this section. For your reference, see the following implementations of those utilities.

export const dirnameFromMetaUrl = (metaUrl) => fileURLToPath(new URL(".", metaUrl)); import { select, input, confirm, checkbox } from "@inquirer/prompts"; export class Prompter { /** * @param {{ message: string, choices: { name: string, value: string }[]}} options */ select(options) { return select(options); } /** * @param {{ message: string }} options */ input(options) { return input(options); } /** * @param {string} prompt */ checkContinue = async (prompt = "") => { const prefix = prompt && prompt + " "; let ok = await this.confirm({ message: `${prefix}Continue?`, }); if (!ok) throw new Error("Exiting..."); }; /** * @param {{ message: string }} options */ confirm(options) { return confirm(options); } /** * @param {{ message: string, choices: { name: string, value: string }[]}} options */ checkbox(options) { return checkbox(options); } } export const wrapText = (text, char = "=") => { const rule = char.repeat(80); return `${rule}\n ${text}\n${rule}\n`; };

Objects in S3 are stored in 'buckets'. Let's define a function for creating a new bucket.

export const createBucket = async () => { const bucketName = await prompter.input({ message: "Enter a bucket name. Bucket names must be globally unique:", }); const command = new CreateBucketCommand({ Bucket: bucketName }); await s3Client.send(command); console.log("Bucket created successfully.\n"); return bucketName; };

Buckets contain 'objects'. This function uploads the contents of a directory to your bucket as objects.

export const uploadFilesToBucket = async ({ bucketName, folderPath }) => { console.log(`Uploading files from ${folderPath}\n`); const keys = readdirSync(folderPath); const files = keys.map((key) => { const filePath = `${folderPath}/${key}`; const fileContent = readFileSync(filePath); return { Key: key, Body: fileContent, }; }); for (let file of files) { await s3Client.send( new PutObjectCommand({ Bucket: bucketName, Body: file.Body, Key: file.Key, }), ); console.log(`${file.Key} uploaded successfully.`); } };

After uploading objects, check to confirm that they were uploaded correctly. You can use ListObjects for that. You'll be using the 'Key' property, but there are other useful properties in the response also.

export const listFilesInBucket = async ({ bucketName }) => { const command = new ListObjectsCommand({ Bucket: bucketName }); const { Contents } = await s3Client.send(command); const contentsList = Contents.map((c) => ` • ${c.Key}`).join("\n"); console.log("\nHere's a list of files in the bucket:"); console.log(contentsList + "\n"); };

Sometimes you might want to copy an object from one bucket to another. Use the CopyObject command for that.

export const copyFileFromBucket = async ({ destinationBucket }) => { const proceed = await prompter.confirm({ message: "Would you like to copy an object from another bucket?", }); if (!proceed) { return; } else { const copy = async () => { try { const sourceBucket = await prompter.input({ message: "Enter source bucket name:", }); const sourceKey = await prompter.input({ message: "Enter source key:", }); const destinationKey = await prompter.input({ message: "Enter destination key:", }); const command = new CopyObjectCommand({ Bucket: destinationBucket, CopySource: `${sourceBucket}/${sourceKey}`, Key: destinationKey, }); await s3Client.send(command); await copyFileFromBucket({ destinationBucket }); } catch (err) { console.error(`Copy error.`); console.error(err); const retryAnswer = await prompter.confirm({ message: "Try again?" }); if (retryAnswer) { await copy(); } } }; await copy(); } };

There's no SDK method for getting multiple objects from a bucket. Instead, you'll create a list of objects to download and iterate over them.

export const downloadFilesFromBucket = async ({ bucketName }) => { const { Contents } = await s3Client.send( new ListObjectsCommand({ Bucket: bucketName }), ); const path = await prompter.input({ message: "Enter destination path for files:", }); for (let content of Contents) { const obj = await s3Client.send( new GetObjectCommand({ Bucket: bucketName, Key: content.Key }), ); writeFileSync( `${path}/${content.Key}`, await obj.Body.transformToByteArray(), ); } console.log("Files downloaded successfully.\n"); };

It's time to clean up your resources. A bucket must be empty before it can be deleted. These two functions empty and delete the bucket.

export const emptyBucket = async ({ bucketName }) => { const listObjectsCommand = new ListObjectsCommand({ Bucket: bucketName }); const { Contents } = await s3Client.send(listObjectsCommand); const keys = Contents.map((c) => c.Key); const deleteObjectsCommand = new DeleteObjectsCommand({ Bucket: bucketName, Delete: { Objects: keys.map((key) => ({ Key: key })) }, }); await s3Client.send(deleteObjectsCommand); console.log(`${bucketName} emptied successfully.\n`); }; export const deleteBucket = async ({ bucketName }) => { const command = new DeleteBucketCommand({ Bucket: bucketName }); await s3Client.send(command); console.log(`${bucketName} deleted successfully.\n`); };

The 'main' function pulls everything together. If you run this file directly the main function will be called.

const main = async () => { const OBJECT_DIRECTORY = `${dirnameFromMetaUrl( import.meta.url, )}../../../../resources/sample_files/.sample_media`; try { console.log(wrapText("Welcome to the Amazon S3 getting started example.")); console.log("Let's create a bucket."); const bucketName = await createBucket(); await prompter.confirm({ message: continueMessage }); console.log(wrapText("File upload.")); console.log( "I have some default files ready to go. You can edit the source code to provide your own.", ); await uploadFilesToBucket({ bucketName, folderPath: OBJECT_DIRECTORY, }); await listFilesInBucket({ bucketName }); await prompter.confirm({ message: continueMessage }); console.log(wrapText("Copy files.")); await copyFileFromBucket({ destinationBucket: bucketName }); await listFilesInBucket({ bucketName }); await prompter.confirm({ message: continueMessage }); console.log(wrapText("Download files.")); await downloadFilesFromBucket({ bucketName }); console.log(wrapText("Clean up.")); await emptyBucket({ bucketName }); await deleteBucket({ bucketName }); } catch (err) { console.error(err); } };

The following code example shows how to get the legal hold configuration of an S3 bucket.

SDK for JavaScript (v3)
Note

There's more on GitHub. Find the complete example and learn how to set up and run in the Amazon Code Examples Repository.

// Copyright Amazon.com, Inc. or its affiliates. All Rights Reserved. // SPDX-License-Identifier: Apache-2.0 import { fileURLToPath } from "url"; import { GetObjectLegalHoldCommand, S3Client } from "@aws-sdk/client-s3"; /** * @param {S3Client} client * @param {string} bucketName * @param {string} objectKey */ export const main = async (client, bucketName, objectKey) => { const command = new GetObjectLegalHoldCommand({ Bucket: bucketName, Key: objectKey, // Optionally, you can provide additional parameters // ExpectedBucketOwner: "ACCOUNT_ID", // RequestPayer: "requester", // VersionId: "OBJECT_VERSION_ID", }); try { const response = await client.send(command); console.log(`Legal Hold Status: ${response.LegalHold.Status}`); } catch (err) { console.error(err); } }; // Invoke main function if this file was run directly. if (process.argv[1] === fileURLToPath(import.meta.url)) { main(new S3Client(), "DOC-EXAMPLE-BUCKET", "OBJECT_KEY"); }

The following code example shows how to work with S3 object lock features.

SDK for JavaScript (v3)
Note

There's more on GitHub. Find the complete example and learn how to set up and run in the Amazon Code Examples Repository.

index.js - Entrypoint for the workflow. This orchestrates all of the steps. Visit GitHub to see the implementation details for Scenario, ScenarioInput, ScenarioOutput, and ScenarioAction.

// Copyright Amazon.com, Inc. or its affiliates. All Rights Reserved. // SPDX-License-Identifier: Apache-2.0 import * as Scenarios from "@aws-doc-sdk-examples/lib/scenario/index.js"; import { exitOnFalse, loadState, saveState, } from "@aws-doc-sdk-examples/lib/scenario/steps-common.js"; import { welcome, welcomeContinue } from "./welcome.steps.js"; import { confirmCreateBuckets, confirmPopulateBuckets, confirmSetLegalHoldFileEnabled, confirmSetLegalHoldFileRetention, confirmSetRetentionPeriodFileEnabled, confirmSetRetentionPeriodFileRetention, confirmUpdateLockPolicy, confirmUpdateRetention, createBuckets, createBucketsAction, populateBuckets, populateBucketsAction, setLegalHoldFileEnabledAction, setLegalHoldFileRetentionAction, setRetentionPeriodFileEnabledAction, setRetentionPeriodFileRetentionAction, updateLockPolicy, updateLockPolicyAction, updateRetention, updateRetentionAction, } from "./setup.steps.js"; /** * @param {Scenarios} scenarios * @param {Record<string, any>} initialState */ export const getWorkflowStages = (scenarios, initialState = {}) => { const client = new S3Client({}); return { deploy: new scenarios.Scenario( "S3 Object Locking - Deploy", [ welcome(scenarios), welcomeContinue(scenarios), exitOnFalse(scenarios, "welcomeContinue"), createBuckets(scenarios), confirmCreateBuckets(scenarios), exitOnFalse(scenarios, "confirmCreateBuckets"), createBucketsAction(scenarios, client), updateRetention(scenarios), confirmUpdateRetention(scenarios), exitOnFalse(scenarios, "confirmUpdateRetention"), updateRetentionAction(scenarios, client), populateBuckets(scenarios), confirmPopulateBuckets(scenarios), exitOnFalse(scenarios, "confirmPopulateBuckets"), populateBucketsAction(scenarios, client), updateLockPolicy(scenarios), confirmUpdateLockPolicy(scenarios), exitOnFalse(scenarios, "confirmUpdateLockPolicy"), updateLockPolicyAction(scenarios, client), confirmSetLegalHoldFileEnabled(scenarios), setLegalHoldFileEnabledAction(scenarios, client), confirmSetRetentionPeriodFileEnabled(scenarios), setRetentionPeriodFileEnabledAction(scenarios, client), confirmSetLegalHoldFileRetention(scenarios), setLegalHoldFileRetentionAction(scenarios, client), confirmSetRetentionPeriodFileRetention(scenarios), setRetentionPeriodFileRetentionAction(scenarios, client), saveState, ], initialState, ), demo: new scenarios.Scenario( "S3 Object Locking - Demo", [loadState, replAction(scenarios, client)], initialState, ), clean: new scenarios.Scenario( "S3 Object Locking - Destroy", [ loadState, confirmCleanup(scenarios), exitOnFalse(scenarios, "confirmCleanup"), cleanupAction(scenarios, client), ], initialState, ), }; }; // Call function if run directly import { fileURLToPath } from "url"; import { S3Client } from "@aws-sdk/client-s3"; import { cleanupAction, confirmCleanup } from "./clean.steps.js"; import { replAction } from "./repl.steps.js"; if (process.argv[1] === fileURLToPath(import.meta.url)) { const objectLockingScenarios = getWorkflowStages(Scenarios); Scenarios.parseScenarioArgs(objectLockingScenarios); }

welcome.steps.js - Output welcome messages to the console.

// Copyright Amazon.com, Inc. or its affiliates. All Rights Reserved. // SPDX-License-Identifier: Apache-2.0 /** * @typedef {import("@aws-doc-sdk-examples/lib/scenario/index.js")} Scenarios */ /** * @param {Scenarios} scenarios */ const welcome = (scenarios) => new scenarios.ScenarioOutput( "welcome", `Welcome to the Amazon Simple Storage Service (S3) Object Locking Workflow Scenario. For this workflow, we will use the AWS SDK for JavaScript to create several S3 buckets and files to demonstrate working with S3 locking features.`, { header: true }, ); /** * @param {Scenarios} scenarios */ const welcomeContinue = (scenarios) => new scenarios.ScenarioInput( "welcomeContinue", "Press Enter when you are ready to start.", { type: "confirm" }, ); export { welcome, welcomeContinue };

setup.steps.js - Deploy buckets, objects, and file settings.

// Copyright Amazon.com, Inc. or its affiliates. All Rights Reserved. // SPDX-License-Identifier: Apache-2.0 import { BucketVersioningStatus, ChecksumAlgorithm, CreateBucketCommand, MFADeleteStatus, PutBucketVersioningCommand, PutObjectCommand, PutObjectLockConfigurationCommand, PutObjectLegalHoldCommand, PutObjectRetentionCommand, ObjectLockLegalHoldStatus, ObjectLockRetentionMode, } from "@aws-sdk/client-s3"; /** * @typedef {import("@aws-doc-sdk-examples/lib/scenario/index.js")} Scenarios */ /** * @typedef {import("@aws-sdk/client-s3").S3Client} S3Client */ const bucketPrefix = "js-object-locking"; /** * @param {Scenarios} scenarios * @param {S3Client} client */ const createBuckets = (scenarios) => new scenarios.ScenarioOutput( "createBuckets", `The following buckets will be created: ${bucketPrefix}-no-lock with object lock False. ${bucketPrefix}-lock-enabled with object lock True. ${bucketPrefix}-retention-after-creation with object lock False.`, { preformatted: true }, ); /** * @param {Scenarios} scenarios */ const confirmCreateBuckets = (scenarios) => new scenarios.ScenarioInput("confirmCreateBuckets", "Create the buckets?", { type: "confirm", }); /** * @param {Scenarios} scenarios * @param {S3Client} client */ const createBucketsAction = (scenarios, client) => new scenarios.ScenarioAction("createBucketsAction", async (state) => { const noLockBucketName = `${bucketPrefix}-no-lock`; const lockEnabledBucketName = `${bucketPrefix}-lock-enabled`; const retentionBucketName = `${bucketPrefix}-retention-after-creation`; await client.send(new CreateBucketCommand({ Bucket: noLockBucketName })); await client.send( new CreateBucketCommand({ Bucket: lockEnabledBucketName, ObjectLockEnabledForBucket: true, }), ); await client.send(new CreateBucketCommand({ Bucket: retentionBucketName })); state.noLockBucketName = noLockBucketName; state.lockEnabledBucketName = lockEnabledBucketName; state.retentionBucketName = retentionBucketName; }); /** * @param {Scenarios} scenarios */ const populateBuckets = (scenarios) => new scenarios.ScenarioOutput( "populateBuckets", `The following test files will be created: file0.txt in ${bucketPrefix}-no-lock. file1.txt in ${bucketPrefix}-no-lock. file0.txt in ${bucketPrefix}-lock-enabled. file1.txt in ${bucketPrefix}-lock-enabled. file0.txt in ${bucketPrefix}-retention-after-creation. file1.txt in ${bucketPrefix}-retention-after-creation.`, { preformatted: true }, ); /** * @param {Scenarios} scenarios */ const confirmPopulateBuckets = (scenarios) => new scenarios.ScenarioInput( "confirmPopulateBuckets", "Populate the buckets?", { type: "confirm" }, ); /** * @param {Scenarios} scenarios * @param {S3Client} client */ const populateBucketsAction = (scenarios, client) => new scenarios.ScenarioAction("populateBucketsAction", async (state) => { await client.send( new PutObjectCommand({ Bucket: state.noLockBucketName, Key: "file0.txt", Body: "Content", ChecksumAlgorithm: ChecksumAlgorithm.SHA256, }), ); await client.send( new PutObjectCommand({ Bucket: state.noLockBucketName, Key: "file1.txt", Body: "Content", ChecksumAlgorithm: ChecksumAlgorithm.SHA256, }), ); await client.send( new PutObjectCommand({ Bucket: state.lockEnabledBucketName, Key: "file0.txt", Body: "Content", ChecksumAlgorithm: ChecksumAlgorithm.SHA256, }), ); await client.send( new PutObjectCommand({ Bucket: state.lockEnabledBucketName, Key: "file1.txt", Body: "Content", ChecksumAlgorithm: ChecksumAlgorithm.SHA256, }), ); await client.send( new PutObjectCommand({ Bucket: state.retentionBucketName, Key: "file0.txt", Body: "Content", ChecksumAlgorithm: ChecksumAlgorithm.SHA256, }), ); await client.send( new PutObjectCommand({ Bucket: state.retentionBucketName, Key: "file1.txt", Body: "Content", ChecksumAlgorithm: ChecksumAlgorithm.SHA256, }), ); }); /** * @param {Scenarios} scenarios */ const updateRetention = (scenarios) => new scenarios.ScenarioOutput( "updateRetention", `A bucket can be configured to use object locking with a default retention period. A default retention period will be configured for ${bucketPrefix}-retention-after-creation.`, { preformatted: true }, ); /** * @param {Scenarios} scenarios */ const confirmUpdateRetention = (scenarios) => new scenarios.ScenarioInput( "confirmUpdateRetention", "Configure default retention period?", { type: "confirm" }, ); /** * @param {Scenarios} scenarios * @param {S3Client} client */ const updateRetentionAction = (scenarios, client) => new scenarios.ScenarioAction("updateRetentionAction", async (state) => { await client.send( new PutBucketVersioningCommand({ Bucket: state.retentionBucketName, VersioningConfiguration: { MFADelete: MFADeleteStatus.Disabled, Status: BucketVersioningStatus.Enabled, }, }), ); await client.send( new PutObjectLockConfigurationCommand({ Bucket: state.retentionBucketName, ObjectLockConfiguration: { ObjectLockEnabled: "Enabled", Rule: { DefaultRetention: { Mode: "GOVERNANCE", Years: 1, }, }, }, }), ); }); /** * @param {Scenarios} scenarios */ const updateLockPolicy = (scenarios) => new scenarios.ScenarioOutput( "updateLockPolicy", `Object lock policies can also be added to existing buckets. An object lock policy will be added to ${bucketPrefix}-lock-enabled.`, { preformatted: true }, ); /** * @param {Scenarios} scenarios */ const confirmUpdateLockPolicy = (scenarios) => new scenarios.ScenarioInput( "confirmUpdateLockPolicy", "Add object lock policy?", { type: "confirm" }, ); /** * @param {Scenarios} scenarios * @param {S3Client} client */ const updateLockPolicyAction = (scenarios, client) => new scenarios.ScenarioAction("updateLockPolicyAction", async (state) => { await client.send( new PutObjectLockConfigurationCommand({ Bucket: state.lockEnabledBucketName, ObjectLockConfiguration: { ObjectLockEnabled: "Enabled", }, }), ); }); /** * @param {Scenarios} scenarios * @param {S3Client} client */ const confirmSetLegalHoldFileEnabled = (scenarios) => new scenarios.ScenarioInput( "confirmSetLegalHoldFileEnabled", (state) => `Would you like to add a legal hold to file0.txt in ${state.lockEnabledBucketName}?`, { type: "confirm", }, ); /** * @param {Scenarios} scenarios * @param {S3Client} client */ const setLegalHoldFileEnabledAction = (scenarios, client) => new scenarios.ScenarioAction( "setLegalHoldFileEnabledAction", async (state) => { await client.send( new PutObjectLegalHoldCommand({ Bucket: state.lockEnabledBucketName, Key: "file0.txt", LegalHold: { Status: ObjectLockLegalHoldStatus.ON, }, }), ); console.log( `Modified legal hold for file0.txt in ${state.lockEnabledBucketName}.`, ); }, { skipWhen: (state) => !state.confirmSetLegalHoldFileEnabled }, ); /** * @param {Scenarios} scenarios * @param {S3Client} client */ const confirmSetRetentionPeriodFileEnabled = (scenarios) => new scenarios.ScenarioInput( "confirmSetRetentionPeriodFileEnabled", (state) => `Would you like to add a 1 day Governance retention period to file1.txt in ${state.lockEnabledBucketName}? Reminder: Only a user with the s3:BypassGovernanceRetention permission will be able to delete this file or its bucket until the retention period has expired.`, { type: "confirm", }, ); /** * @param {Scenarios} scenarios * @param {S3Client} client */ const setRetentionPeriodFileEnabledAction = (scenarios, client) => new scenarios.ScenarioAction( "setRetentionPeriodFileEnabledAction", async (state) => { const retentionDate = new Date(); retentionDate.setDate(retentionDate.getDate() + 1); await client.send( new PutObjectRetentionCommand({ Bucket: state.lockEnabledBucketName, Key: "file1.txt", Retention: { Mode: ObjectLockRetentionMode.GOVERNANCE, RetainUntilDate: retentionDate, }, }), ); console.log( `Set retention for file1.txt in ${state.lockEnabledBucketName} until ${retentionDate.toISOString().split("T")[0]}.`, ); }, { skipWhen: (state) => !state.confirmSetRetentionPeriodFileEnabled }, ); /** * @param {Scenarios} scenarios * @param {S3Client} client */ const confirmSetLegalHoldFileRetention = (scenarios) => new scenarios.ScenarioInput( "confirmSetLegalHoldFileRetention", (state) => `Would you like to add a legal hold to file0.txt in ${state.retentionBucketName}?`, { type: "confirm", }, ); /** * @param {Scenarios} scenarios * @param {S3Client} client */ const setLegalHoldFileRetentionAction = (scenarios, client) => new scenarios.ScenarioAction( "setLegalHoldFileRetentionAction", async (state) => { await client.send( new PutObjectLegalHoldCommand({ Bucket: state.retentionBucketName, Key: "file0.txt", LegalHold: { Status: ObjectLockLegalHoldStatus.ON, }, }), ); console.log( `Modified legal hold for file0.txt in ${state.retentionBucketName}.`, ); }, { skipWhen: (state) => !state.confirmSetLegalHoldFileRetention }, ); /** * @param {Scenarios} scenarios */ const confirmSetRetentionPeriodFileRetention = (scenarios) => new scenarios.ScenarioInput( "confirmSetRetentionPeriodFileRetention", (state) => `Would you like to add a 1 day Governance retention period to file1.txt in ${state.retentionBucketName}? Reminder: Only a user with the s3:BypassGovernanceRetention permission will be able to delete this file or its bucket until the retention period has expired.`, { type: "confirm", }, ); /** * @param {Scenarios} scenarios * @param {S3Client} client */ const setRetentionPeriodFileRetentionAction = (scenarios, client) => new scenarios.ScenarioAction( "setRetentionPeriodFileRetentionAction", async (state) => { const retentionDate = new Date(); retentionDate.setDate(retentionDate.getDate() + 1); await client.send( new PutObjectRetentionCommand({ Bucket: state.retentionBucketName, Key: "file1.txt", Retention: { Mode: ObjectLockRetentionMode.GOVERNANCE, RetainUntilDate: retentionDate, }, BypassGovernanceRetention: true, }), ); console.log( `Set retention for file1.txt in ${state.retentionBucketName} until ${retentionDate.toISOString().split("T")[0]}.`, ); }, { skipWhen: (state) => !state.confirmSetRetentionPeriodFileRetention }, ); export { createBuckets, confirmCreateBuckets, createBucketsAction, populateBuckets, confirmPopulateBuckets, populateBucketsAction, updateRetention, confirmUpdateRetention, updateRetentionAction, updateLockPolicy, confirmUpdateLockPolicy, updateLockPolicyAction, confirmSetLegalHoldFileEnabled, setLegalHoldFileEnabledAction, confirmSetRetentionPeriodFileEnabled, setRetentionPeriodFileEnabledAction, confirmSetLegalHoldFileRetention, setLegalHoldFileRetentionAction, confirmSetRetentionPeriodFileRetention, setRetentionPeriodFileRetentionAction, };

repl.steps.js - View and delete files in the buckets.

// Copyright Amazon.com, Inc. or its affiliates. All Rights Reserved. // SPDX-License-Identifier: Apache-2.0 import { ChecksumAlgorithm, DeleteObjectCommand, GetObjectLegalHoldCommand, GetObjectLockConfigurationCommand, GetObjectRetentionCommand, ListObjectVersionsCommand, PutObjectCommand, } from "@aws-sdk/client-s3"; /** * @typedef {import("@aws-doc-sdk-examples/lib/scenario/index.js")} Scenarios */ /** * @typedef {import("@aws-sdk/client-s3").S3Client} S3Client */ const choices = { EXIT: 0, LIST_ALL_FILES: 1, DELETE_FILE: 2, DELETE_FILE_WITH_RETENTION: 3, OVERWRITE_FILE: 4, VIEW_RETENTION_SETTINGS: 5, VIEW_LEGAL_HOLD_SETTINGS: 6, }; /** * @param {Scenarios} scenarios */ const replInput = (scenarios) => new scenarios.ScenarioInput( "replChoice", `Explore the S3 locking features by selecting one of the following choices`, { type: "select", choices: [ { name: "List all files in buckets", value: choices.LIST_ALL_FILES }, { name: "Attempt to delete a file.", value: choices.DELETE_FILE }, { name: "Attempt to delete a file with retention period bypass.", value: choices.DELETE_FILE_WITH_RETENTION, }, { name: "Attempt to overwrite a file.", value: choices.OVERWRITE_FILE }, { name: "View the object and bucket retention settings for a file.", value: choices.VIEW_RETENTION_SETTINGS, }, { name: "View the legal hold settings for a file.", value: choices.VIEW_LEGAL_HOLD_SETTINGS, }, { name: "Finish the workflow.", value: choices.EXIT }, ], }, ); /** * @param {S3Client} client * @param {string[]} buckets */ const getAllFiles = async (client, buckets) => { /** @type {{bucket: string, key: string, version: string}[]} */ const files = []; for (const bucket of buckets) { const objectsResponse = await client.send( new ListObjectVersionsCommand({ Bucket: bucket }), ); for (const version of objectsResponse.Versions || []) { const { Key, VersionId } = version; files.push({ bucket, key: Key, version: VersionId }); } } return files; }; /** * @param {Scenarios} scenarios * @param {S3Client} client */ const replAction = (scenarios, client) => new scenarios.ScenarioAction( "replAction", async (state) => { const files = await getAllFiles(client, [ state.noLockBucketName, state.lockEnabledBucketName, state.retentionBucketName, ]); const fileInput = new scenarios.ScenarioInput( "selectedFile", "Select a file:", { type: "select", choices: files.map((file, index) => ({ name: `${index + 1}: ${file.bucket}: ${file.key} (version: ${ file.version })`, value: index, })), }, ); const { replChoice } = state; switch (replChoice) { case choices.LIST_ALL_FILES: { const files = await getAllFiles(client, [ state.noLockBucketName, state.lockEnabledBucketName, state.retentionBucketName, ]); state.replOutput = files .map( (file) => `${file.bucket}: ${file.key} (version: ${file.version})`, ) .join("\n"); break; } case choices.DELETE_FILE: { /** @type {number} */ const fileToDelete = await fileInput.handle(state); const selectedFile = files[fileToDelete]; try { await client.send( new DeleteObjectCommand({ Bucket: selectedFile.bucket, Key: selectedFile.key, VersionId: selectedFile.version, }), ); state.replOutput = `Deleted ${selectedFile.key} in ${selectedFile.bucket}.`; } catch (err) { state.replOutput = `Unable to delete object ${selectedFile.key} in bucket ${selectedFile.bucket}: ${err.message}`; } break; } case choices.DELETE_FILE_WITH_RETENTION: { /** @type {number} */ const fileToDelete = await fileInput.handle(state); const selectedFile = files[fileToDelete]; try { await client.send( new DeleteObjectCommand({ Bucket: selectedFile.bucket, Key: selectedFile.key, VersionId: selectedFile.version, BypassGovernanceRetention: true, }), ); state.replOutput = `Deleted ${selectedFile.key} in ${selectedFile.bucket}.`; } catch (err) { state.replOutput = `Unable to delete object ${selectedFile.key} in bucket ${selectedFile.bucket}: ${err.message}`; } break; } case choices.OVERWRITE_FILE: { /** @type {number} */ const fileToOverwrite = await fileInput.handle(state); const selectedFile = files[fileToOverwrite]; try { await client.send( new PutObjectCommand({ Bucket: selectedFile.bucket, Key: selectedFile.key, Body: "New content", ChecksumAlgorithm: ChecksumAlgorithm.SHA256, }), ); state.replOutput = `Overwrote ${selectedFile.key} in ${selectedFile.bucket}.`; } catch (err) { state.replOutput = `Unable to overwrite object ${selectedFile.key} in bucket ${selectedFile.bucket}: ${err.message}`; } break; } case choices.VIEW_RETENTION_SETTINGS: { /** @type {number} */ const fileToView = await fileInput.handle(state); const selectedFile = files[fileToView]; try { const retention = await client.send( new GetObjectRetentionCommand({ Bucket: selectedFile.bucket, Key: selectedFile.key, VersionId: selectedFile.version, }), ); const bucketConfig = await client.send( new GetObjectLockConfigurationCommand({ Bucket: selectedFile.bucket, }), ); state.replOutput = `Object retention for ${selectedFile.key} in ${selectedFile.bucket}: ${retention.Retention?.Mode} until ${retention.Retention?.RetainUntilDate?.toISOString()}. Bucket object lock config for ${selectedFile.bucket} in ${selectedFile.bucket}: Enabled: ${bucketConfig.ObjectLockConfiguration?.ObjectLockEnabled} Rule: ${JSON.stringify(bucketConfig.ObjectLockConfiguration?.Rule?.DefaultRetention)}`; } catch (err) { state.replOutput = `Unable to fetch object lock retention: '${err.message}'`; } break; } case choices.VIEW_LEGAL_HOLD_SETTINGS: { /** @type {number} */ const fileToView = await fileInput.handle(state); const selectedFile = files[fileToView]; try { const legalHold = await client.send( new GetObjectLegalHoldCommand({ Bucket: selectedFile.bucket, Key: selectedFile.key, VersionId: selectedFile.version, }), ); state.replOutput = `Object legal hold for ${selectedFile.key} in ${selectedFile.bucket}: Status: ${legalHold.LegalHold?.Status}`; } catch (err) { state.replOutput = `Unable to fetch legal hold: '${err.message}'`; } break; } default: throw new Error(`Invalid replChoice: ${replChoice}`); } }, { whileConfig: { whileFn: ({ replChoice }) => replChoice !== choices.EXIT, input: replInput(scenarios), output: new scenarios.ScenarioOutput( "REPL output", (state) => state.replOutput, { preformatted: true }, ), }, }, ); export { replInput, replAction, choices };

clean.steps.js - Destroy all created resources.

// Copyright Amazon.com, Inc. or its affiliates. All Rights Reserved. // SPDX-License-Identifier: Apache-2.0 import { DeleteObjectCommand, DeleteBucketCommand, ListObjectVersionsCommand, GetObjectLegalHoldCommand, GetObjectRetentionCommand, PutObjectLegalHoldCommand, } from "@aws-sdk/client-s3"; /** * @typedef {import("@aws-doc-sdk-examples/lib/scenario/index.js")} Scenarios */ /** * @typedef {import("@aws-sdk/client-s3").S3Client} S3Client */ /** * @param {Scenarios} scenarios */ const confirmCleanup = (scenarios) => new scenarios.ScenarioInput("confirmCleanup", "Clean up resources?", { type: "confirm", }); /** * @param {Scenarios} scenarios * @param {S3Client} client */ const cleanupAction = (scenarios, client) => new scenarios.ScenarioAction("cleanupAction", async (state) => { const { noLockBucketName, lockEnabledBucketName, retentionBucketName } = state; const buckets = [ noLockBucketName, lockEnabledBucketName, retentionBucketName, ]; for (const bucket of buckets) { /** @type {import("@aws-sdk/client-s3").ListObjectVersionsCommandOutput} */ let objectsResponse; try { objectsResponse = await client.send( new ListObjectVersionsCommand({ Bucket: bucket, }), ); } catch (e) { if (e instanceof Error && e.name === "NoSuchBucket") { console.log("Object's bucket has already been deleted."); continue; } else { throw e; } } for (const version of objectsResponse.Versions || []) { const { Key, VersionId } = version; try { const legalHold = await client.send( new GetObjectLegalHoldCommand({ Bucket: bucket, Key, VersionId, }), ); if (legalHold.LegalHold?.Status === "ON") { await client.send( new PutObjectLegalHoldCommand({ Bucket: bucket, Key, VersionId, LegalHold: { Status: "OFF", }, }), ); } } catch (err) { console.log( `Unable to fetch legal hold for ${Key} in ${bucket}: '${err.message}'`, ); } try { const retention = await client.send( new GetObjectRetentionCommand({ Bucket: bucket, Key, VersionId, }), ); if (retention.Retention?.Mode === "GOVERNANCE") { await client.send( new DeleteObjectCommand({ Bucket: bucket, Key, VersionId, BypassGovernanceRetention: true, }), ); } } catch (err) { console.log( `Unable to fetch object lock retention for ${Key} in ${bucket}: '${err.message}'`, ); } await client.send( new DeleteObjectCommand({ Bucket: bucket, Key, VersionId, }), ); } await client.send(new DeleteBucketCommand({ Bucket: bucket })); console.log(`Delete for ${bucket} complete.`); } }); export { confirmCleanup, cleanupAction };

The following code example shows how to upload or download large files to and from Amazon S3.

For more information, see Uploading an object using multipart upload.

SDK for JavaScript (v3)
Note

There's more on GitHub. Find the complete example and learn how to set up and run in the Amazon Code Examples Repository.

Upload a large file.

import { CreateMultipartUploadCommand, UploadPartCommand, CompleteMultipartUploadCommand, AbortMultipartUploadCommand, S3Client, } from "@aws-sdk/client-s3"; const twentyFiveMB = 25 * 1024 * 1024; export const createString = (size = twentyFiveMB) => { return "x".repeat(size); }; export const main = async () => { const s3Client = new S3Client({}); const bucketName = "test-bucket"; const key = "multipart.txt"; const str = createString(); const buffer = Buffer.from(str, "utf8"); let uploadId; try { const multipartUpload = await s3Client.send( new CreateMultipartUploadCommand({ Bucket: bucketName, Key: key, }), ); uploadId = multipartUpload.UploadId; const uploadPromises = []; // Multipart uploads require a minimum size of 5 MB per part. const partSize = Math.ceil(buffer.length / 5); // Upload each part. for (let i = 0; i < 5; i++) { const start = i * partSize; const end = start + partSize; uploadPromises.push( s3Client .send( new UploadPartCommand({ Bucket: bucketName, Key: key, UploadId: uploadId, Body: buffer.subarray(start, end), PartNumber: i + 1, }), ) .then((d) => { console.log("Part", i + 1, "uploaded"); return d; }), ); } const uploadResults = await Promise.all(uploadPromises); return await s3Client.send( new CompleteMultipartUploadCommand({ Bucket: bucketName, Key: key, UploadId: uploadId, MultipartUpload: { Parts: uploadResults.map(({ ETag }, i) => ({ ETag, PartNumber: i + 1, })), }, }), ); // Verify the output by downloading the file from the Amazon Simple Storage Service (Amazon S3) console. // Because the output is a 25 MB string, text editors might struggle to open the file. } catch (err) { console.error(err); if (uploadId) { const abortCommand = new AbortMultipartUploadCommand({ Bucket: bucketName, Key: key, UploadId: uploadId, }); await s3Client.send(abortCommand); } } };

Download a large file.

import { GetObjectCommand, S3Client } from "@aws-sdk/client-s3"; import { createWriteStream } from "fs"; const s3Client = new S3Client({}); const oneMB = 1024 * 1024; export const getObjectRange = ({ bucket, key, start, end }) => { const command = new GetObjectCommand({ Bucket: bucket, Key: key, Range: `bytes=${start}-${end}`, }); return s3Client.send(command); }; /** * @param {string | undefined} contentRange */ export const getRangeAndLength = (contentRange) => { const [range, length] = contentRange.split("/"); const [start, end] = range.split("-"); return { start: parseInt(start), end: parseInt(end), length: parseInt(length), }; }; export const isComplete = ({ end, length }) => end === length - 1; // When downloading a large file, you might want to break it down into // smaller pieces. Amazon S3 accepts a Range header to specify the start // and end of the byte range to be downloaded. const downloadInChunks = async ({ bucket, key }) => { const writeStream = createWriteStream( fileURLToPath(new URL(`./${key}`, import.meta.url)), ).on("error", (err) => console.error(err)); let rangeAndLength = { start: -1, end: -1, length: -1 }; while (!isComplete(rangeAndLength)) { const { end } = rangeAndLength; const nextRange = { start: end + 1, end: end + oneMB }; console.log(`Downloading bytes ${nextRange.start} to ${nextRange.end}`); const { ContentRange, Body } = await getObjectRange({ bucket, key, ...nextRange, }); writeStream.write(await Body.transformToByteArray()); rangeAndLength = getRangeAndLength(ContentRange); } }; export const main = async () => { await downloadInChunks({ bucket: "my-cool-bucket", key: "my-cool-object.txt", }); };

Serverless examples

The following code example shows how to implement a Lambda function that receives an event triggered by uploading an object to an S3 bucket. The function retrieves the S3 bucket name and object key from the event parameter and calls the Amazon S3 API to retrieve and log the content type of the object.

SDK for JavaScript (v3)
Note

There's more on GitHub. Find the complete example and learn how to set up and run in the Serverless examples repository.

Consuming an S3 event with Lambda using JavaScript.

// Copyright Amazon.com, Inc. or its affiliates. All Rights Reserved. // SPDX-License-Identifier: Apache-2.0 import { S3Client, HeadObjectCommand } from "@aws-sdk/client-s3"; const client = new S3Client(); exports.handler = async (event, context) => { // Get the object from the event and show its content type const bucket = event.Records[0].s3.bucket.name; const key = decodeURIComponent(event.Records[0].s3.object.key.replace(/\+/g, ' ')); try { const { ContentType } = await client.send(new HeadObjectCommand({ Bucket: bucket, Key: key, })); console.log('CONTENT TYPE:', ContentType); return ContentType; } catch (err) { console.log(err); const message = `Error getting object ${key} from bucket ${bucket}. Make sure they exist and your bucket is in the same region as this function.`; console.log(message); throw new Error(message); } };

Consuming an S3 event with Lambda using TypeScript.

// Copyright Amazon.com, Inc. or its affiliates. All Rights Reserved. // SPDX-License-Identifier: Apache-2.0 import { S3Event } from 'aws-lambda'; import { S3Client, HeadObjectCommand } from '@aws-sdk/client-s3'; const s3 = new S3Client({ region: process.env.AWS_REGION }); export const handler = async (event: S3Event): Promise<string | undefined> => { // Get the object from the event and show its content type const bucket = event.Records[0].s3.bucket.name; const key = decodeURIComponent(event.Records[0].s3.object.key.replace(/\+/g, ' ')); const params = { Bucket: bucket, Key: key, }; try { const { ContentType } = await s3.send(new HeadObjectCommand(params)); console.log('CONTENT TYPE:', ContentType); return ContentType; } catch (err) { console.log(err); const message = `Error getting object ${key} from bucket ${bucket}. Make sure they exist and your bucket is in the same region as this function.`; console.log(message); throw new Error(message); } };