This documentation is for Version 1 of the Amazon CLI only. For documentation related to Version 2 of the Amazon CLI, see the Version 2 User Guide.
Amazon S3 examples using Amazon CLI
The following code examples show you how to perform actions and implement common scenarios by using the Amazon Command Line Interface with Amazon S3.
Actions are code excerpts from larger programs and must be run in context. While actions show you how to call individual service functions, you can see actions in context in their related scenarios.
Each example includes a link to the complete source code, where you can find instructions on how to set up and run the code in context.
Topics
Actions
The following code example shows how to use abort-multipart-upload.
- Amazon CLI
- 
             
                    To abort the specified multipart upload The following abort-multipart-uploadcommand aborts a multipart upload for the keymultipart/01in the bucketamzn-s3-demo-bucket.aws s3api abort-multipart-upload \ --bucketamzn-s3-demo-bucket\ --keymultipart/01\ --upload-iddfRtDYU0WWCCcH43C3WFbkRONycyCpTJJvxu2i5GYkZljF.Yxwh6XG7WfS2vC4to6HiV6Yjlx.cph0gtNBtJ8P3URCSbB7rjxI5iEwVDmgaXZOGgkk5nVTW16HOQ5l0RThe upload ID required by this command is output by create-multipart-uploadand can also be retrieved withlist-multipart-uploads.- 
                    For API details, see AbortMultipartUpload in Amazon CLI Command Reference. 
 
- 
                    
The following code example shows how to use complete-multipart-upload.
- Amazon CLI
- 
             
                    The following command completes a multipart upload for the key multipart/01in the bucketamzn-s3-demo-bucket:aws s3api complete-multipart-upload --multipart-uploadfile://mpustruct--bucketamzn-s3-demo-bucket--key 'multipart/01' --upload-iddfRtDYU0WWCCcH43C3WFbkRONycyCpTJJvxu2i5GYkZljF.Yxwh6XG7WfS2vC4to6HiV6Yjlx.cph0gtNBtJ8P3URCSbB7rjxI5iEwVDmgaXZOGgkk5nVTW16HOQ5l0RThe upload ID required by this command is output by create-multipart-uploadand can also be retrieved withlist-multipart-uploads.The multipart upload option in the above command takes a JSON structure that describes the parts of the multipart upload that should be reassembled into the complete file. In this example, the file://prefix is used to load the JSON structure from a file in the local folder namedmpustruct.mpustruct: { "Parts": [ { "ETag": "e868e0f4719e394144ef36531ee6824c", "PartNumber": 1 }, { "ETag": "6bb2b12753d66fe86da4998aa33fffb0", "PartNumber": 2 }, { "ETag": "d0a0112e841abec9c9ec83406f0159c8", "PartNumber": 3 } ] }The ETag value for each part is upload is output each time you upload a part using the upload-partcommand and can also be retrieved by callinglist-partsor calculated by taking the MD5 checksum of each part.Output: { "ETag": "\"3944a9f7a4faab7f78788ff6210f63f0-3\"", "Bucket": "amzn-s3-demo-bucket", "Location": "https://amzn-s3-demo-bucket.s3.amazonaws.com/multipart%2F01", "Key": "multipart/01" }- 
                    For API details, see CompleteMultipartUpload in Amazon CLI Command Reference. 
 
- 
                    
The following code example shows how to use copy-object.
- Amazon CLI
- 
             
                    The following command copies an object from bucket-1tobucket-2:aws s3api copy-object --copy-sourcebucket-1/test.txt--keytest.txt--bucketbucket-2Output: { "CopyObjectResult": { "LastModified": "2015-11-10T01:07:25.000Z", "ETag": "\"589c8b79c230a6ecd5a7e1d040a9a030\"" }, "VersionId": "YdnYvTCVDqRRFA.NFJjy36p0hxifMlkA" }- 
                    For API details, see CopyObject in Amazon CLI Command Reference. 
 
- 
                    
The following code example shows how to use cp.
- Amazon CLI
- 
             
                    Example 1: Copying a local file to S3 The following cpcommand copies a single file to a specified bucket and key:aws s3 cptest.txts3://amzn-s3-demo-bucket/test2.txtOutput: upload: test.txt to s3://amzn-s3-demo-bucket/test2.txtExample 2: Copying a local file to S3 with an expiration date The following cpcommand copies a single file to a specified bucket and key that expires at the specified ISO 8601 timestamp:aws s3 cptest.txts3://amzn-s3-demo-bucket/test2.txt\ --expires2014-10-01T20:30:00ZOutput: upload: test.txt to s3://amzn-s3-demo-bucket/test2.txtExample 3: Copying a file from S3 to S3 The following cpcommand copies a single s3 object to a specified bucket and key:aws s3 cps3://amzn-s3-demo-bucket/test.txts3://amzn-s3-demo-bucket/test2.txtOutput: copy: s3://amzn-s3-demo-bucket/test.txt to s3://amzn-s3-demo-bucket/test2.txtExample 4: Copying an S3 object to a local file The following cpcommand copies a single object to a specified file locally:aws s3 cps3://amzn-s3-demo-bucket/test.txttest2.txtOutput: download: s3://amzn-s3-demo-bucket/test.txt to test2.txtExample 5: Copying an S3 object from one bucket to another The following cpcommand copies a single object to a specified bucket while retaining its original name:aws s3 cps3://amzn-s3-demo-bucket/test.txts3://amzn-s3-demo-bucket2/Output: copy: s3://amzn-s3-demo-bucket/test.txt to s3://amzn-s3-demo-bucket2/test.txtExample 6: Recursively copying S3 objects to a local directory When passed with the parameter --recursive, the followingcpcommand recursively copies all objects under a specified prefix and bucket to a specified directory. In this example, the bucketamzn-s3-demo-buckethas the objectstest1.txtandtest2.txt:aws s3 cps3://amzn-s3-demo-bucket.\ --recursiveOutput: download: s3://amzn-s3-demo-bucket/test1.txt to test1.txt download: s3://amzn-s3-demo-bucket/test2.txt to test2.txtExample 7: Recursively copying local files to S3 When passed with the parameter --recursive, the followingcpcommand recursively copies all files under a specified directory to a specified bucket and prefix while excluding some files by using an--excludeparameter. In this example, the directorymyDirhas the filestest1.txtandtest2.jpg:aws s3 cpmyDirs3://amzn-s3-demo-bucket/\ --recursive \ --exclude"*.jpg"Output: upload: myDir/test1.txt to s3://amzn-s3-demo-bucket/test1.txtExample 8: Recursively copying S3 objects to another bucket When passed with the parameter --recursive, the followingcpcommand recursively copies all objects under a specified bucket to another bucket while excluding some objects by using an--excludeparameter. In this example, the bucketamzn-s3-demo-buckethas the objectstest1.txtandanother/test1.txt:aws s3 cps3://amzn-s3-demo-bucket/s3://amzn-s3-demo-bucket2/\ --recursive \ --exclude"another/*"Output: copy: s3://amzn-s3-demo-bucket/test1.txt to s3://amzn-s3-demo-bucket2/test1.txtYou can combine --excludeand--includeoptions to copy only objects that match a pattern, excluding all others:aws s3 cps3://amzn-s3-demo-bucket/logs/s3://amzn-s3-demo-bucket2/logs/\ --recursive \ --exclude"*"\ --include"*.log"Output: copy: s3://amzn-s3-demo-bucket/logs/test/test.log to s3://amzn-s3-demo-bucket2/logs/test/test.log copy: s3://amzn-s3-demo-bucket/logs/test3.log to s3://amzn-s3-demo-bucket2/logs/test3.logExample 9: Setting the Access Control List (ACL) while copying an S3 object The following cpcommand copies a single object to a specified bucket and key while setting the ACL topublic-read-write:aws s3 cps3://amzn-s3-demo-bucket/test.txts3://amzn-s3-demo-bucket/test2.txt\ --aclpublic-read-writeOutput: copy: s3://amzn-s3-demo-bucket/test.txt to s3://amzn-s3-demo-bucket/test2.txtNote that if you're using the --acloption, ensure that any associated IAM policies include the"s3:PutObjectAcl"action:aws iam get-user-policy \ --user-namemyuser\ --policy-namemypolicyOutput: { "UserName": "myuser", "PolicyName": "mypolicy", "PolicyDocument": { "Version":"2012-10-17", "Statement": [ { "Action": [ "s3:PutObject", "s3:PutObjectAcl" ], "Resource": [ "arn:aws:s3:::amzn-s3-demo-bucket/*" ], "Effect": "Allow", "Sid": "Stmt1234567891234" } ] } }Example 10: Granting permissions for an S3 object The following cpcommand illustrates the use of the--grantsoption to grant read access to all users identified by URI and full control to a specific user identified by their Canonical ID:aws s3 cpfile.txts3://amzn-s3-demo-bucket/--grantsread=uri=http://acs.amazonaws.com/groups/global/AllUsersfull=id=79a59df900b949e55d96a1e698fbacedfd6e09d98eacf8f8d5218e7cd47ef2beOutput: upload: file.txt to s3://amzn-s3-demo-bucket/file.txtExample 11: Uploading a local file stream to S3 PowerShell may alter the encoding of or add a CRLF to piped input. The following cpcommand uploads a local file stream from standard input to a specified bucket and key:aws s3 cp-s3://amzn-s3-demo-bucket/stream.txtExample 12: Uploading a local file stream that is larger than 50GB to S3 The following cpcommand uploads a 51GB local file stream from standard input to a specified bucket and key. The--expected-sizeoption must be provided, or the upload may fail when it reaches the default part limit of 10,000:aws s3 cp-s3://amzn-s3-demo-bucket/stream.txt--expected-size54760833024Example 13: Downloading an S3 object as a local file stream PowerShell may alter the encoding of or add a CRLF to piped or redirected output. The following cpcommand downloads an S3 object locally as a stream to standard output. Downloading as a stream is not currently compatible with the--recursiveparameter:aws s3 cps3://amzn-s3-demo-bucket/stream.txt-Example 14: Uploading to an S3 access point The following cpcommand uploads a single file (mydoc.txt) to the access point (myaccesspoint) at the key (mykey):aws s3 cpmydoc.txts3://arn:aws:s3:us-west-2:123456789012:accesspoint/myaccesspoint/mykeyOutput: upload: mydoc.txt to s3://arn:aws:s3:us-west-2:123456789012:accesspoint/myaccesspoint/mykeyExample 15: Downloading from an S3 access point The following cpcommand downloads a single object (mykey) from the access point (myaccesspoint) to the local file (mydoc.txt):aws s3 cps3://arn:aws:s3:us-west-2:123456789012:accesspoint/myaccesspoint/mykeymydoc.txtOutput: download: s3://arn:aws:s3:us-west-2:123456789012:accesspoint/myaccesspoint/mykey to mydoc.txt- 
                    For API details, see Cp in Amazon CLI Command Reference. 
 
- 
                    
The following code example shows how to use create-bucket.
- Amazon CLI
- 
             
                    Example 1: To create a bucket The following create-bucketexample creates a bucket namedamzn-s3-demo-bucket:aws s3api create-bucket \ --bucketamzn-s3-demo-bucket\ --regionus-east-1Output: { "Location": "/amzn-s3-demo-bucket" }For more information, see Creating a bucket in the Amazon S3 User Guide. Example 2: To create a bucket with owner enforced The following create-bucketexample creates a bucket namedamzn-s3-demo-bucketthat uses the bucket owner enforced setting for S3 Object Ownership.aws s3api create-bucket \ --bucketamzn-s3-demo-bucket\ --regionus-east-1\ --object-ownershipBucketOwnerEnforcedOutput: { "Location": "/amzn-s3-demo-bucket" }For more information, see Controlling ownership of objects and disabling ACLs in the Amazon S3 User Guide. Example 3: To create a bucket outside of the ``us-east-1`` region The following create-bucketexample creates a bucket namedamzn-s3-demo-bucketin theeu-west-1region. Regions outside ofus-east-1require the appropriateLocationConstraintto be specified in order to create the bucket in the desired region.aws s3api create-bucket \ --bucketamzn-s3-demo-bucket\ --regioneu-west-1\ --create-bucket-configurationLocationConstraint=eu-west-1Output: { "Location": "http://amzn-s3-demo-bucket.s3.amazonaws.com/" }For more information, see Creating a bucket in the Amazon S3 User Guide. - 
                    For API details, see CreateBucket in Amazon CLI Command Reference. 
 
- 
                    
The following code example shows how to use create-multipart-upload.
- Amazon CLI
- 
             
                    The following command creates a multipart upload in the bucket amzn-s3-demo-bucketwith the keymultipart/01:aws s3api create-multipart-upload --bucketamzn-s3-demo-bucket--key 'multipart/01'Output: { "Bucket": "amzn-s3-demo-bucket", "UploadId": "dfRtDYU0WWCCcH43C3WFbkRONycyCpTJJvxu2i5GYkZljF.Yxwh6XG7WfS2vC4to6HiV6Yjlx.cph0gtNBtJ8P3URCSbB7rjxI5iEwVDmgaXZOGgkk5nVTW16HOQ5l0R", "Key": "multipart/01" }The completed file will be named 01in a folder calledmultipartin the bucketamzn-s3-demo-bucket. Save the upload ID, key and bucket name for use with theupload-partcommand.- 
                    For API details, see CreateMultipartUpload in Amazon CLI Command Reference. 
 
- 
                    
The following code example shows how to use delete-bucket-analytics-configuration.
- Amazon CLI
- 
             
                    To delete an analytics configuration for a bucket The following delete-bucket-analytics-configurationexample removes the analytics configuration for the specified bucket and ID.aws s3api delete-bucket-analytics-configuration \ --bucketamzn-s3-demo-bucket\ --id1This command produces no output. - 
                    For API details, see DeleteBucketAnalyticsConfiguration in Amazon CLI Command Reference. 
 
- 
                    
The following code example shows how to use delete-bucket-cors.
- Amazon CLI
- 
             
                    The following command deletes a Cross-Origin Resource Sharing configuration from a bucket named amzn-s3-demo-bucket:aws s3api delete-bucket-cors --bucketamzn-s3-demo-bucket- 
                    For API details, see DeleteBucketCors in Amazon CLI Command Reference. 
 
- 
                    
The following code example shows how to use delete-bucket-encryption.
- Amazon CLI
- 
             
                    To delete the server-side encryption configuration of a bucket The following delete-bucket-encryptionexample deletes the server-side encryption configuration of the specified bucket.aws s3api delete-bucket-encryption \ --bucketamzn-s3-demo-bucketThis command produces no output. - 
                    For API details, see DeleteBucketEncryption in Amazon CLI Command Reference. 
 
- 
                    
The following code example shows how to use delete-bucket-intelligent-tiering-configuration.
- Amazon CLI
- 
             
                    To remove an S3 Intelligent-Tiering configuration on a bucket The following delete-bucket-intelligent-tiering-configurationexample removes an S3 Intelligent-Tiering configuration, named ExampleConfig, on a bucket.aws s3api delete-bucket-intelligent-tiering-configuration \ --bucketamzn-s3-demo-bucket\ --idExampleConfigThis command produces no output. For more information, see Using S3 Intelligent-Tiering in the Amazon S3 User Guide. - 
                    For API details, see DeleteBucketIntelligentTieringConfiguration in Amazon CLI Command Reference. 
 
- 
                    
The following code example shows how to use delete-bucket-inventory-configuration.
- Amazon CLI
- 
             
                    To delete the inventory configuration of a bucket The following delete-bucket-inventory-configurationexample deletes the inventory configuration with ID1for the specified bucket.aws s3api delete-bucket-inventory-configuration \ --bucketamzn-s3-demo-bucket\ --id1This command produces no output. - 
                    For API details, see DeleteBucketInventoryConfiguration in Amazon CLI Command Reference. 
 
- 
                    
The following code example shows how to use delete-bucket-lifecycle.
- Amazon CLI
- 
             
                    The following command deletes a lifecycle configuration from a bucket named amzn-s3-demo-bucket:aws s3api delete-bucket-lifecycle --bucketamzn-s3-demo-bucket- 
                    For API details, see DeleteBucketLifecycle in Amazon CLI Command Reference. 
 
- 
                    
The following code example shows how to use delete-bucket-metrics-configuration.
- Amazon CLI
- 
             
                    To delete a metrics configuration for a bucket The following delete-bucket-metrics-configurationexample removes the metrics configuration for the specified bucket and ID.aws s3api delete-bucket-metrics-configuration \ --bucketamzn-s3-demo-bucket\ --id123This command produces no output. - 
                    For API details, see DeleteBucketMetricsConfiguration in Amazon CLI Command Reference. 
 
- 
                    
The following code example shows how to use delete-bucket-ownership-controls.
- Amazon CLI
- 
             
                    To remove the bucket ownership settings of a bucket The following delete-bucket-ownership-controlsexample removes the bucket ownership settings of a bucket.aws s3api delete-bucket-ownership-controls \ --bucketamzn-s3-demo-bucketThis command produces no output. For more information, see Setting Object Ownership on an existing bucket in the Amazon S3 User Guide. - 
                    For API details, see DeleteBucketOwnershipControls in Amazon CLI Command Reference. 
 
- 
                    
The following code example shows how to use delete-bucket-policy.
- Amazon CLI
- 
             
                    The following command deletes a bucket policy from a bucket named amzn-s3-demo-bucket:aws s3api delete-bucket-policy --bucketamzn-s3-demo-bucket- 
                    For API details, see DeleteBucketPolicy in Amazon CLI Command Reference. 
 
- 
                    
The following code example shows how to use delete-bucket-replication.
- Amazon CLI
- 
             
                    The following command deletes a replication configuration from a bucket named amzn-s3-demo-bucket:aws s3api delete-bucket-replication --bucketamzn-s3-demo-bucket- 
                    For API details, see DeleteBucketReplication in Amazon CLI Command Reference. 
 
- 
                    
The following code example shows how to use delete-bucket-tagging.
- Amazon CLI
- 
             
                    The following command deletes a tagging configuration from a bucket named amzn-s3-demo-bucket:aws s3api delete-bucket-tagging --bucketamzn-s3-demo-bucket- 
                    For API details, see DeleteBucketTagging in Amazon CLI Command Reference. 
 
- 
                    
The following code example shows how to use delete-bucket-website.
- Amazon CLI
- 
             
                    The following command deletes a website configuration from a bucket named amzn-s3-demo-bucket:aws s3api delete-bucket-website --bucketamzn-s3-demo-bucket- 
                    For API details, see DeleteBucketWebsite in Amazon CLI Command Reference. 
 
- 
                    
The following code example shows how to use delete-bucket.
- Amazon CLI
- 
             
                    The following command deletes a bucket named amzn-s3-demo-bucket:aws s3api delete-bucket --bucketamzn-s3-demo-bucket--regionus-east-1- 
                    For API details, see DeleteBucket in Amazon CLI Command Reference. 
 
- 
                    
The following code example shows how to use delete-object-tagging.
- Amazon CLI
- 
             
                    To delete the tag sets of an object The following delete-object-taggingexample deletes the tag with the specified key from the objectdoc1.rtf.aws s3api delete-object-tagging \ --bucketamzn-s3-demo-bucket\ --keydoc1.rtfThis command produces no output. - 
                    For API details, see DeleteObjectTagging in Amazon CLI Command Reference. 
 
- 
                    
The following code example shows how to use delete-object.
- Amazon CLI
- 
             
                    The following command deletes an object named test.txtfrom a bucket namedamzn-s3-demo-bucket:aws s3api delete-object --bucketamzn-s3-demo-bucket--keytest.txtIf bucket versioning is enabled, the output will contain the version ID of the delete marker: { "VersionId": "9_gKg5vG56F.TTEUdwkxGpJ3tNDlWlGq", "DeleteMarker": true }For more information about deleting objects, see Deleting Objects in the Amazon S3 Developer Guide. - 
                    For API details, see DeleteObject in Amazon CLI Command Reference. 
 
- 
                    
The following code example shows how to use delete-objects.
- Amazon CLI
- 
             
                    The following command deletes an object from a bucket named amzn-s3-demo-bucket:aws s3api delete-objects --bucketamzn-s3-demo-bucket--deletefile://delete.jsondelete.jsonis a JSON document in the current directory that specifies the object to delete:{ "Objects": [ { "Key": "test1.txt" } ], "Quiet": false }Output: { "Deleted": [ { "DeleteMarkerVersionId": "mYAT5Mc6F7aeUL8SS7FAAqUPO1koHwzU", "Key": "test1.txt", "DeleteMarker": true } ] }- 
                    For API details, see DeleteObjects in Amazon CLI Command Reference. 
 
- 
                    
The following code example shows how to use delete-public-access-block.
- Amazon CLI
- 
             
                    To delete the block public access configuration for a bucket The following delete-public-access-blockexample removes the block public access configuration on the specified bucket.aws s3api delete-public-access-block \ --bucketamzn-s3-demo-bucketThis command produces no output. - 
                    For API details, see DeletePublicAccessBlock in Amazon CLI Command Reference. 
 
- 
                    
The following code example shows how to use get-bucket-accelerate-configuration.
- Amazon CLI
- 
             
                    To retrieve the accelerate configuration of a bucket The following get-bucket-accelerate-configurationexample retrieves the accelerate configuration for the specified bucket.aws s3api get-bucket-accelerate-configuration \ --bucketamzn-s3-demo-bucketOutput: { "Status": "Enabled" }- 
                    For API details, see GetBucketAccelerateConfiguration in Amazon CLI Command Reference. 
 
- 
                    
The following code example shows how to use get-bucket-acl.
- Amazon CLI
- 
             
                    The following command retrieves the access control list for a bucket named amzn-s3-demo-bucket:aws s3api get-bucket-acl --bucketamzn-s3-demo-bucketOutput: { "Owner": { "DisplayName": "my-username", "ID": "7009a8971cd538e11f6b6606438875e7c86c5b672f46db45460ddcd087d36c32" }, "Grants": [ { "Grantee": { "DisplayName": "my-username", "ID": "7009a8971cd538e11f6b6606438875e7c86c5b672f46db45460ddcd087d36c32" }, "Permission": "FULL_CONTROL" } ] }- 
                    For API details, see GetBucketAcl in Amazon CLI Command Reference. 
 
- 
                    
The following code example shows how to use get-bucket-analytics-configuration.
- Amazon CLI
- 
             
                    To retrieve the analytics configuration for a bucket with a specific ID The following get-bucket-analytics-configurationexample displays the analytics configuration for the specified bucket and ID.aws s3api get-bucket-analytics-configuration \ --bucketamzn-s3-demo-bucket\ --id1Output: { "AnalyticsConfiguration": { "StorageClassAnalysis": {}, "Id": "1" } }- 
                    For API details, see GetBucketAnalyticsConfiguration in Amazon CLI Command Reference. 
 
- 
                    
The following code example shows how to use get-bucket-cors.
- Amazon CLI
- 
             
                    The following command retrieves the Cross-Origin Resource Sharing configuration for a bucket named amzn-s3-demo-bucket:aws s3api get-bucket-cors --bucketamzn-s3-demo-bucketOutput: { "CORSRules": [ { "AllowedHeaders": [ "*" ], "ExposeHeaders": [ "x-amz-server-side-encryption" ], "AllowedMethods": [ "PUT", "POST", "DELETE" ], "MaxAgeSeconds": 3000, "AllowedOrigins": [ "http://www.example.com" ] }, { "AllowedHeaders": [ "Authorization" ], "MaxAgeSeconds": 3000, "AllowedMethods": [ "GET" ], "AllowedOrigins": [ "*" ] } ] }- 
                    For API details, see GetBucketCors in Amazon CLI Command Reference. 
 
- 
                    
The following code example shows how to use get-bucket-encryption.
- Amazon CLI
- 
             
                    To retrieve the server-side encryption configuration for a bucket The following get-bucket-encryptionexample retrieves the server-side encryption configuration for the bucketamzn-s3-demo-bucket.aws s3api get-bucket-encryption \ --bucketamzn-s3-demo-bucketOutput: { "ServerSideEncryptionConfiguration": { "Rules": [ { "ApplyServerSideEncryptionByDefault": { "SSEAlgorithm": "AES256" } } ] } }- 
                    For API details, see GetBucketEncryption in Amazon CLI Command Reference. 
 
- 
                    
The following code example shows how to use get-bucket-intelligent-tiering-configuration.
- Amazon CLI
- 
             
                    To retrieve an S3 Intelligent-Tiering configuration on a bucket The following get-bucket-intelligent-tiering-configurationexample retrieves an S3 Intelligent-Tiering configuration, named ExampleConfig, on a bucket.aws s3api get-bucket-intelligent-tiering-configuration \ --bucketamzn-s3-demo-bucket\ --idExampleConfigOutput: { "IntelligentTieringConfiguration": { "Id": "ExampleConfig2", "Filter": { "Prefix": "images" }, "Status": "Enabled", "Tierings": [ { "Days": 90, "AccessTier": "ARCHIVE_ACCESS" }, { "Days": 180, "AccessTier": "DEEP_ARCHIVE_ACCESS" } ] } }For more information, see Using S3 Intelligent-Tiering in the Amazon S3 User Guide. - 
                    For API details, see GetBucketIntelligentTieringConfiguration in Amazon CLI Command Reference. 
 
- 
                    
The following code example shows how to use get-bucket-inventory-configuration.
- Amazon CLI
- 
             
                    To retrieve the inventory configuration for a bucket The following get-bucket-inventory-configurationexample retrieves the inventory configuration for the specified bucket with ID1.aws s3api get-bucket-inventory-configuration \ --bucketamzn-s3-demo-bucket\ --id1Output: { "InventoryConfiguration": { "IsEnabled": true, "Destination": { "S3BucketDestination": { "Format": "ORC", "Bucket": "arn:aws:s3:::amzn-s3-demo-bucket", "AccountId": "123456789012" } }, "IncludedObjectVersions": "Current", "Id": "1", "Schedule": { "Frequency": "Weekly" } } }- 
                    For API details, see GetBucketInventoryConfiguration in Amazon CLI Command Reference. 
 
- 
                    
The following code example shows how to use get-bucket-lifecycle-configuration.
- Amazon CLI
- 
             
                    The following command retrieves the lifecycle configuration for a bucket named amzn-s3-demo-bucket:aws s3api get-bucket-lifecycle-configuration --bucketamzn-s3-demo-bucketOutput: { "Rules": [ { "ID": "Move rotated logs to Glacier", "Prefix": "rotated/", "Status": "Enabled", "Transitions": [ { "Date": "2015-11-10T00:00:00.000Z", "StorageClass": "GLACIER" } ] }, { "Status": "Enabled", "Prefix": "", "NoncurrentVersionTransitions": [ { "NoncurrentDays": 0, "StorageClass": "GLACIER" } ], "ID": "Move old versions to Glacier" } ] }- 
                    For API details, see GetBucketLifecycleConfiguration in Amazon CLI Command Reference. 
 
- 
                    
The following code example shows how to use get-bucket-lifecycle.
- Amazon CLI
- 
             
                    The following command retrieves the lifecycle configuration for a bucket named amzn-s3-demo-bucket:aws s3api get-bucket-lifecycle --bucketamzn-s3-demo-bucketOutput: { "Rules": [ { "ID": "Move to Glacier after sixty days (objects in logs/2015/)", "Prefix": "logs/2015/", "Status": "Enabled", "Transition": { "Days": 60, "StorageClass": "GLACIER" } }, { "Expiration": { "Date": "2016-01-01T00:00:00.000Z" }, "ID": "Delete 2014 logs in 2016.", "Prefix": "logs/2014/", "Status": "Enabled" } ] }- 
                    For API details, see GetBucketLifecycle in Amazon CLI Command Reference. 
 
- 
                    
The following code example shows how to use get-bucket-location.
- Amazon CLI
- 
             
                    The following command retrieves the location constraint for a bucket named amzn-s3-demo-bucket, if a constraint exists:aws s3api get-bucket-location --bucketamzn-s3-demo-bucketOutput: { "LocationConstraint": "us-west-2" }- 
                    For API details, see GetBucketLocation in Amazon CLI Command Reference. 
 
- 
                    
The following code example shows how to use get-bucket-logging.
- Amazon CLI
- 
             
                    To retrieve the logging status for a bucket The following get-bucket-loggingexample retrieves the logging status for the specified bucket.aws s3api get-bucket-logging \ --bucketamzn-s3-demo-bucketOutput: { "LoggingEnabled": { "TargetPrefix": "", "TargetBucket": "amzn-s3-demo-bucket-logs" } }- 
                    For API details, see GetBucketLogging in Amazon CLI Command Reference. 
 
- 
                    
The following code example shows how to use get-bucket-metrics-configuration.
- Amazon CLI
- 
             
                    To retrieve the metrics configuration for a bucket with a specific ID The following get-bucket-metrics-configurationexample displays the metrics configuration for the specified bucket and ID.aws s3api get-bucket-metrics-configuration \ --bucketamzn-s3-demo-bucket\ --id123Output: { "MetricsConfiguration": { "Filter": { "Prefix": "logs" }, "Id": "123" } }- 
                    For API details, see GetBucketMetricsConfiguration in Amazon CLI Command Reference. 
 
- 
                    
The following code example shows how to use get-bucket-notification-configuration.
- Amazon CLI
- 
             
                    The following command retrieves the notification configuration for a bucket named amzn-s3-demo-bucket:aws s3api get-bucket-notification-configuration --bucketamzn-s3-demo-bucketOutput: { "TopicConfigurations": [ { "Id": "YmQzMmEwM2EjZWVlI0NGItNzVtZjI1MC00ZjgyLWZDBiZWNl", "TopicArn": "arn:aws:sns:us-west-2:123456789012:my-notification-topic", "Events": [ "s3:ObjectCreated:*" ] } ] }- 
                    For API details, see GetBucketNotificationConfiguration in Amazon CLI Command Reference. 
 
- 
                    
The following code example shows how to use get-bucket-notification.
- Amazon CLI
- 
             
                    The following command retrieves the notification configuration for a bucket named amzn-s3-demo-bucket:aws s3api get-bucket-notification --bucketamzn-s3-demo-bucketOutput: { "TopicConfiguration": { "Topic": "arn:aws:sns:us-west-2:123456789012:my-notification-topic", "Id": "YmQzMmEwM2EjZWVlI0NGItNzVtZjI1MC00ZjgyLWZDBiZWNl", "Event": "s3:ObjectCreated:*", "Events": [ "s3:ObjectCreated:*" ] } }- 
                    For API details, see GetBucketNotification in Amazon CLI Command Reference. 
 
- 
                    
The following code example shows how to use get-bucket-ownership-controls.
- Amazon CLI
- 
             
                    To retrieve the bucket ownership settings of a bucket The following get-bucket-ownership-controlsexample retrieves the bucket ownership settings of a bucket.aws s3api get-bucket-ownership-controls \ --bucketamzn-s3-demo-bucketOutput: { "OwnershipControls": { "Rules": [ { "ObjectOwnership": "BucketOwnerEnforced" } ] } }For more information, see Viewing the Object Ownership setting for an S3 bucket in the Amazon S3 User Guide. - 
                    For API details, see GetBucketOwnershipControls in Amazon CLI Command Reference. 
 
- 
                    
The following code example shows how to use get-bucket-policy-status.
- Amazon CLI
- 
             
                    To retrieve the policy status for a bucket indicating whether the bucket is public The following get-bucket-policy-statusexample retrieves the policy status for the bucketamzn-s3-demo-bucket.aws s3api get-bucket-policy-status \ --bucketamzn-s3-demo-bucketOutput: { "PolicyStatus": { "IsPublic": false } }- 
                    For API details, see GetBucketPolicyStatus in Amazon CLI Command Reference. 
 
- 
                    
The following code example shows how to use get-bucket-policy.
- Amazon CLI
- 
             
                    The following command retrieves the bucket policy for a bucket named amzn-s3-demo-bucket:aws s3api get-bucket-policy --bucketamzn-s3-demo-bucketOutput: { "Policy": "{\"Version\":\"2008-10-17\",\"Statement\":[{\"Sid\":\"\",\"Effect\":\"Allow\",\"Principal\":\"*\",\"Action\":\"s3:GetObject\",\"Resource\":\"arn:aws:s3:::amzn-s3-demo-bucket/*\"},{\"Sid\":\"\",\"Effect\":\"Deny\",\"Principal\":\"*\",\"Action\":\"s3:GetObject\",\"Resource\":\"arn:aws:s3:::amzn-s3-demo-bucket/secret/*\"}]}" }Get and put a bucket policyThe following example shows how you can download an Amazon S3 bucket policy, make modifications to the file, and then use put-bucket-policyto apply the modified bucket policy. To download the bucket policy to a file, you can run:aws s3api get-bucket-policy --bucket amzn-s3-demo-bucket --query Policy --output text > policy.jsonYou can then modify the policy.jsonfile as needed. Finally you can apply this modified policy back to the S3 bucket by running:policy.jsonfile as needed. Finally you can apply this modified policy back to the S3 bucket by running:file as needed. Finally you can apply this modified policy back to the S3 bucket by running: aws s3api put-bucket-policy --bucket amzn-s3-demo-bucket --policy file://policy.json- 
                    For API details, see GetBucketPolicy in Amazon CLI Command Reference. 
 
- 
                    
The following code example shows how to use get-bucket-replication.
- Amazon CLI
- 
             
                    The following command retrieves the replication configuration for a bucket named amzn-s3-demo-bucket:aws s3api get-bucket-replication --bucketamzn-s3-demo-bucketOutput: { "ReplicationConfiguration": { "Rules": [ { "Status": "Enabled", "Prefix": "", "Destination": { "Bucket": "arn:aws:s3:::amzn-s3-demo-bucket-backup", "StorageClass": "STANDARD" }, "ID": "ZmUwNzE4ZmQ4tMjVhOS00MTlkLOGI4NDkzZTIWJjNTUtYTA1" } ], "Role": "arn:aws:iam::123456789012:role/s3-replication-role" } }- 
                    For API details, see GetBucketReplication in Amazon CLI Command Reference. 
 
- 
                    
The following code example shows how to use get-bucket-request-payment.
- Amazon CLI
- 
             
                    To retrieve the request payment configuration for a bucket The following get-bucket-request-paymentexample retrieves the requester pays configuration for the specified bucket.aws s3api get-bucket-request-payment \ --bucketamzn-s3-demo-bucketOutput: { "Payer": "BucketOwner" }- 
                    For API details, see GetBucketRequestPayment in Amazon CLI Command Reference. 
 
- 
                    
The following code example shows how to use get-bucket-tagging.
- Amazon CLI
- 
             
                    The following command retrieves the tagging configuration for a bucket named amzn-s3-demo-bucket:aws s3api get-bucket-tagging --bucketamzn-s3-demo-bucketOutput: { "TagSet": [ { "Value": "marketing", "Key": "organization" } ] }- 
                    For API details, see GetBucketTagging in Amazon CLI Command Reference. 
 
- 
                    
The following code example shows how to use get-bucket-versioning.
- Amazon CLI
- 
             
                    The following command retrieves the versioning configuration for a bucket named amzn-s3-demo-bucket:aws s3api get-bucket-versioning --bucketamzn-s3-demo-bucketOutput: { "Status": "Enabled" }- 
                    For API details, see GetBucketVersioning in Amazon CLI Command Reference. 
 
- 
                    
The following code example shows how to use get-bucket-website.
- Amazon CLI
- 
             
                    The following command retrieves the static website configuration for a bucket named amzn-s3-demo-bucket:aws s3api get-bucket-website --bucketamzn-s3-demo-bucketOutput: { "IndexDocument": { "Suffix": "index.html" }, "ErrorDocument": { "Key": "error.html" } }- 
                    For API details, see GetBucketWebsite in Amazon CLI Command Reference. 
 
- 
                    
The following code example shows how to use get-object-acl.
- Amazon CLI
- 
             
                    The following command retrieves the access control list for an object in a bucket named amzn-s3-demo-bucket:aws s3api get-object-acl --bucketamzn-s3-demo-bucket--keyindex.htmlOutput: { "Owner": { "DisplayName": "my-username", "ID": "7009a8971cd538e11f6b6606438875e7c86c5b672f46db45460ddcd087d36c32" }, "Grants": [ { "Grantee": { "DisplayName": "my-username", "ID": "7009a8971cd538e11f6b6606438875e7c86c5b672f46db45460ddcd087d36c32" }, "Permission": "FULL_CONTROL" }, { "Grantee": { "URI": "http://acs.amazonaws.com/groups/global/AllUsers" }, "Permission": "READ" } ] }- 
                    For API details, see GetObjectAcl in Amazon CLI Command Reference. 
 
- 
                    
The following code example shows how to use get-object-attributes.
- Amazon CLI
- 
             
                    To retrieves metadata from an object without returning the object itself The following get-object-attributesexample retrieves metadata from the objectdoc1.rtf.aws s3api get-object-attributes \ --bucketamzn-s3-demo-bucket\ --keydoc1.rtf\ --object-attributes"StorageClass""ETag""ObjectSize"Output: { "LastModified": "2022-03-15T19:37:31+00:00", "VersionId": "IuCPjXTDzHNfldAuitVBIKJpF2p1fg4P", "ETag": "b662d79adeb7c8d787ea7eafb9ef6207", "StorageClass": "STANDARD", "ObjectSize": 405 }For more information, see GetObjectAttributes in the Amazon S3 API Reference. - 
                    For API details, see GetObjectAttributes in Amazon CLI Command Reference. 
 
- 
                    
The following code example shows how to use get-object-legal-hold.
- Amazon CLI
- 
             
                    Retrieves the Legal Hold status of an object The following get-object-legal-holdexample retrieves the Legal Hold status for the specified object.aws s3api get-object-legal-hold \ --bucketamzn-s3-demo-bucket-with-object-lock\ --keydoc1.rtfOutput: { "LegalHold": { "Status": "ON" } }- 
                    For API details, see GetObjectLegalHold in Amazon CLI Command Reference. 
 
- 
                    
The following code example shows how to use get-object-lock-configuration.
- Amazon CLI
- 
             
                    To retrieve an object lock configuration for a bucket The following get-object-lock-configurationexample retrieves the object lock configuration for the specified bucket.aws s3api get-object-lock-configuration \ --bucketamzn-s3-demo-bucket-with-object-lockOutput: { "ObjectLockConfiguration": { "ObjectLockEnabled": "Enabled", "Rule": { "DefaultRetention": { "Mode": "COMPLIANCE", "Days": 50 } } } }- 
                    For API details, see GetObjectLockConfiguration in Amazon CLI Command Reference. 
 
- 
                    
The following code example shows how to use get-object-retention.
- Amazon CLI
- 
             
                    To retrieve the object retention configuration for an object The following get-object-retentionexample retrieves the object retention configuration for the specified object.aws s3api get-object-retention \ --bucketamzn-s3-demo-bucket-with-object-lock\ --keydoc1.rtfOutput: { "Retention": { "Mode": "GOVERNANCE", "RetainUntilDate": "2025-01-01T00:00:00.000Z" } }- 
                    For API details, see GetObjectRetention in Amazon CLI Command Reference. 
 
- 
                    
The following code example shows how to use get-object-tagging.
- Amazon CLI
- 
             
                    To retrieve the tags attached to an object The following get-object-taggingexample retrieves the values for the specified key from the specified object.aws s3api get-object-tagging \ --bucketamzn-s3-demo-bucket\ --keydoc1.rtfOutput: { "TagSet": [ { "Value": "confidential", "Key": "designation" } ] }The following get-object-taggingexample tries to retrieve the tag sets of the objectdoc2.rtf, which has no tags.aws s3api get-object-tagging \ --bucketamzn-s3-demo-bucket\ --keydoc2.rtfOutput: { "TagSet": [] }The following get-object-taggingexample retrieves the tag sets of the objectdoc3.rtf, which has multiple tags.aws s3api get-object-tagging \ --bucketamzn-s3-demo-bucket\ --keydoc3.rtfOutput: { "TagSet": [ { "Value": "confidential", "Key": "designation" }, { "Value": "finance", "Key": "department" }, { "Value": "payroll", "Key": "team" } ] }- 
                    For API details, see GetObjectTagging in Amazon CLI Command Reference. 
 
- 
                    
The following code example shows how to use get-object-torrent.
- Amazon CLI
- 
             
                    The following command creates a torrent for an object in a bucket named amzn-s3-demo-bucket:aws s3api get-object-torrent --bucketamzn-s3-demo-bucket--keylarge-video-file.mp4large-video-file.torrentThe torrent file is saved locally in the current folder. Note that the output filename ( large-video-file.torrent) is specified without an option name and must be the last argument in the command.- 
                    For API details, see GetObjectTorrent in Amazon CLI Command Reference. 
 
- 
                    
The following code example shows how to use get-object.
- Amazon CLI
- 
             
                    The following example uses the get-objectcommand to download an object from Amazon S3:aws s3api get-object --buckettext-content--keydir/my_images.tar.bz2my_images.tar.bz2Note that the outfile parameter is specified without an option name such as "--outfile". The name of the output file must be the last parameter in the command. The example below demonstrates the use of --rangeto download a specific byte range from an object. Note the byte ranges needs to be prefixed with "bytes=":aws s3api get-object --buckettext-content--keydir/my_data--rangebytes=8888-9999my_data_rangeFor more information about retrieving objects, see Getting Objects in the Amazon S3 Developer Guide. - 
                    For API details, see GetObject in Amazon CLI Command Reference. 
 
- 
                    
The following code example shows how to use get-public-access-block.
- Amazon CLI
- 
             
                    To set or modify the block public access configuration for a bucket The following get-public-access-blockexample displays the block public access configuration for the specified bucket.aws s3api get-public-access-block \ --bucketamzn-s3-demo-bucketOutput: { "PublicAccessBlockConfiguration": { "IgnorePublicAcls": true, "BlockPublicPolicy": true, "BlockPublicAcls": true, "RestrictPublicBuckets": true } }- 
                    For API details, see GetPublicAccessBlock in Amazon CLI Command Reference. 
 
- 
                    
The following code example shows how to use head-bucket.
- Amazon CLI
- 
             
                    The following command verifies access to a bucket named amzn-s3-demo-bucket:aws s3api head-bucket --bucketamzn-s3-demo-bucketIf the bucket exists and you have access to it, no output is returned. Otherwise, an error message will be shown. For example: A client error (404) occurred when calling the HeadBucket operation: Not Found- 
                    For API details, see HeadBucket in Amazon CLI Command Reference. 
 
- 
                    
The following code example shows how to use head-object.
- Amazon CLI
- 
             
                    The following command retrieves metadata for an object in a bucket named amzn-s3-demo-bucket:aws s3api head-object --bucketamzn-s3-demo-bucket--keyindex.htmlOutput: { "AcceptRanges": "bytes", "ContentType": "text/html", "LastModified": "Thu, 16 Apr 2015 18:19:14 GMT", "ContentLength": 77, "VersionId": "null", "ETag": "\"30a6ec7e1a9ad79c203d05a589c8b400\"", "Metadata": {} }- 
                    For API details, see HeadObject in Amazon CLI Command Reference. 
 
- 
                    
The following code example shows how to use list-bucket-analytics-configurations.
- Amazon CLI
- 
             
                    To retrieve a list of analytics configurations for a bucket The following list-bucket-analytics-configurationsretrieves a list of analytics configurations for the specified bucket.aws s3api list-bucket-analytics-configurations \ --bucketamzn-s3-demo-bucketOutput: { "AnalyticsConfigurationList": [ { "StorageClassAnalysis": {}, "Id": "1" } ], "IsTruncated": false }- 
                    For API details, see ListBucketAnalyticsConfigurations in Amazon CLI Command Reference. 
 
- 
                    
The following code example shows how to use list-bucket-intelligent-tiering-configurations.
- Amazon CLI
- 
             
                    To retrieve all S3 Intelligent-Tiering configurations on a bucket The following list-bucket-intelligent-tiering-configurationsexample retrieves all S3 Intelligent-Tiering configuration on a bucket.aws s3api list-bucket-intelligent-tiering-configurations \ --bucketamzn-s3-demo-bucketOutput: { "IsTruncated": false, "IntelligentTieringConfigurationList": [ { "Id": "ExampleConfig", "Filter": { "Prefix": "images" }, "Status": "Enabled", "Tierings": [ { "Days": 90, "AccessTier": "ARCHIVE_ACCESS" }, { "Days": 180, "AccessTier": "DEEP_ARCHIVE_ACCESS" } ] }, { "Id": "ExampleConfig2", "Status": "Disabled", "Tierings": [ { "Days": 730, "AccessTier": "ARCHIVE_ACCESS" } ] }, { "Id": "ExampleConfig3", "Filter": { "Tag": { "Key": "documents", "Value": "taxes" } }, "Status": "Enabled", "Tierings": [ { "Days": 90, "AccessTier": "ARCHIVE_ACCESS" }, { "Days": 365, "AccessTier": "DEEP_ARCHIVE_ACCESS" } ] } ] }For more information, see Using S3 Intelligent-Tiering in the Amazon S3 User Guide. - 
                    For API details, see ListBucketIntelligentTieringConfigurations in Amazon CLI Command Reference. 
 
- 
                    
The following code example shows how to use list-bucket-inventory-configurations.
- Amazon CLI
- 
             
                    To retrieve a list of inventory configurations for a bucket The following list-bucket-inventory-configurationsexample lists the inventory configurations for the specified bucket.aws s3api list-bucket-inventory-configurations \ --bucketamzn-s3-demo-bucketOutput: { "InventoryConfigurationList": [ { "IsEnabled": true, "Destination": { "S3BucketDestination": { "Format": "ORC", "Bucket": "arn:aws:s3:::amzn-s3-demo-bucket", "AccountId": "123456789012" } }, "IncludedObjectVersions": "Current", "Id": "1", "Schedule": { "Frequency": "Weekly" } }, { "IsEnabled": true, "Destination": { "S3BucketDestination": { "Format": "CSV", "Bucket": "arn:aws:s3:::amzn-s3-demo-bucket", "AccountId": "123456789012" } }, "IncludedObjectVersions": "Current", "Id": "2", "Schedule": { "Frequency": "Daily" } } ], "IsTruncated": false }- 
                    For API details, see ListBucketInventoryConfigurations in Amazon CLI Command Reference. 
 
- 
                    
The following code example shows how to use list-bucket-metrics-configurations.
- Amazon CLI
- 
             
                    To retrieve a list of metrics configurations for a bucket The following list-bucket-metrics-configurationsexample retrieves a list of metrics configurations for the specified bucket.aws s3api list-bucket-metrics-configurations \ --bucketamzn-s3-demo-bucketOutput: { "IsTruncated": false, "MetricsConfigurationList": [ { "Filter": { "Prefix": "logs" }, "Id": "123" }, { "Filter": { "Prefix": "tmp" }, "Id": "234" } ] }- 
                    For API details, see ListBucketMetricsConfigurations in Amazon CLI Command Reference. 
 
- 
                    
The following code example shows how to use list-buckets.
- Amazon CLI
- 
             
                    The following command uses the list-bucketscommand to display the names of all your Amazon S3 buckets (across all regions):aws s3api list-buckets --query"Buckets[].Name"The query option filters the output of list-bucketsdown to only the bucket names.For more information about buckets, see Working with Amazon S3 Buckets in the Amazon S3 Developer Guide. - 
                    For API details, see ListBuckets in Amazon CLI Command Reference. 
 
- 
                    
The following code example shows how to use list-multipart-uploads.
- Amazon CLI
- 
             
                    The following command lists all of the active multipart uploads for a bucket named amzn-s3-demo-bucket:aws s3api list-multipart-uploads --bucketamzn-s3-demo-bucketOutput: { "Uploads": [ { "Initiator": { "DisplayName": "username", "ID": "arn:aws:iam::0123456789012:user/username" }, "Initiated": "2015-06-02T18:01:30.000Z", "UploadId": "dfRtDYU0WWCCcH43C3WFbkRONycyCpTJJvxu2i5GYkZljF.Yxwh6XG7WfS2vC4to6HiV6Yjlx.cph0gtNBtJ8P3URCSbB7rjxI5iEwVDmgaXZOGgkk5nVTW16HOQ5l0R", "StorageClass": "STANDARD", "Key": "multipart/01", "Owner": { "DisplayName": "aws-account-name", "ID": "100719349fc3b6dcd7c820a124bf7aecd408092c3d7b51b38494939801fc248b" } } ], "CommonPrefixes": [] }In progress multipart uploads incur storage costs in Amazon S3. Complete or abort an active multipart upload to remove its parts from your account. - 
                    For API details, see ListMultipartUploads in Amazon CLI Command Reference. 
 
- 
                    
The following code example shows how to use list-object-versions.
- Amazon CLI
- 
             
                    The following command retrieves version information for an object in a bucket named amzn-s3-demo-bucket:aws s3api list-object-versions --bucketamzn-s3-demo-bucket--prefixindex.htmlOutput: { "DeleteMarkers": [ { "Owner": { "DisplayName": "my-username", "ID": "7009a8971cd660687538875e7c86c5b672fe116bd438f46db45460ddcd036c32" }, "IsLatest": true, "VersionId": "B2VsEK5saUNNHKcOAJj7hIE86RozToyq", "Key": "index.html", "LastModified": "2015-11-10T00:57:03.000Z" }, { "Owner": { "DisplayName": "my-username", "ID": "7009a8971cd660687538875e7c86c5b672fe116bd438f46db45460ddcd036c32" }, "IsLatest": false, "VersionId": ".FLQEZscLIcfxSq.jsFJ.szUkmng2Yw6", "Key": "index.html", "LastModified": "2015-11-09T23:32:20.000Z" } ], "Versions": [ { "LastModified": "2015-11-10T00:20:11.000Z", "VersionId": "Rb_l2T8UHDkFEwCgJjhlgPOZC0qJ.vpD", "ETag": "\"0622528de826c0df5db1258a23b80be5\"", "StorageClass": "STANDARD", "Key": "index.html", "Owner": { "DisplayName": "my-username", "ID": "7009a8971cd660687538875e7c86c5b672fe116bd438f46db45460ddcd036c32" }, "IsLatest": false, "Size": 38 }, { "LastModified": "2015-11-09T23:26:41.000Z", "VersionId": "rasWWGpgk9E4s0LyTJgusGeRQKLVIAFf", "ETag": "\"06225825b8028de826c0df5db1a23be5\"", "StorageClass": "STANDARD", "Key": "index.html", "Owner": { "DisplayName": "my-username", "ID": "7009a8971cd660687538875e7c86c5b672fe116bd438f46db45460ddcd036c32" }, "IsLatest": false, "Size": 38 }, { "LastModified": "2015-11-09T22:50:50.000Z", "VersionId": "null", "ETag": "\"d1f45267a863c8392e07d24dd592f1b9\"", "StorageClass": "STANDARD", "Key": "index.html", "Owner": { "DisplayName": "my-username", "ID": "7009a8971cd660687538875e7c86c5b672fe116bd438f46db45460ddcd036c32" }, "IsLatest": false, "Size": 533823 } ] }- 
                    For API details, see ListObjectVersions in Amazon CLI Command Reference. 
 
- 
                    
The following code example shows how to use list-objects-v2.
- Amazon CLI
- 
             
                    To get a list of objects in a bucket The following list-objects-v2example lists the objects in the specified bucket.aws s3api list-objects-v2 \ --bucketamzn-s3-demo-bucketOutput: { "Contents": [ { "LastModified": "2019-11-05T23:11:50.000Z", "ETag": "\"621503c373607d548b37cff8778d992c\"", "StorageClass": "STANDARD", "Key": "doc1.rtf", "Size": 391 }, { "LastModified": "2019-11-05T23:11:50.000Z", "ETag": "\"a2cecc36ab7c7fe3a71a273b9d45b1b5\"", "StorageClass": "STANDARD", "Key": "doc2.rtf", "Size": 373 }, { "LastModified": "2019-11-05T23:11:50.000Z", "ETag": "\"08210852f65a2e9cb999972539a64d68\"", "StorageClass": "STANDARD", "Key": "doc3.rtf", "Size": 399 }, { "LastModified": "2019-11-05T23:11:50.000Z", "ETag": "\"d1852dd683f404306569471af106988e\"", "StorageClass": "STANDARD", "Key": "doc4.rtf", "Size": 6225 } ] }- 
                    For API details, see ListObjectsV2 in Amazon CLI Command Reference. 
 
- 
                    
The following code example shows how to use list-objects.
- Amazon CLI
- 
             
                    The following example uses the list-objectscommand to display the names of all the objects in the specified bucket:aws s3api list-objects --buckettext-content--query 'Contents[].{Key: Key, Size: Size}'The example uses the --queryargument to filter the output oflist-objectsdown to the key value and size for each objectFor more information about objects, see Working with Amazon S3 Objects in the Amazon S3 Developer Guide. - 
                    For API details, see ListObjects in Amazon CLI Command Reference. 
 
- 
                    
The following code example shows how to use list-parts.
- Amazon CLI
- 
             
                    The following command lists all of the parts that have been uploaded for a multipart upload with key multipart/01in the bucketamzn-s3-demo-bucket:aws s3api list-parts --bucketamzn-s3-demo-bucket--key 'multipart/01' --upload-iddfRtDYU0WWCCcH43C3WFbkRONycyCpTJJvxu2i5GYkZljF.Yxwh6XG7WfS2vC4to6HiV6Yjlx.cph0gtNBtJ8P3URCSbB7rjxI5iEwVDmgaXZOGgkk5nVTW16HOQ5l0ROutput: { "Owner": { "DisplayName": "aws-account-name", "ID": "100719349fc3b6dcd7c820a124bf7aecd408092c3d7b51b38494939801fc248b" }, "Initiator": { "DisplayName": "username", "ID": "arn:aws:iam::0123456789012:user/username" }, "Parts": [ { "LastModified": "2015-06-02T18:07:35.000Z", "PartNumber": 1, "ETag": "\"e868e0f4719e394144ef36531ee6824c\"", "Size": 5242880 }, { "LastModified": "2015-06-02T18:07:42.000Z", "PartNumber": 2, "ETag": "\"6bb2b12753d66fe86da4998aa33fffb0\"", "Size": 5242880 }, { "LastModified": "2015-06-02T18:07:47.000Z", "PartNumber": 3, "ETag": "\"d0a0112e841abec9c9ec83406f0159c8\"", "Size": 5242880 } ], "StorageClass": "STANDARD" }- 
                    For API details, see ListParts in Amazon CLI Command Reference. 
 
- 
                    
The following code example shows how to use ls.
- Amazon CLI
- 
             
                    Example 1: Listing all user owned buckets The following lscommand lists all of the bucket owned by the user. In this example, the user owns the bucketsamzn-s3-demo-bucketandamzn-s3-demo-bucket2. The timestamp is the date the bucket was created, shown in your machine's time zone. This date can change when making changes to your bucket, such as editing its bucket policy. Note ifs3://is used for the path argument<S3Uri>, it will list all of the buckets as well.aws s3 lsOutput: 2013-07-11 17:08:50 amzn-s3-demo-bucket 2013-07-24 14:55:44 amzn-s3-demo-bucket2Example 2: Listing all prefixes and objects in a bucket The following lscommand lists objects and common prefixes under a specified bucket and prefix. In this example, the user owns the bucketamzn-s3-demo-bucketwith the objectstest.txtandsomePrefix/test.txt. TheLastWriteTimeandLengthare arbitrary. Note that since thelscommand has no interaction with the local filesystem, thes3://URI scheme is not required to resolve ambiguity and may be omitted.aws s3 lss3://amzn-s3-demo-bucketOutput: PRE somePrefix/ 2013-07-25 17:06:27 88 test.txtExample 3: Listing all prefixes and objects in a specific bucket and prefix The following lscommand lists objects and common prefixes under a specified bucket and prefix. However, there are no objects nor common prefixes under the specified bucket and prefix.aws s3 lss3://amzn-s3-demo-bucket/noExistPrefixOutput: NoneExample 4: Recursively listing all prefixes and objects in a bucket The following lscommand will recursively list objects in a bucket. Rather than showingPRE dirname/in the output, all the content in a bucket will be listed in order.aws s3 lss3://amzn-s3-demo-bucket\ --recursiveOutput: 2013-09-02 21:37:53 10 a.txt 2013-09-02 21:37:53 2863288 foo.zip 2013-09-02 21:32:57 23 foo/bar/.baz/a 2013-09-02 21:32:58 41 foo/bar/.baz/b 2013-09-02 21:32:57 281 foo/bar/.baz/c 2013-09-02 21:32:57 73 foo/bar/.baz/d 2013-09-02 21:32:57 452 foo/bar/.baz/e 2013-09-02 21:32:57 896 foo/bar/.baz/hooks/bar 2013-09-02 21:32:57 189 foo/bar/.baz/hooks/foo 2013-09-02 21:32:57 398 z.txtExample 5: Summarizing all prefixes and objects in a bucket The following lscommand demonstrates the same command using the --human-readable and --summarize options. --human-readable displays file size in Bytes/MiB/KiB/GiB/TiB/PiB/EiB. --summarize displays the total number of objects and total size at the end of the result listing:aws s3 lss3://amzn-s3-demo-bucket\ --recursive \ --human-readable \ --summarizeOutput: 2013-09-02 21:37:53 10 Bytes a.txt 2013-09-02 21:37:53 2.9 MiB foo.zip 2013-09-02 21:32:57 23 Bytes foo/bar/.baz/a 2013-09-02 21:32:58 41 Bytes foo/bar/.baz/b 2013-09-02 21:32:57 281 Bytes foo/bar/.baz/c 2013-09-02 21:32:57 73 Bytes foo/bar/.baz/d 2013-09-02 21:32:57 452 Bytes foo/bar/.baz/e 2013-09-02 21:32:57 896 Bytes foo/bar/.baz/hooks/bar 2013-09-02 21:32:57 189 Bytes foo/bar/.baz/hooks/foo 2013-09-02 21:32:57 398 Bytes z.txt Total Objects: 10 Total Size: 2.9 MiBExample 6: Listing from an S3 access point The following lscommand list objects from access point (myaccesspoint):aws s3 lss3://arn:aws:s3:us-west-2:123456789012:accesspoint/myaccesspoint/Output: PRE somePrefix/ 2013-07-25 17:06:27 88 test.txt- 
                    For API details, see Ls in Amazon CLI Command Reference. 
 
- 
                    
The following code example shows how to use mb.
- Amazon CLI
- 
             
                    Example 1: Create a bucket The following mbcommand creates a bucket. In this example, the user makes the bucketamzn-s3-demo-bucket. The bucket is created in the region specified in the user's configuration file:aws s3 mbs3://amzn-s3-demo-bucketOutput: make_bucket: s3://amzn-s3-demo-bucketExample 2: Create a bucket in the specified region The following mbcommand creates a bucket in a region specified by the--regionparameter. In this example, the user makes the bucketamzn-s3-demo-bucketin the regionus-west-1:aws s3 mbs3://amzn-s3-demo-bucket\ --regionus-west-1Output: make_bucket: s3://amzn-s3-demo-bucket- 
                    For API details, see Mb in Amazon CLI Command Reference. 
 
- 
                    
The following code example shows how to use mv.
- Amazon CLI
- 
             
                    Example 1: Move a local file to the specified bucket The following mvcommand moves a single file to a specified bucket and key.aws s3 mvtest.txts3://amzn-s3-demo-bucket/test2.txtOutput: move: test.txt to s3://amzn-s3-demo-bucket/test2.txtExample 2: Move an object to the specified bucket and key The following mvcommand moves a single s3 object to a specified bucket and key.aws s3 mvs3://amzn-s3-demo-bucket/test.txts3://amzn-s3-demo-bucket/test2.txtOutput: move: s3://amzn-s3-demo-bucket/test.txt to s3://amzn-s3-demo-bucket/test2.txtExample 3: Move an S3 object to the local directory The following mvcommand moves a single object to a specified file locally.aws s3 mvs3://amzn-s3-demo-bucket/test.txttest2.txtOutput: move: s3://amzn-s3-demo-bucket/test.txt to test2.txtExample 4: Move an object with it's original name to the specified bucket The following mvcommand moves a single object to a specified bucket while retaining its original name:aws s3 mvs3://amzn-s3-demo-bucket/test.txts3://amzn-s3-demo-bucket2/Output: move: s3://amzn-s3-demo-bucket/test.txt to s3://amzn-s3-demo-bucket2/test.txtExample 5: Move all objects and prefixes in a bucket to the local directory When passed with the parameter --recursive, the followingmvcommand recursively moves all objects under a specified prefix and bucket to a specified directory. In this example, the bucketamzn-s3-demo-buckethas the objectstest1.txtandtest2.txt.aws s3 mvs3://amzn-s3-demo-bucket.\ --recursiveOutput: move: s3://amzn-s3-demo-bucket/test1.txt to test1.txt move: s3://amzn-s3-demo-bucket/test2.txt to test2.txtExample 6: Move all objects and prefixes in a bucket to the local directory, except ``.jpg`` files When passed with the parameter --recursive, the followingmvcommand recursively moves all files under a specified directory to a specified bucket and prefix while excluding some files by using an--excludeparameter. In this example, the directorymyDirhas the filestest1.txtandtest2.jpg.aws s3 mvmyDirs3://amzn-s3-demo-bucket/\ --recursive \ --exclude"*.jpg"Output: move: myDir/test1.txt to s3://amzn-s3-demo-bucket2/test1.txtExample 7: Move all objects and prefixes in a bucket to the local directory, except specified prefix When passed with the parameter --recursive, the followingmvcommand recursively moves all objects under a specified bucket to another bucket while excluding some objects by using an--excludeparameter. In this example, the bucketamzn-s3-demo-buckethas the objectstest1.txtandanother/test1.txt.aws s3 mvs3://amzn-s3-demo-bucket/s3://amzn-s3-demo-bucket2/\ --recursive \ --exclude"amzn-s3-demo-bucket/another/*"Output: move: s3://amzn-s3-demo-bucket/test1.txt to s3://amzn-s3-demo-bucket2/test1.txtExample 8: Move an object to the specified bucket and set the ACL The following mvcommand moves a single object to a specified bucket and key while setting the ACL topublic-read-write.aws s3 mvs3://amzn-s3-demo-bucket/test.txts3://amzn-s3-demo-bucket/test2.txt\ --aclpublic-read-writeOutput: move: s3://amzn-s3-demo-bucket/test.txt to s3://amzn-s3-demo-bucket/test2.txtExample 9: Move a local file to the specified bucket and grant permissions The following mvcommand illustrates the use of the--grantsoption to grant read access to all users and full control to a specific user identified by their email address.aws s3 mvfile.txts3://amzn-s3-demo-bucket/\ --grantsread=uri=http://acs.amazonaws.com/groups/global/AllUsersfull=emailaddress=user@example.comOutput: move: file.txt to s3://amzn-s3-demo-bucket/file.txtExample 10: Move a file to an S3 access point The following mvcommand moves a single file namedmydoc.txtto the access point namedmyaccesspointat the key namedmykey.aws s3 mvmydoc.txts3://arn:aws:s3:us-west-2:123456789012:accesspoint/myaccesspoint/mykeyOutput: move: mydoc.txt to s3://arn:aws:s3:us-west-2:123456789012:accesspoint/myaccesspoint/mykey- 
                    For API details, see Mv in Amazon CLI Command Reference. 
 
- 
                    
The following code example shows how to use presign.
- Amazon CLI
- 
             
                    Example 1: To create a pre-signed URL with the default one hour lifetime that links to an object in an S3 bucket The following presigncommand generates a pre-signed URL for a specified bucket and key that is valid for one hour.aws s3 presigns3://amzn-s3-demo-bucket/test2.txtOutput: https://amzn-s3-demo-bucket.s3.us-west-2.amazonaws.com/key?X-Amz-Algorithm=AWS4-HMAC-SHA256&X-Amz-Credential=AKIAEXAMPLE123456789%2F20210621%2Fus-west-2%2Fs3%2Faws4_request&X-Amz-Date=20210621T041609Z&X-Amz-Expires=3600&X-Amz-SignedHeaders=host&X-Amz-Signature=EXAMBLE1234494d5fba3fed607f98018e1dfc62e2529ae96d844123456Example 2: To create a pre-signed URL with a custom lifetime that links to an object in an S3 bucket The following presigncommand generates a pre-signed URL for a specified bucket and key that is valid for one week.aws s3 presigns3://amzn-s3-demo-bucket/test2.txt\ --expires-in604800Output: https://amzn-s3-demo-bucket.s3.us-west-2.amazonaws.com/key?X-Amz-Algorithm=AWS4-HMAC-SHA256&X-Amz-Credential=AKIAEXAMPLE123456789%2F20210621%2Fus-west-2%2Fs3%2Faws4_request&X-Amz-Date=20210621T041609Z&X-Amz-Expires=604800&X-Amz-SignedHeaders=host&X-Amz-Signature=EXAMBLE1234494d5fba3fed607f98018e1dfc62e2529ae96d844123456For more information, see Share an Object with Others in the S3 Developer Guide guide. - 
                    For API details, see Presign in Amazon CLI Command Reference. 
 
- 
                    
The following code example shows how to use put-bucket-accelerate-configuration.
- Amazon CLI
- 
             
                    To set the accelerate configuration of a bucket The following put-bucket-accelerate-configurationexample enables the accelerate configuration for the specified bucket.aws s3api put-bucket-accelerate-configuration \ --bucketamzn-s3-demo-bucket\ --accelerate-configurationStatus=EnabledThis command produces no output. - 
                    For API details, see PutBucketAccelerateConfiguration in Amazon CLI Command Reference. 
 
- 
                    
The following code example shows how to use put-bucket-acl.
- Amazon CLI
- 
             
                    This example grants full controlto two Amazon users (user1@example.com and user2@example.com) andreadpermission to everyone:aws s3api put-bucket-acl --bucketamzn-s3-demo-bucket--grant-full-controlemailaddress=user1@example.com,emailaddress=user2@example.com--grant-readuri=http://acs.amazonaws.com/groups/global/AllUsersSee http://docs.aws.amazon.com/AmazonS3/latest/API/RESTBucketPUTacl.html for details on custom ACLs (the s3api ACL commands, such as put-bucket-acl, use the same shorthand argument notation).- 
                    For API details, see PutBucketAcl in Amazon CLI Command Reference. 
 
- 
                    
The following code example shows how to use put-bucket-analytics-configuration.
- Amazon CLI
- 
             
                    To sets an analytics configuration for the bucket The following put-bucket-analytics-configurationexample configures analytics for the specified bucket.aws s3api put-bucket-analytics-configuration \ --bucketamzn-s3-demo-bucket--id1\ --analytics-configuration '{"Id": "1","StorageClassAnalysis": {}}'This command produces no output. - 
                    For API details, see PutBucketAnalyticsConfiguration in Amazon CLI Command Reference. 
 
- 
                    
The following code example shows how to use put-bucket-cors.
- Amazon CLI
- 
             
                    The following example enables PUT,POST, andDELETErequests from www.example.com, and enablesGETrequests from any domain:aws s3api put-bucket-cors --bucketamzn-s3-demo-bucket--cors-configurationfile://cors.jsoncors.json:{"CORSRules":[{"AllowedOrigins": ["http://www.example.com"], "AllowedHeaders": ["*"], "AllowedMethods": ["PUT", "POST", "DELETE"], "MaxAgeSeconds":3000,"ExposeHeaders": ["x-amz-server-side-encryption"]},{"AllowedOrigins": ["*"], "AllowedHeaders": ["Authorization"], "AllowedMethods": ["GET"], "MaxAgeSeconds":3000}]}- 
                    For API details, see PutBucketCors in Amazon CLI Command Reference. 
 
- 
                    
The following code example shows how to use put-bucket-encryption.
- Amazon CLI
- 
             
                    To configure server-side encryption for a bucket The following put-bucket-encryptionexample sets AES256 encryption as the default for the specified bucket.aws s3api put-bucket-encryption \ --bucketamzn-s3-demo-bucket\ --server-side-encryption-configuration '{"Rules": [{"ApplyServerSideEncryptionByDefault": {"SSEAlgorithm": "AES256"}}]}'This command produces no output. - 
                    For API details, see PutBucketEncryption in Amazon CLI Command Reference. 
 
- 
                    
The following code example shows how to use put-bucket-intelligent-tiering-configuration.
- Amazon CLI
- 
             
                    To update an S3 Intelligent-Tiering configuration on a bucket The following put-bucket-intelligent-tiering-configurationexample updates an S3 Intelligent-Tiering configuration, named ExampleConfig, on a bucket. The configuration will transition objects that have not been accessed under the prefix images to Archive Access after 90 days and Deep Archive Access after 180 days.aws s3api put-bucket-intelligent-tiering-configuration \ --bucketamzn-s3-demo-bucket\ --id"ExampleConfig"\ --intelligent-tiering-configurationfile://intelligent-tiering-configuration.jsonContents of intelligent-tiering-configuration.json:{ "Id": "ExampleConfig", "Status": "Enabled", "Filter": { "Prefix": "images" }, "Tierings": [ { "Days": 90, "AccessTier": "ARCHIVE_ACCESS" }, { "Days": 180, "AccessTier": "DEEP_ARCHIVE_ACCESS" } ] }This command produces no output. For more information, see Setting Object Ownership on an existing bucket in the Amazon S3 User Guide. - 
                    For API details, see PutBucketIntelligentTieringConfiguration in Amazon CLI Command Reference. 
 
- 
                    
The following code example shows how to use put-bucket-inventory-configuration.
- Amazon CLI
- 
             
                    Example 1: To set an inventory configuration for a bucket The following put-bucket-inventory-configurationexample sets a weekly ORC-formatted inventory report for the bucketamzn-s3-demo-bucket.aws s3api put-bucket-inventory-configuration \ --bucketamzn-s3-demo-bucket\ --id1\ --inventory-configuration '{"Destination": { "S3BucketDestination": { "AccountId": "123456789012", "Bucket": "arn:aws:s3:::amzn-s3-demo-bucket", "Format": "ORC" }}, "IsEnabled": true, "Id": "1", "IncludedObjectVersions": "Current", "Schedule": { "Frequency": "Weekly" }}'This command produces no output. Example 2: To set an inventory configuration for a bucket The following put-bucket-inventory-configurationexample sets a daily CSV-formatted inventory report for the bucketamzn-s3-demo-bucket.aws s3api put-bucket-inventory-configuration \ --bucketamzn-s3-demo-bucket\ --id2\ --inventory-configuration '{"Destination": { "S3BucketDestination": { "AccountId": "123456789012", "Bucket": "arn:aws:s3:::amzn-s3-demo-bucket", "Format": "CSV" }}, "IsEnabled": true, "Id": "2", "IncludedObjectVersions": "Current", "Schedule": { "Frequency": "Daily" }}'This command produces no output. - 
                    For API details, see PutBucketInventoryConfiguration in Amazon CLI Command Reference. 
 
- 
                    
The following code example shows how to use put-bucket-lifecycle-configuration.
- Amazon CLI
- 
             
                    The following command applies a lifecycle configuration to a bucket named amzn-s3-demo-bucket:aws s3api put-bucket-lifecycle-configuration --bucketamzn-s3-demo-bucket--lifecycle-configurationfile://lifecycle.jsonThe file lifecycle.jsonis a JSON document in the current folder that specifies two rules:{ "Rules": [ { "ID": "Move rotated logs to Glacier", "Prefix": "rotated/", "Status": "Enabled", "Transitions": [ { "Date": "2015-11-10T00:00:00.000Z", "StorageClass": "GLACIER" } ] }, { "Status": "Enabled", "Prefix": "", "NoncurrentVersionTransitions": [ { "NoncurrentDays": 2, "StorageClass": "GLACIER" } ], "ID": "Move old versions to Glacier" } ] }The first rule moves files with the prefix rotatedto Glacier on the specified date. The second rule moves old object versions to Glacier when they are no longer current. For information on acceptable timestamp formats, see Specifying Parameter Values in the Amazon CLI User Guide.- 
                    For API details, see PutBucketLifecycleConfiguration in Amazon CLI Command Reference. 
 
- 
                    
The following code example shows how to use put-bucket-lifecycle.
- Amazon CLI
- 
             
                    The following command applies a lifecycle configuration to the bucket amzn-s3-demo-bucket:aws s3api put-bucket-lifecycle --bucketamzn-s3-demo-bucket--lifecycle-configurationfile://lifecycle.jsonThe file lifecycle.jsonis a JSON document in the current folder that specifies two rules:{ "Rules": [ { "ID": "Move to Glacier after sixty days (objects in logs/2015/)", "Prefix": "logs/2015/", "Status": "Enabled", "Transition": { "Days": 60, "StorageClass": "GLACIER" } }, { "Expiration": { "Date": "2016-01-01T00:00:00.000Z" }, "ID": "Delete 2014 logs in 2016.", "Prefix": "logs/2014/", "Status": "Enabled" } ] }The first rule moves files to Amazon Glacier after sixty days. The second rule deletes files from Amazon S3 on the specified date. For information on acceptable timestamp formats, see Specifying Parameter Values in the Amazon CLI User Guide. Each rule in the above example specifies a policy ( TransitionorExpiration) and file prefix (folder name) to which it applies. You can also create a rule that applies to an entire bucket by specifying a blank prefix:{ "Rules": [ { "ID": "Move to Glacier after sixty days (all objects in bucket)", "Prefix": "", "Status": "Enabled", "Transition": { "Days": 60, "StorageClass": "GLACIER" } } ] }- 
                    For API details, see PutBucketLifecycle in Amazon CLI Command Reference. 
 
- 
                    
The following code example shows how to use put-bucket-logging.
- Amazon CLI
- 
             
                    Example 1: To set bucket policy logging The following put-bucket-loggingexample sets the logging policy for amzn-s3-demo-bucket. First, grant the logging service principal permission in your bucket policy using theput-bucket-policycommand.aws s3api put-bucket-policy \ --bucketamzn-s3-demo-bucket\ --policyfile://policy.jsonContents of policy.json:{ "Version":"2012-10-17", "Statement": [ { "Sid": "S3ServerAccessLogsPolicy", "Effect": "Allow", "Principal": {"Service": "logging.s3.amazonaws.com"}, "Action": "s3:PutObject", "Resource": "arn:aws:s3:::amzn-s3-demo-bucket/Logs/*", "Condition": { "ArnLike": {"aws:SourceARN": "arn:aws:s3:::SOURCE-BUCKET-NAME"}, "StringEquals": {"aws:SourceAccount": "SOURCE-AWS-ACCOUNT-ID"} } } ] }To apply the logging policy, use put-bucket-logging.aws s3api put-bucket-logging \ --bucketamzn-s3-demo-bucket\ --bucket-logging-statusfile://logging.jsonContents of logging.json:{ "LoggingEnabled": { "TargetBucket": "amzn-s3-demo-bucket", "TargetPrefix": "Logs/" } }The put-bucket-policycommand is required to grants3:PutObjectpermissions to the logging service principal.For more information, see Amazon S3 Server Access Logging in the Amazon S3 User Guide. Example 2: To set a bucket policy for logging access to only a single user The following put-bucket-loggingexample sets the logging policy for amzn-s3-demo-bucket. The Amazon user bob@example.com will have full control over the log files, and no one else has any access. First, grant S3 permission withput-bucket-acl.aws s3api put-bucket-acl \ --bucketamzn-s3-demo-bucket\ --grant-writeURI=http://acs.amazonaws.com/groups/s3/LogDelivery\ --grant-read-acpURI=http://acs.amazonaws.com/groups/s3/LogDeliveryThen apply the logging policy using put-bucket-logging.aws s3api put-bucket-logging \ --bucketamzn-s3-demo-bucket\ --bucket-logging-statusfile://logging.jsonContents of logging.json:{ "LoggingEnabled": { "TargetBucket": "amzn-s3-demo-bucket", "TargetPrefix": "amzn-s3-demo-bucket-logs/", "TargetGrants": [ { "Grantee": { "Type": "AmazonCustomerByEmail", "EmailAddress": "bob@example.com" }, "Permission": "FULL_CONTROL" } ] } }the put-bucket-aclcommand is required to grant S3's log delivery system the necessary permissions (write and read-acp permissions).For more information, see Amazon S3 Server Access Logging in the Amazon S3 Developer Guide. - 
                    For API details, see PutBucketLogging in Amazon CLI Command Reference. 
 
- 
                    
The following code example shows how to use put-bucket-metrics-configuration.
- Amazon CLI
- 
             
                    To set a metrics configuration for a bucket The following put-bucket-metrics-configurationexample sets a metric configuration with ID 123 for the specified bucket.aws s3api put-bucket-metrics-configuration \ --bucketamzn-s3-demo-bucket\ --id123\ --metrics-configuration '{"Id": "123", "Filter": {"Prefix": "logs"}}'This command produces no output. - 
                    For API details, see PutBucketMetricsConfiguration in Amazon CLI Command Reference. 
 
- 
                    
The following code example shows how to use put-bucket-notification-configuration.
- Amazon CLI
- 
             
                    To enable the specified notifications to a bucket The following put-bucket-notification-configurationexample applies a notification configuration to a bucket namedamzn-s3-demo-bucket. The filenotification.jsonis a JSON document in the current folder that specifies an SNS topic and an event type to monitor.aws s3api put-bucket-notification-configuration \ --bucketamzn-s3-demo-bucket\ --notification-configurationfile://notification.jsonContents of notification.json:{ "TopicConfigurations": [ { "TopicArn": "arn:aws:sns:us-west-2:123456789012:s3-notification-topic", "Events": [ "s3:ObjectCreated:*" ] } ] }The SNS topic must have an IAM policy attached to it that allows Amazon S3 to publish to it. { "Version":"2012-10-17", "Id": "example-ID", "Statement": [ { "Sid": "example-statement-ID", "Effect": "Allow", "Principal": { "Service": "s3.amazonaws.com" }, "Action": [ "SNS:Publish" ], "Resource": "arn:aws:sns:us-west-2:123456789012::s3-notification-topic", "Condition": { "ArnLike": { "aws:SourceArn": "arn:aws:s3:*:*:amzn-s3-demo-bucket" } } } ] }- 
                    For API details, see PutBucketNotificationConfiguration in Amazon CLI Command Reference. 
 
- 
                    
The following code example shows how to use put-bucket-notification.
- Amazon CLI
- 
             
                    The applies a notification configuration to a bucket named amzn-s3-demo-bucket:aws s3api put-bucket-notification --bucketamzn-s3-demo-bucket--notification-configurationfile://notification.jsonThe file notification.jsonis a JSON document in the current folder that specifies an SNS topic and an event type to monitor:{ "TopicConfiguration": { "Event": "s3:ObjectCreated:*", "Topic": "arn:aws:sns:us-west-2:123456789012:s3-notification-topic" } }The SNS topic must have an IAM policy attached to it that allows Amazon S3 to publish to it: { "Version":"2012-10-17", "Id": "example-ID", "Statement": [ { "Sid": "example-statement-ID", "Effect": "Allow", "Principal": { "Service": "s3.amazonaws.com" }, "Action": [ "SNS:Publish" ], "Resource": "arn:aws:sns:us-west-2:123456789012:amzn-s3-demo-bucket", "Condition": { "ArnLike": { "aws:SourceArn": "arn:aws:s3:*:*:amzn-s3-demo-bucket" } } } ] }- 
                    For API details, see PutBucketNotification in Amazon CLI Command Reference. 
 
- 
                    
The following code example shows how to use put-bucket-ownership-controls.
- Amazon CLI
- 
             
                    To update the bucket ownership settings of a bucket The following put-bucket-ownership-controlsexample updates the bucket ownership settings of a bucket.aws s3api put-bucket-ownership-controls \ --bucketamzn-s3-demo-bucket\ --ownership-controls="Rules=[{ObjectOwnership=BucketOwnerEnforced}]"This command produces no output. For more information, see Setting Object Ownership on an existing bucket in the Amazon S3 User Guide. - 
                    For API details, see PutBucketOwnershipControls in Amazon CLI Command Reference. 
 
- 
                    
The following code example shows how to use put-bucket-policy.
- Amazon CLI
- 
             
                    This example allows all users to retrieve any object in amzn-s3-demo-bucket except those in the MySecretFolder. It also grants putanddeletepermission to the root user of the Amazon account1234-5678-9012:aws s3api put-bucket-policy --bucketamzn-s3-demo-bucket--policyfile://policy.jsonpolicy.json:{"Statement":[{"Effect": "Allow", "Principal": "*", "Action": "s3:GetObject", "Resource":"arn:aws:s3:::amzn-s3-demo-bucket/*"},{"Effect": "Deny", "Principal": "*", "Action": "s3:GetObject", "Resource":"arn:aws:s3:::amzn-s3-demo-bucket/MySecretFolder/*"},{"Effect": "Allow", "Principal":{"AWS":"arn:aws:iam::123456789012:root"},"Action":["s3:DeleteObject","s3:PutObject"],"Resource":"arn:aws:s3:::amzn-s3-demo-bucket/*"}]}- 
                    For API details, see PutBucketPolicy in Amazon CLI Command Reference. 
 
- 
                    
The following code example shows how to use put-bucket-replication.
- Amazon CLI
- 
             
                    To configure replication for an S3 bucket The following put-bucket-replicationexample applies a replication configuration to the specified S3 bucket.aws s3api put-bucket-replication \ --bucketamzn-s3-demo-bucket1\ --replication-configurationfile://replication.jsonContents of replication.json:{ "Role": "arn:aws:iam::123456789012:role/s3-replication-role", "Rules": [ { "Status": "Enabled", "Priority": 1, "DeleteMarkerReplication": { "Status": "Disabled" }, "Filter" : { "Prefix": ""}, "Destination": { "Bucket": "arn:aws:s3:::amzn-s3-demo-bucket2" } } ] }The destination bucket must have versioning enabled. The specified role must have permission to write to the destination bucket and have a trust relationship that allows Amazon S3 to assume the role. Example role permission policy: { "Version":"2012-10-17", "Statement": [ { "Effect": "Allow", "Action": [ "s3:GetReplicationConfiguration", "s3:ListBucket" ], "Resource": [ "arn:aws:s3:::amzn-s3-demo-bucket1" ] }, { "Effect": "Allow", "Action": [ "s3:GetObjectVersion", "s3:GetObjectVersionAcl", "s3:GetObjectVersionTagging" ], "Resource": [ "arn:aws:s3:::amzn-s3-demo-bucket1/*" ] }, { "Effect": "Allow", "Action": [ "s3:ReplicateObject", "s3:ReplicateDelete", "s3:ReplicateTags" ], "Resource": "arn:aws:s3:::amzn-s3-demo-bucket2/*" } ] }Example trust relationship policy: { "Version":"2012-10-17", "Statement": [ { "Effect": "Allow", "Principal": { "Service": "s3.amazonaws.com" }, "Action": "sts:AssumeRole" } ] }This command produces no output. For more information, see This is the topic title in the Amazon Simple Storage Service Console User Guide. - 
                    For API details, see PutBucketReplication in Amazon CLI Command Reference. 
 
- 
                    
The following code example shows how to use put-bucket-request-payment.
- Amazon CLI
- 
             
                    Example 1: To enable ``requester pays`` configuration for a bucket The following put-bucket-request-paymentexample enablesrequester paysfor the specified bucket.aws s3api put-bucket-request-payment \ --bucketamzn-s3-demo-bucket\ --request-payment-configuration '{"Payer":"Requester"}'This command produces no output. Example 2: To disable ``requester pays`` configuration for a bucket The following put-bucket-request-paymentexample disablesrequester paysfor the specified bucket.aws s3api put-bucket-request-payment \ --bucketamzn-s3-demo-bucket\ --request-payment-configuration '{"Payer":"BucketOwner"}'This command produces no output. - 
                    For API details, see PutBucketRequestPayment in Amazon CLI Command Reference. 
 
- 
                    
The following code example shows how to use put-bucket-tagging.
- Amazon CLI
- 
             
                    The following command applies a tagging configuration to a bucket named amzn-s3-demo-bucket:aws s3api put-bucket-tagging --bucketamzn-s3-demo-bucket--taggingfile://tagging.jsonThe file tagging.jsonis a JSON document in the current folder that specifies tags:{ "TagSet": [ { "Key": "organization", "Value": "marketing" } ] }Or apply a tagging configuration to amzn-s3-demo-bucketdirectly from the command line:aws s3api put-bucket-tagging --bucketamzn-s3-demo-bucket--tagging 'TagSet=[{Key=organization,Value=marketing}]'- 
                    For API details, see PutBucketTagging in Amazon CLI Command Reference. 
 
- 
                    
The following code example shows how to use put-bucket-versioning.
- Amazon CLI
- 
             
                    The following command enables versioning on a bucket named amzn-s3-demo-bucket:aws s3api put-bucket-versioning --bucketamzn-s3-demo-bucket--versioning-configurationStatus=EnabledThe following command enables versioning, and uses an mfa code aws s3api put-bucket-versioning --bucketamzn-s3-demo-bucket--versioning-configurationStatus=Enabled--mfa"SERIAL 123456"- 
                    For API details, see PutBucketVersioning in Amazon CLI Command Reference. 
 
- 
                    
The following code example shows how to use put-bucket-website.
- Amazon CLI
- 
             
                    The applies a static website configuration to a bucket named amzn-s3-demo-bucket:aws s3api put-bucket-website --bucketamzn-s3-demo-bucket--website-configurationfile://website.jsonThe file website.jsonis a JSON document in the current folder that specifies index and error pages for the website:{ "IndexDocument": { "Suffix": "index.html" }, "ErrorDocument": { "Key": "error.html" } }- 
                    For API details, see PutBucketWebsite in Amazon CLI Command Reference. 
 
- 
                    
The following code example shows how to use put-object-acl.
- Amazon CLI
- 
             
                    The following command grants full controlto two Amazon users (user1@example.com and user2@example.com) andreadpermission to everyone:aws s3api put-object-acl --bucketamzn-s3-demo-bucket--keyfile.txt--grant-full-controlemailaddress=user1@example.com,emailaddress=user2@example.com--grant-readuri=http://acs.amazonaws.com/groups/global/AllUsersSee http://docs.aws.amazon.com/AmazonS3/latest/API/RESTBucketPUTacl.html for details on custom ACLs (the s3api ACL commands, such as put-object-acl, use the same shorthand argument notation).- 
                    For API details, see PutObjectAcl in Amazon CLI Command Reference. 
 
- 
                    
The following code example shows how to use put-object-legal-hold.
- Amazon CLI
- 
             
                    To apply a Legal Hold to an object The following put-object-legal-holdexample sets a Legal Hold on the objectdoc1.rtf.aws s3api put-object-legal-hold \ --bucketamzn-s3-demo-bucket-with-object-lock\ --keydoc1.rtf\ --legal-holdStatus=ONThis command produces no output. - 
                    For API details, see PutObjectLegalHold in Amazon CLI Command Reference. 
 
- 
                    
The following code example shows how to use put-object-lock-configuration.
- Amazon CLI
- 
             
                    To set an object lock configuration on a bucket The following put-object-lock-configurationexample sets a 50-day object lock on the specified bucket.aws s3api put-object-lock-configuration \ --bucketamzn-s3-demo-bucket-with-object-lock\ --object-lock-configuration '{ "ObjectLockEnabled": "Enabled", "Rule": { "DefaultRetention": { "Mode": "COMPLIANCE", "Days": 50 }}}'This command produces no output. - 
                    For API details, see PutObjectLockConfiguration in Amazon CLI Command Reference. 
 
- 
                    
The following code example shows how to use put-object-retention.
- Amazon CLI
- 
             
                    To set an object retention configuration for an object The following put-object-retentionexample sets an object retention configuration for the specified object until 2025-01-01.aws s3api put-object-retention \ --bucketamzn-s3-demo-bucket-with-object-lock\ --keydoc1.rtf\ --retention '{ "Mode": "GOVERNANCE", "RetainUntilDate": "2025-01-01T00:00:00" }'This command produces no output. - 
                    For API details, see PutObjectRetention in Amazon CLI Command Reference. 
 
- 
                    
The following code example shows how to use put-object-tagging.
- Amazon CLI
- 
             
                    To set a tag on an object The following put-object-taggingexample sets a tag with the keydesignationand the valueconfidentialon the specified object.aws s3api put-object-tagging \ --bucketamzn-s3-demo-bucket\ --keydoc1.rtf\ --tagging '{"TagSet": [{ "Key": "designation", "Value": "confidential" }]}'This command produces no output. The following put-object-taggingexample sets multiple tags sets on the specified object.aws s3api put-object-tagging \ --bucketamzn-s3-demo-bucket-example\ --keydoc3.rtf\ --tagging '{"TagSet": [{ "Key": "designation", "Value": "confidential" }, { "Key": "department", "Value": "finance" }, { "Key": "team", "Value": "payroll" } ]}'This command produces no output. - 
                    For API details, see PutObjectTagging in Amazon CLI Command Reference. 
 
- 
                    
The following code example shows how to use put-object.
- Amazon CLI
- 
             
                    Example 1: Upload an object to Amazon S3 The following put-objectcommand example uploads an object to Amazon S3.aws s3api put-object \ --bucketamzn-s3-demo-bucket\ --keymy-dir/MySampleImage.png\ --bodyMySampleImage.pngFor more information about uploading objects, see Uploading Objects < http://docs.aws.amazon.com/AmazonS3/latest/dev/UploadingObjects.html> in the Amazon S3 Developer Guide. Example 2: Upload a video file to Amazon S3 The following put-objectcommand example uploads a video file.aws s3api put-object \ --bucketamzn-s3-demo-bucket\ --keymy-dir/big-video-file.mp4\ --body/media/videos/f-sharp-3-data-services.mp4For more information about uploading objects, see Uploading Objects < http://docs.aws.amazon.com/AmazonS3/latest/dev/UploadingObjects.html> in the Amazon S3 Developer Guide. - 
                    For API details, see PutObject in Amazon CLI Command Reference. 
 
- 
                    
The following code example shows how to use put-public-access-block.
- Amazon CLI
- 
             
                    To set the block public access configuration for a bucket The following put-public-access-blockexample sets a restrictive block public access configuration for the specified bucket.aws s3api put-public-access-block \ --bucketamzn-s3-demo-bucket\ --public-access-block-configuration"BlockPublicAcls=true,IgnorePublicAcls=true,BlockPublicPolicy=true,RestrictPublicBuckets=true"This command produces no output. - 
                    For API details, see PutPublicAccessBlock in Amazon CLI Command Reference. 
 
- 
                    
The following code example shows how to use rb.
- Amazon CLI
- 
             
                    Example 1: Delete a bucket The following rbcommand removes a bucket. In this example, the user's bucket isamzn-s3-demo-bucket. Note that the bucket must be empty in order to remove:aws s3 rbs3://amzn-s3-demo-bucketOutput: remove_bucket: amzn-s3-demo-bucketExample 2: Force delete a bucket The following rbcommand uses the--forceparameter to first remove all of the objects in the bucket and then remove the bucket itself. In this example, the user's bucket isamzn-s3-demo-bucketand the objects inamzn-s3-demo-bucketaretest1.txtandtest2.txt:aws s3 rbs3://amzn-s3-demo-bucket\ --forceOutput: delete: s3://amzn-s3-demo-bucket/test1.txt delete: s3://amzn-s3-demo-bucket/test2.txt remove_bucket: amzn-s3-demo-bucket- 
                    For API details, see Rb in Amazon CLI Command Reference. 
 
- 
                    
The following code example shows how to use restore-object.
- Amazon CLI
- 
             
                    To create a restore request for an object The following restore-objectexample restores the specified Amazon S3 Glacier object for the bucketmy-glacier-bucketfor 10 days.aws s3api restore-object \ --bucketmy-glacier-bucket\ --keydoc1.rtf\ --restore-requestDays=10This command produces no output. - 
                    For API details, see RestoreObject in Amazon CLI Command Reference. 
 
- 
                    
The following code example shows how to use rm.
- Amazon CLI
- 
             
                    Example 1: Delete an S3 object The following rmcommand deletes a single s3 object:aws s3 rms3://amzn-s3-demo-bucket/test2.txtOutput: delete: s3://amzn-s3-demo-bucket/test2.txtExample 2: Delete all contents in a bucket The following rmcommand recursively deletes all objects under a specified bucket and prefix when passed with the parameter--recursive. In this example, the bucketamzn-s3-demo-bucketcontains the objectstest1.txtandtest2.txt:aws s3 rms3://amzn-s3-demo-bucket\ --recursiveOutput: delete: s3://amzn-s3-demo-bucket/test1.txt delete: s3://amzn-s3-demo-bucket/test2.txtExample 3: Delete all contents in a bucket, except ``.jpg`` files The following rmcommand recursively deletes all objects under a specified bucket and prefix when passed with the parameter--recursivewhile excluding some objects by using an--excludeparameter. In this example, the bucketamzn-s3-demo-buckethas the objectstest1.txtandtest2.jpg:aws s3 rms3://amzn-s3-demo-bucket/\ --recursive \ --exclude"*.jpg"Output: delete: s3://amzn-s3-demo-bucket/test1.txtExample 4: Delete all contents in a bucket, except objects under the specified prefix The following rmcommand recursively deletes all objects under a specified bucket and prefix when passed with the parameter--recursivewhile excluding all objects under a particular prefix by using an--excludeparameter. In this example, the bucketamzn-s3-demo-buckethas the objectstest1.txtandanother/test.txt:aws s3 rms3://amzn-s3-demo-bucket/\ --recursive \ --exclude"another/*"Output: delete: s3://amzn-s3-demo-bucket/test1.txtExample 5: Delete an object from an S3 access point The following rmcommand deletes a single object (mykey) from the access point (myaccesspoint). :: The followingrmcommand deletes a single object (mykey) from the access point (myaccesspoint).aws s3 rms3://arn:aws:s3:us-west-2:123456789012:accesspoint/myaccesspoint/mykeyOutput: delete: s3://arn:aws:s3:us-west-2:123456789012:accesspoint/myaccesspoint/mykey- 
                    For API details, see Rm in Amazon CLI Command Reference. 
 
- 
                    
The following code example shows how to use select-object-content.
- Amazon CLI
- 
             
                    To filter the contents of an Amazon S3 object based on an SQL statement The following select-object-contentexample filters the objectmy-data-file.csvwith the specified SQL statement and sends output to a file.aws s3api select-object-content \ --bucketamzn-s3-demo-bucket\ --keymy-data-file.csv\ --expression"select * from s3object limit 100"\ --expression-type 'SQL' \ --input-serialization '{"CSV": {}, "CompressionType": "NONE"}' \ --output-serialization '{"CSV": {}}'"output.csv"This command produces no output. - 
                    For API details, see SelectObjectContent in Amazon CLI Command Reference. 
 
- 
                    
The following code example shows how to use sync.
- Amazon CLI
- 
             
                    Example 1: Sync all local objects to the specified bucket The following synccommand syncs objects from a local directory to the specified prefix and bucket by uploading the local files to S3. A local file will require uploading if the size of the local file is different than the size of the S3 object, the last modified time of the local file is newer than the last modified time of the S3 object, or the local file does not exist under the specified bucket and prefix. In this example, the user syncs the bucketamzn-s3-demo-bucketto the local current directory. The local current directory contains the filestest.txtandtest2.txt. The bucketamzn-s3-demo-bucketcontains no objects.aws s3 sync.s3://amzn-s3-demo-bucketOutput: upload: test.txt to s3://amzn-s3-demo-bucket/test.txt upload: test2.txt to s3://amzn-s3-demo-bucket/test2.txtExample 2: Sync all S3 objects from the specified S3 bucket to another bucket The following synccommand syncs objects under a specified prefix and bucket to objects under another specified prefix and bucket by copying S3 objects. An S3 object will require copying if the sizes of the two S3 objects differ, the last modified time of the source is newer than the last modified time of the destination, or the S3 object does not exist under the specified bucket and prefix destination.In this example, the user syncs the bucket amzn-s3-demo-bucketto the bucketamzn-s3-demo-bucket2. The bucketamzn-s3-demo-bucketcontains the objectstest.txtandtest2.txt. The bucketamzn-s3-demo-bucket2contains no objects:aws s3 syncs3://amzn-s3-demo-buckets3://amzn-s3-demo-bucket2Output: copy: s3://amzn-s3-demo-bucket/test.txt to s3://amzn-s3-demo-bucket2/test.txt copy: s3://amzn-s3-demo-bucket/test2.txt to s3://amzn-s3-demo-bucket2/test2.txtExample 3: Sync all S3 objects from the specified S3 bucket to the local directory The following synccommand syncs files from the specified S3 bucket to the local directory by downloading S3 objects. An S3 object will require downloading if the size of the S3 object differs from the size of the local file, the last modified time of the S3 object is newer than the last modified time of the local file, or the S3 object does not exist in the local directory. Take note that when objects are downloaded from S3, the last modified time of the local file is changed to the last modified time of the S3 object. In this example, the user syncs the bucketamzn-s3-demo-bucketto the current local directory. The bucketamzn-s3-demo-bucketcontains the objectstest.txtandtest2.txt. The current local directory has no files:aws s3 syncs3://amzn-s3-demo-bucket.Output: download: s3://amzn-s3-demo-bucket/test.txt to test.txt download: s3://amzn-s3-demo-bucket/test2.txt to test2.txtExample 4: Sync all local objects to the specified bucket and delete all files that do not match The following synccommand syncs objects under a specified prefix and bucket to files in a local directory by uploading the local files to S3. Because of the--deleteparameter, any files existing under the specified prefix and bucket but not existing in the local directory will be deleted. In this example, the user syncs the bucketamzn-s3-demo-bucketto the local current directory. The local current directory contains the filestest.txtandtest2.txt. The bucketamzn-s3-demo-bucketcontains the objecttest3.txt:aws s3 sync.s3://amzn-s3-demo-bucket\ --deleteOutput: upload: test.txt to s3://amzn-s3-demo-bucket/test.txt upload: test2.txt to s3://amzn-s3-demo-bucket/test2.txt delete: s3://amzn-s3-demo-bucket/test3.txtExample 5: Sync all local objects to the specified bucket except ``.jpg`` files The following synccommand syncs objects under a specified prefix and bucket to files in a local directory by uploading the local files to S3. Because of the--excludeparameter, all files matching the pattern existing both in S3 and locally will be excluded from the sync. In this example, the user syncs the bucketamzn-s3-demo-bucketto the local current directory. The local current directory contains the filestest.jpgandtest2.txt. The bucketamzn-s3-demo-bucketcontains the objecttest.jpgof a different size than the localtest.jpg:aws s3 sync.s3://amzn-s3-demo-bucket\ --exclude"*.jpg"Output: upload: test2.txt to s3://amzn-s3-demo-bucket/test2.txtExample 6: Sync all local objects to the specified bucket except specified directory files The following synccommand syncs files under a local directory to objects under a specified prefix and bucket by downloading S3 objects. This example uses the--excludeparameter flag to exclude a specified directory and S3 prefix from thesynccommand. In this example, the user syncs the local current directory to the bucketamzn-s3-demo-bucket. The local current directory contains the filestest.txtandanother/test2.txt. The bucketamzn-s3-demo-bucketcontains the objectsanother/test5.txtandtest1.txt:aws s3 syncs3://amzn-s3-demo-bucket/.\ --exclude"*another/*"Output: download: s3://amzn-s3-demo-bucket/test1.txt to test1.txtExample 7: Sync all objects between buckets in different regions The following synccommand syncs files between two buckets in different regions:aws s3 syncs3://my-us-west-2-buckets3://my-us-east-1-bucket\ --source-regionus-west-2\ --regionus-east-1Output: download: s3://my-us-west-2-bucket/test1.txt to s3://my-us-east-1-bucket/test1.txtExample 8: Sync to an S3 access point The following synccommand syncs the current directory to the access point (myaccesspoint):aws s3 sync.s3://arn:aws:s3:us-west-2:123456789012:accesspoint/myaccesspoint/Output: upload: test.txt to s3://arn:aws:s3:us-west-2:123456789012:accesspoint/myaccesspoint/test.txt upload: test2.txt to s3://arn:aws:s3:us-west-2:123456789012:accesspoint/myaccesspoint/test2.txt- 
                    For API details, see Sync in Amazon CLI Command Reference. 
 
- 
                    
The following code example shows how to use upload-part-copy.
- Amazon CLI
- 
             
                    To upload part of an object by copying data from an existing object as the data source The following upload-part-copyexample uploads a part by copying data from an existing object as a data source.aws s3api upload-part-copy \ --bucketamzn-s3-demo-bucket\ --key"Map_Data_June.mp4"\ --copy-source"amzn-s3-demo-bucket/copy_of_Map_Data_June.mp4"\ --part-number1\ --upload-id"bq0tdE1CDpWQYRPLHuNG50xAT6pA5D.m_RiBy0ggOH6b13pVRY7QjvLlf75iFdJqp_2wztk5hvpUM2SesXgrzbehG5hViyktrfANpAD0NO.Nk3XREBqvGeZF6U3ipiSm"Output: { "CopyPartResult": { "LastModified": "2019-12-13T23:16:03.000Z", "ETag": "\"711470fc377698c393d94aed6305e245\"" } }- 
                    For API details, see UploadPartCopy in Amazon CLI Command Reference. 
 
- 
                    
The following code example shows how to use upload-part.
- Amazon CLI
- 
             
                    The following command uploads the first part in a multipart upload initiated with the create-multipart-uploadcommand:aws s3api upload-part --bucketamzn-s3-demo-bucket--key 'multipart/01' --part-number1--bodypart01--upload-id"dfRtDYU0WWCCcH43C3WFbkRONycyCpTJJvxu2i5GYkZljF.Yxwh6XG7WfS2vC4to6HiV6Yjlx.cph0gtNBtJ8P3URCSbB7rjxI5iEwVDmgaXZOGgkk5nVTW16HOQ5l0R"The bodyoption takes the name or path of a local file for upload (do not use the file:// prefix). The minimum part size is 5 MB. Upload ID is returned bycreate-multipart-uploadand can also be retrieved withlist-multipart-uploads. Bucket and key are specified when you create the multipart upload.Output: { "ETag": "\"e868e0f4719e394144ef36531ee6824c\"" }Save the ETag value of each part for later. They are required to complete the multipart upload. - 
                    For API details, see UploadPart in Amazon CLI Command Reference. 
 
- 
                    
The following code example shows how to use website.
- Amazon CLI
- 
             
                    Configure an S3 bucket as a static website The following command configures a bucket named amzn-s3-demo-bucketas a static website. The index document option specifies the file inamzn-s3-demo-bucketthat visitors will be directed to when they navigate to the website URL. In this case, the bucket is in the us-west-2 region, so the site would appear athttp://amzn-s3-demo-bucket.s3-website-us-west-2.amazonaws.com.All files in the bucket that appear on the static site must be configured to allow visitors to open them. File permissions are configured separately from the bucket website configuration. aws s3 websites3://amzn-s3-demo-bucket/\ --index-documentindex.html\ --error-documenterror.htmlFor information on hosting a static website in Amazon S3, see Hosting a Static Website in the Amazon Simple Storage Service Developer Guide. - 
                    For API details, see Website in Amazon CLI Command Reference. 
 
-