This documentation is for Version 1 of the Amazon CLI only. For documentation related to Version 2 of the Amazon CLI, see the Version 2 User Guide.
Amazon S3 examples using Amazon CLI
The following code examples show you how to perform actions and implement common scenarios by using the Amazon Command Line Interface with Amazon S3.
Actions are code excerpts from larger programs and must be run in context. While actions show you how to call individual service functions, you can see actions in context in their related scenarios.
Each example includes a link to the complete source code, where you can find instructions on how to set up and run the code in context.
Topics
Actions
The following code example shows how to use abort-multipart-upload
.
- Amazon CLI
-
To abort the specified multipart upload
The following
abort-multipart-upload
command aborts a multipart upload for the keymultipart/01
in the bucketmy-bucket
.aws s3api abort-multipart-upload \ --bucket
my-bucket
\ --keymultipart/01
\ --upload-iddfRtDYU0WWCCcH43C3WFbkRONycyCpTJJvxu2i5GYkZljF.Yxwh6XG7WfS2vC4to6HiV6Yjlx.cph0gtNBtJ8P3URCSbB7rjxI5iEwVDmgaXZOGgkk5nVTW16HOQ5l0R
The upload ID required by this command is output by
create-multipart-upload
and can also be retrieved withlist-multipart-uploads
.-
For API details, see AbortMultipartUpload
in Amazon CLI Command Reference.
-
The following code example shows how to use complete-multipart-upload
.
- Amazon CLI
-
The following command completes a multipart upload for the key
multipart/01
in the bucketmy-bucket
:aws s3api complete-multipart-upload --multipart-upload
file://mpustruct
--bucketmy-bucket
--key 'multipart/01
' --upload-iddfRtDYU0WWCCcH43C3WFbkRONycyCpTJJvxu2i5GYkZljF.Yxwh6XG7WfS2vC4to6HiV6Yjlx.cph0gtNBtJ8P3URCSbB7rjxI5iEwVDmgaXZOGgkk5nVTW16HOQ5l0R
The upload ID required by this command is output by
create-multipart-upload
and can also be retrieved withlist-multipart-uploads
.The multipart upload option in the above command takes a JSON structure that describes the parts of the multipart upload that should be reassembled into the complete file. In this example, the
file://
prefix is used to load the JSON structure from a file in the local folder namedmpustruct
.mpustruct:
{ "Parts": [ { "ETag": "e868e0f4719e394144ef36531ee6824c", "PartNumber": 1 }, { "ETag": "6bb2b12753d66fe86da4998aa33fffb0", "PartNumber": 2 }, { "ETag": "d0a0112e841abec9c9ec83406f0159c8", "PartNumber": 3 } ] }
The ETag value for each part is upload is output each time you upload a part using the
upload-part
command and can also be retrieved by callinglist-parts
or calculated by taking the MD5 checksum of each part.Output:
{ "ETag": "\"3944a9f7a4faab7f78788ff6210f63f0-3\"", "Bucket": "my-bucket", "Location": "https://my-bucket.s3.amazonaws.com/multipart%2F01", "Key": "multipart/01" }
-
For API details, see CompleteMultipartUpload
in Amazon CLI Command Reference.
-
The following code example shows how to use copy-object
.
- Amazon CLI
-
The following command copies an object from
bucket-1
tobucket-2
:aws s3api copy-object --copy-source
bucket-1/test.txt
--keytest.txt
--bucketbucket-2
Output:
{ "CopyObjectResult": { "LastModified": "2015-11-10T01:07:25.000Z", "ETag": "\"589c8b79c230a6ecd5a7e1d040a9a030\"" }, "VersionId": "YdnYvTCVDqRRFA.NFJjy36p0hxifMlkA" }
-
For API details, see CopyObject
in Amazon CLI Command Reference.
-
The following code example shows how to use cp
.
- Amazon CLI
-
Example 1: Copying a local file to S3
The following
cp
command copies a single file to a specified bucket and key:aws s3 cp
test.txt
s3://mybucket/test2.txt
Output:
upload: test.txt to s3://mybucket/test2.txt
Example 2: Copying a local file to S3 with an expiration date
The following
cp
command copies a single file to a specified bucket and key that expires at the specified ISO 8601 timestamp:aws s3 cp
test.txt
s3://mybucket/test2.txt
\ --expires2014-10-01T20:30:00Z
Output:
upload: test.txt to s3://mybucket/test2.txt
Example 3: Copying a file from S3 to S3
The following
cp
command copies a single s3 object to a specified bucket and key:aws s3 cp
s3://mybucket/test.txt
s3://mybucket/test2.txt
Output:
copy: s3://mybucket/test.txt to s3://mybucket/test2.txt
Example 4: Copying an S3 object to a local file
The following
cp
command copies a single object to a specified file locally:aws s3 cp
s3://mybucket/test.txt
test2.txt
Output:
download: s3://mybucket/test.txt to test2.txt
Example 5: Copying an S3 object from one bucket to another
The following
cp
command copies a single object to a specified bucket while retaining its original name:aws s3 cp
s3://mybucket/test.txt
s3://mybucket2/
Output:
copy: s3://mybucket/test.txt to s3://mybucket2/test.txt
Example 6: Recursively copying S3 objects to a local directory
When passed with the parameter
--recursive
, the followingcp
command recursively copies all objects under a specified prefix and bucket to a specified directory. In this example, the bucketmybucket
has the objectstest1.txt
andtest2.txt
:aws s3 cp
s3://mybucket
.
\ --recursiveOutput:
download: s3://mybucket/test1.txt to test1.txt download: s3://mybucket/test2.txt to test2.txt
Example 7: Recursively copying local files to S3
When passed with the parameter
--recursive
, the followingcp
command recursively copies all files under a specified directory to a specified bucket and prefix while excluding some files by using an--exclude
parameter. In this example, the directorymyDir
has the filestest1.txt
andtest2.jpg
:aws s3 cp
myDir
s3://mybucket/
\ --recursive \ --exclude"*.jpg"
Output:
upload: myDir/test1.txt to s3://mybucket/test1.txt
Example 8: Recursively copying S3 objects to another bucket
When passed with the parameter
--recursive
, the followingcp
command recursively copies all objects under a specified bucket to another bucket while excluding some objects by using an--exclude
parameter. In this example, the bucketmybucket
has the objectstest1.txt
andanother/test1.txt
:aws s3 cp
s3://mybucket/
s3://mybucket2/
\ --recursive \ --exclude"another/*"
Output:
copy: s3://mybucket/test1.txt to s3://mybucket2/test1.txt
You can combine
--exclude
and--include
options to copy only objects that match a pattern, excluding all others:aws s3 cp
s3://mybucket/logs/
s3://mybucket2/logs/
\ --recursive \ --exclude"*"
\ --include"*.log"
Output:
copy: s3://mybucket/logs/test/test.log to s3://mybucket2/logs/test/test.log copy: s3://mybucket/logs/test3.log to s3://mybucket2/logs/test3.log
Example 9: Setting the Access Control List (ACL) while copying an S3 object
The following
cp
command copies a single object to a specified bucket and key while setting the ACL topublic-read-write
:aws s3 cp
s3://mybucket/test.txt
s3://mybucket/test2.txt
\ --aclpublic-read-write
Output:
copy: s3://mybucket/test.txt to s3://mybucket/test2.txt
Note that if you're using the
--acl
option, ensure that any associated IAM policies include the"s3:PutObjectAcl"
action:aws iam get-user-policy \ --user-name
myuser
\ --policy-namemypolicy
Output:
{ "UserName": "myuser", "PolicyName": "mypolicy", "PolicyDocument": { "Version": "2012-10-17", "Statement": [ { "Action": [ "s3:PutObject", "s3:PutObjectAcl" ], "Resource": [ "arn:aws:s3:::mybucket/*" ], "Effect": "Allow", "Sid": "Stmt1234567891234" } ] } }
Example 10: Granting permissions for an S3 object
The following
cp
command illustrates the use of the--grants
option to grant read access to all users identified by URI and full control to a specific user identified by their Canonical ID:aws s3 cp
file.txt
s3://mybucket/
--grantsread=uri=http://acs.amazonaws.com/groups/global/AllUsers
full=id=79a59df900b949e55d96a1e698fbacedfd6e09d98eacf8f8d5218e7cd47ef2be
Output:
upload: file.txt to s3://mybucket/file.txt
Example 11: Uploading a local file stream to S3
PowerShell may alter the encoding of or add a CRLF to piped input.
The following
cp
command uploads a local file stream from standard input to a specified bucket and key:aws s3 cp
-
s3://mybucket/stream.txt
Example 12: Uploading a local file stream that is larger than 50GB to S3
The following
cp
command uploads a 51GB local file stream from standard input to a specified bucket and key. The--expected-size
option must be provided, or the upload may fail when it reaches the default part limit of 10,000:aws s3 cp
-
s3://mybucket/stream.txt
--expected-size54760833024
Example 13: Downloading an S3 object as a local file stream
PowerShell may alter the encoding of or add a CRLF to piped or redirected output.
The following
cp
command downloads an S3 object locally as a stream to standard output. Downloading as a stream is not currently compatible with the--recursive
parameter:aws s3 cp
s3://mybucket/stream.txt
-
Example 14: Uploading to an S3 access point
The following
cp
command uploads a single file (mydoc.txt
) to the access point (myaccesspoint
) at the key (mykey
):aws s3 cp
mydoc.txt
s3://arn:aws:s3:us-west-2:123456789012:accesspoint/myaccesspoint/mykey
Output:
upload: mydoc.txt to s3://arn:aws:s3:us-west-2:123456789012:accesspoint/myaccesspoint/mykey
Example 15: Downloading from an S3 access point
The following
cp
command downloads a single object (mykey
) from the access point (myaccesspoint
) to the local file (mydoc.txt
):aws s3 cp
s3://arn:aws:s3:us-west-2:123456789012:accesspoint/myaccesspoint/mykey
mydoc.txt
Output:
download: s3://arn:aws:s3:us-west-2:123456789012:accesspoint/myaccesspoint/mykey to mydoc.txt
-
For API details, see Cp
in Amazon CLI Command Reference.
-
The following code example shows how to use create-bucket
.
- Amazon CLI
-
Example 1: To create a bucket
The following
create-bucket
example creates a bucket namedmy-bucket
:aws s3api create-bucket \ --bucket
my-bucket
\ --regionus-east-1
Output:
{ "Location": "/my-bucket" }
For more information, see Creating a bucket
in the Amazon S3 User Guide. Example 2: To create a bucket with owner enforced
The following
create-bucket
example creates a bucket namedmy-bucket
that uses the bucket owner enforced setting for S3 Object Ownership.aws s3api create-bucket \ --bucket
my-bucket
\ --regionus-east-1
\ --object-ownershipBucketOwnerEnforced
Output:
{ "Location": "/my-bucket" }
For more information, see Controlling ownership of objects and disabling ACLs
in the Amazon S3 User Guide. Example 3: To create a bucket outside of the ``us-east-1`` region
The following
create-bucket
example creates a bucket namedmy-bucket
in theeu-west-1
region. Regions outside ofus-east-1
require the appropriateLocationConstraint
to be specified in order to create the bucket in the desired region.aws s3api create-bucket \ --bucket
my-bucket
\ --regioneu-west-1
\ --create-bucket-configurationLocationConstraint=eu-west-1
Output:
{ "Location": "http://my-bucket.s3.amazonaws.com/" }
For more information, see Creating a bucket
in the Amazon S3 User Guide. -
For API details, see CreateBucket
in Amazon CLI Command Reference.
-
The following code example shows how to use create-multipart-upload
.
- Amazon CLI
-
The following command creates a multipart upload in the bucket
my-bucket
with the keymultipart/01
:aws s3api create-multipart-upload --bucket
my-bucket
--key 'multipart/01
'Output:
{ "Bucket": "my-bucket", "UploadId": "dfRtDYU0WWCCcH43C3WFbkRONycyCpTJJvxu2i5GYkZljF.Yxwh6XG7WfS2vC4to6HiV6Yjlx.cph0gtNBtJ8P3URCSbB7rjxI5iEwVDmgaXZOGgkk5nVTW16HOQ5l0R", "Key": "multipart/01" }
The completed file will be named
01
in a folder calledmultipart
in the bucketmy-bucket
. Save the upload ID, key and bucket name for use with theupload-part
command.-
For API details, see CreateMultipartUpload
in Amazon CLI Command Reference.
-
The following code example shows how to use delete-bucket-analytics-configuration
.
- Amazon CLI
-
To delete an analytics configuration for a bucket
The following
delete-bucket-analytics-configuration
example removes the analytics configuration for the specified bucket and ID.aws s3api delete-bucket-analytics-configuration \ --bucket
my-bucket
\ --id1
This command produces no output.
-
For API details, see DeleteBucketAnalyticsConfiguration
in Amazon CLI Command Reference.
-
The following code example shows how to use delete-bucket-cors
.
- Amazon CLI
-
The following command deletes a Cross-Origin Resource Sharing configuration from a bucket named
my-bucket
:aws s3api delete-bucket-cors --bucket
my-bucket
-
For API details, see DeleteBucketCors
in Amazon CLI Command Reference.
-
The following code example shows how to use delete-bucket-encryption
.
- Amazon CLI
-
To delete the server-side encryption configuration of a bucket
The following
delete-bucket-encryption
example deletes the server-side encryption configuration of the specified bucket.aws s3api delete-bucket-encryption \ --bucket
my-bucket
This command produces no output.
-
For API details, see DeleteBucketEncryption
in Amazon CLI Command Reference.
-
The following code example shows how to use delete-bucket-intelligent-tiering-configuration
.
- Amazon CLI
-
To remove an S3 Intelligent-Tiering configuration on a bucket
The following
delete-bucket-intelligent-tiering-configuration
example removes an S3 Intelligent-Tiering configuration, named ExampleConfig, on a bucket.aws s3api delete-bucket-intelligent-tiering-configuration \ --bucket
DOC-EXAMPLE-BUCKET
\ --idExampleConfig
This command produces no output.
For more information, see Using S3 Intelligent-Tiering
in the Amazon S3 User Guide. -
For API details, see DeleteBucketIntelligentTieringConfiguration
in Amazon CLI Command Reference.
-
The following code example shows how to use delete-bucket-inventory-configuration
.
- Amazon CLI
-
To delete the inventory configuration of a bucket
The following
delete-bucket-inventory-configuration
example deletes the inventory configuration with ID1
for the specified bucket.aws s3api delete-bucket-inventory-configuration \ --bucket
my-bucket
\ --id1
This command produces no output.
-
For API details, see DeleteBucketInventoryConfiguration
in Amazon CLI Command Reference.
-
The following code example shows how to use delete-bucket-lifecycle
.
- Amazon CLI
-
The following command deletes a lifecycle configuration from a bucket named
my-bucket
:aws s3api delete-bucket-lifecycle --bucket
my-bucket
-
For API details, see DeleteBucketLifecycle
in Amazon CLI Command Reference.
-
The following code example shows how to use delete-bucket-metrics-configuration
.
- Amazon CLI
-
To delete a metrics configuration for a bucket
The following
delete-bucket-metrics-configuration
example removes the metrics configuration for the specified bucket and ID.aws s3api delete-bucket-metrics-configuration \ --bucket
my-bucket
\ --id123
This command produces no output.
-
For API details, see DeleteBucketMetricsConfiguration
in Amazon CLI Command Reference.
-
The following code example shows how to use delete-bucket-ownership-controls
.
- Amazon CLI
-
To remove the bucket ownership settings of a bucket
The following
delete-bucket-ownership-controls
example removes the bucket ownership settings of a bucket.aws s3api delete-bucket-ownership-controls \ --bucket
DOC-EXAMPLE-BUCKET
This command produces no output.
For more information, see Setting Object Ownership on an existing bucket
in the Amazon S3 User Guide. -
For API details, see DeleteBucketOwnershipControls
in Amazon CLI Command Reference.
-
The following code example shows how to use delete-bucket-policy
.
- Amazon CLI
-
The following command deletes a bucket policy from a bucket named
my-bucket
:aws s3api delete-bucket-policy --bucket
my-bucket
-
For API details, see DeleteBucketPolicy
in Amazon CLI Command Reference.
-
The following code example shows how to use delete-bucket-replication
.
- Amazon CLI
-
The following command deletes a replication configuration from a bucket named
my-bucket
:aws s3api delete-bucket-replication --bucket
my-bucket
-
For API details, see DeleteBucketReplication
in Amazon CLI Command Reference.
-
The following code example shows how to use delete-bucket-tagging
.
- Amazon CLI
-
The following command deletes a tagging configuration from a bucket named
my-bucket
:aws s3api delete-bucket-tagging --bucket
my-bucket
-
For API details, see DeleteBucketTagging
in Amazon CLI Command Reference.
-
The following code example shows how to use delete-bucket-website
.
- Amazon CLI
-
The following command deletes a website configuration from a bucket named
my-bucket
:aws s3api delete-bucket-website --bucket
my-bucket
-
For API details, see DeleteBucketWebsite
in Amazon CLI Command Reference.
-
The following code example shows how to use delete-bucket
.
- Amazon CLI
-
The following command deletes a bucket named
my-bucket
:aws s3api delete-bucket --bucket
my-bucket
--regionus-east-1
-
For API details, see DeleteBucket
in Amazon CLI Command Reference.
-
The following code example shows how to use delete-object-tagging
.
- Amazon CLI
-
To delete the tag sets of an object
The following
delete-object-tagging
example deletes the tag with the specified key from the objectdoc1.rtf
.aws s3api delete-object-tagging \ --bucket
my-bucket
\ --keydoc1.rtf
This command produces no output.
-
For API details, see DeleteObjectTagging
in Amazon CLI Command Reference.
-
The following code example shows how to use delete-object
.
- Amazon CLI
-
The following command deletes an object named
test.txt
from a bucket namedmy-bucket
:aws s3api delete-object --bucket
my-bucket
--keytest.txt
If bucket versioning is enabled, the output will contain the version ID of the delete marker:
{ "VersionId": "9_gKg5vG56F.TTEUdwkxGpJ3tNDlWlGq", "DeleteMarker": true }
For more information about deleting objects, see Deleting Objects in the Amazon S3 Developer Guide.
-
For API details, see DeleteObject
in Amazon CLI Command Reference.
-
The following code example shows how to use delete-objects
.
- Amazon CLI
-
The following command deletes an object from a bucket named
my-bucket
:aws s3api delete-objects --bucket
my-bucket
--deletefile://delete.json
delete.json
is a JSON document in the current directory that specifies the object to delete:{ "Objects": [ { "Key": "test1.txt" } ], "Quiet": false }
Output:
{ "Deleted": [ { "DeleteMarkerVersionId": "mYAT5Mc6F7aeUL8SS7FAAqUPO1koHwzU", "Key": "test1.txt", "DeleteMarker": true } ] }
-
For API details, see DeleteObjects
in Amazon CLI Command Reference.
-
The following code example shows how to use delete-public-access-block
.
- Amazon CLI
-
To delete the block public access configuration for a bucket
The following
delete-public-access-block
example removes the block public access configuration on the specified bucket.aws s3api delete-public-access-block \ --bucket
my-bucket
This command produces no output.
-
For API details, see DeletePublicAccessBlock
in Amazon CLI Command Reference.
-
The following code example shows how to use get-bucket-accelerate-configuration
.
- Amazon CLI
-
To retrieve the accelerate configuration of a bucket
The following
get-bucket-accelerate-configuration
example retrieves the accelerate configuration for the specified bucket.aws s3api get-bucket-accelerate-configuration \ --bucket
my-bucket
Output:
{ "Status": "Enabled" }
-
For API details, see GetBucketAccelerateConfiguration
in Amazon CLI Command Reference.
-
The following code example shows how to use get-bucket-acl
.
- Amazon CLI
-
The following command retrieves the access control list for a bucket named
my-bucket
:aws s3api get-bucket-acl --bucket
my-bucket
Output:
{ "Owner": { "DisplayName": "my-username", "ID": "7009a8971cd538e11f6b6606438875e7c86c5b672f46db45460ddcd087d36c32" }, "Grants": [ { "Grantee": { "DisplayName": "my-username", "ID": "7009a8971cd538e11f6b6606438875e7c86c5b672f46db45460ddcd087d36c32" }, "Permission": "FULL_CONTROL" } ] }
-
For API details, see GetBucketAcl
in Amazon CLI Command Reference.
-
The following code example shows how to use get-bucket-analytics-configuration
.
- Amazon CLI
-
To retrieve the analytics configuration for a bucket with a specific ID
The following
get-bucket-analytics-configuration
example displays the analytics configuration for the specified bucket and ID.aws s3api get-bucket-analytics-configuration \ --bucket
my-bucket
\ --id1
Output:
{ "AnalyticsConfiguration": { "StorageClassAnalysis": {}, "Id": "1" } }
-
For API details, see GetBucketAnalyticsConfiguration
in Amazon CLI Command Reference.
-
The following code example shows how to use get-bucket-cors
.
- Amazon CLI
-
The following command retrieves the Cross-Origin Resource Sharing configuration for a bucket named
my-bucket
:aws s3api get-bucket-cors --bucket
my-bucket
Output:
{ "CORSRules": [ { "AllowedHeaders": [ "*" ], "ExposeHeaders": [ "x-amz-server-side-encryption" ], "AllowedMethods": [ "PUT", "POST", "DELETE" ], "MaxAgeSeconds": 3000, "AllowedOrigins": [ "http://www.example.com" ] }, { "AllowedHeaders": [ "Authorization" ], "MaxAgeSeconds": 3000, "AllowedMethods": [ "GET" ], "AllowedOrigins": [ "*" ] } ] }
-
For API details, see GetBucketCors
in Amazon CLI Command Reference.
-
The following code example shows how to use get-bucket-encryption
.
- Amazon CLI
-
To retrieve the server-side encryption configuration for a bucket
The following
get-bucket-encryption
example retrieves the server-side encryption configuration for the bucketmy-bucket
.aws s3api get-bucket-encryption \ --bucket
my-bucket
Output:
{ "ServerSideEncryptionConfiguration": { "Rules": [ { "ApplyServerSideEncryptionByDefault": { "SSEAlgorithm": "AES256" } } ] } }
-
For API details, see GetBucketEncryption
in Amazon CLI Command Reference.
-
The following code example shows how to use get-bucket-intelligent-tiering-configuration
.
- Amazon CLI
-
To retrieve an S3 Intelligent-Tiering configuration on a bucket
The following
get-bucket-intelligent-tiering-configuration
example retrieves an S3 Intelligent-Tiering configuration, named ExampleConfig, on a bucket.aws s3api get-bucket-intelligent-tiering-configuration \ --bucket
DOC-EXAMPLE-BUCKET
\ --idExampleConfig
Output:
{ "IntelligentTieringConfiguration": { "Id": "ExampleConfig2", "Filter": { "Prefix": "images" }, "Status": "Enabled", "Tierings": [ { "Days": 90, "AccessTier": "ARCHIVE_ACCESS" }, { "Days": 180, "AccessTier": "DEEP_ARCHIVE_ACCESS" } ] } }
For more information, see Using S3 Intelligent-Tiering
in the Amazon S3 User Guide. -
For API details, see GetBucketIntelligentTieringConfiguration
in Amazon CLI Command Reference.
-
The following code example shows how to use get-bucket-inventory-configuration
.
- Amazon CLI
-
To retrieve the inventory configuration for a bucket
The following
get-bucket-inventory-configuration
example retrieves the inventory configuration for the specified bucket with ID1
.aws s3api get-bucket-inventory-configuration \ --bucket
my-bucket
\ --id1
Output:
{ "InventoryConfiguration": { "IsEnabled": true, "Destination": { "S3BucketDestination": { "Format": "ORC", "Bucket": "arn:aws:s3:::my-bucket", "AccountId": "123456789012" } }, "IncludedObjectVersions": "Current", "Id": "1", "Schedule": { "Frequency": "Weekly" } } }
-
For API details, see GetBucketInventoryConfiguration
in Amazon CLI Command Reference.
-
The following code example shows how to use get-bucket-lifecycle-configuration
.
- Amazon CLI
-
The following command retrieves the lifecycle configuration for a bucket named
my-bucket
:aws s3api get-bucket-lifecycle-configuration --bucket
my-bucket
Output:
{ "Rules": [ { "ID": "Move rotated logs to Glacier", "Prefix": "rotated/", "Status": "Enabled", "Transitions": [ { "Date": "2015-11-10T00:00:00.000Z", "StorageClass": "GLACIER" } ] }, { "Status": "Enabled", "Prefix": "", "NoncurrentVersionTransitions": [ { "NoncurrentDays": 0, "StorageClass": "GLACIER" } ], "ID": "Move old versions to Glacier" } ] }
-
For API details, see GetBucketLifecycleConfiguration
in Amazon CLI Command Reference.
-
The following code example shows how to use get-bucket-lifecycle
.
- Amazon CLI
-
The following command retrieves the lifecycle configuration for a bucket named
my-bucket
:aws s3api get-bucket-lifecycle --bucket
my-bucket
Output:
{ "Rules": [ { "ID": "Move to Glacier after sixty days (objects in logs/2015/)", "Prefix": "logs/2015/", "Status": "Enabled", "Transition": { "Days": 60, "StorageClass": "GLACIER" } }, { "Expiration": { "Date": "2016-01-01T00:00:00.000Z" }, "ID": "Delete 2014 logs in 2016.", "Prefix": "logs/2014/", "Status": "Enabled" } ] }
-
For API details, see GetBucketLifecycle
in Amazon CLI Command Reference.
-
The following code example shows how to use get-bucket-location
.
- Amazon CLI
-
The following command retrieves the location constraint for a bucket named
my-bucket
, if a constraint exists:aws s3api get-bucket-location --bucket
my-bucket
Output:
{ "LocationConstraint": "us-west-2" }
-
For API details, see GetBucketLocation
in Amazon CLI Command Reference.
-
The following code example shows how to use get-bucket-logging
.
- Amazon CLI
-
To retrieve the logging status for a bucket
The following
get-bucket-logging
example retrieves the logging status for the specified bucket.aws s3api get-bucket-logging \ --bucket
my-bucket
Output:
{ "LoggingEnabled": { "TargetPrefix": "", "TargetBucket": "my-bucket-logs" } }
-
For API details, see GetBucketLogging
in Amazon CLI Command Reference.
-
The following code example shows how to use get-bucket-metrics-configuration
.
- Amazon CLI
-
To retrieve the metrics configuration for a bucket with a specific ID
The following
get-bucket-metrics-configuration
example displays the metrics configuration for the specified bucket and ID.aws s3api get-bucket-metrics-configuration \ --bucket
my-bucket
\ --id123
Output:
{ "MetricsConfiguration": { "Filter": { "Prefix": "logs" }, "Id": "123" } }
-
For API details, see GetBucketMetricsConfiguration
in Amazon CLI Command Reference.
-
The following code example shows how to use get-bucket-notification-configuration
.
- Amazon CLI
-
The following command retrieves the notification configuration for a bucket named
my-bucket
:aws s3api get-bucket-notification-configuration --bucket
my-bucket
Output:
{ "TopicConfigurations": [ { "Id": "YmQzMmEwM2EjZWVlI0NGItNzVtZjI1MC00ZjgyLWZDBiZWNl", "TopicArn": "arn:aws:sns:us-west-2:123456789012:my-notification-topic", "Events": [ "s3:ObjectCreated:*" ] } ] }
-
For API details, see GetBucketNotificationConfiguration
in Amazon CLI Command Reference.
-
The following code example shows how to use get-bucket-notification
.
- Amazon CLI
-
The following command retrieves the notification configuration for a bucket named
my-bucket
:aws s3api get-bucket-notification --bucket
my-bucket
Output:
{ "TopicConfiguration": { "Topic": "arn:aws:sns:us-west-2:123456789012:my-notification-topic", "Id": "YmQzMmEwM2EjZWVlI0NGItNzVtZjI1MC00ZjgyLWZDBiZWNl", "Event": "s3:ObjectCreated:*", "Events": [ "s3:ObjectCreated:*" ] } }
-
For API details, see GetBucketNotification
in Amazon CLI Command Reference.
-
The following code example shows how to use get-bucket-ownership-controls
.
- Amazon CLI
-
To retrieve the bucket ownership settings of a bucket
The following
get-bucket-ownership-controls
example retrieves the bucket ownership settings of a bucket.aws s3api get-bucket-ownership-controls \ --bucket
DOC-EXAMPLE-BUCKET
Output:
{ "OwnershipControls": { "Rules": [ { "ObjectOwnership": "BucketOwnerEnforced" } ] } }
For more information, see Viewing the Object Ownership setting for an S3 bucket
in the Amazon S3 User Guide. -
For API details, see GetBucketOwnershipControls
in Amazon CLI Command Reference.
-
The following code example shows how to use get-bucket-policy-status
.
- Amazon CLI
-
To retrieve the policy status for a bucket indicating whether the bucket is public
The following
get-bucket-policy-status
example retrieves the policy status for the bucketmy-bucket
.aws s3api get-bucket-policy-status \ --bucket
my-bucket
Output:
{ "PolicyStatus": { "IsPublic": false } }
-
For API details, see GetBucketPolicyStatus
in Amazon CLI Command Reference.
-
The following code example shows how to use get-bucket-policy
.
- Amazon CLI
-
The following command retrieves the bucket policy for a bucket named
my-bucket
:aws s3api get-bucket-policy --bucket
my-bucket
Output:
{ "Policy": "{\"Version\":\"2008-10-17\",\"Statement\":[{\"Sid\":\"\",\"Effect\":\"Allow\",\"Principal\":\"*\",\"Action\":\"s3:GetObject\",\"Resource\":\"arn:aws:s3:::my-bucket/*\"},{\"Sid\":\"\",\"Effect\":\"Deny\",\"Principal\":\"*\",\"Action\":\"s3:GetObject\",\"Resource\":\"arn:aws:s3:::my-bucket/secret/*\"}]}" }
Get and put a bucket policyThe following example shows how you can download an Amazon S3 bucket policy, make modifications to the file, and then use
put-bucket-policy
to apply the modified bucket policy. To download the bucket policy to a file, you can run:aws s3api get-bucket-policy --bucket mybucket --query Policy --output text > policy.json
You can then modify the
policy.json
file as needed. Finally you can apply this modified policy back to the S3 bucket by running:policy.json
file as needed. Finally you can apply this modified policy back to the S3 bucket by running:file as needed. Finally you can apply this modified policy back to the S3 bucket by running:
aws s3api put-bucket-policy --bucket mybucket --policy file://policy.json
-
For API details, see GetBucketPolicy
in Amazon CLI Command Reference.
-
The following code example shows how to use get-bucket-replication
.
- Amazon CLI
-
The following command retrieves the replication configuration for a bucket named
my-bucket
:aws s3api get-bucket-replication --bucket
my-bucket
Output:
{ "ReplicationConfiguration": { "Rules": [ { "Status": "Enabled", "Prefix": "", "Destination": { "Bucket": "arn:aws:s3:::my-bucket-backup", "StorageClass": "STANDARD" }, "ID": "ZmUwNzE4ZmQ4tMjVhOS00MTlkLOGI4NDkzZTIWJjNTUtYTA1" } ], "Role": "arn:aws:iam::123456789012:role/s3-replication-role" } }
-
For API details, see GetBucketReplication
in Amazon CLI Command Reference.
-
The following code example shows how to use get-bucket-request-payment
.
- Amazon CLI
-
To retrieve the request payment configuration for a bucket
The following
get-bucket-request-payment
example retrieves the requester pays configuration for the specified bucket.aws s3api get-bucket-request-payment \ --bucket
my-bucket
Output:
{ "Payer": "BucketOwner" }
-
For API details, see GetBucketRequestPayment
in Amazon CLI Command Reference.
-
The following code example shows how to use get-bucket-tagging
.
- Amazon CLI
-
The following command retrieves the tagging configuration for a bucket named
my-bucket
:aws s3api get-bucket-tagging --bucket
my-bucket
Output:
{ "TagSet": [ { "Value": "marketing", "Key": "organization" } ] }
-
For API details, see GetBucketTagging
in Amazon CLI Command Reference.
-
The following code example shows how to use get-bucket-versioning
.
- Amazon CLI
-
The following command retrieves the versioning configuration for a bucket named
my-bucket
:aws s3api get-bucket-versioning --bucket
my-bucket
Output:
{ "Status": "Enabled" }
-
For API details, see GetBucketVersioning
in Amazon CLI Command Reference.
-
The following code example shows how to use get-bucket-website
.
- Amazon CLI
-
The following command retrieves the static website configuration for a bucket named
my-bucket
:aws s3api get-bucket-website --bucket
my-bucket
Output:
{ "IndexDocument": { "Suffix": "index.html" }, "ErrorDocument": { "Key": "error.html" } }
-
For API details, see GetBucketWebsite
in Amazon CLI Command Reference.
-
The following code example shows how to use get-object-acl
.
- Amazon CLI
-
The following command retrieves the access control list for an object in a bucket named
my-bucket
:aws s3api get-object-acl --bucket
my-bucket
--keyindex.html
Output:
{ "Owner": { "DisplayName": "my-username", "ID": "7009a8971cd538e11f6b6606438875e7c86c5b672f46db45460ddcd087d36c32" }, "Grants": [ { "Grantee": { "DisplayName": "my-username", "ID": "7009a8971cd538e11f6b6606438875e7c86c5b672f46db45460ddcd087d36c32" }, "Permission": "FULL_CONTROL" }, { "Grantee": { "URI": "http://acs.amazonaws.com/groups/global/AllUsers" }, "Permission": "READ" } ] }
-
For API details, see GetObjectAcl
in Amazon CLI Command Reference.
-
The following code example shows how to use get-object-attributes
.
- Amazon CLI
-
To retrieves metadata from an object without returning the object itself
The following
get-object-attributes
example retrieves metadata from the objectdoc1.rtf
.aws s3api get-object-attributes \ --bucket
my-bucket
\ --keydoc1.rtf
\ --object-attributes"StorageClass"
"ETag"
"ObjectSize"
Output:
{ "LastModified": "2022-03-15T19:37:31+00:00", "VersionId": "IuCPjXTDzHNfldAuitVBIKJpF2p1fg4P", "ETag": "b662d79adeb7c8d787ea7eafb9ef6207", "StorageClass": "STANDARD", "ObjectSize": 405 }
For more information, see GetObjectAttributes
in the Amazon S3 API Reference. -
For API details, see GetObjectAttributes
in Amazon CLI Command Reference.
-
The following code example shows how to use get-object-legal-hold
.
- Amazon CLI
-
Retrieves the Legal Hold status of an object
The following
get-object-legal-hold
example retrieves the Legal Hold status for the specified object.aws s3api get-object-legal-hold \ --bucket
my-bucket-with-object-lock
\ --keydoc1.rtf
Output:
{ "LegalHold": { "Status": "ON" } }
-
For API details, see GetObjectLegalHold
in Amazon CLI Command Reference.
-
The following code example shows how to use get-object-lock-configuration
.
- Amazon CLI
-
To retrieve an object lock configuration for a bucket
The following
get-object-lock-configuration
example retrieves the object lock configuration for the specified bucket.aws s3api get-object-lock-configuration \ --bucket
my-bucket-with-object-lock
Output:
{ "ObjectLockConfiguration": { "ObjectLockEnabled": "Enabled", "Rule": { "DefaultRetention": { "Mode": "COMPLIANCE", "Days": 50 } } } }
-
For API details, see GetObjectLockConfiguration
in Amazon CLI Command Reference.
-
The following code example shows how to use get-object-retention
.
- Amazon CLI
-
To retrieve the object retention configuration for an object
The following
get-object-retention
example retrieves the object retention configuration for the specified object.aws s3api get-object-retention \ --bucket
my-bucket-with-object-lock
\ --keydoc1.rtf
Output:
{ "Retention": { "Mode": "GOVERNANCE", "RetainUntilDate": "2025-01-01T00:00:00.000Z" } }
-
For API details, see GetObjectRetention
in Amazon CLI Command Reference.
-
The following code example shows how to use get-object-tagging
.
- Amazon CLI
-
To retrieve the tags attached to an object
The following
get-object-tagging
example retrieves the values for the specified key from the specified object.aws s3api get-object-tagging \ --bucket
my-bucket
\ --keydoc1.rtf
Output:
{ "TagSet": [ { "Value": "confidential", "Key": "designation" } ] }
The following
get-object-tagging
example tries to retrieve the tag sets of the objectdoc2.rtf
, which has no tags.aws s3api get-object-tagging \ --bucket
my-bucket
\ --keydoc2.rtf
Output:
{ "TagSet": [] }
The following
get-object-tagging
example retrieves the tag sets of the objectdoc3.rtf
, which has multiple tags.aws s3api get-object-tagging \ --bucket
my-bucket
\ --keydoc3.rtf
Output:
{ "TagSet": [ { "Value": "confidential", "Key": "designation" }, { "Value": "finance", "Key": "department" }, { "Value": "payroll", "Key": "team" } ] }
-
For API details, see GetObjectTagging
in Amazon CLI Command Reference.
-
The following code example shows how to use get-object-torrent
.
- Amazon CLI
-
The following command creates a torrent for an object in a bucket named
my-bucket
:aws s3api get-object-torrent --bucket
my-bucket
--keylarge-video-file.mp4
large-video-file.torrent
The torrent file is saved locally in the current folder. Note that the output filename (
large-video-file.torrent
) is specified without an option name and must be the last argument in the command.-
For API details, see GetObjectTorrent
in Amazon CLI Command Reference.
-
The following code example shows how to use get-object
.
- Amazon CLI
-
The following example uses the
get-object
command to download an object from Amazon S3:aws s3api get-object --bucket
text-content
--keydir/my_images.tar.bz2
my_images.tar.bz2
Note that the outfile parameter is specified without an option name such as "--outfile". The name of the output file must be the last parameter in the command.
The example below demonstrates the use of
--range
to download a specific byte range from an object. Note the byte ranges needs to be prefixed with "bytes=":aws s3api get-object --bucket
text-content
--keydir/my_data
--rangebytes=8888-9999
my_data_range
For more information about retrieving objects, see Getting Objects in the Amazon S3 Developer Guide.
-
For API details, see GetObject
in Amazon CLI Command Reference.
-
The following code example shows how to use get-public-access-block
.
- Amazon CLI
-
To set or modify the block public access configuration for a bucket
The following
get-public-access-block
example displays the block public access configuration for the specified bucket.aws s3api get-public-access-block \ --bucket
my-bucket
Output:
{ "PublicAccessBlockConfiguration": { "IgnorePublicAcls": true, "BlockPublicPolicy": true, "BlockPublicAcls": true, "RestrictPublicBuckets": true } }
-
For API details, see GetPublicAccessBlock
in Amazon CLI Command Reference.
-
The following code example shows how to use head-bucket
.
- Amazon CLI
-
The following command verifies access to a bucket named
my-bucket
:aws s3api head-bucket --bucket
my-bucket
If the bucket exists and you have access to it, no output is returned. Otherwise, an error message will be shown. For example:
A client error (404) occurred when calling the HeadBucket operation: Not Found
-
For API details, see HeadBucket
in Amazon CLI Command Reference.
-
The following code example shows how to use head-object
.
- Amazon CLI
-
The following command retrieves metadata for an object in a bucket named
my-bucket
:aws s3api head-object --bucket
my-bucket
--keyindex.html
Output:
{ "AcceptRanges": "bytes", "ContentType": "text/html", "LastModified": "Thu, 16 Apr 2015 18:19:14 GMT", "ContentLength": 77, "VersionId": "null", "ETag": "\"30a6ec7e1a9ad79c203d05a589c8b400\"", "Metadata": {} }
-
For API details, see HeadObject
in Amazon CLI Command Reference.
-
The following code example shows how to use list-bucket-analytics-configurations
.
- Amazon CLI
-
To retrieve a list of analytics configurations for a bucket
The following
list-bucket-analytics-configurations
retrieves a list of analytics configurations for the specified bucket.aws s3api list-bucket-analytics-configurations \ --bucket
my-bucket
Output:
{ "AnalyticsConfigurationList": [ { "StorageClassAnalysis": {}, "Id": "1" } ], "IsTruncated": false }
-
For API details, see ListBucketAnalyticsConfigurations
in Amazon CLI Command Reference.
-
The following code example shows how to use list-bucket-intelligent-tiering-configurations
.
- Amazon CLI
-
To retrieve all S3 Intelligent-Tiering configurations on a bucket
The following
list-bucket-intelligent-tiering-configurations
example retrieves all S3 Intelligent-Tiering configuration on a bucket.aws s3api list-bucket-intelligent-tiering-configurations \ --bucket
DOC-EXAMPLE-BUCKET
Output:
{ "IsTruncated": false, "IntelligentTieringConfigurationList": [ { "Id": "ExampleConfig", "Filter": { "Prefix": "images" }, "Status": "Enabled", "Tierings": [ { "Days": 90, "AccessTier": "ARCHIVE_ACCESS" }, { "Days": 180, "AccessTier": "DEEP_ARCHIVE_ACCESS" } ] }, { "Id": "ExampleConfig2", "Status": "Disabled", "Tierings": [ { "Days": 730, "AccessTier": "ARCHIVE_ACCESS" } ] }, { "Id": "ExampleConfig3", "Filter": { "Tag": { "Key": "documents", "Value": "taxes" } }, "Status": "Enabled", "Tierings": [ { "Days": 90, "AccessTier": "ARCHIVE_ACCESS" }, { "Days": 365, "AccessTier": "DEEP_ARCHIVE_ACCESS" } ] } ] }
For more information, see Using S3 Intelligent-Tiering
in the Amazon S3 User Guide. -
For API details, see ListBucketIntelligentTieringConfigurations
in Amazon CLI Command Reference.
-
The following code example shows how to use list-bucket-inventory-configurations
.
- Amazon CLI
-
To retrieve a list of inventory configurations for a bucket
The following
list-bucket-inventory-configurations
example lists the inventory configurations for the specified bucket.aws s3api list-bucket-inventory-configurations \ --bucket
my-bucket
Output:
{ "InventoryConfigurationList": [ { "IsEnabled": true, "Destination": { "S3BucketDestination": { "Format": "ORC", "Bucket": "arn:aws:s3:::my-bucket", "AccountId": "123456789012" } }, "IncludedObjectVersions": "Current", "Id": "1", "Schedule": { "Frequency": "Weekly" } }, { "IsEnabled": true, "Destination": { "S3BucketDestination": { "Format": "CSV", "Bucket": "arn:aws:s3:::my-bucket", "AccountId": "123456789012" } }, "IncludedObjectVersions": "Current", "Id": "2", "Schedule": { "Frequency": "Daily" } } ], "IsTruncated": false }
-
For API details, see ListBucketInventoryConfigurations
in Amazon CLI Command Reference.
-
The following code example shows how to use list-bucket-metrics-configurations
.
- Amazon CLI
-
To retrieve a list of metrics configurations for a bucket
The following
list-bucket-metrics-configurations
example retrieves a list of metrics configurations for the specified bucket.aws s3api list-bucket-metrics-configurations \ --bucket
my-bucket
Output:
{ "IsTruncated": false, "MetricsConfigurationList": [ { "Filter": { "Prefix": "logs" }, "Id": "123" }, { "Filter": { "Prefix": "tmp" }, "Id": "234" } ] }
-
For API details, see ListBucketMetricsConfigurations
in Amazon CLI Command Reference.
-
The following code example shows how to use list-buckets
.
- Amazon CLI
-
The following command uses the
list-buckets
command to display the names of all your Amazon S3 buckets (across all regions):aws s3api list-buckets --query
"Buckets[].Name"
The query option filters the output of
list-buckets
down to only the bucket names.For more information about buckets, see Working with Amazon S3 Buckets in the Amazon S3 Developer Guide.
-
For API details, see ListBuckets
in Amazon CLI Command Reference.
-
The following code example shows how to use list-multipart-uploads
.
- Amazon CLI
-
The following command lists all of the active multipart uploads for a bucket named
my-bucket
:aws s3api list-multipart-uploads --bucket
my-bucket
Output:
{ "Uploads": [ { "Initiator": { "DisplayName": "username", "ID": "arn:aws:iam::0123456789012:user/username" }, "Initiated": "2015-06-02T18:01:30.000Z", "UploadId": "dfRtDYU0WWCCcH43C3WFbkRONycyCpTJJvxu2i5GYkZljF.Yxwh6XG7WfS2vC4to6HiV6Yjlx.cph0gtNBtJ8P3URCSbB7rjxI5iEwVDmgaXZOGgkk5nVTW16HOQ5l0R", "StorageClass": "STANDARD", "Key": "multipart/01", "Owner": { "DisplayName": "aws-account-name", "ID": "100719349fc3b6dcd7c820a124bf7aecd408092c3d7b51b38494939801fc248b" } } ], "CommonPrefixes": [] }
In progress multipart uploads incur storage costs in Amazon S3. Complete or abort an active multipart upload to remove its parts from your account.
-
For API details, see ListMultipartUploads
in Amazon CLI Command Reference.
-
The following code example shows how to use list-object-versions
.
- Amazon CLI
-
The following command retrieves version information for an object in a bucket named
my-bucket
:aws s3api list-object-versions --bucket
my-bucket
--prefixindex.html
Output:
{ "DeleteMarkers": [ { "Owner": { "DisplayName": "my-username", "ID": "7009a8971cd660687538875e7c86c5b672fe116bd438f46db45460ddcd036c32" }, "IsLatest": true, "VersionId": "B2VsEK5saUNNHKcOAJj7hIE86RozToyq", "Key": "index.html", "LastModified": "2015-11-10T00:57:03.000Z" }, { "Owner": { "DisplayName": "my-username", "ID": "7009a8971cd660687538875e7c86c5b672fe116bd438f46db45460ddcd036c32" }, "IsLatest": false, "VersionId": ".FLQEZscLIcfxSq.jsFJ.szUkmng2Yw6", "Key": "index.html", "LastModified": "2015-11-09T23:32:20.000Z" } ], "Versions": [ { "LastModified": "2015-11-10T00:20:11.000Z", "VersionId": "Rb_l2T8UHDkFEwCgJjhlgPOZC0qJ.vpD", "ETag": "\"0622528de826c0df5db1258a23b80be5\"", "StorageClass": "STANDARD", "Key": "index.html", "Owner": { "DisplayName": "my-username", "ID": "7009a8971cd660687538875e7c86c5b672fe116bd438f46db45460ddcd036c32" }, "IsLatest": false, "Size": 38 }, { "LastModified": "2015-11-09T23:26:41.000Z", "VersionId": "rasWWGpgk9E4s0LyTJgusGeRQKLVIAFf", "ETag": "\"06225825b8028de826c0df5db1a23be5\"", "StorageClass": "STANDARD", "Key": "index.html", "Owner": { "DisplayName": "my-username", "ID": "7009a8971cd660687538875e7c86c5b672fe116bd438f46db45460ddcd036c32" }, "IsLatest": false, "Size": 38 }, { "LastModified": "2015-11-09T22:50:50.000Z", "VersionId": "null", "ETag": "\"d1f45267a863c8392e07d24dd592f1b9\"", "StorageClass": "STANDARD", "Key": "index.html", "Owner": { "DisplayName": "my-username", "ID": "7009a8971cd660687538875e7c86c5b672fe116bd438f46db45460ddcd036c32" }, "IsLatest": false, "Size": 533823 } ] }
-
For API details, see ListObjectVersions
in Amazon CLI Command Reference.
-
The following code example shows how to use list-objects-v2
.
- Amazon CLI
-
To get a list of objects in a bucket
The following
list-objects-v2
example lists the objects in the specified bucket.aws s3api list-objects-v2 \ --bucket
my-bucket
Output:
{ "Contents": [ { "LastModified": "2019-11-05T23:11:50.000Z", "ETag": "\"621503c373607d548b37cff8778d992c\"", "StorageClass": "STANDARD", "Key": "doc1.rtf", "Size": 391 }, { "LastModified": "2019-11-05T23:11:50.000Z", "ETag": "\"a2cecc36ab7c7fe3a71a273b9d45b1b5\"", "StorageClass": "STANDARD", "Key": "doc2.rtf", "Size": 373 }, { "LastModified": "2019-11-05T23:11:50.000Z", "ETag": "\"08210852f65a2e9cb999972539a64d68\"", "StorageClass": "STANDARD", "Key": "doc3.rtf", "Size": 399 }, { "LastModified": "2019-11-05T23:11:50.000Z", "ETag": "\"d1852dd683f404306569471af106988e\"", "StorageClass": "STANDARD", "Key": "doc4.rtf", "Size": 6225 } ] }
-
For API details, see ListObjectsV2
in Amazon CLI Command Reference.
-
The following code example shows how to use list-objects
.
- Amazon CLI
-
The following example uses the
list-objects
command to display the names of all the objects in the specified bucket:aws s3api list-objects --bucket
text-content
--query 'Contents[].{Key: Key, Size: Size}
'The example uses the
--query
argument to filter the output oflist-objects
down to the key value and size for each objectFor more information about objects, see Working with Amazon S3 Objects in the Amazon S3 Developer Guide.
-
For API details, see ListObjects
in Amazon CLI Command Reference.
-
The following code example shows how to use list-parts
.
- Amazon CLI
-
The following command lists all of the parts that have been uploaded for a multipart upload with key
multipart/01
in the bucketmy-bucket
:aws s3api list-parts --bucket
my-bucket
--key 'multipart/01
' --upload-iddfRtDYU0WWCCcH43C3WFbkRONycyCpTJJvxu2i5GYkZljF.Yxwh6XG7WfS2vC4to6HiV6Yjlx.cph0gtNBtJ8P3URCSbB7rjxI5iEwVDmgaXZOGgkk5nVTW16HOQ5l0R
Output:
{ "Owner": { "DisplayName": "aws-account-name", "ID": "100719349fc3b6dcd7c820a124bf7aecd408092c3d7b51b38494939801fc248b" }, "Initiator": { "DisplayName": "username", "ID": "arn:aws:iam::0123456789012:user/username" }, "Parts": [ { "LastModified": "2015-06-02T18:07:35.000Z", "PartNumber": 1, "ETag": "\"e868e0f4719e394144ef36531ee6824c\"", "Size": 5242880 }, { "LastModified": "2015-06-02T18:07:42.000Z", "PartNumber": 2, "ETag": "\"6bb2b12753d66fe86da4998aa33fffb0\"", "Size": 5242880 }, { "LastModified": "2015-06-02T18:07:47.000Z", "PartNumber": 3, "ETag": "\"d0a0112e841abec9c9ec83406f0159c8\"", "Size": 5242880 } ], "StorageClass": "STANDARD" }
-
For API details, see ListParts
in Amazon CLI Command Reference.
-
The following code example shows how to use ls
.
- Amazon CLI
-
Example 1: Listing all user owned buckets
The following
ls
command lists all of the bucket owned by the user. In this example, the user owns the bucketsmybucket
andmybucket2
. The timestamp is the date the bucket was created, shown in your machine's time zone. This date can change when making changes to your bucket, such as editing its bucket policy. Note ifs3://
is used for the path argument<S3Uri>
, it will list all of the buckets as well.aws s3 ls
Output:
2013-07-11 17:08:50 mybucket 2013-07-24 14:55:44 mybucket2
Example 2: Listing all prefixes and objects in a bucket
The following
ls
command lists objects and common prefixes under a specified bucket and prefix. In this example, the user owns the bucketmybucket
with the objectstest.txt
andsomePrefix/test.txt
. TheLastWriteTime
andLength
are arbitrary. Note that since thels
command has no interaction with the local filesystem, thes3://
URI scheme is not required to resolve ambiguity and may be omitted.aws s3 ls
s3://mybucket
Output:
PRE somePrefix/ 2013-07-25 17:06:27 88 test.txt
Example 3: Listing all prefixes and objects in a specific bucket and prefix
The following
ls
command lists objects and common prefixes under a specified bucket and prefix. However, there are no objects nor common prefixes under the specified bucket and prefix.aws s3 ls
s3://mybucket/noExistPrefix
Output:
None
Example 4: Recursively listing all prefixes and objects in a bucket
The following
ls
command will recursively list objects in a bucket. Rather than showingPRE dirname/
in the output, all the content in a bucket will be listed in order.aws s3 ls
s3://mybucket
\ --recursiveOutput:
2013-09-02 21:37:53 10 a.txt 2013-09-02 21:37:53 2863288 foo.zip 2013-09-02 21:32:57 23 foo/bar/.baz/a 2013-09-02 21:32:58 41 foo/bar/.baz/b 2013-09-02 21:32:57 281 foo/bar/.baz/c 2013-09-02 21:32:57 73 foo/bar/.baz/d 2013-09-02 21:32:57 452 foo/bar/.baz/e 2013-09-02 21:32:57 896 foo/bar/.baz/hooks/bar 2013-09-02 21:32:57 189 foo/bar/.baz/hooks/foo 2013-09-02 21:32:57 398 z.txt
Example 5: Summarizing all prefixes and objects in a bucket
The following
ls
command demonstrates the same command using the --human-readable and --summarize options. --human-readable displays file size in Bytes/MiB/KiB/GiB/TiB/PiB/EiB. --summarize displays the total number of objects and total size at the end of the result listing:aws s3 ls
s3://mybucket
\ --recursive \ --human-readable \ --summarizeOutput:
2013-09-02 21:37:53 10 Bytes a.txt 2013-09-02 21:37:53 2.9 MiB foo.zip 2013-09-02 21:32:57 23 Bytes foo/bar/.baz/a 2013-09-02 21:32:58 41 Bytes foo/bar/.baz/b 2013-09-02 21:32:57 281 Bytes foo/bar/.baz/c 2013-09-02 21:32:57 73 Bytes foo/bar/.baz/d 2013-09-02 21:32:57 452 Bytes foo/bar/.baz/e 2013-09-02 21:32:57 896 Bytes foo/bar/.baz/hooks/bar 2013-09-02 21:32:57 189 Bytes foo/bar/.baz/hooks/foo 2013-09-02 21:32:57 398 Bytes z.txt Total Objects: 10 Total Size: 2.9 MiB
Example 6: Listing from an S3 access point
The following
ls
command list objects from access point (myaccesspoint
):aws s3 ls
s3://arn:aws:s3:us-west-2:123456789012:accesspoint/myaccesspoint/
Output:
PRE somePrefix/ 2013-07-25 17:06:27 88 test.txt
-
For API details, see Ls
in Amazon CLI Command Reference.
-
The following code example shows how to use mb
.
- Amazon CLI
-
Example 1: Create a bucket
The following
mb
command creates a bucket. In this example, the user makes the bucketmybucket
. The bucket is created in the region specified in the user's configuration file:aws s3 mb
s3://mybucket
Output:
make_bucket: s3://mybucket
Example 2: Create a bucket in the specified region
The following
mb
command creates a bucket in a region specified by the--region
parameter. In this example, the user makes the bucketmybucket
in the regionus-west-1
:aws s3 mb
s3://mybucket
\ --regionus-west-1
Output:
make_bucket: s3://mybucket
-
For API details, see Mb
in Amazon CLI Command Reference.
-
The following code example shows how to use mv
.
- Amazon CLI
-
Example 1: Move a local file to the specified bucket
The following
mv
command moves a single file to a specified bucket and key.aws s3 mv
test.txt
s3://mybucket/test2.txt
Output:
move: test.txt to s3://mybucket/test2.txt
Example 2: Move an object to the specified bucket and key
The following
mv
command moves a single s3 object to a specified bucket and key.aws s3 mv
s3://mybucket/test.txt
s3://mybucket/test2.txt
Output:
move: s3://mybucket/test.txt to s3://mybucket/test2.txt
Example 3: Move an S3 object to the local directory
The following
mv
command moves a single object to a specified file locally.aws s3 mv
s3://mybucket/test.txt
test2.txt
Output:
move: s3://mybucket/test.txt to test2.txt
Example 4: Move an object with it's original name to the specified bucket
The following
mv
command moves a single object to a specified bucket while retaining its original name:aws s3 mv
s3://mybucket/test.txt
s3://mybucket2/
Output:
move: s3://mybucket/test.txt to s3://mybucket2/test.txt
Example 5: Move all objects and prefixes in a bucket to the local directory
When passed with the parameter
--recursive
, the followingmv
command recursively moves all objects under a specified prefix and bucket to a specified directory. In this example, the bucketmybucket
has the objectstest1.txt
andtest2.txt
.aws s3 mv
s3://mybucket
.
\ --recursiveOutput:
move: s3://mybucket/test1.txt to test1.txt move: s3://mybucket/test2.txt to test2.txt
Example 6: Move all objects and prefixes in a bucket to the local directory, except ``.jpg`` files
When passed with the parameter
--recursive
, the followingmv
command recursively moves all files under a specified directory to a specified bucket and prefix while excluding some files by using an--exclude
parameter. In this example, the directorymyDir
has the filestest1.txt
andtest2.jpg
.aws s3 mv
myDir
s3://mybucket/
\ --recursive \ --exclude"*.jpg"
Output:
move: myDir/test1.txt to s3://mybucket2/test1.txt
Example 7: Move all objects and prefixes in a bucket to the local directory, except specified prefix
When passed with the parameter
--recursive
, the followingmv
command recursively moves all objects under a specified bucket to another bucket while excluding some objects by using an--exclude
parameter. In this example, the bucketmybucket
has the objectstest1.txt
andanother/test1.txt
.aws s3 mv
s3://mybucket/
s3://mybucket2/
\ --recursive \ --exclude"mybucket/another/*"
Output:
move: s3://mybucket/test1.txt to s3://mybucket2/test1.txt
Example 8: Move an object to the specified bucket and set the ACL
The following
mv
command moves a single object to a specified bucket and key while setting the ACL topublic-read-write
.aws s3 mv
s3://mybucket/test.txt
s3://mybucket/test2.txt
\ --aclpublic-read-write
Output:
move: s3://mybucket/test.txt to s3://mybucket/test2.txt
Example 9: Move a local file to the specified bucket and grant permissions
The following
mv
command illustrates the use of the--grants
option to grant read access to all users and full control to a specific user identified by their email address.aws s3 mv
file.txt
s3://mybucket/
\ --grantsread=uri=http://acs.amazonaws.com/groups/global/AllUsers
full=emailaddress=user@example.com
Output:
move: file.txt to s3://mybucket/file.txt
Example 10: Move a file to an S3 access point
The following
mv
command moves a single file namedmydoc.txt
to the access point namedmyaccesspoint
at the key namedmykey
.aws s3 mv
mydoc.txt
s3://arn:aws:s3:us-west-2:123456789012:accesspoint/myaccesspoint/mykey
Output:
move: mydoc.txt to s3://arn:aws:s3:us-west-2:123456789012:accesspoint/myaccesspoint/mykey
-
For API details, see Mv
in Amazon CLI Command Reference.
-
The following code example shows how to use presign
.
- Amazon CLI
-
Example 1: To create a pre-signed URL with the default one hour lifetime that links to an object in an S3 bucket
The following
presign
command generates a pre-signed URL for a specified bucket and key that is valid for one hour.aws s3 presign
s3://DOC-EXAMPLE-BUCKET/test2.txt
Output:
https://DOC-EXAMPLE-BUCKET.s3.us-west-2.amazonaws.com/key?X-Amz-Algorithm=AWS4-HMAC-SHA256&X-Amz-Credential=AKIAEXAMPLE123456789%2F20210621%2Fus-west-2%2Fs3%2Faws4_request&X-Amz-Date=20210621T041609Z&X-Amz-Expires=3600&X-Amz-SignedHeaders=host&X-Amz-Signature=EXAMBLE1234494d5fba3fed607f98018e1dfc62e2529ae96d844123456
Example 2: To create a pre-signed URL with a custom lifetime that links to an object in an S3 bucket
The following
presign
command generates a pre-signed URL for a specified bucket and key that is valid for one week.aws s3 presign
s3://DOC-EXAMPLE-BUCKET/test2.txt
\ --expires-in604800
Output:
https://DOC-EXAMPLE-BUCKET.s3.us-west-2.amazonaws.com/key?X-Amz-Algorithm=AWS4-HMAC-SHA256&X-Amz-Credential=AKIAEXAMPLE123456789%2F20210621%2Fus-west-2%2Fs3%2Faws4_request&X-Amz-Date=20210621T041609Z&X-Amz-Expires=604800&X-Amz-SignedHeaders=host&X-Amz-Signature=EXAMBLE1234494d5fba3fed607f98018e1dfc62e2529ae96d844123456
For more information, see Share an Object with Others
in the S3 Developer Guide guide. -
For API details, see Presign
in Amazon CLI Command Reference.
-
The following code example shows how to use put-bucket-accelerate-configuration
.
- Amazon CLI
-
To set the accelerate configuration of a bucket
The following
put-bucket-accelerate-configuration
example enables the accelerate configuration for the specified bucket.aws s3api put-bucket-accelerate-configuration \ --bucket
my-bucket
\ --accelerate-configurationStatus=Enabled
This command produces no output.
-
For API details, see PutBucketAccelerateConfiguration
in Amazon CLI Command Reference.
-
The following code example shows how to use put-bucket-acl
.
- Amazon CLI
-
This example grants
full control
to two Amazon users (user1@example.com and user2@example.com) andread
permission to everyone:aws s3api put-bucket-acl --bucket
MyBucket
--grant-full-controlemailaddress=user1@example.com,emailaddress=user2@example.com
--grant-readuri=http://acs.amazonaws.com/groups/global/AllUsers
See http://docs.aws.amazon.com/AmazonS3/latest/API/RESTBucketPUTacl.html for details on custom ACLs (the s3api ACL commands, such as
put-bucket-acl
, use the same shorthand argument notation).-
For API details, see PutBucketAcl
in Amazon CLI Command Reference.
-
The following code example shows how to use put-bucket-analytics-configuration
.
- Amazon CLI
-
To sets an analytics configuration for the bucket
The following
put-bucket-analytics-configuration
example configures analytics for the specified bucket.aws s3api put-bucket-analytics-configuration \ --bucket
my-bucket
--id1
\ --analytics-configuration '{"Id": "1","StorageClassAnalysis": {}}
'This command produces no output.
-
For API details, see PutBucketAnalyticsConfiguration
in Amazon CLI Command Reference.
-
The following code example shows how to use put-bucket-cors
.
- Amazon CLI
-
The following example enables
PUT
,POST
, andDELETE
requests from www.example.com, and enablesGET
requests from any domain:aws s3api put-bucket-cors --bucket
MyBucket
--cors-configurationfile://cors.json
cors.json:
{
"CORSRules":[
{
"AllowedOrigins": ["http://www.example.com"], "AllowedHeaders": ["*"], "AllowedMethods": ["PUT", "POST", "DELETE"], "MaxAgeSeconds":3000,
"ExposeHeaders": ["x-amz-server-side-encryption"]},
{
"AllowedOrigins": ["*"], "AllowedHeaders": ["Authorization"], "AllowedMethods": ["GET"], "MaxAgeSeconds":3000
}
]
}
-
For API details, see PutBucketCors
in Amazon CLI Command Reference.
-
The following code example shows how to use put-bucket-encryption
.
- Amazon CLI
-
To configure server-side encryption for a bucket
The following
put-bucket-encryption
example sets AES256 encryption as the default for the specified bucket.aws s3api put-bucket-encryption \ --bucket
my-bucket
\ --server-side-encryption-configuration '{"Rules": [{"ApplyServerSideEncryptionByDefault": {"SSEAlgorithm": "AES256"}}]}
'This command produces no output.
-
For API details, see PutBucketEncryption
in Amazon CLI Command Reference.
-
The following code example shows how to use put-bucket-intelligent-tiering-configuration
.
- Amazon CLI
-
To update an S3 Intelligent-Tiering configuration on a bucket
The following
put-bucket-intelligent-tiering-configuration
example updates an S3 Intelligent-Tiering configuration, named ExampleConfig, on a bucket. The configuration will transition objects that have not been accessed under the prefix images to Archive Access after 90 days and Deep Archive Access after 180 days.aws s3api put-bucket-intelligent-tiering-configuration \ --bucket
DOC-EXAMPLE-BUCKET
\ --id"ExampleConfig"
\ --intelligent-tiering-configurationfile://intelligent-tiering-configuration.json
Contents of
intelligent-tiering-configuration.json
:{ "Id": "ExampleConfig", "Status": "Enabled", "Filter": { "Prefix": "images" }, "Tierings": [ { "Days": 90, "AccessTier": "ARCHIVE_ACCESS" }, { "Days": 180, "AccessTier": "DEEP_ARCHIVE_ACCESS" } ] }
This command produces no output.
For more information, see Setting Object Ownership on an existing bucket
in the Amazon S3 User Guide. -
For API details, see PutBucketIntelligentTieringConfiguration
in Amazon CLI Command Reference.
-
The following code example shows how to use put-bucket-inventory-configuration
.
- Amazon CLI
-
Example 1: To set an inventory configuration for a bucket
The following
put-bucket-inventory-configuration
example sets a weekly ORC-formatted inventory report for the bucketmy-bucket
.aws s3api put-bucket-inventory-configuration \ --bucket
my-bucket
\ --id1
\ --inventory-configuration '{"Destination": { "S3BucketDestination": { "AccountId": "123456789012", "Bucket": "arn:aws:s3:::my-bucket", "Format": "ORC" }}, "IsEnabled": true, "Id": "1", "IncludedObjectVersions": "Current", "Schedule": { "Frequency": "Weekly" }}
'This command produces no output.
Example 2: To set an inventory configuration for a bucket
The following
put-bucket-inventory-configuration
example sets a daily CSV-formatted inventory report for the bucketmy-bucket
.aws s3api put-bucket-inventory-configuration \ --bucket
my-bucket
\ --id2
\ --inventory-configuration '{"Destination": { "S3BucketDestination": { "AccountId": "123456789012", "Bucket": "arn:aws:s3:::my-bucket", "Format": "CSV" }}, "IsEnabled": true, "Id": "2", "IncludedObjectVersions": "Current", "Schedule": { "Frequency": "Daily" }}
'This command produces no output.
-
For API details, see PutBucketInventoryConfiguration
in Amazon CLI Command Reference.
-
The following code example shows how to use put-bucket-lifecycle-configuration
.
- Amazon CLI
-
The following command applies a lifecycle configuration to a bucket named
my-bucket
:aws s3api put-bucket-lifecycle-configuration --bucket
my-bucket
--lifecycle-configurationfile://lifecycle.json
The file
lifecycle.json
is a JSON document in the current folder that specifies two rules:{ "Rules": [ { "ID": "Move rotated logs to Glacier", "Prefix": "rotated/", "Status": "Enabled", "Transitions": [ { "Date": "2015-11-10T00:00:00.000Z", "StorageClass": "GLACIER" } ] }, { "Status": "Enabled", "Prefix": "", "NoncurrentVersionTransitions": [ { "NoncurrentDays": 2, "StorageClass": "GLACIER" } ], "ID": "Move old versions to Glacier" } ] }
The first rule moves files with the prefix
rotated
to Glacier on the specified date. The second rule moves old object versions to Glacier when they are no longer current. For information on acceptable timestamp formats, see Specifying Parameter Values in the Amazon CLI User Guide.-
For API details, see PutBucketLifecycleConfiguration
in Amazon CLI Command Reference.
-
The following code example shows how to use put-bucket-lifecycle
.
- Amazon CLI
-
The following command applies a lifecycle configuration to the bucket
my-bucket
:aws s3api put-bucket-lifecycle --bucket
my-bucket
--lifecycle-configurationfile://lifecycle.json
The file
lifecycle.json
is a JSON document in the current folder that specifies two rules:{ "Rules": [ { "ID": "Move to Glacier after sixty days (objects in logs/2015/)", "Prefix": "logs/2015/", "Status": "Enabled", "Transition": { "Days": 60, "StorageClass": "GLACIER" } }, { "Expiration": { "Date": "2016-01-01T00:00:00.000Z" }, "ID": "Delete 2014 logs in 2016.", "Prefix": "logs/2014/", "Status": "Enabled" } ] }
The first rule moves files to Amazon Glacier after sixty days. The second rule deletes files from Amazon S3 on the specified date. For information on acceptable timestamp formats, see Specifying Parameter Values in the Amazon CLI User Guide.
Each rule in the above example specifies a policy (
Transition
orExpiration
) and file prefix (folder name) to which it applies. You can also create a rule that applies to an entire bucket by specifying a blank prefix:{ "Rules": [ { "ID": "Move to Glacier after sixty days (all objects in bucket)", "Prefix": "", "Status": "Enabled", "Transition": { "Days": 60, "StorageClass": "GLACIER" } } ] }
-
For API details, see PutBucketLifecycle
in Amazon CLI Command Reference.
-
The following code example shows how to use put-bucket-logging
.
- Amazon CLI
-
Example 1: To set bucket policy logging
The following
put-bucket-logging
example sets the logging policy for MyBucket. First, grant the logging service principal permission in your bucket policy using theput-bucket-policy
command.aws s3api put-bucket-policy \ --bucket
MyBucket
\ --policyfile://policy.json
Contents of
policy.json
:{ "Version": "2012-10-17", "Statement": [ { "Sid": "S3ServerAccessLogsPolicy", "Effect": "Allow", "Principal": {"Service": "logging.s3.amazonaws.com"}, "Action": "s3:PutObject", "Resource": "arn:aws:s3:::MyBucket/Logs/*", "Condition": { "ArnLike": {"aws:SourceARN": "arn:aws:s3:::SOURCE-BUCKET-NAME"}, "StringEquals": {"aws:SourceAccount": "SOURCE-AWS-ACCOUNT-ID"} } } ] }
To apply the logging policy, use
put-bucket-logging
.aws s3api put-bucket-logging \ --bucket
MyBucket
\ --bucket-logging-statusfile://logging.json
Contents of
logging.json
:{ "LoggingEnabled": { "TargetBucket": "MyBucket", "TargetPrefix": "Logs/" } }
The
put-bucket-policy
command is required to grants3:PutObject
permissions to the logging service principal.For more information, see Amazon S3 Server Access Logging
in the Amazon S3 User Guide. Example 2: To set a bucket policy for logging access to only a single user
The following
put-bucket-logging
example sets the logging policy for MyBucket. The Amazon user bob@example.com will have full control over the log files, and no one else has any access. First, grant S3 permission withput-bucket-acl
.aws s3api put-bucket-acl \ --bucket
MyBucket
\ --grant-writeURI=http://acs.amazonaws.com/groups/s3/LogDelivery
\ --grant-read-acpURI=http://acs.amazonaws.com/groups/s3/LogDelivery
Then apply the logging policy using
put-bucket-logging
.aws s3api put-bucket-logging \ --bucket
MyBucket
\ --bucket-logging-statusfile://logging.json
Contents of
logging.json
:{ "LoggingEnabled": { "TargetBucket": "MyBucket", "TargetPrefix": "MyBucketLogs/", "TargetGrants": [ { "Grantee": { "Type": "AmazonCustomerByEmail", "EmailAddress": "bob@example.com" }, "Permission": "FULL_CONTROL" } ] } }
the
put-bucket-acl
command is required to grant S3's log delivery system the necessary permissions (write and read-acp permissions).For more information, see Amazon S3 Server Access Logging
in the Amazon S3 Developer Guide. -
For API details, see PutBucketLogging
in Amazon CLI Command Reference.
-
The following code example shows how to use put-bucket-metrics-configuration
.
- Amazon CLI
-
To set a metrics configuration for a bucket
The following
put-bucket-metrics-configuration
example sets a metric configuration with ID 123 for the specified bucket.aws s3api put-bucket-metrics-configuration \ --bucket
my-bucket
\ --id123
\ --metrics-configuration '{"Id": "123", "Filter": {"Prefix": "logs"}}
'This command produces no output.
-
For API details, see PutBucketMetricsConfiguration
in Amazon CLI Command Reference.
-
The following code example shows how to use put-bucket-notification-configuration
.
- Amazon CLI
-
To enable the specified notifications to a bucket
The following
put-bucket-notification-configuration
example applies a notification configuration to a bucket namedmy-bucket
. The filenotification.json
is a JSON document in the current folder that specifies an SNS topic and an event type to monitor.aws s3api put-bucket-notification-configuration \ --bucket
my-bucket
\ --notification-configurationfile://notification.json
Contents of
notification.json
:{ "TopicConfigurations": [ { "TopicArn": "arn:aws:sns:us-west-2:123456789012:s3-notification-topic", "Events": [ "s3:ObjectCreated:*" ] } ] }
The SNS topic must have an IAM policy attached to it that allows Amazon S3 to publish to it.
{ "Version": "2008-10-17", "Id": "example-ID", "Statement": [ { "Sid": "example-statement-ID", "Effect": "Allow", "Principal": { "Service": "s3.amazonaws.com" }, "Action": [ "SNS:Publish" ], "Resource": "arn:aws:sns:us-west-2:123456789012::s3-notification-topic", "Condition": { "ArnLike": { "aws:SourceArn": "arn:aws:s3:*:*:my-bucket" } } } ] }
-
For API details, see PutBucketNotificationConfiguration
in Amazon CLI Command Reference.
-
The following code example shows how to use put-bucket-notification
.
- Amazon CLI
-
The applies a notification configuration to a bucket named
my-bucket
:aws s3api put-bucket-notification --bucket
my-bucket
--notification-configurationfile://notification.json
The file
notification.json
is a JSON document in the current folder that specifies an SNS topic and an event type to monitor:{ "TopicConfiguration": { "Event": "s3:ObjectCreated:*", "Topic": "arn:aws:sns:us-west-2:123456789012:s3-notification-topic" } }
The SNS topic must have an IAM policy attached to it that allows Amazon S3 to publish to it:
{ "Version": "2008-10-17", "Id": "example-ID", "Statement": [ { "Sid": "example-statement-ID", "Effect": "Allow", "Principal": { "Service": "s3.amazonaws.com" }, "Action": [ "SNS:Publish" ], "Resource": "arn:aws:sns:us-west-2:123456789012:my-bucket", "Condition": { "ArnLike": { "aws:SourceArn": "arn:aws:s3:*:*:my-bucket" } } } ] }
-
For API details, see PutBucketNotification
in Amazon CLI Command Reference.
-
The following code example shows how to use put-bucket-ownership-controls
.
- Amazon CLI
-
To update the bucket ownership settings of a bucket
The following
put-bucket-ownership-controls
example updates the bucket ownership settings of a bucket.aws s3api put-bucket-ownership-controls \ --bucket
DOC-EXAMPLE-BUCKET
\ --ownership-controls="Rules=[{ObjectOwnership=BucketOwnerEnforced}]"This command produces no output.
For more information, see Setting Object Ownership on an existing bucket
in the Amazon S3 User Guide. -
For API details, see PutBucketOwnershipControls
in Amazon CLI Command Reference.
-
The following code example shows how to use put-bucket-policy
.
- Amazon CLI
-
This example allows all users to retrieve any object in MyBucket except those in the MySecretFolder. It also grants
put
anddelete
permission to the root user of the Amazon account1234-5678-9012
:aws s3api put-bucket-policy --bucket
MyBucket
--policyfile://policy.json
policy.json:
{
"Statement":[
{
"Effect": "Allow", "Principal": "*", "Action": "s3:GetObject", "Resource":"arn:aws:s3:::MyBucket/*"
},
{
"Effect": "Deny", "Principal": "*", "Action": "s3:GetObject", "Resource":"arn:aws:s3:::MyBucket/MySecretFolder/*"
},
{
"Effect": "Allow", "Principal":{
"AWS":"arn:aws:iam::123456789012:root"
},
"Action":[
"s3:DeleteObject","s3:PutObject"
],
"Resource":"arn:aws:s3:::MyBucket/*"
}
]
}
-
For API details, see PutBucketPolicy
in Amazon CLI Command Reference.
-
The following code example shows how to use put-bucket-replication
.
- Amazon CLI
-
To configure replication for an S3 bucket
The following
put-bucket-replication
example applies a replication configuration to the specified S3 bucket.aws s3api put-bucket-replication \ --bucket
AWSDOC-EXAMPLE-BUCKET1
\ --replication-configurationfile://replication.json
Contents of
replication.json
:{ "Role": "arn:aws:iam::123456789012:role/s3-replication-role", "Rules": [ { "Status": "Enabled", "Priority": 1, "DeleteMarkerReplication": { "Status": "Disabled" }, "Filter" : { "Prefix": ""}, "Destination": { "Bucket": "arn:aws:s3:::AWSDOC-EXAMPLE-BUCKET2" } } ] }
The destination bucket must have versioning enabled. The specified role must have permission to write to the destination bucket and have a trust relationship that allows Amazon S3 to assume the role.
Example role permission policy:
{ "Version": "2012-10-17", "Statement": [ { "Effect": "Allow", "Action": [ "s3:GetReplicationConfiguration", "s3:ListBucket" ], "Resource": [ "arn:aws:s3:::AWSDOC-EXAMPLE-BUCKET1" ] }, { "Effect": "Allow", "Action": [ "s3:GetObjectVersion", "s3:GetObjectVersionAcl", "s3:GetObjectVersionTagging" ], "Resource": [ "arn:aws:s3:::AWSDOC-EXAMPLE-BUCKET1/*" ] }, { "Effect": "Allow", "Action": [ "s3:ReplicateObject", "s3:ReplicateDelete", "s3:ReplicateTags" ], "Resource": "arn:aws:s3:::AWSDOC-EXAMPLE-BUCKET2/*" } ] }
Example trust relationship policy:
{ "Version": "2012-10-17", "Statement": [ { "Effect": "Allow", "Principal": { "Service": "s3.amazonaws.com" }, "Action": "sts:AssumeRole" } ] }
This command produces no output.
For more information, see This is the topic title
in the Amazon Simple Storage Service Console User Guide. -
For API details, see PutBucketReplication
in Amazon CLI Command Reference.
-
The following code example shows how to use put-bucket-request-payment
.
- Amazon CLI
-
Example 1: To enable ``requester pays`` configuration for a bucket
The following
put-bucket-request-payment
example enablesrequester pays
for the specified bucket.aws s3api put-bucket-request-payment \ --bucket
my-bucket
\ --request-payment-configuration '{"Payer":"Requester"}
'This command produces no output.
Example 2: To disable ``requester pays`` configuration for a bucket
The following
put-bucket-request-payment
example disablesrequester pays
for the specified bucket.aws s3api put-bucket-request-payment \ --bucket
my-bucket
\ --request-payment-configuration '{"Payer":"BucketOwner"}
'This command produces no output.
-
For API details, see PutBucketRequestPayment
in Amazon CLI Command Reference.
-
The following code example shows how to use put-bucket-tagging
.
- Amazon CLI
-
The following command applies a tagging configuration to a bucket named
my-bucket
:aws s3api put-bucket-tagging --bucket
my-bucket
--taggingfile://tagging.json
The file
tagging.json
is a JSON document in the current folder that specifies tags:{ "TagSet": [ { "Key": "organization", "Value": "marketing" } ] }
Or apply a tagging configuration to
my-bucket
directly from the command line:aws s3api put-bucket-tagging --bucket
my-bucket
--tagging 'TagSet=[{Key=organization,Value=marketing}]
'-
For API details, see PutBucketTagging
in Amazon CLI Command Reference.
-
The following code example shows how to use put-bucket-versioning
.
- Amazon CLI
-
The following command enables versioning on a bucket named
my-bucket
:aws s3api put-bucket-versioning --bucket
my-bucket
--versioning-configurationStatus=Enabled
The following command enables versioning, and uses an mfa code
aws s3api put-bucket-versioning --bucket
my-bucket
--versioning-configurationStatus=Enabled
--mfa"SERIAL 123456"
-
For API details, see PutBucketVersioning
in Amazon CLI Command Reference.
-
The following code example shows how to use put-bucket-website
.
- Amazon CLI
-
The applies a static website configuration to a bucket named
my-bucket
:aws s3api put-bucket-website --bucket
my-bucket
--website-configurationfile://website.json
The file
website.json
is a JSON document in the current folder that specifies index and error pages for the website:{ "IndexDocument": { "Suffix": "index.html" }, "ErrorDocument": { "Key": "error.html" } }
-
For API details, see PutBucketWebsite
in Amazon CLI Command Reference.
-
The following code example shows how to use put-object-acl
.
- Amazon CLI
-
The following command grants
full control
to two Amazon users (user1@example.com and user2@example.com) andread
permission to everyone:aws s3api put-object-acl --bucket
MyBucket
--keyfile.txt
--grant-full-controlemailaddress=user1@example.com,emailaddress=user2@example.com
--grant-readuri=http://acs.amazonaws.com/groups/global/AllUsers
See http://docs.aws.amazon.com/AmazonS3/latest/API/RESTBucketPUTacl.html for details on custom ACLs (the s3api ACL commands, such as
put-object-acl
, use the same shorthand argument notation).-
For API details, see PutObjectAcl
in Amazon CLI Command Reference.
-
The following code example shows how to use put-object-legal-hold
.
- Amazon CLI
-
To apply a Legal Hold to an object
The following
put-object-legal-hold
example sets a Legal Hold on the objectdoc1.rtf
.aws s3api put-object-legal-hold \ --bucket
my-bucket-with-object-lock
\ --keydoc1.rtf
\ --legal-holdStatus=ON
This command produces no output.
-
For API details, see PutObjectLegalHold
in Amazon CLI Command Reference.
-
The following code example shows how to use put-object-lock-configuration
.
- Amazon CLI
-
To set an object lock configuration on a bucket
The following
put-object-lock-configuration
example sets a 50-day object lock on the specified bucket.aws s3api put-object-lock-configuration \ --bucket
my-bucket-with-object-lock
\ --object-lock-configuration '{ "ObjectLockEnabled": "Enabled", "Rule": { "DefaultRetention": { "Mode": "COMPLIANCE", "Days": 50 }}}
'This command produces no output.
-
For API details, see PutObjectLockConfiguration
in Amazon CLI Command Reference.
-
The following code example shows how to use put-object-retention
.
- Amazon CLI
-
To set an object retention configuration for an object
The following
put-object-retention
example sets an object retention configuration for the specified object until 2025-01-01.aws s3api put-object-retention \ --bucket
my-bucket-with-object-lock
\ --keydoc1.rtf
\ --retention '{ "Mode": "GOVERNANCE", "RetainUntilDate": "2025-01-01T00:00:00" }
'This command produces no output.
-
For API details, see PutObjectRetention
in Amazon CLI Command Reference.
-
The following code example shows how to use put-object-tagging
.
- Amazon CLI
-
To set a tag on an object
The following
put-object-tagging
example sets a tag with the keydesignation
and the valueconfidential
on the specified object.aws s3api put-object-tagging \ --bucket
my-bucket
\ --keydoc1.rtf
\ --tagging '{"TagSet": [{ "Key": "designation", "Value": "confidential" }]}
'This command produces no output.
The following
put-object-tagging
example sets multiple tags sets on the specified object.aws s3api put-object-tagging \ --bucket
my-bucket-example
\ --keydoc3.rtf
\ --tagging '{"TagSet": [{ "Key": "designation", "Value": "confidential" }, { "Key": "department", "Value": "finance" }, { "Key": "team", "Value": "payroll" } ]}
'This command produces no output.
-
For API details, see PutObjectTagging
in Amazon CLI Command Reference.
-
The following code example shows how to use put-object
.
- Amazon CLI
-
The following example uses the
put-object
command to upload an object to Amazon S3:aws s3api put-object --bucket
text-content
--keydir-1/my_images.tar.bz2
--bodymy_images.tar.bz2
The following example shows an upload of a video file (The video file is specified using Windows file system syntax.):
aws s3api put-object --bucket
text-content
--keydir-1/big-video-file.mp4
--body e:\media\videos\f-sharp-3-data-services.mp4For more information about uploading objects, see Uploading Objects in the Amazon S3 Developer Guide.
-
For API details, see PutObject
in Amazon CLI Command Reference.
-
The following code example shows how to use put-public-access-block
.
- Amazon CLI
-
To set the block public access configuration for a bucket
The following
put-public-access-block
example sets a restrictive block public access configuration for the specified bucket.aws s3api put-public-access-block \ --bucket
my-bucket
\ --public-access-block-configuration"BlockPublicAcls=true,IgnorePublicAcls=true,BlockPublicPolicy=true,RestrictPublicBuckets=true"
This command produces no output.
-
For API details, see PutPublicAccessBlock
in Amazon CLI Command Reference.
-
The following code example shows how to use rb
.
- Amazon CLI
-
Example 1: Delete a bucket
The following
rb
command removes a bucket. In this example, the user's bucket ismybucket
. Note that the bucket must be empty in order to remove:aws s3 rb
s3://mybucket
Output:
remove_bucket: mybucket
Example 2: Force delete a bucket
The following
rb
command uses the--force
parameter to first remove all of the objects in the bucket and then remove the bucket itself. In this example, the user's bucket ismybucket
and the objects inmybucket
aretest1.txt
andtest2.txt
:aws s3 rb
s3://mybucket
\ --forceOutput:
delete: s3://mybucket/test1.txt delete: s3://mybucket/test2.txt remove_bucket: mybucket
-
For API details, see Rb
in Amazon CLI Command Reference.
-
The following code example shows how to use restore-object
.
- Amazon CLI
-
To create a restore request for an object
The following
restore-object
example restores the specified Amazon S3 Glacier object for the bucketmy-glacier-bucket
for 10 days.aws s3api restore-object \ --bucket
my-glacier-bucket
\ --keydoc1.rtf
\ --restore-requestDays=10
This command produces no output.
-
For API details, see RestoreObject
in Amazon CLI Command Reference.
-
The following code example shows how to use rm
.
- Amazon CLI
-
Example 1: Delete an S3 object
The following
rm
command deletes a single s3 object:aws s3 rm
s3://mybucket/test2.txt
Output:
delete: s3://mybucket/test2.txt
Example 2: Delete all contents in a bucket
The following
rm
command recursively deletes all objects under a specified bucket and prefix when passed with the parameter--recursive
. In this example, the bucketmybucket
contains the objectstest1.txt
andtest2.txt
:aws s3 rm
s3://mybucket
\ --recursiveOutput:
delete: s3://mybucket/test1.txt delete: s3://mybucket/test2.txt
Example 3: Delete all contents in a bucket, except ``.jpg`` files
The following
rm
command recursively deletes all objects under a specified bucket and prefix when passed with the parameter--recursive
while excluding some objects by using an--exclude
parameter. In this example, the bucketmybucket
has the objectstest1.txt
andtest2.jpg
:aws s3 rm
s3://mybucket/
\ --recursive \ --exclude"*.jpg"
Output:
delete: s3://mybucket/test1.txt
Example 4: Delete all contents in a bucket, except objects under the specified prefix
The following
rm
command recursively deletes all objects under a specified bucket and prefix when passed with the parameter--recursive
while excluding all objects under a particular prefix by using an--exclude
parameter. In this example, the bucketmybucket
has the objectstest1.txt
andanother/test.txt
:aws s3 rm
s3://mybucket/
\ --recursive \ --exclude"another/*"
Output:
delete: s3://mybucket/test1.txt
Example 5: Delete an object from an S3 access point
The following
rm
command deletes a single object (mykey
) from the access point (myaccesspoint
). :: The followingrm
command deletes a single object (mykey
) from the access point (myaccesspoint
).aws s3 rm
s3://arn:aws:s3:us-west-2:123456789012:accesspoint/myaccesspoint/mykey
Output:
delete: s3://arn:aws:s3:us-west-2:123456789012:accesspoint/myaccesspoint/mykey
-
For API details, see Rm
in Amazon CLI Command Reference.
-
The following code example shows how to use select-object-content
.
- Amazon CLI
-
To filter the contents of an Amazon S3 object based on an SQL statement
The following
select-object-content
example filters the objectmy-data-file.csv
with the specified SQL statement and sends output to a file.aws s3api select-object-content \ --bucket
my-bucket
\ --keymy-data-file.csv
\ --expression"select * from s3object limit 100"
\ --expression-type 'SQL
' \ --input-serialization '{"CSV": {}, "CompressionType": "NONE"}
' \ --output-serialization '{"CSV": {}}
'"output.csv"
This command produces no output.
-
For API details, see SelectObjectContent
in Amazon CLI Command Reference.
-
The following code example shows how to use sync
.
- Amazon CLI
-
Example 1: Sync all local objects to the specified bucket
The following
sync
command syncs objects from a local directory to the specified prefix and bucket by uploading the local files to S3. A local file will require uploading if the size of the local file is different than the size of the S3 object, the last modified time of the local file is newer than the last modified time of the S3 object, or the local file does not exist under the specified bucket and prefix. In this example, the user syncs the bucketmybucket
to the local current directory. The local current directory contains the filestest.txt
andtest2.txt
. The bucketmybucket
contains no objects.aws s3 sync
.
s3://mybucket
Output:
upload: test.txt to s3://mybucket/test.txt upload: test2.txt to s3://mybucket/test2.txt
Example 2: Sync all S3 objects from the specified S3 bucket to another bucket
The following
sync
command syncs objects under a specified prefix and bucket to objects under another specified prefix and bucket by copying S3 objects. An S3 object will require copying if the sizes of the two S3 objects differ, the last modified time of the source is newer than the last modified time of the destination, or the S3 object does not exist under the specified bucket and prefix destination.In this example, the user syncs the bucket
mybucket
to the bucketmybucket2
. The bucketmybucket
contains the objectstest.txt
andtest2.txt
. The bucketmybucket2
contains no objects:aws s3 sync
s3://mybucket
s3://mybucket2
Output:
copy: s3://mybucket/test.txt to s3://mybucket2/test.txt copy: s3://mybucket/test2.txt to s3://mybucket2/test2.txt
Example 3: Sync all S3 objects from the specified S3 bucket to the local directory
The following
sync
command syncs files from the specified S3 bucket to the local directory by downloading S3 objects. An S3 object will require downloading if the size of the S3 object differs from the size of the local file, the last modified time of the S3 object is newer than the last modified time of the local file, or the S3 object does not exist in the local directory. Take note that when objects are downloaded from S3, the last modified time of the local file is changed to the last modified time of the S3 object. In this example, the user syncs the bucketmybucket
to the current local directory. The bucketmybucket
contains the objectstest.txt
andtest2.txt
. The current local directory has no files:aws s3 sync
s3://mybucket
.
Output:
download: s3://mybucket/test.txt to test.txt download: s3://mybucket/test2.txt to test2.txt
Example 4: Sync all local objects to the specified bucket and delete all files that do not match
The following
sync
command syncs objects under a specified prefix and bucket to files in a local directory by uploading the local files to S3. Because of the--delete
parameter, any files existing under the specified prefix and bucket but not existing in the local directory will be deleted. In this example, the user syncs the bucketmybucket
to the local current directory. The local current directory contains the filestest.txt
andtest2.txt
. The bucketmybucket
contains the objecttest3.txt
:aws s3 sync
.
s3://mybucket
\ --deleteOutput:
upload: test.txt to s3://mybucket/test.txt upload: test2.txt to s3://mybucket/test2.txt delete: s3://mybucket/test3.txt
Example 5: Sync all local objects to the specified bucket except ``.jpg`` files
The following
sync
command syncs objects under a specified prefix and bucket to files in a local directory by uploading the local files to S3. Because of the--exclude
parameter, all files matching the pattern existing both in S3 and locally will be excluded from the sync. In this example, the user syncs the bucketmybucket
to the local current directory. The local current directory contains the filestest.jpg
andtest2.txt
. The bucketmybucket
contains the objecttest.jpg
of a different size than the localtest.jpg
:aws s3 sync
.
s3://mybucket
\ --exclude"*.jpg"
Output:
upload: test2.txt to s3://mybucket/test2.txt
Example 6: Sync all local objects to the specified bucket except ``.jpg`` files
The following
sync
command syncs files under a local directory to objects under a specified prefix and bucket by downloading S3 objects. This example uses the--exclude
parameter flag to exclude a specified directory and S3 prefix from thesync
command. In this example, the user syncs the local current directory to the bucketmybucket
. The local current directory contains the filestest.txt
andanother/test2.txt
. The bucketmybucket
contains the objectsanother/test5.txt
andtest1.txt
:aws s3 sync
s3://mybucket/
.
\ --exclude"*another/*"
Output:
download: s3://mybucket/test1.txt to test1.txt
Example 7: Sync all objects between buckets in different regions
The following
sync
command syncs files between two buckets in different regions:aws s3 sync
s3://my-us-west-2-bucket
s3://my-us-east-1-bucket
\ --source-regionus-west-2
\ --regionus-east-1
Output:
download: s3://my-us-west-2-bucket/test1.txt to s3://my-us-east-1-bucket/test1.txt
Example 8: Sync to an S3 access point
The following
sync
command syncs the current directory to the access point (myaccesspoint
):aws s3 sync
.
s3://arn:aws:s3:us-west-2:123456789012:accesspoint/myaccesspoint/
Output:
upload: test.txt to s3://arn:aws:s3:us-west-2:123456789012:accesspoint/myaccesspoint/test.txt upload: test2.txt to s3://arn:aws:s3:us-west-2:123456789012:accesspoint/myaccesspoint/test2.txt
-
For API details, see Sync
in Amazon CLI Command Reference.
-
The following code example shows how to use upload-part-copy
.
- Amazon CLI
-
To upload part of an object by copying data from an existing object as the data source
The following
upload-part-copy
example uploads a part by copying data from an existing object as a data source.aws s3api upload-part-copy \ --bucket
my-bucket
\ --key"Map_Data_June.mp4"
\ --copy-source"my-bucket/copy_of_Map_Data_June.mp4"
\ --part-number1
\ --upload-id"bq0tdE1CDpWQYRPLHuNG50xAT6pA5D.m_RiBy0ggOH6b13pVRY7QjvLlf75iFdJqp_2wztk5hvpUM2SesXgrzbehG5hViyktrfANpAD0NO.Nk3XREBqvGeZF6U3ipiSm"
Output:
{ "CopyPartResult": { "LastModified": "2019-12-13T23:16:03.000Z", "ETag": "\"711470fc377698c393d94aed6305e245\"" } }
-
For API details, see UploadPartCopy
in Amazon CLI Command Reference.
-
The following code example shows how to use upload-part
.
- Amazon CLI
-
The following command uploads the first part in a multipart upload initiated with the
create-multipart-upload
command:aws s3api upload-part --bucket
my-bucket
--key 'multipart/01
' --part-number1
--bodypart01
--upload-id"dfRtDYU0WWCCcH43C3WFbkRONycyCpTJJvxu2i5GYkZljF.Yxwh6XG7WfS2vC4to6HiV6Yjlx.cph0gtNBtJ8P3URCSbB7rjxI5iEwVDmgaXZOGgkk5nVTW16HOQ5l0R"
The
body
option takes the name or path of a local file for upload (do not use the file:// prefix). The minimum part size is 5 MB. Upload ID is returned bycreate-multipart-upload
and can also be retrieved withlist-multipart-uploads
. Bucket and key are specified when you create the multipart upload.Output:
{ "ETag": "\"e868e0f4719e394144ef36531ee6824c\"" }
Save the ETag value of each part for later. They are required to complete the multipart upload.
-
For API details, see UploadPart
in Amazon CLI Command Reference.
-
The following code example shows how to use website
.
- Amazon CLI
-
Configure an S3 bucket as a static website
The following command configures a bucket named
my-bucket
as a static website. The index document option specifies the file inmy-bucket
that visitors will be directed to when they navigate to the website URL. In this case, the bucket is in the us-west-2 region, so the site would appear athttp://my-bucket.s3-website-us-west-2.amazonaws.com
.All files in the bucket that appear on the static site must be configured to allow visitors to open them. File permissions are configured separately from the bucket website configuration.
aws s3 website
s3://my-bucket/
\ --index-documentindex.html
\ --error-documenterror.html
For information on hosting a static website in Amazon S3, see Hosting a Static Website
in the Amazon Simple Storage Service Developer Guide. -
For API details, see Website
in Amazon CLI Command Reference.
-