Using Batch Operations with S3 Express One Zone - Amazon Simple Storage Service
Services or capabilities described in Amazon Web Services documentation might vary by Region. To see the differences applicable to the China Regions, see Getting Started with Amazon Web Services in China (PDF).

Using Batch Operations with S3 Express One Zone

You can use Amazon S3 Batch Operations to perform operations on objects stored in S3 buckets. To learn more about S3 Batch Operations, see Performing large-scale batch operations on Amazon S3 objects.

The following topics discuss performing batch operations on objects stored in the S3 Express One Zone storage class in directory buckets.

Using Batch Operations with directory buckets

You can perform the Copy operation and the Invoke Amazon Lambda function operations on objects that are stored in directory buckets. With Copy, you can copy objects between buckets of the same type (for example, from a directory bucket to a directory bucket). You can also copy between general purpose buckets and directory buckets. With Invoke Amazon Lambda function, you can use a Lambda function to perform actions on objects in your directory bucket with code that you define.

Copying objects

You can copy between the same bucket type or between directory buckets and general purpose buckets. When you copy to a directory bucket, you must use the correct Amazon Resource Name (ARN) format for this bucket type. The ARN format for a directory bucket is arn:aws-cn:s3express:region:account-id:bucket/bucket-base-name--x-s3.

You can also populate your directory bucket with data by using the Import action in the S3 console. Import is a streamlined method for creating Batch Operations jobs to copy objects from general purpose buckets to directory buckets. For Import copy jobs from general purpose buckets to directory buckets, S3 automatically generates a manifest. For more information, see Importing objects to a directory bucket and Specifying a manifest.

Invoking Lambda functions (LambdaInvoke)

There are special requirements for using Batch Operations to invoke Lambda functions that act on directory buckets. For example, you must structure your Lambda request by using a v2 JSON invocation schema, and specify InvocationSchemaVersion 2.0 when you create the job. For more information, see Invoke Amazon Lambda function.

Key differences

The following is a list of key differences when you're using Batch Operations to perform bulk operations on objects that are stored in directory buckets with the S3 Express One Zone storage class:

  • Amazon S3 automatically encrypts all new objects that are uploaded to an S3 bucket. The default encryption configuration of an S3 bucket is always enabled and is at a minimum set to server-side encryption with Amazon S3 managed keys (SSE-S3). For directory buckets, only sSSE-S3 is supported. If you make a CopyObject request that sets server-side encryption with customer-provided keys (SSE-C) or server-side encryption with Amazon Key Management Service (Amazon KMS) keys (SSE-KMS) on a directory bucket (source or destination), the response returns an HTTP 400 (Bad Request) error.

  • Objects in directory buckets can't be tagged. You can only specify an empty tag set. By default, Batch Operations copies tags. If you copy an object that has tags between general purpose buckets and directory buckets, you receive a 501 (Not Implemented) response.

  • S3 Express One Zone offers you the option to choose the checksum algorithm that is used to validate your data during uploads or downloads. You can select one of the following Secure Hash Algorithms (SHA) or Cyclic Redundancy Check (CRC) data-integrity check algorithms: CRC32, CRC32, SHA-1, and SHA-256. MD5-based checksums are not supported with the S3 Express One Zone storage class.

  • By default, all Amazon S3 buckets set the S3 Object Ownership setting to bucket owner enforced and access control lists (ACLs) are disabled. For directory buckets, this setting can't be modified. You can copy an object from general purpose buckets to directory buckets. However, you can't overwrite the default ACL when you copy to or from a directory bucket.

  • Regardless of how you specify your manifest, the list itself must be stored in a general purpose bucket. Batch Operations can't import existing manifests from (or save generated manifests to) directory buckets. However, objects described within the manifest can be stored in directory buckets.

  • Batch Operations can't specify a directory bucket as a location in an S3 Inventory report. Inventory reports don't support directory buckets. You can create a manifest file for objects within a directory bucket by using the ListObjectsV2 API operation to list the objects. You can then insert the list in a CSV file.

Granting access

To perform copy jobs, you must have the following permissions:

  • To copy objects from one directory bucket to another directory bucket, you must have the s3express:CreateSession permission.

  • To copy objects from directory buckets to general purpose buckets, you must have the s3express:CreateSession permission and the s3:PutObject permission to write the object copy to the destination bucket.

  • To copy objects from general purpose buckets to directory buckets, you must have the s3express:CreateSession permission and the s3:GetObject permission to read the source object that is being copied.

    For more information, see CopyObject in the Amazon Simple Storage Service API Reference.

  • To invoke a Lambda function, you must grant permissions to your resource based on your Lambda function. To determine which permissions are required, check the corresponding API operations.