Importing data from Amazon S3 into an Aurora PostgreSQL DB cluster
You can import data that's been stored using Amazon Simple Storage Service into a table on an Aurora PostgreSQL DB cluster instance. To do this, you first install the Aurora PostgreSQL
aws_s3
extension. This extension provides
the functions that you use to import data from an Amazon S3 bucket. A bucket is an Amazon S3 container for objects
and files.
The data can be in a comma-separate value (CSV) file, a text file,
or a compressed (gzip) file. Following, you can learn how to install the extension and how to import data
from Amazon S3 into a table.
Your database must be running PostgreSQL version 10.7 or higher to import from Amazon S3 into Aurora PostgreSQL.
If you don't have data stored on Amazon S3, you need to first create a bucket and store the data. For more information, see the following topics in the Amazon Simple Storage Service User Guide.
Cross-account import from Amazon S3 is supported. For more information, see Granting cross-account permissions in the Amazon Simple Storage Service User Guide.
You can use the customer managed key for encryption while importing data from S3. For more information, see KMS keys stored in Amazon KMS in the Amazon Simple Storage Service User Guide.
Note
Importing data from Amazon S3 isn't supported for Aurora Serverless v1. It is supported for Aurora Serverless v2.