Write a batch of DynamoDB items using an Amazon SDK - Amazon DynamoDB
Services or capabilities described in Amazon Web Services documentation might vary by Region. To see the differences applicable to the China Regions, see Getting Started with Amazon Web Services in China.

Write a batch of DynamoDB items using an Amazon SDK

The following code examples show how to write a batch of DynamoDB items.

.NET
Amazon SDK for .NET
Tip

To learn how to set up and run this example, see GitHub.

Writes a batch of items to the movie table.

/// <summary> /// Loads the contents of a JSON file into a list of movies to be /// added to the DynamoDB table. /// </summary> /// <param name="movieFileName">The full path to the JSON file.</param> /// <returns>A generic list of movie objects.</returns> public static List<Movie> ImportMovies(string movieFileName) { if (!File.Exists(movieFileName)) { return null; } using var sr = new StreamReader(movieFileName); string json = sr.ReadToEnd(); var allMovies = JsonConvert.DeserializeObject<List<Movie>>(json); // Now return the first 250 entries. return allMovies.GetRange(0, 250); } /// <summary> /// Writes 250 items to the movie table. /// </summary> /// <param name="client">The initialized DynamoDB client object.</param> /// <param name="movieFileName">A string containing the full path to /// the JSON file containing movie data.</param> /// <returns>A long integer value representing the number of movies /// imported from the JSON file.</returns> public static async Task<long> BatchWriteItemsAsync( AmazonDynamoDBClient client, string movieFileName) { var movies = ImportMovies(movieFileName); if (movies is null) { Console.WriteLine("Couldn't find the JSON file with movie data."); return 0; } var context = new DynamoDBContext(client); var bookBatch = context.CreateBatchWrite<Movie>(); bookBatch.AddPutItems(movies); Console.WriteLine("Adding imported movies to the table."); await bookBatch.ExecuteAsync(); return movies.Count; }
  • For API details, see BatchWriteItem in Amazon SDK for .NET API Reference.

Go
SDK for Go V2
Tip

To learn how to set up and run this example, see GitHub.

// TableBasics encapsulates the Amazon DynamoDB service actions used in the examples. // It contains a DynamoDB service client that is used to act on the specified table. type TableBasics struct { DynamoDbClient *dynamodb.Client TableName string } // AddMovieBatch adds a slice of movies to the DynamoDB table. The function sends // batches of 25 movies to DynamoDB until all movies are added or it reaches the // specified maximum. func (basics TableBasics) AddMovieBatch(movies []Movie, maxMovies int) (int, error) { var err error var item map[string]types.AttributeValue written := 0 batchSize := 25 // DynamoDB allows a maximum batch size of 25 items. start := 0 end := start + batchSize for start < maxMovies && start < len(movies) { var writeReqs []types.WriteRequest if end > len(movies) { end = len(movies) } for _, movie := range movies[start:end] { item, err = attributevalue.MarshalMap(movie) if err != nil { log.Printf("Couldn't marshal movie %v for batch writing. Here's why: %v\n", movie.Title, err) } else { writeReqs = append( writeReqs, types.WriteRequest{PutRequest: &types.PutRequest{Item: item}}, ) } } _, err = basics.DynamoDbClient.BatchWriteItem(context.TODO(), &dynamodb.BatchWriteItemInput{ RequestItems: map[string][]types.WriteRequest{basics.TableName: writeReqs}}) if err != nil { log.Printf("Couldn't add a batch of movies to %v. Here's why: %v\n", basics.TableName, err) } else { written += len(writeReqs) } start = end end += batchSize } return written, err }
  • For API details, see BatchWriteItem in Amazon SDK for Go API Reference.

Java
SDK for Java 2.x
Tip

To learn how to set up and run this example, see GitHub.

Inserts many items into a table by using the enhanced client.

public static void putBatchRecords(DynamoDbEnhancedClient enhancedClient) { try { DynamoDbTable<Customer> mappedTable = enhancedClient.table("Customer", TableSchema.fromBean(Customer.class)); LocalDate localDate = LocalDate.parse("2020-04-07"); LocalDateTime localDateTime = localDate.atStartOfDay(); Instant instant = localDateTime.toInstant(ZoneOffset.UTC); Customer record2 = new Customer(); record2.setCustName("Fred Pink"); record2.setId("id110"); record2.setEmail("fredp@noserver.com"); record2.setRegistrationDate(instant) ; Customer record3 = new Customer(); record3.setCustName("Susan Pink"); record3.setId("id120"); record3.setEmail("spink@noserver.com"); record3.setRegistrationDate(instant) ; BatchWriteItemEnhancedRequest batchWriteItemEnhancedRequest = BatchWriteItemEnhancedRequest.builder() .writeBatches(WriteBatch.builder(Customer.class) .mappedTableResource(mappedTable) .addPutItem(r -> r.item(record2)) .addPutItem(r -> r.item(record3)) .build()) .build(); // Add these two items to the table. enhancedClient.batchWriteItem(batchWriteItemEnhancedRequest); System.out.println("done"); } catch (DynamoDbException e) { System.err.println(e.getMessage()); System.exit(1); } }
  • For API details, see BatchWriteItem in Amazon SDK for Java 2.x API Reference.

JavaScript
SDK for JavaScript V3
Tip

To learn how to set up and run this example, see GitHub.

Create the client.

// Create the DynamoDB service client module using ES6 syntax. import { DynamoDBClient } from "@aws-sdk/client-dynamodb"; // Set the AWS Region. export const REGION = "REGION"; // For example, "us-east-1". // Create an Amazon DynamoDB service client object. export const ddbClient = new DynamoDBClient({ region: REGION });

Create the document client.

// Create a service client module using ES6 syntax. import { DynamoDBDocumentClient } from "@aws-sdk/lib-dynamodb"; import { ddbClient } from "./ddbClient.js"; // Set the AWS Region. const REGION = "REGION"; // For example, "us-east-1". const marshallOptions = { // Whether to automatically convert empty strings, blobs, and sets to `null`. convertEmptyValues: false, // false, by default. // Whether to remove undefined values while marshalling. removeUndefinedValues: false, // false, by default. // Whether to convert typeof object to map attribute. convertClassInstanceToMap: false, // false, by default. }; const unmarshallOptions = { // Whether to return numbers as a string instead of converting them to native JavaScript numbers. wrapNumbers: false, // false, by default. }; const translateConfig = { marshallOptions, unmarshallOptions }; // Create the DynamoDB document client. const ddbDocClient = DynamoDBDocumentClient.from(ddbClient, translateConfig); export { ddbDocClient };

Add the items to the table.

import fs from "fs"; import * as R from "ramda"; import { ddbDocClient } from "../libs/ddbDocClient.js"; import { BatchWriteCommand } from "@aws-sdk/lib-dynamodb"; export const writeData = async () => { // Before you run this example, download 'movies.json' from https://docs.aws.amazon.com/amazondynamodb/latest/developerguide/GettingStarted.Js.02.html, // and put it in the same folder as the example. // Get the movie data parse to convert into a JSON object. const allMovies = JSON.parse(fs.readFileSync("moviedata.json", "utf8")); // Split the table into segments of 25. const dataSegments = R.splitEvery(25, allMovies); const TABLE_NAME = "TABLE_NAME" try { // Loop batch write operation 10 times to upload 250 items. for (let i = 0; i < 10; i++) { const segment = dataSegments[i]; for (let j = 0; j < 25; j++) { const params = { RequestItems: { [TABLE_NAME]: [ { // Destination Amazon DynamoDB table name. PutRequest: { Item: { year: segment[j].year, title: segment[j].title, info: segment[j].info, }, }, }, ], }, }; const data = ddbDocClient.send(new BatchWriteCommand(params)); } console.log("Success, table updated."); } } catch (error) { console.log("Error", error); } }; writeData();
  • For API details, see BatchWriteItem in Amazon SDK for JavaScript API Reference.

SDK for JavaScript V2
Tip

To learn how to set up and run this example, see GitHub.

// Load the AWS SDK for Node.js var AWS = require('aws-sdk'); // Set the region AWS.config.update({region: 'REGION'}); // Create DynamoDB service object var ddb = new AWS.DynamoDB({apiVersion: '2012-08-10'}); var params = { RequestItems: { "TABLE_NAME": [ { PutRequest: { Item: { "KEY": { "N": "KEY_VALUE" }, "ATTRIBUTE_1": { "S": "ATTRIBUTE_1_VALUE" }, "ATTRIBUTE_2": { "N": "ATTRIBUTE_2_VALUE" } } } }, { PutRequest: { Item: { "KEY": { "N": "KEY_VALUE" }, "ATTRIBUTE_1": { "S": "ATTRIBUTE_1_VALUE" }, "ATTRIBUTE_2": { "N": "ATTRIBUTE_2_VALUE" } } } } ] } }; ddb.batchWriteItem(params, function(err, data) { if (err) { console.log("Error", err); } else { console.log("Success", data); } });
PHP
SDK for PHP
Tip

To learn how to set up and run this example, see GitHub.

public function writeBatch(string $TableName, array $Batch, int $depth = 2) { if (--$depth <= 0) { throw new Exception("Max depth exceeded. Please try with fewer batch items or increase depth."); } $marshal = new Marshaler(); $total = 0; foreach (array_chunk($Batch, 25) as $Items) { foreach ($Items as $Item) { $BatchWrite['RequestItems'][$TableName][] = ['PutRequest' => ['Item' => $marshal->marshalItem($Item)]]; } try { echo "Batching another " . count($Items) . " for a total of " . ($total += count($Items)) . " items!\n"; $response = $this->dynamoDbClient->batchWriteItem($BatchWrite); $BatchWrite = []; } catch (Exception $e) { echo "uh oh..."; echo $e->getMessage(); die(); } if ($total >= 250) { echo "250 movies is probably enough. Right? We can stop there.\n"; break; } } }
  • For API details, see BatchWriteItem in Amazon SDK for PHP API Reference.

Python
SDK for Python (Boto3)
Tip

To learn how to set up and run this example, see GitHub.

class Movies: """Encapsulates an Amazon DynamoDB table of movie data.""" def __init__(self, dyn_resource): """ :param dyn_resource: A Boto3 DynamoDB resource. """ self.dyn_resource = dyn_resource self.table = None def write_batch(self, movies): """ Fills an Amazon DynamoDB table with the specified data, using the Boto3 Table.batch_writer() function to put the items in the table. Inside the context manager, Table.batch_writer builds a list of requests. On exiting the context manager, Table.batch_writer starts sending batches of write requests to Amazon DynamoDB and automatically handles chunking, buffering, and retrying. :param movies: The data to put in the table. Each item must contain at least the keys required by the schema that was specified when the table was created. """ try: with self.table.batch_writer() as writer: for movie in movies: writer.put_item(Item=movie) except ClientError as err: logger.error( "Couldn't load data into table %s. Here's why: %s: %s", self.table.name, err.response['Error']['Code'], err.response['Error']['Message']) raise
  • For API details, see BatchWriteItem in Amazon SDK for Python (Boto3) API Reference.

Ruby
SDK for Ruby
Tip

To learn how to set up and run this example, see GitHub.

# Fills an Amazon DynamoDB table with the specified data. Items are sent in # batches of 25 until all items are written. # # @param movies [Enumerable] The data to put in the table. Each item must contain at least # the keys required by the schema that was specified when the # table was created. def write_batch(movies) index = 0 slice_size = 25 while index < movies.length movie_items = [] movies[index, slice_size].each do |movie| movie_items.append({put_request: { item: movie }}) end @dynamo_resource.batch_write_item({request_items: { @table.name => movie_items }}) index += slice_size end rescue Aws::Errors::ServiceError => e puts( "Couldn't load data into table #{@table.name}. Here's why:") puts("\t#{e.code}: #{e.message}") raise end
  • For API details, see BatchWriteItem in Amazon SDK for Ruby API Reference.

For a complete list of Amazon SDK developer guides and code examples, see Using DynamoDB with an Amazon SDK. This topic also includes information about getting started and details about previous SDK versions.