Introduction
In the first part of the series we set up the sample application which has API Gateway in front of Lambda functions which communicate with Aurora Serverless v2 PostgreSQL database via Data API to create the products and retrieve them (by id). In the second part we dove deeper into the new Data API for Aurora Serverless v2 itself and its capabilities like executing SQL Statements and used AWS SDK for Java for it. In this part of the series, we'll take a look at Data API capabilities to batch SQL statement over an array of data for bulk update and insert operations. A batch SQL statement can provide a significant performance improvement over individual insert and update statements.
Data API for Aurora Serverless v2 to batch SQL statement over an array of data for bulk update and insert operations
To show the database transaction capability I used the same sample application but added the possibility to create the multiple products within HTTP PUT /products request (see the CreateProductsDataApiFunction Lambda definition in the SAM template). The corresponding Lambda function expects the JSON as HTTP body like :
[
{
"name": "Calendar A3",
"price": "23.56"
},
{
"name": "Photobook A4",
"price": "45.21"
},
{
"name": "Mug red",
"price": "10.31"
}
]
In this concrete example we'll create 3 products. We can then convert this JSON to the list of products using Gson.
List<Product> products = new Gson().
fromJson(body, new TypeToken<List<Product>>(){}.getType());
The only major difference between the ExecuteStatementRequest and BatchExecuteStatementRequest is that with the latter we need to pass the parameterSet consisting of the collection of SQL paremeters. Below is the code sample for it.
final String CREATE_PRODUCT_SQL =
"INSERT INTO tbl_product (id, name, price)
VALUES (:id, :name, :price);";
Set<Set<SqlParameter>> parameterSets= new HashSet<>();
for(Product product : products) {
long productId= getNextSequenceValue("product_id");
product.setId(productId);
final SqlParameter productIdParam=
SqlParameter.builder().name("id").value(Field.builder().
longValue(Long.valueOf(productId)).build()).build();
final SqlParameter productNameParam=
SqlParameter.builder().name("name").value(Field.builder().
stringValue(product.getName()).build()).build();
final SqlParameter productPriceParam=
SqlParameter.builder().name("price").value(Field.builder().
doubleValue(product.getPrice().doubleValue()).build()).build();
Set<SqlParameter> sqlParams=
Set.of(productIdParam,productNameParam,productPriceParam);
parameterSets.add(sqlParams);
}
What we do in the code above is for each product we generate its id by asking the "product id" sequence for its next value and construct 3 SQLParameters : id, name and price which are required to create one product, doing the same for all products in the request and finally constructing the set of SQL parameters.
Then we create BatchExecuteStatementRequest by passing all the required parameters including the SQL statement and set of SQL parameters created above to iterate over it and execute the same SQL statement with the different SQL parameters.
final BatchExecuteStatementRequest request=
BatchExecuteStatementRequest.builder().database("").
resourceArn(dbClusterArn).secretArn(dbSecretStoreArn).
sql(CREATE_PRODUCT_SQL).parameterSets(parameterSets).
//formatRecordsAs(RecordsFormatType.JSON).
build();
Then RdsDataClient needs to invoke the previously created BatchExecuteStatement.
final BatchExecuteStatementResponse response=
rdsDataClient.batchExecuteStatement(request);
After it we use BatchExecuteStatementResponse to iterate over the result of each execution of the batch statement (which is insert or update SQL statement) like this.
for(final UpdateResult updateResult: response.updateResults()) {
....
}
Conclusion
In this part of the series we took a look at Data API capabilities to batch SQL statement over an array of data for bulk update and insert operations and learned about batchExecuteStatement capabilities of the RdsDataClient which uses BatchExecuteStatement Request/Response model.
In the next part of series, we'll explore how to work with transactions with the Data API for the Aurora Serverless v2.
Top comments (0)