Skip to content
Advertisement

AWS Aurora: bulk upsert of records using pre-formed SQL Statements

Is there a way of doing a batch insert/update of records into AWS Aurora using “pre-formed” Postgresql statements, using Python?

My scenario: I have an AWS lambda that receives data changes (insert/modify/remove) from DynamoDB via Kinesis, which then needs to apply them to an instance of Postgres in AWS Aurora.

All I’ve managed to find doing an Internet search is the use of Boto3 via the “batch_execute_statement” command in the RDS Data Service client, where one needs to populate a list of parameters for each individual record.

If possible, I would like a mechanism where I can supply many “pre-formed” INSERT/UPDATE/DELETE Postgresql statements to the database in a batch operation.

Many thanks in advance for any assistance.

Advertisement

Answer

I used Psycopg2 and an SqlAlchemy engine’s raw connection (instead of Boto3) and looped through my list of SQL statements, executing each one in turn.

User contributions licensed under: CC BY-SA
5 People found this is helpful
Advertisement