I am trying to write a lambda function that will tag ec2 instances that will go from pending to running state. However, I have a problem reading the csv file that holds my ec2 instance tags. Currently, I have gone to where the lambda returns me the following result. However, I need a list of dictionaries. Because the rest of
Tag: aws-lambda
Parsing JSON in AWS Lambda Python
For a personal project I’m trying to write an AWS Lambda in Python3.9 that will delete a newly created user, if the creator is not myself. For this, the logs in CloudWatch Logs will trigger (via CloudTrail and EventBridge) my Lambda. Therefore, I will receive the JSON request as my event in : But I have trouble to parse it…
How Python AWS Lambda interacts specifically with the uploaded file?
I´m trying to do the following: when I upload a file in my s3 storage, the lambda picks this json file and converts it into a csv file. How can I specify in the lambda code which file must pick? example of my code in local: in this example, I provide the name of the file…but..how can I manage that
AWS Lambda Python Boto3 – Item count dynamodb table
I am trying to count the total number of items in the Dynamobd table. Boto3 documenation says item_count attribute. (integer) — The number of items in the specified table. DynamoDB updates this value approximately every six hours. Recent changes might not be reflected in this value. I populated about 100 records into that table. output shows 0 reccords Answer As
S3 notifications generating multiple events and how to handle them
There is this S3 notification feature described here: Amazon S3 event notifications are designed to be delivered at least once. Typically, event notifications are delivered in seconds but can sometimes take a minute or longer. and discussed here. I thought I could mitigate the duplications a bit by deleting files I have already processed. The problem is, when a second
Install newer version of sqlite3 on AWS Lambda for use with Python
I have a Python script running in a Docker container on AWS Lambda. I’m using the recommended AWS image (public.ecr.aws/lambda/python:3.9), which comes with SQLite version 3.7.17 (from 2013!). When I test the container locally on my M1 Mac, I see this: However, I use newer SQLite features, so I need to find a way to use a newer version of
Lambda to call incoming webhooks
I have been using a bash script to call a webhook that trigger azure devops pipeline, now want to use lambda function to do the same thing but am having issue with identation. The error is below and am not sure why is not working. Any idea why? Trying this now and comes up with : But the error comes
using pop on multidimensional lists python with dynamoDB
I want to pop an item from a list of lists. So I have a scan for all dynamo Items and want to “pop” one field from each list. For example: The List: From this example, I have a bunch of Results and for every result list, I will remove the Item “credentials” like However, this isn’t working Answer You’re
AWS Aurora: bulk upsert of records using pre-formed SQL Statements
Is there a way of doing a batch insert/update of records into AWS Aurora using “pre-formed” Postgresql statements, using Python? My scenario: I have an AWS lambda that receives data changes (insert/modify/remove) from DynamoDB via Kinesis, which then needs to apply them to an instance of Postgres in AWS Aurora. All I’ve managed to find doing an Internet search is
Python script wont power off or on due to DB status
I have created an API in AWS Gateway which uses AWS Lambda. The Lambda uses a python script. I can use the API gateway to invoke the Lambda which can power on or off RDS clusters. In my AWS account I have 4 RDS clusters. If all 4 are powered off I can use the API to power them all