Using boto3, we can create a new task definition: How do we update an existing task definition? Is it just another call with changes and the same family name? Answer update an existing task definition You can’t do this. You have to create a new revision of an existing task definition. Then you will also have to update your ECS
Tag: boto3
Get tables from AWS Glue using boto3
I need to harvest tables and column names from AWS Glue crawler metadata catalogue. I used boto3 but constantly getting number of 100 tables even though there are more. Setting up NextToken doesn’t …
How to parse Boto3 200 response for copy_object request
I am new to Python and I’m writing an AWS lambda that copies files from one bucket to another. I am using the Boto3 library and have come across the following in the documentation: A copy request …
List of all roles with attached policies with Boto3
Found a useful thread here that helped me get part of a script to get a list of all roles and its attached policies: response = client.list_attached_role_policies( RoleName=’MyRoleName’ ) I am …
AWS Lambda function with placeholders
I am working on AWS Lambda function for my python function. I have a python function that calls an IAM policy form a file and populates it using the function. This is my function, name of the file is …
Load JSON data from CloudTrail into DynamoDB using Boto
I am working on a Boto3 script that can load the attributes from Cloudtrail into Dynamodb. The format of my cloudtrail logs is JSON. I am fairly new to DynamoDB and I am not sure where I am making a …
Signature Error while updating S3 object metadata through boto3
I have a lambda function that takes S3 object from S3 events and updates it with the custom metadata. Here is the boto3 script: When I run the script, it gives me the following error: An error occurred (SignatureDoesNotMatch) when calling the CopyObject operation: The request signature we calculated does not match the signature you provided. Check your key and
Include only .gz extension files from S3 bucket
I want to process/download .gz files from S3 bucket. There are more than 10,000 files on S3 so I am using import boto3 s3 = boto3.resource(‘s3’) bucket = s3.Bucket(‘my-bucket’) objects = bucket….
How to create the dynamodb table using serverless.yml and delete the items of it using python boto3?
I’ve created the dynamodb table using serverless.yml as below: resources: Resources: myTable: Type: AWS::DynamoDB::Table DeletionPolicy: Retain Properties: TableName: …
Amazon S3 boto3 how to iterate through objects in a bucket?
In a flask app, I was trying to iterate through objects in a S3 Bucket and trying to print the key/ filename but my_bucket.objects.all() returns only the first object in the bucket. It’s not returning the all the objects. The output is [001.pdf] instead of [001, 002, 003, 004, 005] AdvertisementAnswer You are exiting the […]