Answer Upgade pip as follows: curl https://bootstrap.pypa.io/get-pip.py | python It can be usefull for you, maybe you have to execute sudo python
Tag: amazon-web-services
Download a folder from S3 using Boto3
Using Boto3 Python SDK, I was able to download files using the method bucket.download_file() Is there a way to download an entire folder? Answer quick and dirty but it works: Assuming you want to download the directory foo/bar from s3 then the for-loop will iterate all the files whose path starts with the Prefix=foo/bar.
AWS Lambda – unable to import module ‘lambda_function’
Like many others before me, I’m trying to run an AWS Lambda function and when I try to test it, I get “errorMessage”: “Unable to import module ‘lambda_function'” My Handler is set to lambda_function.lambda_handler, and I indeed have a file named lambda_function.py which contains a function called lambda_handler. Here’s a screenshot as proof: Everything was working fine when I was
Writing a pickle file to an s3 bucket in AWS
I’m trying to write a pandas dataframe as a pickle file into an s3 bucket in AWS. I know that I can write dataframe new_df as a csv to an s3 bucket as follows: I’ve tried using the same code as above with to_pickle() but with no success. Answer I’ve found the solution, need to call BytesIO into the buffer
List out auto scaling group names with a specific application tag using boto3
I was trying to fetch auto scaling groups with Application tag value as ‘CCC’. The list is as below, The script I coded below gives output which includes one ASG without CCC tag. The output which I am getting is as below, Where as ‘prd-dcc-ein-w2’ is an asg with a different tag ‘gweb’. And the last one (dev-ccc-msp-agt-asg) in the
Get the auto id for inserted row into Redshift table using psycopg2 in Python
I am inserting a record into a Amazon Redshift table from Python 2.7 using psycopg2 library and I would like to get back the auto generate primary id for the inserted row. I have tried the usual ways I can find here or in other websites using google search, eg: I receive an error on cur.execute line : Does anybody
aws boto3 grab subnet info
Im trying to grab a list of subnets from aws, I have a working version for VPC that I have modified: I keep getting: subnets = list(ec2.Subnet.filters(Filters=filters)) AttributeError: ‘function’ object has no attribute ‘filters’ From everything im reading and other examples this should work Any ideas? Answer To access the subnets collection of ec2 resource,
Adding API to Usage Plan using Serverless Framework
My serverless.yaml file is as follows: I want to add this API to a Usage Plan. How is this done? Answer Used the AWS CLI with the following command
How to update metadata of an existing object in AWS S3 using python boto3?
boto3 documentation does not clearly specify how to update the user metadata of an already existing S3 Object. Answer It can be done using the copy_from() method –
Boto3: grabbing only selected objects from the S3 resource
I can grab and read all the objects in my AWS S3 bucket via and then would give me the path within the bucket. Is there a way to filter beforehand for only those files respecting a certain starting path (a directory in the bucket) so that I’d avoid looping over all the objects and filtering later? Answer Use the