I have a Python Script that gets the details of the unused security groups. I want that to write into a CSV file and upload to S3 Bucket.
When I test it in local machine it writes to CSV in the local machine. But when I execute that as a lambda function, it needs a place to save the CSV. So I am using s3.
import boto3 import csv ses = boto3.client('ses') def lambda_handler(event, context): with open('https://unused******- 1.amazonaws.com/Unused.csv', 'w') as csvfile: writer = csv.writer(csvfile) writer.writerow([ 'Account Name', 'Region', 'Id' ]) ec2 = boto3.resource('ec2') sgs = list(ec2.security_groups.all()) insts = list(ec2.instances.all()) all_sgs = set([sg.group_id for sg in sgs]) all_inst_sgs = set([sg['GroupId'] for inst in insts for sg in inst.security_groups]) unused_sgs = all_sgs - all_inst_sgs for elem in unused_sgs: writer.writerow([ Account_Name, region, elem ])
I want to write the result of “elem” into csv file and upload to S3 Bucket. Kindly advice.
Advertisement
Answer
By using StringIO()
, you don’t need to save the csv to local and just upload the IO to S3. Try my code and let me know if something wrong because I can’t test the code but it was worked for other cases.
import boto3 import csv import io s3 = boto3.client('s3') ses = boto3.client('ses') def lambda_handler(event, context): csvio = io.StringIO() writer = csv.writer(csvio) writer.writerow([ 'Account Name', 'Region', 'Id' ]) ec2 = boto3.resource('ec2') sgs = list(ec2.security_groups.all()) insts = list(ec2.instances.all()) all_sgs = set([sg.group_id for sg in sgs]) all_inst_sgs = set([sg['GroupId'] for inst in insts for sg in inst.security_groups]) unused_sgs = all_sgs - all_inst_sgs for elem in unused_sgs: writer.writerow([ Account_Name, region, elem ]) s3.put_object(Body=csvio.getvalue(), ContentType='application/vnd.ms-excel', Bucket='bucket', Key='name_of.csv') csvio.close()