Skip to content
Advertisement

Copy a large amount of files in s3 on the same bucket

I got a “directory” on a s3 bucket with 80 TB ~ and I need do copy everything to another directory in the same bucket

source = s3://mybucket/abc/process/

destiny = s3://mybucket/cde/process/

I already tried to use aws s3 sync, but worked only for the big files, still left 50 TB to copy. I’m thinking about to use a boto3 code as this example below, but I don’t know how to do for multiple files/directories recursively.

JavaScript

How can I do this using the boto3?

Advertisement

Answer

While there may be better ways of doing this using bucket policies, it can be done using boto3. first you will need to get a list of the contents of the bucket

JavaScript

Where source_bucket is the name of the bucket and source_prefix is the name of the folder.
Next you will iterate over the contents and for each item call the s3.meta.client.copy method like so

JavaScript

the contents are a dictionary so you must use ‘Key’ to get the name of the item and use split to break it into prefix and file name.

User contributions licensed under: CC BY-SA
1 People found this is helpful
Advertisement