Copying existing files in a s3 Bucket to another s3 bucket

0 votes

I have a existing s3 bucket which contains large amount of files. I want to run a lambda function every 1 minute and copy those files to another destination s3 bucket.

My function is:

s3 = boto3.resource('s3')
clientname=boto3.client('s3')
def lambda_handler(event, context):
    bucket = 'test-bucket-for-transfer-check'
    try:
        response = clientname.list_objects(
            Bucket=bucket,
            MaxKeys=5
        )

        for record in response['Contents']:
            key = record['Key']
            copy_source = {
                'Bucket': bucket,
                'Key': key
            }
            try:
                destbucket = s3.Bucket('serverless-demo-s3-bucket')
                destbucket.copy(copy_source, key)
                print('{} transferred to destination bucket'.format(key))

            except Exception as e:
                print(e)
                print('Error getting object {} from bucket {}. '.format(key, bucket))
                raise e
    except Exception as e:
        print(e)
        raise e

Now how can I make sure the function is copying new files each time it runs??

Sep 14, 2018 in AWS by bug_seeker
• 15,510 points
15,001 views

1 answer to this question.

0 votes

Suppose the two buckets in question are Bucket-A and Bucket-B
and task to be done is copy files from Bucket-A --> Bucket-B

  1. Lamba-1 (code in the question) reads all files from Bucket-A and copy them one-by-one to Bucket-B in a loop.
  2. In the same loop it also puts one entry in DynamoDb table say "Copy_Logs" with column File_key and Flag. Where File_Key is the object key of the file and Flag is set to false telling the state of copy operation
  3. Now configure events on Bucket-B to invoke a Lambda-2 on every put and multi-part upload
  4. Now Lambda-2 will read the object key from S3 notification payload and updates its respective record in DynamoDb table with flag set to true.

Now you have all records in DynamoDb table "Copy-Logs" which files were copied successfully and which were not.

answered Sep 14, 2018 by Priyaj
• 58,020 points

Related Questions In AWS

0 votes
2 answers

How to access files in S3 bucket from R?

You can take a look at the ...READ MORE

answered Aug 10, 2018 in AWS by Deepthi
• 300 points
6,617 views
0 votes
1 answer

How to download the latest file in a S3 bucket using AWS CLI?

You can use the below command $ aws ...READ MORE

answered Sep 6, 2018 in AWS by Archana
• 4,170 points
19,855 views
+1 vote
1 answer

Llimit to number of objects in a S3 bucket

If you are referring to the number of objects ...READ MORE

answered Oct 12, 2018 in AWS by Archana
• 4,170 points
4,651 views
0 votes
1 answer

How to appoint a sub-domain in a S3 bucket?

You need to rename your bucket to ...READ MORE

answered Oct 12, 2018 in AWS by Archana
• 4,170 points
1,614 views
0 votes
1 answer

How do I use AWS sdk definitions for TypeScript?

Hey, slight modification with what you have ...READ MORE

answered Dec 3, 2018 in AWS by Archana
• 5,640 points
1,875 views
0 votes
1 answer

How to create a S3 bucket in AWS Console?

To create a S3 bucket in AWS, ...READ MORE

answered Feb 13, 2019 in AWS by Priyaj
• 58,020 points
1,154 views
0 votes
1 answer
webinar REGISTER FOR FREE WEBINAR X
REGISTER NOW
webinar_success Thank you for registering Join Edureka Meetup community for 100+ Free Webinars each month JOIN MEETUP GROUP