I have a existing s3 bucket which contains large amount of files. I want to run a lambda function every 1 minute and copy those files to another destination s3 bucket.
My function is:
s3 = boto3.resource('s3')
clientname=boto3.client('s3')
def lambda_handler(event, context):
bucket = 'test-bucket-for-transfer-check'
try:
response = clientname.list_objects(
Bucket=bucket,
MaxKeys=5
)
for record in response['Contents']:
key = record['Key']
copy_source = {
'Bucket': bucket,
'Key': key
}
try:
destbucket = s3.Bucket('serverless-demo-s3-bucket')
destbucket.copy(copy_source, key)
print('{} transferred to destination bucket'.format(key))
except Exception as e:
print(e)
print('Error getting object {} from bucket {}. '.format(key, bucket))
raise e
except Exception as e:
print(e)
raise e
Now how can I make sure the function is copying new files each time it runs??