Simple way to migrate s3 buckets across AWS accounts

Is it possible to migrate S3 buckets over AWS account?

Yeah, this can be done simply from your Linux machine. You can use the AWS command “aws s3” to migrate s3 buckets over AWS accounts.

Prerequisite

IAM user/s with privilege to access s3 buckets.

How to migrate s3 buckets over AWS account?

This can be done by different ways. Here I am going to explain this migration in a different simple way.
You can copy the content of s3 bucket to another bucket using “aws s3 cp” command.

Method I

Copying content to an EC2 instance and upload it to destination bucket.

Hope, you that you can not be able to create buckets with the same name on the source account as s3 bucket names are unique across the entire S3 namespace across accounts. So you have to create a new temporary bucket to copy contents to destination. After moving contents to destination you can simply remove the s3 bucket is source and create it to destination AWS account, then can move content from the temporary bucket. Cool!!

Step 1: Create IAM users at source and destination server with read and write access to s3 bucket.

Step 2: Create an EC2 instance to copy content from source bucket. You can also do it from any Linux machine, even from your Linux PC. However, I suggest an instance in the same AWS account to speedup (network speed should be good) the migration process.

Step 3: Configure those IAM users on your Linux instance.

How to configure IAM user on awscli?

You can simply configure awscli with IAM user privilege by executing the command aws configure.

The above command prompt for four questions. AWS Access Key ID of that IAM user, AWS Secret Access Key of IAM user, Default region name and Default output format. Read more…

Step 4: Check the permission of IAM user.

You can check this by listing buckets from your Linux instance.

aws s3 ls

Step 5: Copy contents from source bucket to local EC2 instance. You can use the following command:

aws s3 cp s3://mybucket /path/to/local/ --recursive

Run the above command in a screen.

Step 6: After the process is completed, start the copy from local folder to remote s3 bucket.

aws s3 cp /path/to/local/ s3://mybucket --recursive --profile user2

Here the user2 is the IAM user who has access to remote s3 bucket.

Step 7: Cross check all the content migrated correctly to remote bucket.

You can use the following command to check files are copied correctly to remote bucket:

Source bucket:

aws s3 ls --summarize --human-readable --recursive s3://bucket-name|tail -n 2

Destination bucket:

aws s3 ls --summarize --human-readable --recursive s3://bucket-name --profile user2|tail -n 2

Method II

By using single IAM user which has access to source and destination s3 buckets. Please see the steps pasted below:

Step 1: Create an IAM user on source AWS account which has privilege to access s3 buckets.

Step 2: Give access to that IAM user on destinations AWS bucket.

How to give permission to an IAM user on remove s3 bucket?

You can give permission to an IAM user by adding custom policy to buckets. You can use the following policy to give permission:

----
{
    "Version": "2012-10-17",
    "Statement": [
        {
            "Sid": "AddPerm",
            "Effect": "Allow",
            "Principal": "*",
            "Action": "s3:GetObject",
            "Resource": "arn:aws:s3:::bucket-name/*"
        }
    ]
}
----

Step 3: Check the IAM user have access to both buckets

You can check this by listing buckets from your Linux instance.

aws s3 ls s3://source-bucket
aws s3 ls s3://destination-bucket

Step 4: Start copying process

aws s3 cp s3://source-bucket/ s3://estination-bucket/ --recursive

You can also use the “sync” option to copy/sync content between buckets.

aws s3 sync s3://source-bucket/ s3://estination-bucket/ --recursive

After the migration completed, remove the source bucket, create the same on destination, move contents, remove temporary bucket.

That’s it!!!!

Post navigation

Arunlal A

Senior System Developer at Zeta. Linux lover. Traveller. Let's connect! Whether you're a seasoned DevOps pro or just starting your journey, I'm always eager to engage with like-minded individuals. Follow my blog for regular updates, connect on social media, and let's embark on this DevOps adventure together! Happy coding and deploying!

Leave a Reply

Your email address will not be published. Required fields are marked *