Dump Mongo DB and move it to an S3 bucket.

Requirement: Need to create a script to create Mongo DB dump and move the dump to an AWS s3 bucket.

Prerequisites: SSH access to Mongo DB server, IAM user with AWS s3 full [or write] access, aws-cli on server, knowledge in Mongo commands for dump creation.

As we need to move the dump to an S3 bucket, first we need to configure IAM user. Then only we can move the dump to S3 bucket. To configure IAM, you need to install aws-cli tool on the machine. The document added below will help you on that:

How to install AWS command line interface (awscli) on Linux?

Yeah, this should help you to manage many things from your Linux server or local machine. AWS supports command line interface tool. It’s package name is awscli. The package awscli is available in commonly using package manager like YUM, APT, APT-GET etc..

However, it’s not recommended to install using YUM or APT, because it is not guaranteed to be the latest version unless you get it from pip. Read More….

What’s the Mongo command to create a dump?

This is the core command that we’re going to use here. This is the command:

mongodump --db <db-name> -o /directory/to/dump

Example

mongodump --db crybit -o /backup

What command you use if the mongo is listening on non default port?

If the Mongo is not listing on the default port, you need to mention that port in every Mongo commands. For example, consider your Mongo daemon is listening on port number 27057, you need to mention port here.

This can be done by using “–port” switch along with normal command.

Example:

mongodump --prot 27057 --db <db-name> -o /directory/to/dump

How to check the port associated with the Mongo daemon?

From telnet or from the Mongo configuration file.

Once you’re ready with these things, next is moving dump to an S3 bucket. You can use “aws s3 sync” or “aws s3 cp” command for copying files from server to bucket.

Next, think about and create script. That will help you next time, definitely.

I like to share the script I created for this purpose. You can edit it and use as per your requirement.

#!/bin/bash
# This will create a full db dump and store it to s3 bucket my-db-backup-bucket in folder full-db-backup
# You can modify all variables as you wish
# Here we are making dump to /data/database-backup/ make sure that the directory is there.
# Here we're not keeping dumps locally.

echo "Enter DB name: "
read db

echo "removing current backups from /data/database-backup/"
rm -rf /data/database-backup/*

echo "Creating db dump to /data/database-backup/"
mongodump --db $db -o /data/database-backup/$db-$(date +%F:%R)

status=$(echo "$?")
if [ $status != 0 ]; then
 echo "Something Wrong!!"
else
 echo "All set"
 echo "Moving backup to the s3 bucket mongo-db-backup-bucket"
 cmd="aws s3 sync /data/database-backup/ s3://my-db-backup-bucket/full-db-backup  --profile user2"
 nohup $cmd &
 echo "Done!" 
fi

Here IAM is configured as profile 2.

Change the code as you wish and dump painlessly.

Also read:

Ways to monitor Prometheus exporters?

Of course, we need monitoring for every bit of our infrastructure and this bit is the critical one collecting your infrastructure metrics and you’re gonna use this for analysing system performance. So the data in your Prometheus is very critical. Read more: https://www.crybit.com/ways-to-monitor-prometheus-exporters/

#prometheus

Post navigation

Arunlal A

Senior System Developer at Zeta. Linux lover. Traveller. Let's connect! Whether you're a seasoned DevOps pro or just starting your journey, I'm always eager to engage with like-minded individuals. Follow my blog for regular updates, connect on social media, and let's embark on this DevOps adventure together! Happy coding and deploying!

2 thoughts on “Dump Mongo DB and move it to an S3 bucket.

Leave a Reply

Your email address will not be published. Required fields are marked *