BASH – backup sites and MySQL databases on AWS S3

The script finds all folders in a given directory and archives them one by one, copies them to S3 Bucket and deletes them locally. It also receives a list of all MySQL databases, excluding system databases, archives them in turn, sends them to S3 Bucket and deletes them locally.

#!/bin/bash

DATE=$(date +%d-%m-%Y)
WEB_PATH="/var/www/html/"

SITE_LIST="$(ls $WEB_PATH)"
DB_LIST="$(mysql -u root -e "SHOW DATABASES;" | grep -Ev "(Database|information_schema|performance_schema|mysql|sys)")"
S3_BUCKET="s3://artem-services"

################################ BACKUP SITES ################################

for SITE in $SITE_LIST
do
	if [ -f "/tmp/$SITE.tar.gz" ] # Check if there is an old archive left
	then
		rm /tmp/$SITE.tar.gz
	fi

	tar -zcvf /tmp/$SITE.tar.gz --directory="$WEB_PATH/$SITE/" ./
	aws s3 cp /tmp/$SITE.tar.gz $S3_BUCKET/$DATE/SITE/$SITE.tar.gz
	rm /tmp/$SITE.tar.gz
done

############################### BACKUP DATABASES ##############################

for DB in $DB_LIST
do
	if [ -f "/tmp/$DB.gz" ] # Check if there is an old archive left
	then
		rm /tmp/$DB.gz
	fi

	mysqldump -u root $DB | gzip -c > /tmp/$DB.gz
	aws s3 cp /tmp/$DB.gz $S3_BUCKET/$DATE/DATABASE/$DB.gz
	rm /tmp/$DB.gz
done

Tagged: Tags

Subscribe
Notify of
guest

0 Comments
Inline Feedbacks
View all comments