網站資料定時備份指令碼分享 (保留最近的30份備份資料)

散盡浮華發表於2016-12-07

 

備份是我們運維人員最基本的日常工作,做好備份是穩定運維的一個重要環節。下面分享兩個使用過的簡單備份指令碼:

1)網站資料備份
將網站資料/var/www/vhost/www.kevin.com和/var/www/vhost/www.grace.com分別備份到:
/Data/code-backup/www.kevin.com和/Data/code-backup/www.grace.com下。

[root@huanqiu_web5 code-backup]# cat web_code_backup.sh
#!/bin/bash
  
#備份網站資料
/bin/tar -zvcf /Data/code-backup/www.kevin.com/www.kevin.com_`date +%Y%m%d_%H%M%S`.tar.gz /var/www/vhosts/www.kevin.com
/bin/tar -zvcf /Data/code-backup/www.grace.com/www.grace.com_`date +%Y%m%d_%H%M%S`.tar.gz /var/www/vhosts/www.grace.com
  
#刪除一週之前的備份檔案
find /Data/code-backup/www.kevin.com -type f -mtime +7 -exec rm -f {} \;
find /Data/code-backup/www.grace.com -type f -mtime +7 -exec rm -f {} \;
  
[root@huanqiu_web5 ~]# crontab -l
#每天凌晨5點備份網站資料
0 5 * * * /bin/bash -x /Data/code-backup/web_code_backup.sh > /dev/null 2>&1
 
備份後的效果如下:
[root@huanqiu_web5 ~]# ls /Data/code-backup/www.kevin.com/
www.kevin.com_20170322_174328.tar.gz
[root@xqsj_web5 ~]# ls /Data/code-backup/www.grace.com/
www.grace.com_20170322_174409.tar.gz

2)資料庫備份(自動刪除10天前的備份檔案)
資料庫服務使用的是阿里雲的mysql,遠端進行定時的全量備份,備份到本地,以防萬一。mysql資料庫遠端備份的資料最好打包壓縮

[root@huanqiuPC crontab]# pwd
/Data/Mysql_Bakup/crontab
[root@huanqiuPC crontab]# cat backup_db_wangshibo.sh
#!/bin/bash
MYSQL="/usr/bin/mysql"
MYSQLDUMP="/usr/bin/mysqldump"
BACKUP_DIR="/Data/Mysql_Bakup"
#DB_SOCKET="/var/lib/mysql/mysql.sock"
DB_hostname="110.120.11.9"
DBNAME="wangshibo"
DB_USER="db_wangshibo"
DB_PASS="mhxzk3rfzh"
TIME=`date +%Y%m%d%H%M%S`
LOCK_FILE="${BACKUP_DIR}/lock_file.tmp"
BKUP_LOG="/Data/Mysql_Backup/${TIME}_bkup.log"
DEL_BAK=`date -d '10 days ago' '+%Y%m%d'`
##To judge lock_file
if [[ -f $LOCK_FILE ]];then
exit 255
else
echo $$ > $LOCK_FILE
fi

##dump databases##
echo ${TIME} >> ${BKUP_LOG}
echo "=======Start Bakup============" >>${BKUP_LOG}
#${MYSQLDUMP} -h ${DB_hostname} -u${DB_USER} -p${DB_PASS} --databases ${DBNAME} | gzip -9 > ${BACKUP_DIR}/${TIME}.${DBNAME}.gz
${MYSQLDUMP} -h ${DB_hostname} -u${DB_USER} -p${DB_PASS} --databases ${DBNAME} |gzip -9 > ${BACKUP_DIR}/${TIME}.${DBNAME}.gz
echo "=======Finished Bakup============" >>${BKUP_LOG}
/bin/rm -f ${LOCK_FILE}

##del back 10 days before##
/bin/rm -f ${BACKUP_DIR}/${DEL_BAK}*.gz

定時進行備份

[root@huanqiuPC Mysql_Bakup]# crontab -l
10 0,6,12,18 * * * /bin/bash /Data/Mysql_Bakup/crontab/backup_db_wangshibo.sh >/dev/null 2>&1

指令碼執行後的備份效果如下

[root@huanqiuPC crontab]# cd /Data/Mysql_Bakup
[root@huanqiuPC Mysql_Bakup]# ls
20161202061001.wangshibo.gz

同步線上資料庫到beta環境資料庫(覆蓋beta資料庫):
將上面定時備份的資料包拷貝到beta機器上,然後解壓,登陸mysql,source命令進行手動覆蓋。

                                                                                                  再看一例                                                                                                      

[root@backup online_bak]# cat rsync.sh      (指令碼中的同步:限速3M,保留最近一個月的備份)
#!/bin/bash

# ehr data backup----------------------------------------------------------
cd /data/bak/online_bak/192.168.34.27/tomcat_data/
/usr/bin/rsync -e "ssh -p22222" -avpgolr --bwlimit=3072 192.168.34.27:/data/tomcat7/webapps /data/bak/online_bak/192.168.34.27/tomcat_data/`date +%Y%m%d`
/bin/tar -zvcf  `date +%Y%m%d`.tar.gz `date +%Y%m%d`
rm -rf `date +%Y%m%d`

cd /data/bak/online_bak/192.168.34.27/tomcat_data/
NUM1=`ls -l|awk '{print $9}'|grep 2017|wc -l`
I1=$( /usr/bin/expr $NUM1 - 30 )
ls -l|awk '{print $9}'|grep 2017|sed -n "1,$I1 p"|xargs rm -rf

# zp data backup----------------------------------------------------------
cd /data/bak/online_bak/192.168.34.33/tomcat_data/
/usr/bin/rsync -e "ssh -p22222" -avpgolr --bwlimit=3072 192.168.34.33:/data/tomcat8/webapps /data/bak/online_bak/192.168.34.33/tomcat_data/`date +%Y%m%d`
/bin/tar -zvcf  `date +%Y%m%d`.tar.gz `date +%Y%m%d`
rm -rf `date +%Y%m%d`

cd /data/bak/online_bak/192.168.34.33/tomcat_data/
NUM2=`ls -l|awk '{print $9}'|grep 2017|wc -l`
I2=$( /usr/bin/expr $NUM2 - 30 )
ls -l|awk '{print $9}'|grep 2017|sed -n "1,$I2 p"|xargs rm -rf

cd /data/bak/online_bak/192.168.34.33/upload
/usr/bin/rsync -e "ssh -p22222" -avpgolr --bwlimit=3072 192.168.34.33:/home/zrx_hr/upload /data/bak/online_bak/192.168.34.33/upload/`date +%Y%m%d`
/bin/tar -zvcf  `date +%Y%m%d`.tar.gz `date +%Y%m%d`
rm -rf `date +%Y%m%d`

cd /data/bak/online_bak/192.168.34.33/upload
NUM3=`ls -l|awk '{print $9}'|grep 2017|wc -l`
I3=$( /usr/bin/expr $NUM3 - 30 )
ls -l|awk '{print $9}'|grep 2017|sed -n "1,$I3 p"|xargs rm -rf

# zabbix mysql backup----------------------------------------------------------
/bin/mkdir /data/bak/online_bak/192.168.16.21/mysql_data/`date +%Y%m%d`
/data/mysql/bin/mysqldump -hlocalhost -uroot -pBKJK-@@@-12345 --databases zabbix > /data/bak/online_bak/192.168.16.21/mysql_data/`date +%Y%m%d`/zabbix.sql

cd /data/bak/online_bak/192.168.16.21/mysql_data/
/bin/tar -zvcf  `date +%Y%m%d`.tar.gz `date +%Y%m%d`
rm -rf `date +%Y%m%d`

cd /data/bak/online_bak/192.168.16.21/mysql_data/
NUM4=`ls -l|awk '{print $9}'|grep 2017|wc -l`
I4=$( /usr/bin/expr $NUM4 - 30 )
ls -l|awk '{print $9}'|grep 2017|sed -n "1,$I4 p"|xargs rm -rf

[root@backup online_bak]# pwd
/data/bak/online_bak
[root@backup online_bak]# ls
192.168.16.21    rsync.sh
192.168.34.27  192.168.34.33  
[root@backup online_bak]# ll
total 10K
drwxr-xr-x   3 root root   23 Aug 19 17:47 192.168.16.21
drwxr-xr-x   4 root root   41 Aug 19 18:30 192.168.34.27
drwxr-xr-x   4 root root   37 Aug 19 18:17 192.168.34.33
-rwxr-xr-x   1 root root 6.3K Aug 19 19:20 rsync.sh

[root@backup online_bak]# ll 192.168.16.21/
total 4.0K
drwxr-xr-x  2 root root   28 Aug 19 19:43 mysql_data

[root@backup online_bak]# ll 192.168.16.21/mysql_data/
total 1.5G
-rw-r--r-- 1 root root 1.5G Aug 19 19:43 20170819.tar.gz

[root@backup online_bak]# ll 192.168.34.27
total 4.0K
drwxr-xr-x  2 root root 4.0K Aug 19 19:26 tomcat_data

[root@backup online_bak]# ll 192.168.34.27/tomcat_data/
total 3.9G
......
-rw-r--r-- 1 root root 140M Aug 19 11:06 20170818.tar.gz
-rw-r--r-- 1 root root 140M Aug 19 19:26 20170819.tar.gz

[root@backup online_bak]# ll 192.168.34.33
total 8.0K
drwxr-xr-x  2 root root 4.0K Aug 19 19:26 tomcat_data
drwxr-xr-x  2 root root   28 Aug 19 19:30 upload

[root@backup online_bak]# crontab -l
# online backup
0 2 * * * /bin/bash -x /data/bak/online_bak/rsync.sh > /dev/null 2>&1

                                                                                                                                                                                                         

取一個目錄下,按照檔案/目錄的修改時間來排序,取最後一次修改的檔案
[work@qd-op-comm01 xcspam]$ ls
bin                    xcspam-20170802145542  xcspam-20170807204545  xcspam-20170814115753  xcspam-20170818115806  xcspam-20170824162641  xcspam-20170831173616  
xcspam                 xcspam-20170802194447  xcspam-20170808163425  xcspam-20170815191150  xcspam-20170821122949  xcspam-20170824165020  xcspam-20170831191347
xcspam-20170731154018  xcspam-20170803113809  xcspam-20170808195340  xcspam-20170815210032  xcspam-20170821153300  xcspam-20170829100941  xcspam-20170904105109
xcspam-20170801190647  xcspam-20170807150022  xcspam-20170809103648  xcspam-20170816141022  xcspam-20170822173600  xcspam-20170831135623  xcspam-20170911120519
xcspam-20170802142921  xcspam-20170807164137  xcspam-20170809111246  xcspam-20170816190704  xcspam-20170823101913  xcspam-20170831160115  xcspam-20170911195802
[work@qd-op-comm01 xcspam]$ ls -rtd xcspam* |tail -1
xcspam-20170911195802

[work@qd-op-comm01 xcspam]$ ls -rtd xcspam* |tail -2|head -1   //這是倒數第二個被修改的檔案

                                                                                                                                                                                                      

自動刪除30天之前的備份資料, 即保留最近的30份備份資料, 指令碼如下(這個可以作為通用指令碼):

[root@qw-backup01 caiwu]# cat delete_30days_before.sh 
#!/bin/bash

cd `pwd`

NUM=`ls -l|awk '{print $9}'|wc -l`
I=$( /usr/bin/expr $NUM - 31 )
ls -l|awk '{print $9}'|sed -n "1,$I p"|xargs rm -rf

[root@qw-backup01 caiwu]# ls
201901100.des3  20190141.des3  20190150.des3  20190159.des3  20190168.des3  20190177.des3  20190186.des3  20190195.des3
20190133.des3   20190142.des3  20190151.des3  20190160.des3  20190169.des3  20190178.des3  20190187.des3  20190196.des3
20190134.des3   20190143.des3  20190152.des3  20190161.des3  20190170.des3  20190179.des3  20190188.des3  20190197.des3
20190135.des3   20190144.des3  20190153.des3  20190162.des3  20190171.des3  20190180.des3  20190189.des3  20190198.des3
20190136.des3   20190145.des3  20190154.des3  20190163.des3  20190172.des3  20190181.des3  20190190.des3  20190199.des3
20190137.des3   20190146.des3  20190155.des3  20190164.des3  20190173.des3  20190182.des3  20190191.des3  delete_30days_before.sh
20190138.des3   20190147.des3  20190156.des3  20190165.des3  20190174.des3  20190183.des3  20190192.des3
20190139.des3   20190148.des3  20190157.des3  20190166.des3  20190175.des3  20190184.des3  20190193.des3
20190140.des3   20190149.des3  20190158.des3  20190167.des3  20190176.des3  20190185.des3  20190194.des3

執行指令碼
[root@qw-backup01 caiwu]# sh -x delete_30days_before.sh 
+ cd /data/backup/caiwu
++ ls -l
++ awk '{print $9}'
++ wc -l
+ NUM=70
++ /usr/bin/expr 70 - 31
+ I=39
+ ls -l
+ awk '{print $9}'
+ sed -n '1,39 p'
+ xargs rm -rf

再次檢視, 發現只保留了30天之內的備份資料
[root@qw-backup01 caiwu]# ls
20190170.des3  20190174.des3  20190178.des3  20190182.des3  20190186.des3  20190190.des3  20190194.des3  20190198.des3
20190171.des3  20190175.des3  20190179.des3  20190183.des3  20190187.des3  20190191.des3  20190195.des3  20190199.des3
20190172.des3  20190176.des3  20190180.des3  20190184.des3  20190188.des3  20190192.des3  20190196.des3  delete_30days_before.sh
20190173.des3  20190177.des3  20190181.des3  20190185.des3  20190189.des3  20190193.des3  20190197.des3

相關文章