Invented and sketched a couple of lines for backup mysql database and directories with http files.
First of all, we make a copy of mysql of the zabbix database (with the –ignore-table switch we exclude unnecessary history tables, since they can occupy gigabytes) and archive it:
mysqldump --ignore-table=zabbix.history --ignore-table=zabbix.history_uint --ignore-table=zabbix.trends --ignore-table=zabbix.trends_uint -u USERNAME -h localhost -pPASSWORD zabbix | gzip -c > /backups/zabbix_`date +%Y-%m-%d`.sql.gz
The second step is to archive the zabbix http files:
tar -cvjf /backups/`date +%Y-%m-%d`_zabbix.tar.bz2 /usr/share/zabbix/
Instead of the /backups/ directory, you can mount and specify some kind of network drive from the Internet and make backups on it.
Both lines can be added to /etc/crontab so that copies are run daily, for example, or added to the script, give the script permission to execute, and in /etc/crontab specify the path to this script, for example every day at 3 am:
0 3 * * * root /backups/script.sh > /dev/null 2>&1
See also my articles:
Using and configuring CRON
Script to delete old files