14
Is there any free tool, or even a script (.bat
) that I can use to make a backup automatic from my database?
I wish that backup was generated by saving a single file .sql
.
14
Is there any free tool, or even a script (.bat
) that I can use to make a backup automatic from my database?
I wish that backup was generated by saving a single file .sql
.
13
I have a script that does just that: a backup of Mariadb/Mysql and the entire Postgresql and rsync the file to another location.
#!/bin/bash
BASE_DIR=/tmp/backups
TMP_DIR="$BASE_DIR/$(date +%Y%m%d-%H%M%S)-$RANDOM"
DEST_DIR=/mnt/backups/database
if [ -d $TMP_DIR ] ; then
rm -rf $TMP_DIR
fi;
mkdir -p $TMP_DIR
cd $TMP_DIR
pg_dumpall -U postgres | gzip > postgresql.sql.gz
mysqldump --all-databases -u root | gzip > mysql.sql.gz
if [ -d $DEST_DIR ] ; then
rsync -a $BASE_DIR/ $DEST_DIR
fi;
And this script is invoked once a day by cron:
$ crontab -l
@daily /usr/local/bin/backup
3
I am not an expert in Mysql, but here is my answer, being a little more simplistic, and I believe more correct too.
To perform a backup with the online bank it is necessary to have the Binary log enabled.
Otherwise you will have problem in the integrity between tables in the restoration process (transactions that were ongoing during the dump process)
$ mysqldump --all-databases -F > /tmp/backup.mysql-full.sql
$ mysqladmin flush-logs
Then one should copy the binarys logs generated during this process. To restore this backup, you must apply the dump and logs :
$ mysql < backup.mysql-full.sql
$ mysqlbinlog mysql.bin.XXX > dump.XX1.sql
$ mysql -f < dump.XX1.sql
$ mysqlbinlog mysql.bin.XX2 > dump.XX2.sql
$ mysql < dump.XXX.sql
# e por ai vai... aplicando cada binary log.
I don’t remember why, but if I’m not mistaken the user Grants are not saved in this dump, then you need to manually generate them too! I use this shell function that I took on some other issue here of the stack (in English).
mygrants()
{
mysql -B -N $@ -e "SELECT DISTINCT CONCAT(
'SHOW GRANTS FOR ''', user, '''@''', host, ''';'
) AS query FROM mysql.user" | \
mysql $@ | \
sed 's/\(GRANT .*\)/\1;/;s/^\(Grants for .*\)/## \1 ##/;/##/{x;p;x;}'
}
Dump backup does not copy your configuration file my.cnf
, then make sure to copy it manually!
I use the tool xtrabackup from the percone (freeware) to back up online bank.
But ai is not a backup in . sql, it is practically a copy of the datafiles+Binary logs.
I did backup and Restore tests and it worked perfectly.
Besides having a much better performance than the dump because it is a "raw" copy.
But ai also has the disadvantage of not having given in TXT format.
For reference, I back up a database that runs the Zabbix monitoring tool, with 7 GB of data (size of the mysql folder where the data is). The backup is done in maximum 3 minutes. Considering that it is a virtual machine (Vmware Esxi) on a desktop, I think wonderful this time.
In my scripts, I was previously using xtrabackup, but as my bank grew, I started to have some negative effects on the environment during backup. I searched alternative methods and started using snapshots, thus copying a complete image of the database.
The process of locking the bank, generating the snapshot and releasing again takes no more than 5 seconds.
For this I used the LVM snapshot feature running the following steps (well simplified version):
FLUSH TABLES WITH READ LOCK
lvcreate --snapshot myvg/lvmysql -n lv_snap1 -L 10G
UNLOCK TABLES
mount /dev/myvg/lv_snap1 /mnt/bkp
It takes some precautions in using snapshot, as it has a useful "life" that is the space allocated to it. This useful life is the time you have to copy its content to another place and then destroy the snapshot.
3
There is a class (very good, recommend) that does this
It is very simple to understand and use, and you can configure numerous things, such as which tables you want to receive the backup, if you want to also backup the settings and etc.
3
There are a thousand ways. Take a look at this opensource project that can solve your problem:
http://sourceforge.net/projects/automysqlbackup/
Or read this blog that suggests 10 different ways.
http://www.noupe.com/how-tos/10-ways-to-automatically-manually-backup-mysql-database.html
Very cool this project. I will try to use it.
3
In linux you can use crontab to perform this task daily.
Typo:
crontab -e
Add the line by replacing the values with your server:
0 23 * * * mysqldump -h localhost -u usuario -psenha meuBanco > backup_meuBanco_$(date +%s).sql 2>&1
Save (ESC :wq ENTER) and ready, every day at 23 hours your script that will make a dump will save your database.
2
In Amazon server instances aws (ec2), I adapted my own solutions to avoid having to pay for other software. In the solution for windows, I implemented in C# using the library aws.net. For linux, I installed #AWS-CLI and set up the security keys (were). From there I created the script below to perform the backup. If you want, you can also configure the custody of the backups itself S3 that it automatically purges.
#!/bin/bash
##########################################
# Backup database MySQL enviando para o S3
# Criado por Andre Mesquita
##########################################
#VARIAVEIS
DATAHORA=`date +%Y%m%d-%H%M`
FILESDIR='cd /sites/tmp'
TAR=/bin/tar
RM=/bin/rm
IPBANCO="127.0.0.1"
USERBANCO='seu usuario'
SENHABANCO='sua senha'
NOMEDOBANCO='nome do database'
NOMEDOBUCKET='meusbackups'
#Entrando no diretorio de backup
$FILESDIR
#Executa o backup do banco de dados
echo 'Realizando o backup do banco de dados...'
mysqldump --host=$IPBANCO --user=$USERBANCO --password=$SENHABANCO --databases $NOMEDOBANCO > ./backup_$NOMEDOBANCO.sql
#Compactando o backup
echo 'Compactando arquivos do site...'
$TAR zcf $NOMEDOBANCO_database.bkp_$DATAHORA.tar.gz ./backup_$NOMEDOBANCO.sql
echo 'Enviando para o respositorio de backups...'
aws s3 cp ./$NOMEDOBANCO_database.bkp_$DATAHORA.tar.gz s3://$NOMEDOBUCKET/$NOMEDOBANCO_database.bkp_$DATAHORA.tar.gz
echo 'Excluindo o script descompactado...'
$RM $NOMEDOBANCO_database.bkp_$DATAHORA.tar.gz
$RM backup_$NOMEDOBANCO.sql
echo ' '
echo ' Operacao finalizada. '
echo ' '
I hope it helps.
To send files to S3, in windows environment, I published my tools in https://github.com/avmesquita/AWS-Tools
1
I don’t know if this is what you need, but follow a complete code to make bkp of your database, only save the code in a php file, create a folder called "db_bkp" and do not forget to fill in the database access data. inside your hosting server will have an option called "cron task" or something like that, just go there and type the address for that file and ready
<?php
ini_set('display_errors',1); ini_set('display_startup_erros',1); error_reporting(E_ALL);//force php to show any error message
backup_tables(DB_HOST,DB_USER,DB_PASS,DB_NAME);//don't forget to fill with your own database access informations
function backup_tables($host,$user,$pass,$name)
{
$link = mysqli_connect($host,$user,$pass);
mysqli_select_db($link, $name);
$tables = array();
$result = mysqli_query($link, 'SHOW TABLES');
$i=0;
while($row = mysqli_fetch_row($result))
{
$tables[$i] = $row[0];
$i++;
}
$return = "";
foreach($tables as $table)
{
$result = mysqli_query($link, 'SELECT * FROM '.$table);
$num_fields = mysqli_num_fields($result);
$return .= 'DROP TABLE IF EXISTS '.$table.';';
$row2 = mysqli_fetch_row(mysqli_query($link, 'SHOW CREATE TABLE '.$table));
$return.= "\n\n".$row2[1].";\n\n";
for ($i = 0; $i < $num_fields; $i++)
{
while($row = mysqli_fetch_row($result))
{
$return.= 'INSERT INTO '.$table.' VALUES(';
for($j=0; $j < $num_fields; $j++)
{
$row[$j] = addslashes($row[$j]);
if (isset($row[$j])) { $return.= '"'.$row[$j].'"' ; } else { $return.= '""'; }
if ($j < ($num_fields-1)) { $return.= ','; }
}
$return.= ");\n";
}
}
$return.="\n\n\n";
}
//save file
$handle = fopen('db_bkp/db-backup-'.time().'-'.(md5(implode(',',$tables))).'.sql','w+');//Don' forget to create a folder to be saved, "db_bkp" in this case
fwrite($handle, $return);
fclose($handle);
echo "bkp efetuado com sucesso";//Sucessfuly message
}
?>
0
This I created to use without mysqldump:
$HOST = "localhost"; # HOSTING $USER_NAME = "root"; # USUARIO DA BASE DE DADOS $USER_PWRD = "123456"; # PASSWORD DA BASE DE DADOS $DB_PRIMARY = "data base"; # NOME DA BASE DE DADOS PRIMARIA // TABELA SELECIONADA PARA O BACKUP $DB_SELECT = "$DB_PRIMARY"; // CONEXÃO BACKUP $CONNECT_BACKUP = new mysqli("$HOST", "$USER_NAME", "$USER_PWRD", "$DB_SELECT"); // SETA O CHARSET DE COMUNICAÇÃO $CONNECT_BACKUP->set_charset('utf8'); if ($CONNECT_BACKUP->connect_errno) { echo "Failed to connect to MySQL: (" . $CONNECT_BACKUP->connect_errno . ") " . $CONNECT_BACKUP->connect_error; } // INI FUÇÕES PARA A GERAÇÃO // SQL QUERY ARRAY function sqlQuery( $SQL_COMAND, $CONNECT ) { $RESULT_COMAND = mysqli_query($CONNECT, $SQL_COMAND); while ($ROW_COMAND[] = mysqli_fetch_assoc($RESULT_COMAND )) { } array_pop($ROW_COMAND); return $ROW_COMAND; } // SQL QUERY function sqlQueryView( $SQL_COMAND, $CONNECT ) { $RESULT_COMAND = mysqli_query($CONNECT, $SQL_COMAND); while ($ROW_COMAND[] = mysqli_fetch_assoc($RESULT_COMAND )) { } array_pop($ROW_COMAND); foreach ($ROW_COMAND as $KEY => $VALUE) { foreach ($VALUE as $KEY2 => $VALUE2) { $COLS[] = $VALUE2; } } return $COLS; } // SQL INSERT function sqlInsert( $DB_TABLE, $ROW_BACKUP ) { if ( $ROW_BACKUP != null ) { $SQL_INSERT = 'INSERT INTO `'.$DB_TABLE.'`'."\n"; $SQL_INSERT .= '('; foreach ($ROW_BACKUP[0] as $KEY => $VALUE) { $COLS[] = "`$KEY`"; } $COLS_IMP = implode( ', ', $COLS ); $SQL_INSERT .= $COLS_IMP; $SQL_INSERT .= ")"."\n"; $SQL_INSERT .= "VALUES"."\n"; foreach ($ROW_BACKUP as $KEY => $VALUE) { $SQL_INSERT .= "("; foreach ($VALUE as $KEY2 => $VALUE2) { $ROWS[] = "'".$VALUE2."'"; } $ROWS_IMP = implode( ', ', $ROWS );unset ($ROWS); $SQL_INSERT .= $ROWS_IMP; $SQL_INSERT .= "), "."\n"; } $SQL_INSERT = trim($SQL_INSERT, '), '."\n"); $SQL_INSERT .= "); "."\n"."\n"; } return $SQL_INSERT; } // END FUÇÕES PARA A GERAÇÃO // INI GERA O SCRIPT SQL $SQL_TABLES = 'SHOW TABLES FROM `'.$DB_SELECT.'`;'; $ARRAY_ROWS_TABLES = sqlQuery( $SQL_TABLES, $CONNECT_BACKUP ); foreach ($ARRAY_ROWS_TABLES as $KEY => $VALUE) { foreach ($VALUE as $KEY2 => $VALUE2) { $SELECT_IN_TABLES[] = 'SELECT * FROM `'.$VALUE2.'`;'."\n"; $SQL_SELECTS[] = $SELECT_IN_TABLES; unset($SELECT_IN_TABLES); } } foreach ($SQL_SELECTS as $KEY => $VALUE) { foreach ($VALUE as $KEY2 => $VALUE2) { $DB_NAME_X = explode ('`',$VALUE2); $DB_NAME_Q = $DB_NAME_X[1]; //SQL CREATE TABLE $SQL_CREATE_TABLES = 'SHOW CREATE TABLE `'.$DB_NAME_Q.'`;'; $ARRAY_ROWS_CREATE_TABLES = sqlQueryView( $SQL_CREATE_TABLES, $CONNECT_BACKUP ); $SQL_CREATE .= $ARRAY_ROWS_CREATE_TABLES[1].";"."\n"."\n"; $SQL_QUERY = sqlQuery( $VALUE2, $CONNECT_BACKUP ); $SQL_INSERTS .= sqlInsert( $DB_NAME_Q, $SQL_QUERY ); } } // END GERA O SCRIPT SQL // INI MOSTRA SCRIPT SQL PRONTO //DEBUG print_r ( $SQL_CREATE ); //DEBUG print_r ( $SQL_INSERTS ); // END MOSTRA SCRIPT SQL PRONTO
Browser other questions tagged mysql backup
You are not signed in. Login or sign up in order to post.
Thank you. That’s exactly what I needed.
– lionbtt
But this way, if you don’t take the bank off the air, you can’t create integrity issues?
– ceinmart
Well, in my case, it’s not a big problem. But if the bank has a lot of competing access, that could be a problem. If you are using Innodb (or some transactional transaction), just add the "-single-transaction option"
– jpkrohling
In time: note that this script is not for professional environments, high competition and availability. For such environments, RTFM is much more recommendable than an entry in stackoverflow, especially when the professional should decide the tactics based on the environment in which the systems are running (ie: maybe it is better to back up based on a slave’s data)
– jpkrohling
There is an opensource Spoon tool that helped me a lot. Not only with Bkps, but other tasks. Search for Pentaho Data Integration.
– Thiago Cunha