Database control (Mysql) in production environment

Asked

Viewed 257 times

0

For a good part of my project I came working with git archive on a local server, (Debian 7), this week I migrated to gitlab, without any complication in the change, I changed only for portability, because I am always working in a place or other outside the same network.

In this migration, I came across a problem in the issue of controlling my database already in production.

The problem if gives directly in that, in production environment, previously as the repository was on the same database server, I had a script that read the database every 24 hours, and gave a dump in the tables with changes in a folder structure separating by date (done in PHP, very simple thing).

Something like that:

bd_bckps/
        /01-01-2017/..
                   /customers.sql
                   /logs.sql
        /02-01-2017/..
                   /logs.sql
                   /internal_users.sql
        /[prossegue os dias]

Moving on to gitlab, the backup of the database continues to work, but outside of the version structure and the general backup, in addition to which, future modifications to the development database that must be upgraded for production will be out of control.

Well, I believe that this form of comic book control that I use is not very safe,.

The question is:

Is there any way to automate git push for a certain time ? (Shel Script ?)

Or:

Which database version control method would be recommended to use in a massive data project ?

PS.: I hadn’t worked with a bank yet that was going to get that big, and I don’t have much experience with Mysql.

1 answer

1


You could use crontab, for example every day at midnight, access the /directory folder, commit with current date and time, and push)

0 0 * * * git add . && commit -m `date` && git push origin master  >/dev/null 2>&1

Browser other questions tagged

You are not signed in. Login or sign up in order to post.