13
Setting
Some time ago, I tried to use GIT to restore some backups, mostly small files, git behaved very well by versioning them when the changes from one commit to another weren’t big, but on a specific server, there were large binary files, which GIT could not handle, I couldn’t even make the initial commit.
Problem
If git did not behave well with these files (errors were related to memory problems), the real limitations when handling binaries with GIT are open, of course handling binaries is not the purpose of GIT, but the information I obtained at that time was not clear enough.
Question
- What is the relationship between the limit of a binary file to be checked in GIT with processing capacity and machine memory?
- It’s safe to keep binaries in GIT, even when small versioned in many commits?
- What method can we use to optimize GIT so that it behaves better when binary conversion cannot be avoided?
You can cite solutions like Git Annex or Git Bup, but just as a help to the answer, it refers to pure GIT behavior, without plugins or Forks
When you talk in very large files you’re talking about what? Megas, Gigas, tens of Gigas?
– Alexandre Marcondes
Too big = Gigs ( ~ 15+)
– hernandev
Can you paste here the memory failure message you are having? The error may be in several parts of the process and the resolution is different for each one.
– Alexandre Marcondes