I found the answer in Michael Pryor’s answer on Stack Overflow.
According to him, it is necessary to change a Mysql configuration, and pass an extra parameter to the client on the command line, setting a high heat to max_allowed_packet
(he uses 100M
).
Change in the my.cnf
(or my.ini on Windows)
max_allowed_packet=100M
Alternatively, you can run the following commands on the server:
set global net_buffer_length=1000000;
set global max_allowed_packet=1000000000;
Change in client call
mysql --max_allowed_packet=100M -u usuario -p banco < dump.sql
(I restarted the server before calling the client.)
Your
dump.sql
had the consultations ofINSERT
with X limitrows
? Or aINSERT
dealt with all theROWS
?– Zuul
I have a
INSERT
with severalVALUES
per table (this was generated by mysqldump). As I see if there is a limit of X Rows?– bfavaretto
When they are very large tables, the
INSERT
are repeated every 50rows
for example. Otherwise aINSERT
with severalVALUES
exceeds package limits. SeveralINSERT
gives smaller packages and becomes more viable.– Zuul
My file is hard to consult because it is so large, but I have now discovered that there are indeed several
INSERT
s per table. I can’t tell how many lines perINSERT
. Still it was giving the error until I increased the max_allowed_packet.– bfavaretto
I make use of the gvim to query text files larger than 1GB. So far manage to open an 8GB without problems on Linux :)
– Zuul
I’m using the vim too, on Mac, but when rolling sometimes it skips a "screen" whole (I think it’s because the line is too big).
– bfavaretto
These mega lines are probably the reason why the package limit is exceeded!
– Zuul
I also think, but the solution I posted below solved (which makes sense). If you know of a solution at the other end, that forces the
mysqldump
to generate smaller packages, it would be very useful!– bfavaretto