0
I’m posting this question here, but I’ve already researched Google, Stackoverflow itself, I’ve tested changing several variables, disabling the Foreign Keys, Indexes and I don’t know how to import anymore, but I really don’t understand why an import using Mysql LOAD DATA is so slow.
I have a CSV with the structure below containing approximately 94,000 lines:
nome;email;empresa;cidade;estado;datanasc;ativo;grupo;
I want to perform two imports with the same file (two queries). One for the group (with "dummy" columns and containing the group name at the end) and the other query is to import the other columns and ignore the last field. So far so good, the commands I believe are ok, I really believe it is some problem with the file, with the structure or with the PC.
--- PRIMEIRA (Já chegou a rodar por 12 horas e não terminou)
LOAD DATA INFILE 'C:/caminho/arquivo.csv'
REPLACE INTO TABLE grupo FIELDS TERMINATED BY ';' OPTIONALLY ENCLOSED BY '"' LINES TERMINATED BY '\r\n' IGNORE 1 LINES
(@dummy, @dummy, @dummy, @dummy, @dummy, @dummy, @dummy, nome)
SET
id = NULL, ativo = 1;
--- SEGUNDA (não inicia, porque não termina a primeira)
LOAD DATA INFILE 'C:/caminho/arquivo.csv'
REPLACE INTO TABLE assinante FIELDS TERMINATED BY ';' OPTIONALLY ENCLOSED BY '"' LINES TERMINATED BY '\r\n' IGNORE 1 LINES
(nome, email, empresa, cidade, estado, datanasc, ativo, @dummy)
SET
id = NULL;
Okay, I don’t have the best machine, but the file isn’t that big, it’s about 8mb. My settings are Windows 64bits with 4GB RAM and Mysql 5.6;
With small files it works that is a beauty, quickly.
In my settings of my.ini I left so:
innodb_buffer_pool_size = 2G
innodb_log_file_size = 256M
Please, I don’t know what else to do, this process was supposed to be fast, I don’t want to have to go line to line importing via PHP, and this is the process that should be more "performatic". Grateful for the help.
Andre found these answers in so, see if it helps you: http://dba.stackexchange.com/a/31791
– JcSaint
Thanks for the reply. I also found this link, I tested these methods. The problem is that the table is Innodb and some things do not work in this type of table. What I did was dynamically create a Myisam table and import the records there, then move to the official table (Innodb), it’s extremely fast like this, a matter of seconds, the bad thing is that I have to create a "temporary" table to work.
– Andre
My machine has only 2GB ram and I don’t use Msql to manage CSV data, but I use a nosql database that runs files in csv format directly.500 Mb of data can be managed in 1 second (most querys). You should turn your CSV into an insert script.Your 8mb file is small.
– user101090