1
Good Afternoon
Currently, I have a PHP script that reads a CSV file of 6801 lines, and performs an INSERT as long as there are records. I am currently able to enter 19 records per second.
In addition to this INSERT routine that runs every 5 minutes, users will connect to the database and etc..
I would like somehow to increase the rate of 19reg/sec and still make simultaneous connections also perform well.
Follow the source of the reading function:
error_reporting(0);
date_default_timezone_set('America/Sao_Paulo');
set_time_limit(600);
$server = "localhost";
$username = "root";
$password = "";
$database = "datab";
$conecta = mysql_connect($server, $username, $password) or print (mysql_error());
mysql_select_db($database, $conecta) or print(mysql_error());
function lerCsv(){
$handle = fopen("csv.csv", "r");
$num = 1;
$inicio = date("Y-m-d H:i:s");
Logger("Inicio da insercao -> ".$inicio);
while (!feof($handle) ) {
$linhas = fgetcsv($handle, 0);
$a= $linhas[0];
$b= $linhas[1];
$c= $linhas[2];
$d= $linhas[3];
$e= $linhas[4];
$f= $linhas[5];
$sql = "insert into bldatant values(null, '".$a."', '".$b."','".$c."','".$d."','".$e."','".$f."');";
if($query = mysql_query($sql)){
Logger("Registro ".$num." inserido com sucesso.".$sql);
$num++;
}
else{
Logger("Erro ao inserir registro".$num.". Erro->".mysql_error());
}
$sql="update blparams set valorparam=NOW() where nomeparam='BP_LASTUPD'";
$exc = mysql_query($sql);
}
fclose($handle);
mysql_close($conecta);
}
Put the source code, a lot of things can influence that.
– rray
@rray added :)
– Lucas Torres
There are a number of adjustments that could be employed, one would be to make the insert straight to the bank eliminating the cvs file, with a web service for example, another measure would be to improve the hardware put a SSD HD, take a look at the status and see what is neck in your database and adjust via hardware or adjust the database configuration file ( tunning ) are so many possibilities that it would be better if you specify what type of fit you would like to apply
– SneepS NinjA
Study how to do batch operations in Mysql (Bulk Insert, Bulk update). Basically you disable the auto-commit for the current connection, do a batch of, I don’t know, 100 operations at a time, and then just commit. This speeds things up a lot when the number of Ins/updates followed is large.
– Piovezan
@Sneepsninja I am currently running on my machine, i5 with 4gb of RAM, in the future, it will go to an online hosting, probably Locaweb, so that I have no way to touch the hardware. I think it’s better to even adjust code and database
– Lucas Torres
@Piovezan will take a look at this, but by disabling the auto commit I run no risk of losing the transactions?
– Lucas Torres
The auto commit serves to give commit automatically after each Insert/update, ie, make the operation worth. When you disable it, you have to commit explicitly via code after each batch of Inserts/updates (this batch, which is what happens between a commit and another, is what’s called a transaction). Once this is done, you can enable the auto commit again if you have any other operation to do, otherwise it is not necessary. It only holds for the connection you are currently using, does not interfere with other connections. Also learn the rollback, which undoes the transaction in case something goes wrong.
– Piovezan