2
I need to read an 80 giga file using PHP to include in a postgres database. The file is a txt with a column layout.
THE IDEA WOULD BE NOT TO DIVIDE IT, BECAUSE THE PROCESS WOULD BE UNVIABLE, SO I NEED TO READ THE FILE AT ONCE IF IT TAKES SEVERAL HOURS.
The problem here I believe is memory.
What is the best way, if possible, to read this file without having to increase memory? it is possible to read the file in pieces so that there is no memory overflow? what would be the function for that?
Don’t forget to change the configuration directives, the running time limit of the script in php.ini, disable safe mode, etc. I believe you know this.
– Fabiano Monteiro
@Fabianomonteiro yes.. I’m already taking into account the time limit and safe mode. Thank you!
– Rodrigo Vicentin