0
I have a file with almost 700mb that has numerous lines with Json within it. I need to treat every json line-by-line and insert them into my database.
The point is, today I’m using the following code:
using (StreamReader arquivo = new StreamReader(System.IO.File.OpenRead(file), Encoding.UTF8))
{
while (arquivo.Peek() > -1)
{
//tratamento do arquivo.
}
}
How can I read the lines in parallel for the process to get faster?
'Cause you don’t carry it all at once and treat it right in memory? Link1 - To read one line at a time, Link2
– Andre Mesquita
The file is 700 MB, unfortunately there are many files. I will not have resource available for such.
– Jhonathan
Take a look at this thread from Soen-1. Look at this case too Soen-2
– Andre Mesquita