0
The scenario of my doubt is the following: I have a system that reads a file that has around 3 million lines, each line has a separator, I identify each item on the line, I do the proper treatment that should be done on each of these items, and turn each line into an object, and this object is later added to a list, which is saved later in the database. Lines that are defective must go to another list so that a file with defective lines is generated.
The problem occurs when the number of lines is for example 5 million, causing memory bursting. I wonder if anyone’s been through it, and how did you solve it?