3
I have here a "small" problem with a DataTable
.
From an SQL query I am getting a huge amount of records (> 1,000,000), and although the consultation is relatively quick, the loading of the DataTable
from the Fill()
is extremely slow and in some cases generates an error of System.OutOfMemoryException
because the process where the code is being executed exceeds the 4gb (compulsory compilation in x86).
My question is whether there is any way to reduce the charging time of DataTable
and at the same time prevent the memory from exceeding the limit.
Note that this information will be used later for XML serialization.
Not using Datatabel? : ) Really. Other technologies were created because this was problematic.
– Maniero
Yes, if there really is no other chance we may have to change! Will using a
DataReader
and go filling the objects, one by one, will it be faster? Or using aIEnumerable<T>
?– João Martins
Much faster. Doing it by hand or using Dapper will be the quickest options. Even the Entity Framework will be much better most of the time, especially Core, although a little more complex.
– Maniero
Thanks @Maniero, I think that with one of these options we can overcome the problem!
– João Martins
also think about paging, after all if only the data weighs enough, imagine an XML with more than 1kk of records...
– Rovann Linhalis
@Rovannlinhalis in this case has to be a single file, however large it is.
– João Martins
complicated... but then would think of using a datareader to go reading and writing, without storing everything in memory
– Rovann Linhalis