1
having regard to the following scenario::
using (TransactionScope scope = new TransactionScope())
{
var allObjects = new List<MyObject>();
var objects = myStaticClass.GetAllObjects();
foreach (var myObject in allObjects)
dbContext.MyObject.Add(myObject);
dbContext.SaveChanges();
scope.Complete();
}
In the loop I have about 250.000 records to be processed, that is, those so many to be added to a DbSet
to be stored in the database at once through the SaveChanges()
of DbContext
.
The problem here is the amount of objects to be added to DbSet
, that at a certain point causes a System.OutOfMemoryException
(hits close to 2Gb of occupied memory), this because the architecture of the process is x86, something I cannot change.
What I need, even urgently, is a solution to this case, that is, I need to store the 250.000 records but I have to avoid the System.OutOfMemoryException
to let the process end.
Hi @Augusto. That sounds much more viable and code friendly. Thank you. Could you at least withdraw the negative vote from my question?
– João Martins
Edit so I can remove.
– Augusto Vasques