7
I am using the Entity Framework to Insert and Update Thousands of Records.
At first it was slow, but after putting the code down it improved the speed.
db.Configuration.AutoDetectChangesEnabled = false;
db.Configuration.ValidateOnSaveEnabled = false;
Speed Solved.
However, I can not reach the end because, as you insert, the model is getting large and increasing memory consumption until give me exception to reach 1,5GB of use.
Note: I have tried using the AsNoTracking()
.
I’m also trying to recharge the model from time to time, but it doesn’t lower consumption. It only increases.
Has anyone gone through this or has any idea?
Part of the code:
foreach (var prd in produtoGradeAux)
{
if (dbPdv.Database.Connection.State != ConnectionState.Open)
dbPdv.Database.Connection.Open();
using (var transaction = dbPdv.Database.Connection.BeginTransaction(System.Data.IsolationLevel.ReadUncommitted))
{
dbPdv.Database.UseTransaction(transaction);
i++
Produto prodAux = null;
var pAux = dbPdv.produto_grade.AsNoTracking().FirstOrDefault(x => x.produto_GradeIdGw == prd.produto_gradeID);
if (prd.cd_grade.Trim().Length > 6)
pAux = dbPdv.produto_grade.AsNoTracking().FirstOrDefault(x => x.cd_grade.Trim() == prd.cd_grade.Trim());
if (prd.cd_grade.Trim().Length > 6)
prodAux = dbPdv.produto_grade.AsNoTracking().Where(x => x.Produto.cd_ref.Trim() == prd.cd_grade.Trim().Substring(0, 6)).Select(x => x.Produto).FirstOrDefault();
int lnFamiliaId, lnGrupoId, lnUnidadeId, lnMarcaId, lnLinhaId;
RetGrupos(prd, out lnFamiliaId, out lnGrupoId, out lnUnidadeId, out lnMarcaId, out lnLinhaId, dbPdv);
if (pAux == null)
{
if (prodAux == null)
prodAux = RetProduto(dbPdv, prd.Produto, lnFamiliaId, lnGrupoId, lnUnidadeId, lnMarcaId, lnLinhaId);
pAux = RetProdutoGrade(dbPdv, prodAux, prd);
SetProdutoEan(dbPdv, prd, pAux);
SetProdutoCf(dbPdv, prd, pAux);
SetProdutoEstoque(dbPdv, lojaAux, prd, pAux);
SetProdutoPreco(dbPdv, prd, pAux);
}
else
{
AtuProdutoGrade(dbPdv, prd, pAux);
AtuProduto(dbPdv, prd, pAux, lnFamiliaId, lnGrupoId, lnUnidadeId, lnMarcaId, lnLinhaId);
AtuProdutoEan(dbPdv, prd, pAux);
AtuProduto_Cf(dbPdv, prd, pAux);
AtuProduto_Preco(dbPdv, prd, pAux);
AtuProdutoEstoque(dbPdv, lojaAux, prd, pAux);
}
transaction.Commit();
//Tentar Melhorar performance...
if (i % 1000 == 0)
{
HabilitaDb(dbPdv);
dbPdv.Dispose();
dbPdv = GetDbPdv(pdvAux);
DesabilitaDb(dbPdv);
}
}
}
You did what you should do and can’t help much more without knowing how your code is. Have you thought about doing it with SQL instead of EF? It does not need to be used for everything. It is possible that you have a memory leak. I answered this (specific case) earlier http://answall.com/a/84277/101
– Maniero
Opa.. Thank You For The Answer.. I will put Part of the Code in the Question
– PachecoDt
Each entity added to the context will continue there as long as the context exists or until you clear the context. Large-volume sequential entity processing you should do in batches, for example: every 100 entities you persist with and eliminate from context.
– Caffé
Opa Thanks @Caffé .. To Doing 1000 in 1000... I tried 500 too but I didn’t see much difference
– PachecoDt
Apart from your code (which I didn’t spend much time trying to understand) I’ll give you an algorithm suggestion: creates context, after X entities read, processed and persisted, sends the changes to the bank, destroys the context and creates another context for the next X entities. If you want to reuse the same context you will have to untie each entity so that they can be deleted from memory - I see no need; destroy context and create another new one is ok.
– Caffé
Good idea @Caffé.. I’m using the
dbPdv.Dispose();
and then recreating it in through theGetDbPdv()
in the Dbpdv Variable. Does It Not Work? It Really Wouldn’t Be Viable.– PachecoDt