3
I have an Asp.Net MVC project with Entity Framework 4 and SQL Server, where there is a Action
who receives via POST
an archive csv
.
Taking into account the use of memory, processing time and resources or any question regarding the database, which of these two code options is the most recommended?
An approximate number of 15000 lines can be used in the file.
Option 1:
var postedFile = Request.Files[0];
using (var reader = new StreamReader(postedFile.InputStream))
{
using (TransactionScope scope = new TransactionScope())
{
while (!reader.EndOfStream)
{
var line = reader.ReadLine();
var values = line.Split(';');
// Validações
db.Produtos.Add(new Produto { Descricao = values[0], Valor = values[1] });
}
db.SaveChanges();
scope.Complete();
}
}
Option 2:
var postedFile = Request.Files[0];
using (var reader = new StreamReader(postedFile.InputStream))
{
var produtos = new List<Produto>();
while (!reader.EndOfStream)
{
var line = reader.ReadLine();
var values = line.Split(';');
// Validações
var produto = new Produto
{
Descricao = values[0],
Valor = values[1]
};
produtos.Add(produto);
}
using (TransactionScope scope = new TransactionScope())
{
produtos.ForEach(p => db.Produto.Add(p));
db.SaveChanges();
scope.Complete();
}
}
There is a Option 3 best?
You’re saving a file
.csv
in your database in an array of bytes?– user46523
no no, I want to do everything in memory...
– Jedaias Rodrigues
Good I think I almost got it, are you reading the file and added each line in your database? File Template . csv? would help a lot!
– user46523
@John is just that, and the model is
Descrição;Valor
by the data of the question. P– Marciano.Andrade
@John, my doubt is not how to do it, but how best to do it. The above is just an example, in this case we would have two columns, a description and a value separated by a semicolon. The idea is to read the file without saving it on the disk, but for this some questions arise: which solution requires less memory usage? When creating a Transactionscope with 15000 items I can impair the performance of the application or bank?
– Jedaias Rodrigues
So@Jedaiasrodrigues you didn’t post this in your question, this 15,000 line scenario up for pure Sqlclient is complicated. i understood the reading in memory, but, missing prerequisites that you just talk for example 15,000 thousand lines is a worrisome factor for
Entity Framework
. Seeing so The first is the best with a fit oftry catch roolback
– user46523
@Jedaiasrodrigues what version of ASP MVC?
– user46523
@João understood, I’ll be editing the question. I’m using MVC 4, the project is old.
– Jedaias Rodrigues
@Jedaiasrodrigues, I advise you not to use the
List<T>.ForEach
, it makes the debuggar code bad, is slower, and is assigning a function to the list that is not hers. When reading the CSV, be careful with escape characters and other rules of the same, then I advise you to use the Csvhelper. Finally, for performance reasons, avoid saving the 15,000 records at once, make a Savechanges every 100~1000 records, or better make use of the Sqlbulkcopy– Tobias Mesquita
As I told you, and @Tobymosque said above, you can use Sqlbulk. I needed to save 100,000 records in my comic. Using the foreach would take about 5 minutes. Instead of saving directly, I used foreach to save the data to a list, and used Bulkinsert to enter the data. The task changed from 5 minutes to about 15 seconds. If you need it, let me know and I’ll get back to you later.
– Randrade