2
I have a web application in Asp.Net MVC with C# and received a new requirement where the goal is to read a text file with thousands of lines, each line containing a set of data that will be used to insert and update the database.
My question is what best practices to adopt in this development. With my little experience, I know that the large volume of information makes the processing take a long time, reaching more than an hour and end up causing timeout. I believe that just increasing the timeout is not the best solution.
I also need to present to the user the processing situation, preferably in real time, what would be my options?
You can consider Bulk operations as discussed in this answer: (http://answall.com/a/9344/3084). Operations in Bulk are provided by the database itself and the goal is exactly this, to load large volumes of data.
– cantoni
If I got it right, Bulk would be for me to save the text file data in a database table. In my case, the "columns" of each row not separated by a character or tab (FIELDTERMINATOR), which seems to me to be a prerequisite for Bulk to work. Files follow their own layout and I am not at liberty to change that layout. I would first have to edit row by row, adding something that separates these columns. And also how would be the validations of each field? I can validate the file before, but still have the question of performance.
– Tiago Azevedo Borges