Create Json with a large amount of Data and Return with Webapi

Asked

Viewed 954 times

1

Scenario: I have a Webapi that executes a method in my service layer that executes a query either returns 500k result lines, containing 90 columns.

Method that makes the query and traverses the results:

   using (var command = Session.Connection.CreateCommand())
   {
      command.CommandText = dataSource.Query;

      var name = "";

      using (var reader = command.ExecuteReader())
      {
         while (reader.Read())
         {

         }
      }
   }

Today to try to build a JSON I use the Jsontextwriter class:

StringBuilder sb = new StringBuilder(int.MaxValue);
StringWriter sw = new StringWriter(sb);

using (JsonWriter jrWriter = new JsonTextWriter(sw))
{
   using (var command = Session.Connection.CreateCommand())
   {
      command.CommandText = dataSource.Query;

      var name = "";

      using (var reader = command.ExecuteReader())
      {
         while (reader.Read())
         {
            int fields = reader.FieldCount;

            jrWriter.WriteStartObject();

            for (int i = 0; i < fields; i++)
            {
               jrWriter.WritePropertyName(name);

               name = reader.GetName(i);

               if (reader[i] == null)
               {
                  jrWriter.WriteValue("");
               }
               else if (reader[i] is string)
               {
                  jrWriter.WriteValue(reader[i].ToString().Trim());
               }
               else if (reader[i] is bool)
               {
                  var value = (bool)reader[i];

                  if (value)

                     jrWriter.WriteValue("true");

                  else
                  {

                     jrWriter.WriteValue("false");

                  }
                  else if (reader[i] is DateTime)
                  {
                     jrWriter.WriteValue(DateTime.Parse(reader[i].ToString()));
                  }
                  else
                  {
                    jrWriter.WriteValue(reader[i].ToString().Trim().Replace(",", "."));
                  }

                  jrWriter.WriteEndObject();
             }
       }
  }

}

Problem:

After iterating 150k records, the Systemoutmemoryexception error is generated

I have tried several other solutions to try to send this data mass to the front, but always bump into the same Exception.

I have tried to convert each record of mine to an array of bytes and include each record in an array of bytes, so I was able to iterate all the records, but at the time of making the return to the front, pop the Exception Systemoutmemoryexeption.

If someone already needed to send a lot of data over the webApi, I am open to receiving tips in a better way to do this, because I believe that the problem is always having to convert everything to string.

  • Not only in C#, but in several other languages, the array before data generation takes up a lot of memory space. There is no way to partition this execution?

  • Have you thought about the performance of the page that will display this information? I think you should think about the possibility of paging it.

  • Unfortunately, there is no way I can page this, I will display this information in a PIVOT component for data analysis, IE must be available all the data mass.

  • Every letter in your Json takes up a memory space to exist and also needs to be processed... You’ll need a machine to run this, there’s no miracle in a situation like this. As optimized as your code is, to be really fast you will need a server

  • In addition to the memory overflow, this operation will take a long time to generate the result for your users. I assume you are using Sql Server, correct? A better way would be to create SSIS Packages to export your file, and once completed, return the path to the file. That is, generate your file asynchronously.

  • I think the best way to traffic a lot of data is using paged query. https://channel9.msdn.com/Blogs/ASP-NET-Site-Videos/paging-and-querying

  • @Diegosilvalima If you really want to do so, try to generate JSON in file and not in memory and at the end stream the file as return.

  • I also went through this problem, but it was when generating the json in Delphi, it is not a C# problem but an array problem. What I did to avoid this mistake was to partition the information

Show 3 more comments

1 answer

1

@Diegosilvalima

I see four possible solutions.

1) Partition the search for information, that is, instead of taking the 500k lines at once, trigger requests chained asyncs that fetch 50K at a time (for example) and mounting your PIVOT and freeing up memory while doing so.

2) You will need much more ram memory and there is no other way. You are saving EVERYTHING in ram and 500k lines depending on the size of these lines there is not much to do.

3) Mix ram and file. Go processing everything, slowly saving it in a file (or a BD) to save the final JSON and then you read the JSON file and again mounting the PIVOT as you read and free memory.

4) RECOMMENDED Paging. PIVOT is fully capable of displaying page information. And then think of your end user. What human user of a system can see (or even have interest in seeing) 500k of rows from a single table? You’ll make him wait much longer for the result, the page will crash (I’m pretty sure Chrome will want to kill the tab by not being responding long).

I can’t imagine any scenario where I can’t have pages for such a large volume of information.

In case you really can’t paginate, I know why there, I think the only solution for you is item 2. More ram memory, much more.

If you are dealing with data analysis, there is another aspect that does not involve improvement in the C# code or anything like that. It involves pre-computing the data, transforming it into other smaller, already computed/analyzed structures. But even so it will not exempt the need for more memory.

And remember, if this system is multi-user what could happen if 3 users request this same report at the same time on the same server.

Browser other questions tagged

You are not signed in. Login or sign up in order to post.