Query takes time to complete

Asked

Viewed 113 times

0

Good afternoon, you guys, am having a problem executing a query in c# using NPGSQL.

            try
            {
                bdConn.Open();

                NpgsqlCommand sql = new NpgsqlCommand();
                sql.Connection = bdConn;
                sql.CommandText = query;
                sql.CommandTimeout = 600000;
                NpgsqlDataReader dr = sql.ExecuteReader();                    

                while (dr.Read())
                {
                    string xml = dr[1].ToString();
                    xml = xml.Replace("<![CDATA[", "").Replace("]]>", "");

                    switch (cbbDocumento.Text)
                    {
                        case "NFe": System.IO.File.WriteAllText(tbPasta.Text + "\\" + dr[0].ToString() + "-nfe.xml", xml); break;
                        case "NFCe": System.IO.File.WriteAllText(tbPasta.Text + "\\" + dr[0].ToString() + "-nfce.xml", xml); break;
                        case "CTe": System.IO.File.WriteAllText(tbPasta.Text + "\\" + dr[0].ToString() + "-cte.xml", xml); break;
                        case "MDFe": System.IO.File.WriteAllText(tbPasta.Text + "\\" + dr[0].ToString() + "-mdfe.xml", xml); break;
                    }
                }
            }

            finally
            {
                bdConn.Close();

            }

In this section, after Executereader the connection remains open, and while there are records to be read the query is still running in the database.

This ends up consuming a lot of processing from my database server.

Is there any way to run a query and close that connection before you start working with the returned data?

Grateful.

Update: The result of this query has about 80 thousand records and I saved the result in a physical location. The problem is that it takes about 30 minutes to save this amount of records and this is "killing" the database server.

  • You can’t see what you’re doing. You can close it, you did it? But it won’t change anything. O Datareader is usually the fastest way, the problem seems to be in the way you are doing. Or the query It’s bad, or there’s just too much data anyway. You would need to put information that gives subsidy to us evaluate, in the current form the question is not clear, or can not be reproduced.

  • But does Datareader.Read() still use a database connection? Or does it just interpret the query result that was executed in Executereader() ?

  • You still use the connection, but that doesn’t mean anything. You’re focusing on something that’s not the problem.

  • I’ve changed the question now. See if you can help me.

  • 1

    Maybe the biggest cost is to record all these files

  • You can read and "save" the entire dataReader to a list or something similar and close the connection, and then work the writing part of the files with this auxiliary variable. I don’t know how much it will improve performance, or if it will get in the way even more, but maybe it’s worth the test.

  • Bring all the data you want to process in memory, close the connection and process with Parallel.For to optimize the recording, the way it is you reach the time of the timeout generating Exception.

Show 2 more comments
No answers

Browser other questions tagged

You are not signed in. Login or sign up in order to post.