1
Context
Here at the company we have a very complex query that encompasses several business rules, for presentation in a Workflow and for now I am evaluating it on Sqlserver.
The performance of this query was bad and with the growth of the tables, the DBA warned us to reevaluate the query because it is becoming very expensive for the bank.
When I started the query had a cost [Estimated Subtreecost = 19,891], from there I started to evaluate and remove some optional fields, left unnecessary joins, subselects of the projection and the cost increased instead of decreasing [Estimated Subtreecost = 20,232]
When running the reduced query, I had the result in 23s and the original query (with lower cost) 1min18s
Question
I am not very experienced in evaluating plans of execution of queries and from there came a doubt, the cost is based on the amount of data returned? at run time? at cost to the bank?
Can the query with higher cost be better than the original with lower cost?
Note: I cannot share the query because it is confidential data and the query is very complex and it would be difficult to create a similar fake query
"There are some common false assumptions about the estimated cost of execution. In particular, people often assume that the estimated cost of execution is a good indicator of how long the query takes to be executed and that this estimate allows differentiating good plans from bad plans. But this is not true.": http://technet.microsoft.com/pt-br/magazine/2007.11.sqlquery.aspx The article is very informative. The cropped part is just a decoy :-)
– Caffé