What better way to resolve the slowness of reporting?

Asked

Viewed 111 times

1

At a time, I developed a page of reports, users have an option of filters, this working, but as the data in the database are growing this getting slower and do not know what is the best option to solve this problem, some of the things I thought were:

  • If it is always better to bring the data already calculated by the query or if in some cases it is better to return all the data, and make these calculations with use of php, for example?

  • Am I creating the proper querys? What I mean by that, I am doing the accounts in several sub-querys, follows an example:

    SELECT DISTINCT t.nome_tag,
        (
            SELECT count(vw.tag_id) FROM view_relatorio vw
            WHERE vw.tag_id = t.id AND vw.cliente_id IN (1,2,3,10,20)
              AND vw.conforme = 1 AND vw.dt_validate >= '2016-09-09'
        ) as conforme,
    
        (
            SELECT count(vw.tag_id) FROM view_relatorio vw
            WHERE vw.tag_id = t.id AND vw.cliente_id IN (1,2,3,10,20)
              AND vw.conforme = 0
        ) as naoConforme,
    
        (
            SELECT count(vw.tag_id) FROM view_relatorio vw
            WHERE vw.tag_id = t.id AND vw.cliente_id IN (1,2,3,10,20)
        ) as totalAtribuido
    
    FROM view_relatorio view
    JOIN tag t ON t.id = view.tag_id
    WHERE view.cliente_id IN (1,2,3,10,20)
    

Remembering that this is a small example and has several other querys.

I have built views to improve, as I read in some places, I have also indexed columns in the necessary tables, finally, as to improve the part of the database I have made several adjustments.

  • 2

    face makes a routine to generate the result per day and keeps in a view in mysql dai vc so le esta view , the processing will be much better

  • this was an option I also thought, I’ve used this in another system, but in this case, want the data in real time, but thank you very much for the idea.

  • 2

    A suggestion, the first step is to measure the time qto takes the query, the second I would do an experiment would exchange these sub-consultations for 3 queries and in the end would make their Union now improves performance just by measuring. Don’t forget the commands Analyze and explain they can help identify bottlenecks in consultations.

  • 1

    I may be mistaken, but it seems that you are trying to get the data in repeated ways for each item of the internal loop, when you could resolve this in the JOIN and in PHP would group the data.

  • I’m going to try to do this test @rray commented on, but from what I’m observing in other querys that I have, it doesn’t seem to get much better, but it doesn’t hurt to test.

  • I also need to test what @Guilhermenascimento commented, as it may be a solution, although a little more laborious now, but it’s an option.

Show 1 more comment

1 answer

2


The 3 queries are practically repeated (one to pick up the conformers, one handle the non-conforming and one handle the total). Count() is an operation that by default costs the bank because it has to scan the records. It also has Joins in 4 queries. The bank cannot take advantage of all these repetitions. In this case it is better to do a query that takes a notary and add it in an array. Another suggestion is to review the data model to see what can be improved (separate into more tables: for example [tag_id, cliente_id, as] or even just [tag_id, as]). To measure time, take a look at the command "EXPLAIN ...". It shows how the query will be executed.

  • This sounds like what @Guilhermenascimento commented, I will have to do some tests and work on it, it will not be simple, but I hope it is an alternative and improve the performance. Thank you

Browser other questions tagged

You are not signed in. Login or sign up in order to post.