1
I have a system composed of several tables.One of the tables is called Publications
and contains 15 varied fields, 3 of which are the main ones of my doubt:
1. titulo - varchar(100)
2. subtitulo - varchar(200)
3. texto - text
In a short time (18 months +/-) this table will have more than 1 million records and we are concerned about the performance of searches for records.
What is the best way to make queries by keywords inserted by users in a form, using the LIKE or FULLTEXT or whatever?
We are concerned with searches for compound words such as "house music" and mainly speed and process performance.
Where I start, I’m used to darlings simpler and less impactful and I’m not able to handle all the variables involved.
I usually follow this when the going gets tough: 10 techniques for optimizing SQL statements
– Marconi
@Marconi the type of database and language are important in the question, you do not agree?
– Flávio Kowalske
Because it is a historical series, partitions are usually made by date, keeping only the most recent records in the main table and the oldest ones in the secondary tables or partitions. In Sql-Server I know that there are tools that automate the creation and record loads of these partitions. If these functions do not exist in Mysql, you can do this manually. That is, migrate the old records and separate the queries, making the second query only if necessary.
– Alexandre Cavaloti
Related: How to improve reading performance of an SQL database?
– vinibrsl