1
Hello, I am creating a table scheme that needs to be well optimized, the reason for that will be tables that will have 3 to 5 million records. The big concern here is with regard to the search, in this case the famous LIKE %%... can not even pass close, because of this at the time of each insertion, 3 keywords will be created based on the main field as a name of something, example:
Dummy table "Cities" for example only with the following fields
id|name|visibility|key_1|key_2|key_3
So let’s insert the city of "São Paulo" as a record, in this case we will have two keywords that will obviously be removed accents (programmatically) and all minuscule.
key_1 = sao
key_2 = paulo
That way a search in a table with millions of records would be like this:
select name from cities
where visibility = 1
and (
key_1 = 'sao' or
key_1 = 'paulo' or
key_2 = 'sao' or
key_2 = 'paulo'
);
The question is: is this really faster than Full text or LIKE% wildcards? is good practice?
I have seen large companies using this technique, however, I would like your opinion.
Thanks in advance!