6
I have a Maxmind table, with more than 3 million cities. I created the indexes for a simple select and is at satisfactory performance.
The problem is that I am using an autocomplete and for that my select uses like
, but always returns timeout (Maximum Execution time of 30 Seconds exceeded).
I chose to use MyISAM
as a table with low or no changes.
From what I’ve been reading, 3 million is not an excessive amount, but I don’t know how I should proceed to optimize this search.
CREATE TABLE `cidades` (
`id` int(11) NOT NULL AUTO_INCREMENT,
`pais` char(50) DEFAULT NULL,
`regiao` char(100) DEFAULT NULL,
`cidade` char(100) DEFAULT NULL,
`titulo` char(100) DEFAULT NULL,
`lat` char(50) DEFAULT NULL,
`lon` char(50) DEFAULT NULL,
PRIMARY KEY (`id`),
KEY `ID` (`id`),
KEY `CITY` (`cidade`),
KEY `ID_2` (`id`,`cidade`,`titulo`)
) ENGINE=MyISAM AUTO_INCREMENT=3166036 DEFAULT CHARSET=utf8;
select cidade.titulo
from city
where cidade.titulo like '%Rio de Janeiro%'
In fact, I had tested with
like 'Rio de Janeiro%'
and the response time was virtually identical to select. Then the problem would be the % at first? This way he makes use of the indexes?– Papa Charlie