How to improve mongoDB query performance in collections with millions of records and each with many searchable attributes?

Asked

Viewed 395 times

3

There is a collection with approximately 20 million records and each of them with about 200 searchable attributes.

Example:

{atrib001:"abc", atrib002:"123", atrib003:"1x3"... atrib200:"1zz"}

The searching application allows it to be mounted dynamically, according to the options desired by the user.

How is it possible or what is the recommendation for creating indexes in mongoDB to improve the performance of this type of research ? In addition, it is feasible to create an index for each attribute, which in this case would be 200 indexes, and rely on mongoDB to choose the best of them ?

2 answers

1

1

(Note: I am not DBA. And test anything on a separate copy of the bank before applying in production).

With regard to indices Mongodb is not so different from relational banks. They are typically B-trees.

If the query is dynamic and the user can choose which attributes to include in the search, the best thing to do is to create the 200 indexes, one for each attribute, and so the searches will be optimized.

If there was a pattern, for example if the user always included attribute 1 and attribute 2 in the query, you could create a composite index that would map both attributes at once. But since there isn’t, the indexes should be created separately.

Browser other questions tagged

You are not signed in. Login or sign up in order to post.