Most likely HashMap
, whose implementation is the simplest of the alternatives available: a simple scattering table non-synchronized, non-compete, unordered (in the sense that there is no guarantee in the order in which its iterator returns elements) and non-navigable (in the sense of NavigableMap
).
The documentation suggests an optimization when instantiating the HashMap
:
This implementation provides Constant-time performance for the basic
Operations (get and put), assuming the hash Function disperses the
Elements properly Among the Buckets. Iteration over Collection views
requires time proportional to the "Capacity" of the Hashmap instance
(the number of Buckets) plus its size (the number of key-value
mappings). Thus, it’s very Important not to set the initial Capacity
Too high (or the load factor Too low) if iteration performance is
Important.
An instance of Hashmap has two Parameters that affect its performance:
initial Capacity and load factor. The Capacity is the number of
Buckets in the hash table, and the initial Capacity is Simply the
Capacity at the time the hash table is created. The load factor is a
Asure of how full the hash table is allowed to get before its
Capacity is Automatically increased. When the number of Entries in the
hash table Exceeds the product of the load factor and the Current
Capacity, the hash table is rehashed (that is, Internal data
Structures are rebuilt) so that the hash table has approximately Twice
the number of Buckets.
As a general Rule, the default load factor (.75) offers a good
tradeoff between time and space costs. Higher values decrease the
space overhead but Increase the lookup cost (reflected in Most of the
Operations of the Hashmap class, including get and put). The expected
number of Entries in the map and its load factor should be Taken into
Account when Setting its initial Capacity, so as to minimize the
number of rehash Operations. If the initial Capacity is Greater than
the Maximum number of Entries divided by the load factor, no rehash
Operations will Ever occur.
The idea is that, if memory is available, costly processes of rehash adopting an initial capacity greater than the quotient between the expected number of entries and the load factor. In your case it is 2 to 10 thousand elements, therefore it would be good to instantiate the HashMap
with an initial capacity of the order 2667 to 13334, since the load factor standard and recommended is 0.75.
Note that if you are on Android, worth knowing the class android.util.SparseArray
, which is a more efficient implementation than a HashMap
for use with keys of the whole type.
Great comment, the second scenario fits perfectly in the situation, I will perform tests with the commented Maps and some more as the
TreeMap
and post the results!– David Schrammel