It has already been mentioned that hashmaps are O(n/m)
in average, if n
is the number of items and m
is the size. It has also been mentioned that in principle the whole thing could collapse into a singly linked list with O(n)
query time. (This all assumes that calculating the hash is constant time).
However what isn't often mentioned is, that with probability at least 1-1/n
(so for 1000 items that's a 99.9% chance) the largest bucket won't be filled more than O(logn)
! Hence matching the average complexity of binary search trees. (And the constant is good, a tighter bound is (log n)*(m/n) + O(1)
).
All that's required for this theoretical bound is that you use a reasonably good hash function (see Wikipedia: Universal Hashing. It can be as simple as a*x>>m
). And of course that the person giving you the values to hash doesn't know how you have chosen your random constants.
TL;DR: With Very High Probability the worst case get/put complexity of a hashmap is O(logn)
.