Higher memory usage than necessary if K < N
If you are only interested in the \$K\$ most frequent elements, then you only need to maintain a list of size \$K\$. You can do that if you make sure the key and counter are stored in the map, not in the list, and only use the list to maintain references to map entries sorted by the counter value:
using fast_map = …;
struct item {
const T val;
std::size_t count{};
std::list<fast_map<T, item>::iterator>::iterator it;
};
fast_map<T, item> items;
std::list<fast_map<T, item>::iterator> by_count;
Despite the seemingly circular type definition this compiles. You can use by_count.end() as the value of it for those items that are not part of the list.
You could also avoid the separate std::list entirely if you make item part of an intrusive list, like for example using Boost.Intrusive, although this could end up using more memory if \$N\$ is much larger than \$K\$.
Don't use increment/decrement operators on variables used multiple times in one statement
This is confusing, UB before C++17, and unspecified since C++17 (see order of evaluation):
auto list_it = items.end();
by_value.emplace(list_it->val, --list_it);
Just move the decrement to its own line. Alternatively, you could reverse the order of the list so you can use items.begin() and not have to modify that.
Does not compile if T is small and not copyable
You are storing T twice by value if its size is equal to or smaller than a reference. However, it is possible to make a small type that is not copyable. You could check for this and use a std::reference_wrapper for uncopyable types regardless of their size.
Complexity
The
insert()function operates in O(1) time, confirmed by varying the number of iterations of the test program's loop between 100,000 and 100,000,000.
That's the average-case complexity, but not the worst-case complexity. Consider that there is a potential non-\$O(1)\$ operation in insert(), and that's the call to std::ranges::lower_bound(). On a std::list, this is \$O(N)\$. That might happen if all elements in the list so far have the same count, and then another insert happens that would increment the count of one of those existing elements.
You might then think that the amortized complexity of finding the \$K\$ most-frequent elements of a vector is still \$O(N)\$, but I think that might not be the case with a worst-case input. Consider that vector having the values \$1 \dots N/2\$, in that order, repeated twice (possibly reversed the second time). That might up ending \$O(N^2)\$.