A Cache replacement policy which:

Counts how often an item is needed. Those that are used least often are discarded first. This works very similar to LRU [LRU] except that instead of storing the value of how recently a block was accessed, we store the value of how many times it was accessed. So of course while running an access sequence we will replace a block which was used fewest times from our cache. E.g., if A was used (accessed) 5 times and B was used 3 times and others C and D were used 10 times each, we will replace B.

(“Cache Replacement Policies” 2023)

Problems

While the LFU method may seem like an intuitive approach to memory management it is not without faults. Consider an item in memory which is referenced repeatedly for a short period of time and is not accessed again for an extended period of time. Due to how rapidly it was just accessed its counter has increased drastically even though it will not be used again for a decent amount of time. […]

Moreover, new items that just entered the cache are subject to being removed very soon again, because they start with a low counter, even though they might be used very frequently after that. Due to major issues like these, an explicit LFU system is fairly uncommon; instead, there are hybrids that utilize LFU concepts.

(“Least Frequently Used” 2022)

Bibliography

“Cache Replacement Policies.” 2023. Wikipedia, February. https://en.wikipedia.org/w/index.php?title=Cache_replacement_policies&oldid=1141486190.
“Least Frequently Used.” 2022. Wikipedia, November. https://en.wikipedia.org/w/index.php?title=Least_frequently_used&oldid=1122344427.