[LeetCode] LRU Cache
The basic idea is to store the key-value pair in some container. In the following code, we make three type definitions:
1 typedef list<int> LI; 2 typedef pair<int, LI::iterator> PII; 3 typedef unordered_map<int, PII> HIPII;
HIPII is the ultimate type for the container. It is essentially an unordered_map, using key as its key and a pair as its value. The first element of the pair is the value and the second element of it is the iterator of the key in a list (front elements are more recently used). So each element in the HIPII container is like a {int key, {int value, iterator}}.
Now the implementation of get and set is relatively straightforward: each time we access a key, touch it (making it at the beginning of the list) and then get or set its value. If the container has met its size and we need add a new key, just remove the key corresponding to the last element of list (least recently used).
The code is as follows.
1 class LRUCache{ 2 public: 3 LRUCache(int capacity) { 4 _capacity = capacity; 5 } 6 7 int get(int key) { 8 auto it = cache.find(key); 9 if (it == cache.end()) return -1; 10 touch(it); 11 return it -> second.first; 12 } 13 14 void set(int key, int value) { 15 auto it = cache.find(key); 16 if (it != cache.end()) touch(it); 17 else { 18 if (cache.size() == _capacity) { 19 cache.erase(used.back()); 20 used.pop_back(); 21 } 22 used.push_front(key); 23 } 24 cache[key] = {value, used.begin()}; 25 } 26 private: 27 typedef list<int> LI; 28 typedef pair<int, LI::iterator> LII; 29 typedef unordered_map<int, LII> HIPII; 30 31 int _capacity; 32 LI used; 33 HIPII cache; 34 35 void touch(HIPII::iterator it) { 36 int key = it -> first; 37 used.erase(it -> second.second); 38 used.push_front(key); 39 it -> second.second = used.begin(); 40 } 41 };
相關文章
- Leetcode LRU CacheLeetCode
- LRU cache原理及go實現Go
- 動手實現一個 LRU cache
- LRU cache快取簡單實現快取
- 用 Go 實現一個 LRU cacheGo
- LRU Cache的原理和python的實現Python
- Android快取機制-LRU cache原理與用法Android快取
- Python 的快取機制: functools.lru_cachePython快取
- 通過原始碼學習@functools.lru_cache原始碼
- Python 中 lru_cache 的使用和實現Python
- 從 LRU Cache 帶你看面試的本質面試
- [Leetcode]146.LRU快取機制LeetCode快取
- LeetCode-146- LRU 快取機制LeetCode快取
- python自帶快取lru_cache用法及擴充套件(詳細)Python快取套件
- LeetCode146 動手實現LRU演算法LeetCode演算法
- LeetCode第 146 號問題: LRU 快取機制LeetCode快取
- Leetcode LRU快取,陣列+結構體實現LeetCode快取陣列結構體
- 使用LinkedHashMap來實現一個使用LRU(Least Recently Used)演算法的cacheHashMapAST演算法
- 演算法題:設計和實現一個 LRU Cache 快取機制演算法快取
- lru
- 給vnTrader 1.92版本加入lru_cache快取讀取提速優化回測快取優化
- protobuf、LRU、sigleflight
- 詳解leetcode146題【LRU (最近最少使用) 快取機制】(附js最優解法!)LeetCode快取JS
- LRU 演算法演算法
- [Python手撕]LRUPython
- 146. LRU 快取快取
- LRU快取機制快取
- library cache pin和library cache lock(一)
- library cache pin和library cache lock (zt)
- library cache pin和library cache lock(二)
- Guava CacheGuava
- Spring CacheSpring
- Service Worker Cache 和 HTTP Cache 的區別HTTP
- MySQL:Table_open_cache_hits/Table_open_cache_misses/Table_open_cache_overflowsMySql
- LRU 實現 通過 LinkedHashMapHashMap
- LRU演算法簡介演算法
- 高效設計一個LRU
- 傳統LRU連結串列 vs MySQL的LRU連結串列,孰優孰劣?MySql
- Redis 為何使用近似 LRU 演算法淘汰資料,而不是真實 LRU?Redis演算法