[LeetCode] LRU Cache
The basic idea is to store the key-value pair in some container. In the following code, we make three type definitions:
1 typedef list<int> LI; 2 typedef pair<int, LI::iterator> PII; 3 typedef unordered_map<int, PII> HIPII;
HIPII is the ultimate type for the container. It is essentially an unordered_map, using key as its key and a pair as its value. The first element of the pair is the value and the second element of it is the iterator of the key in a list (front elements are more recently used). So each element in the HIPII container is like a {int key, {int value, iterator}}.
Now the implementation of get and set is relatively straightforward: each time we access a key, touch it (making it at the beginning of the list) and then get or set its value. If the container has met its size and we need add a new key, just remove the key corresponding to the last element of list (least recently used).
The code is as follows.
1 class LRUCache{ 2 public: 3 LRUCache(int capacity) { 4 _capacity = capacity; 5 } 6 7 int get(int key) { 8 auto it = cache.find(key); 9 if (it == cache.end()) return -1; 10 touch(it); 11 return it -> second.first; 12 } 13 14 void set(int key, int value) { 15 auto it = cache.find(key); 16 if (it != cache.end()) touch(it); 17 else { 18 if (cache.size() == _capacity) { 19 cache.erase(used.back()); 20 used.pop_back(); 21 } 22 used.push_front(key); 23 } 24 cache[key] = {value, used.begin()}; 25 } 26 private: 27 typedef list<int> LI; 28 typedef pair<int, LI::iterator> LII; 29 typedef unordered_map<int, LII> HIPII; 30 31 int _capacity; 32 LI used; 33 HIPII cache; 34 35 void touch(HIPII::iterator it) { 36 int key = it -> first; 37 used.erase(it -> second.second); 38 used.push_front(key); 39 it -> second.second = used.begin(); 40 } 41 };
相關文章
- Leetcode LRU CacheLeetCode
- Leetcode-LRU CacheLeetCode
- LRU Cache leetcode javaLeetCodeJava
- LeetCode 146 [LRU Cache]LeetCode
- LeetCode146:LRU CacheLeetCode
- cache buffers lru chainAI
- cache buffers LRU chain latchAI
- buffer cache部分原理(LRU)
- cache buffers chains vs cache buffers lru chainAI
- cache buffers chains and cache buffers lru chainsAI
- LRU cache原理及go實現Go
- leveldb程式碼精讀 lru cache
- cache buffer lru chain latch等待事件AI事件
- buffer cache實驗6-latch:cache buffers lru chainsAI
- LRU cache快取簡單實現快取
- 動手實現一個 LRU cache
- 用 Go 實現一個 LRU cacheGo
- 等待事件_cache_buffers_lru_chain_latch事件AI
- LRU Cache 的簡單 C++ 實現C++
- oracle buffer cache管理機制_buffer cache dump與lru機制小記Oracle
- 通過原始碼學習@functools.lru_cache原始碼
- LRU Cache的原理和python的實現Python
- 從 LRU Cache 帶你看面試的本質面試
- Android快取機制-LRU cache原理與用法Android快取
- Python 的快取機制: functools.lru_cachePython快取
- Python 中 lru_cache 的使用和實現Python
- Buffer Cache結構及LRU, LRBA , Checkpoint Queue[final]
- [Leetcode]146.LRU快取機制LeetCode快取
- LeetCode-146- LRU 快取機制LeetCode快取
- oracle實驗記錄(buffer_cache分析(3)cbc lru chain latch)OracleAI
- Leetcode LRU快取,陣列+結構體實現LeetCode快取陣列結構體
- python自帶快取lru_cache用法及擴充套件(詳細)Python快取套件
- LeetCode第 146 號問題: LRU 快取機制LeetCode快取
- LeetCode146 動手實現LRU演算法LeetCode演算法
- 演算法題:設計和實現一個 LRU Cache 快取機制演算法快取
- ORACLE buffer cache 原理 --LRU連結串列(參照學習:oracle核心技術揭秘)Oracle
- LeetCode演算法題解:LFU CacheLeetCode演算法
- 使用LinkedHashMap來實現一個使用LRU(Least Recently Used)演算法的cacheHashMapAST演算法