ideone

useful pieces of codes

View on GitHub

Description

Implement LRU cache (Least Recenetly Used).

Definition

An LRU cache has a fixed capacity. It supports two type of operations, put and get.
We can store a key value pair in the cache using a put method.
We can retrieve a value for a key by get method (if it is present in the cache).
As the capacity of the cache is fixed, it can only store limited number of unique key value pairs.
If the capacity of the cache is reached, we can remove the least recently used key from the cache memory.
As cache needs to be fast, it has to support both get and put methods with O(1) time complexity.

Solution

Under the hood, two data structures are used to achieve the functionality of the LRU cache.
One is the obvious choice of hashmap. As we have to get and set the value of any key in O(1) time complexity, the natural choice would be a hashmap. Hashmap can do both of these operations in O(1) time complexity.
But we also have to remove the least recently used element once the capacity is reached. Hence to keep track of the usage of the keys we want one more data structure. We want to have some kind of list which is ordered by the usage of the key. Once a key is used it comes in the front of the list. And once the capacity is reached we can remove the last key in the list from the hahsmap.
A doubly linked list will be ideal choice for these kind of operations. As inserting in front of the linked list is O(1). Identifying the last element in the list is O(1). Removing an element from the middle of the list is O(1).
Hence both the operations get and put can be done in O(1) time complexity.

Complexity

Time Complexity: O(1)
Space Complexity: O(n)
n: capacity of the cache

Code

#include <iostream>
#include <unordered_map>
using namespace std;

template <class T>
struct LinkedList{
    LinkedList* remove(){
        if(l)   l->r = r;
        if(r)   r->l = l;
        return this;
    }
    void insert(LinkedList* &head){
        r = head;
        l = NULL;
        if(head)    head->l = this;
        head = this;
    }
    T x;
    LinkedList* l;
    LinkedList* r;
    LinkedList(T X, LinkedList* L=NULL, LinkedList* R=NULL) : x(X), l(L), r(R) {}
};

class LRUCache {
private:
    int n;
    LinkedList<pair<int,int>> *head;
    LinkedList<pair<int,int>> *tail;
    unordered_map<int,LinkedList<pair<int,int>>*> h;
public:
    LRUCache(int capacity) {
        n = capacity;
        head = tail = new LinkedList<pair<int,int>>(make_pair(0,0));
    }
    
    int get(int key) {
        if(h.find(key)==h.end())    return -1;
        if(head!=h[key])    h[key]->remove()->insert(head);
        return head->x.second;
    }
    
    void put(int key, int value) {
        if(h.find(key)==h.end())    h[key] = new LinkedList<pair<int,int>>({key,value});
        else    h[key]->x.second = value;
        get(key);
        while(h.size()>n){
            key = tail->l->remove()->x.first;
            delete(h[key]);
            h.erase(key);
        }
    }
};
int main() {
	LRUCache cache(2);
	cache.put(1, 1);
	cache.put(2, 2);
	cout<<cache.get(1)<<endl;	// returns 1
	cache.put(3, 3);			// evicts key 2
	cout<<cache.get(2)<<endl;	// returns -1 (not found)
	cache.put(4, 4);			// evicts key 1
	cout<<cache.get(1)<<endl;	// returns -1 (not found)
	cout<<cache.get(3)<<endl;	// returns 3
	cout<<cache.get(4)<<endl;	// returns 4
	return 0;
}

Ideone it!