40

Java has LinkedHashMap which gets you 99% there to an LRU cache.

Is there a Javascript implementation of an LRU cache, preferably from a reputable source, that is:

  1. understandable
  2. efficient (amortized O(1) get/put/delete)

? I've been searching on the web but couldn't find one; I thought I found one on Ajax Design Patterns but it glosses over the sendToTail() method and has O(n) performance (presumably, since the queue and associative array are split up).

I suppose I could write my own, but I've learned the hard way that reinventing the wheel for core algorithms can be hazardous to one's health :/

1

9 Answers 9

81

Map should be O(1) in most implementations average case. Since Map keeps insertion order, adding a bit of code around it will get you a LRU and for most uses this should be plenty fast.

I needed a simple LRU cache for a small number of expensive operations (1 second). I felt better about copy-pasting some small code rather than introducing something external, but since I didn't find it I wrote it:

class LRU {
    constructor(max = 10) {
        this.max = max;
        this.cache = new Map();
    }

    get(key) {
        let item = this.cache.get(key);
        if (item !== undefined) {
            // refresh key
            this.cache.delete(key);
            this.cache.set(key, item);
        }
        return item;
    }

    set(key, val) {
        // refresh key
        if (this.cache.has(key)) this.cache.delete(key);
        // evict oldest
        else if (this.cache.size === this.max) this.cache.delete(this.first());
        this.cache.set(key, val);
    }

    first() {
        return this.cache.keys().next().value;
    }
}

Usage:

> let cache = new LRU(3)
> [1, 2, 3, 4, 5].forEach(v => cache.set(v, 'v:'+v))
> cache.get(2)
undefined
> cache.get(3)
"v:3"
> cache.set(6, 6)
> cache.get(4)
undefined
> cache.get(3)
"v:3"
Sign up to request clarification or add additional context in comments.

9 Comments

how is your this.first() working? Map doesn't provide first()
@TaranGoel The .first() is implemented right there in the code. Just see below the set().
Awesome @odinho-Velmont, seems like Map possesses many capabilities already to help us implementing LRU cache. I was trying with simple JS Object (as Hash Table) and DoublyLinkedList and it's becoming quite cumbersome.
this fails the leetcode LRU problem. see: gist.github.com/Vandivier/ea2228f8f363417aa6392612e5399449
@JohnVandivier Why? I registered on that site and used your example, it says "Accepted" for me with runtime 95ms and all test cases passing.
|
9

This:

https://github.com/monsur/jscache

seems to fit you case although setItem (i.e. put) is O(N) in the worst case, that happens if the cache is filled up on insertion. In this case the cache is searched to purge expired items or least recently used items. getItem is O(1) and the expiry is handled on the getItem operation (i.e. if the item being fetched is expired, removes it and returns null).

The code is compact enough to be easily understood.

P.S. It might be useful to add to the constructor the option to specify the fillFactor, which is fixed to 0.75 (meaning that when the cache is purged it's size is reduced at least to 3/4th of the maximum size)

2 Comments

thanks, I did run across that. It seemed like it had too many bells & whistles for my application (not to mention the phrase ASP.NET which is a huge red flag in my mind), but maybe I should give it another look.
+1 The implementation has nothing to do with ASP.NET I think it is worth a look
7

This is worth a mention: https://github.com/rsms/js-lru

The core set of functions are O(1) and the code is heavily commented (with ASCII art too!)

Comments

4

The monsur.com implementation is O(n) on insertion only because it has items which actually expire on real world time. It is not a simple LRU. If you only care about maintaining the most recently used items without regard to real world time, this can be done in O(1). A queue, implemented as a doubly linked list, is O(1) for insertion or deletion from the end, and this is all you should need for a cache. As for lookup, a hash map, which javascript makes pathetically easy, should be good for nearly O(1) lookup (assuming the javascript engine uses a good hashmap, that's implementation dependent of course). So you have a linked list of items with a hash map pointing to the items. Manipulate the ends of the linked list as needed to put new items and requested items on one end and remove old items from the other end.

3 Comments

the linked list needs to be able to delete (but not insert) items from the middle if items are removed from the LRU cache and reinserted. That's the hard part, your answer seems to gloss over that.
Redundantly, removing from the middle of a doubly linked list is O(n), which you would have to do to maintain the LRU invariant on access.
@Eloff, There is the additional hash map to get to any element anywhere in the list with O(1). But you and "Jason S" are right that manipulating the ends isn't enough, any item at any position in the list can be the next one that needs to go back to the front position, so while insertion is at one end removal can be from any position. Still, thanks to the hash map that can be done independent of the length of the list.
2

This library runtime-memcache implements lru and a few other caching schemes in javascript.

It uses modified Doubly Linked List to achieve O(1) for get, set and remove. You can check out the implementation which is pretty simple.

1 Comment

interesting... could you comment on any testing / peer review that would help someone justify use of this library? (My question was from 2009 and I have no idea what I was doing at the time, but maybe your answer can help someone else)
1

I improve @odinho's answer to fit this leetcode question

change if (item) to if (item != undefined) to fit value === 0 case

following is my code:

class LRUCache {
  constructor(max = 10) {
    this.max = max
    this.cache = new Map()
  }

  get(key) {
    let item = this.cache.get(key)
    if (item !== undefined) {
      // refresh key
      this.cache.delete(key)
      this.cache.set(key, item)
    }
    return item === undefined ? -1 : item
  }

  put(key, val) {
    // refresh key
    if (this.cache.has(key)) this.cache.delete(key)
    // evict oldest
    else if (this.cache.size == this.max) this.cache.delete(this.first())
    this.cache.set(key, val)
  }

  first() {
    return this.cache.keys().next().value
  }
}

Comments

0

It's not an LRU cache, but I've got my own linked map implementation. As it uses a JS objects as store, it'll have similar performance characteristics (the wrapper objects and hash function impart a performance penalty).

Currently, documentation is basically non-existant, but there's a related SO answer.

The each() method will pass the current key, the current value and a boolean indicating if there are more elements as arguments to the callback function.

Alternatively, looping can be done manually via

for(var i = map.size; i--; map.next()) {
    var currentKey = map.key();
    var currentValue = map.value();
}

Comments

0

I am aware that this is an old question but adding a link for future refrence. Check out https://github.com/monmohan/dsjslib . This has a LRU Cache implementation in addition to some other data structures. Such caches (and this one too) maintain doubly linked list of cache entries in LRU order i.e. entries move to the head as they are accessed and are removed from tail when they are reclaimed (say by expiry or because size limit was reached). Its O(1) since it only involves constant number of pointer manipulations.

Comments

-1

Since we need read, write, update and delete operations in O(1), we use two data structures.

  1. An Object(or Map)in JavaScript provides retrieval in O(1).
  2. A Doubly LinkedList(Custom data structure we create) makes below functionalities in O(1)
    • change position of the most used element to the top
    • delete least used element from cache on reaching cache limit.

The custom implementation of Doubly LinkedList and Least Recently Used cache with clear explanation is given below.

https://medium.com/dsinjs/implementing-lru-cache-in-javascript-94ba6755cda9

2 Comments

Don't link to an external resource, inline your implementation in the answer.
If I'm not mistaken, this implementation will fail with multiple put/write to the same key -- every element in the cache will be for the same key, which is NOT how LRU should work

Your Answer

By clicking “Post Your Answer”, you agree to our terms of service and acknowledge you have read our privacy policy.

Start asking to get answers

Find the answer to your question by asking.

Ask question

Explore related questions

See similar questions with these tags.