Map metadata overhead


As part of a benchmark I am conducting, I am running a test that creates and then deletes a key from a map multiple times. From what I understood yjs uses a GC object to keep the metadata overhead small, but I’m finding it quite large. Specifically a document can be several KB when empty.

The results I’m getting are:

My code is:

console.log("num update-insert operations;size in bytes" );
for (n=1; n<=2**15; n = n*2) {

    let doc = new Y.Doc()
    let ymap = doc.getMap('a')
    for (let i = 0; i < n; i++) {
        let key = `key-${i}`
        ymap.set(key, 0)

    docInspection = Y.encodeStateAsUpdateV2(doc).byteLength
    // console.log(`Size of docYjs: ${docinspection.length} bytes.`)


I would appreciate any insights into why the metadata grows so much.


Welcome to the discussion board @Arik,

The metadata only grows because you add a new key in each iteration. The current garbage collection approach doesn’t work in this case. I have a design for Y.Map in mind that will allow garbage collection in this case as well. But in any case it makes sense to make sure that you don’t create too many keys. Rather, you should store each entry in a Y.Array. E.g. yarray.insert(i, [{ .. }]) instead of ymap.set('key-${i}', {..}).


It’s tempting to use YMaps as collections to get that sweet O(1) lookup, but in practice I’ve found a YArray + a runtime lookup table works much better with respect to memory usage and YDoc parse time. YMaps are better suited to modeling domain objects with an upper bound on number of keys, rather than collections.

As dmonad has said, YMap keys take up lots of space, whereas YArray tombstones can be quite compact.

1 Like