1

I'm getting ArrayBuffer chunks of data and I am adding each chunk to an array on a web worker and when I get all the chunks I convert that array of chunks to a blob.

//worker.js

const array = []

this.onmessage = function(e){
     if(e.data.done)
     {
        const blob = new Blob(array);
        this.postMessage(blob)
        shunks.length = 0
     }
     else
     {
         array.push(e.data.chunk)
     }
}

MY QUESTIONS ARE

  1. if the array size hits 2GB it will be stored on the memory right? that means I wont be able to fill that array with data greater than my available memory?
  2. When I create a blob from that array, will the blob also take another 2GB from the memory?
1
  • 1
    Yes, it's always in memory for at least as long as there's a reference to it. Yes, it's limited to available memory. Yes, the blob will take more memory beyond the memory already occupied by the original ArrayBuffers Commented Dec 7, 2021 at 7:00

2 Answers 2

3

That depends...

Theoretically, yes each ArrayBuffer will occupy its byteLength in the memory, and yes, Blobs made from it will occupy new space in memory.

However browsers may use some optimizations here, for instance IIRC Chrome does save its Blobs on the user's disk instead of bloating the memory, and I believe they could use similar tricks for ArrayBuffers.

So when possible, it may be preferable to build a Blob per smaller chunk (since each "chunk" Blob would be saved on the disk), but that's quite unsure and relies on strong assumptions. If you need to be safe, better assume it will be on the memory and that you are at risk of reaching a limit.

If you really need to handle huge files, you may consider Streams, for instance the new File System Access API allows us to write to disk as streams. On systems where this is not accessible, you could try saving each chunk in IndexedDB.

Sign up to request clarification or add additional context in comments.

3 Comments

well explained Thank you, but is it possible to to create a blob for each chunk and concatenate all these created blobs at the end into a single one? I did not mention that the array buffer is a file and I used blobs in order to generate an Object URL URL.createObjectURL(blob) to be able to download the actual file. Is there any way to save each chunk of the buffer directly to the disk on the same file with out using Blobs ,something similar to node filesystem fs require('fs').writeFileSync('/path', Buffer.from(buffer)) but here we need from the full buffer not chunks @Kaiido
Yes you can do new Blob([blob1, blob2, blob3]) and in this case no new data is created.
I just want to let you know that as u suggested File System Access API is the perfect solution for this case. I did not even need to work with blobs, just saving buffers directly to my disk. I post the code how did I do it. thank you.
0

You can play with typescriptlang.

const baseArray = new Uint8ClampedArray(1_000_000_000);

console.log(baseArray.length)
const start = performance.now();


function f() {
    const n = 10;

    for (var i = 1; i < n; ++i) {
        const b = new Blob([baseArray]);
        console.log(b.size)
    }
}

f();

const end = performance.now();

const duration = end - start;
console.log(duration / 1000)

The execution time is around 2 seconds due to copying / extra allocations. If you set n to 1000, you get a crash. TLDR: Blob copies data from its arguments.

Comments

Your Answer

By clicking “Post Your Answer”, you agree to our terms of service and acknowledge you have read our privacy policy.

Start asking to get answers

Find the answer to your question by asking.

Ask question

Explore related questions

See similar questions with these tags.