-
-
Notifications
You must be signed in to change notification settings - Fork 2.7k
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Partial pieces are kept in memory leading to high RAM usage #1973
Comments
@feross Why don't we use IndexedDB or Cache Store instead of |
I'm working on a new web storage API that seems to fit your needs pretty well! It's called NativeIO, and it's currently available behind a flag on Chrome. I suspect that you could avoid keeping partial pieces in memory, by writing them to disk through NativeIO and directly updating them as more data arrives. We are very interested in getting feedback for it as we approach an origin trial, so please let me know if trying it out sounds interesting to you! I'm happy to have chat or discussed it further in a separate issue. |
Semi-related to this issue: @jhiesey and I just fixed an issue in EDIT: Just published a new |
Perhaps integrating a library such as StreamSaver.js into WebTorrent is viable so Blobs don't have to be created. |
that is an unrelated issue to this one, and this is already being worked on with store backpressure |
I'm confused by that last sentence. Which problem is being solved by backpressuring, and do both problems have an issue here? |
Is this still relevant? If so, what is blocking it? Is there anything you can do to help move it forward? |
Hello, Swap space fully used (4Go real RAM + 2.4 Go Swap) F. |
I experience this as well. Some movies and some shows works just fine to start with, but the gui freezes after some time. "(node:107851) MaxListenersExceededWarning: Possible EventEmitter memory leak detected. 11 ready listeners added to [Torrent]. Use emitter.setMaxListeners() to increase limit" |
this seems really easy to fix considering we wrap all requests to our chunk stores in a cache layer, and the stores themselves support partial pieces |
We use more RAM than we need to for a large files which are downloading quickly because we keep pieces in memory until they're fully downloaded at which point we write them to disk.
I believe there's a good chance that all the reports of high RAM usage on WebTorrent Desktop are due to this design decision. Example: webtorrent/webtorrent-desktop#1877 Can we explore how we might limit the number of partial pieces in memory or somehow not keep partial pieces in memory at all?
The text was updated successfully, but these errors were encountered: