Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Partial pieces are kept in memory leading to high RAM usage #1973

Open
feross opened this issue Nov 25, 2020 · 10 comments
Open

Partial pieces are kept in memory leading to high RAM usage #1973

feross opened this issue Nov 25, 2020 · 10 comments

Comments

@feross
Copy link
Member

feross commented Nov 25, 2020

We use more RAM than we need to for a large files which are downloading quickly because we keep pieces in memory until they're fully downloaded at which point we write them to disk.

I believe there's a good chance that all the reports of high RAM usage on WebTorrent Desktop are due to this design decision. Example: webtorrent/webtorrent-desktop#1877 Can we explore how we might limit the number of partial pieces in memory or somehow not keep partial pieces in memory at all?

@datnguyencse
Copy link

datnguyencse commented Dec 30, 2020

@feross Why don't we use IndexedDB or Cache Store instead of https://github.com/mafintosh/memory-chunk-store
Take a look: https://github.com/datnguyencse/cache-storage-chunk-store

@fivedots
Copy link

I'm working on a new web storage API that seems to fit your needs pretty well! It's called NativeIO, and it's currently available behind a flag on Chrome. I suspect that you could avoid keeping partial pieces in memory, by writing them to disk through NativeIO and directly updating them as more data arrives.

We are very interested in getting feedback for it as we approach an origin trial, so please let me know if trying it out sounds interesting to you! I'm happy to have chat or discussed it further in a separate issue.

@feross
Copy link
Member Author

feross commented Feb 4, 2021

Semi-related to this issue: @jhiesey and I just fixed an issue in block-stream2 that significantly reduced memory usage on https://instant.io, almost by half. See: https://github.com/substack/block-stream2/pull/4

EDIT: Just published a new webtorrent with the latest block-stream2 fix. webtorrent 0.112.4.

@redbrain
Copy link

redbrain commented Mar 2, 2021

Perhaps integrating a library such as StreamSaver.js into WebTorrent is viable so Blobs don't have to be created.

@ThaUnknown
Copy link
Member

that is an unrelated issue to this one, and this is already being worked on with store backpressure

@redbrain
Copy link

redbrain commented Mar 3, 2021

I'm confused by that last sentence. Which problem is being solved by backpressuring, and do both problems have an issue here?

@github-actions
Copy link

github-actions bot commented Sep 6, 2021

Is this still relevant? If so, what is blocking it? Is there anything you can do to help move it forward?

@Franck78
Copy link

Franck78 commented Dec 4, 2021

Hello,
I don't think it will help a lot but under Linux, some torrent are able to entirely screw my machine.

Swap space fully used (4Go real RAM + 2.4 Go Swap)
Popcorn closed the case because they can't do much about it.

F.

@Umeaboy
Copy link

Umeaboy commented Dec 23, 2021

I experience this as well. Some movies and some shows works just fine to start with, but the gui freezes after some time.
I see an error about memory leaking when regarding to maximum listeners and here's the complete error:

"(node:107851) MaxListenersExceededWarning: Possible EventEmitter memory leak detected. 11 ready listeners added to [Torrent]. Use emitter.setMaxListeners() to increase limit"

@ThaUnknown
Copy link
Member

this seems really easy to fix considering we wrap all requests to our chunk stores in a cache layer, and the stores themselves support partial pieces

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Projects
None yet
Development

No branches or pull requests

8 participants