Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

memory consumption blowups in downstream projects using Lwt #972

Open
gasche opened this issue Nov 13, 2022 · 5 comments
Open

memory consumption blowups in downstream projects using Lwt #972

gasche opened this issue Nov 13, 2022 · 5 comments

Comments

@gasche
Copy link
Contributor

gasche commented Nov 13, 2022

Hi Lwt people,

I randomly ended up on the following cohttp issue, which points at some Lwt usage pattern that greatly increase memory consumption in unexpected ways. (Sounds like a bug to me, at least a usability bug.)

mirage/ocaml-cohttp#545

The issue is unfortunately not so clear -- I don't understand if Lwt_io is the culprit or not. There is a pure-Lwt repro case at

mirage/ocaml-cohttp#545 (comment)

The issue seems serious enough that some users are considering avoiding Lwt due to its (unclear) existence. Probably worth investigating.

@raphael-proust
Copy link
Collaborator

@rgrinberg In the cohttp issue discussion you mention

The issue was successfully worked around in the new cohttp client and servers though.

Do you have pointers to the fix? Maybe a PR number?

@raphael-proust
Copy link
Collaborator

I haven't really worked on the Lwt_bytes/Lwt_io part of Lwt. I'll try to have a look some time, but that will be a big context switch so I can't do it right now.

As a low-hanging fruit (hopefully) I'll try to understand the patterns that can be problematic and update the documentation accordingly. An actual fix may come later.

@mseri
Copy link

mseri commented Nov 14, 2022

Concerning Lwt_io, the "fix" is only in the new cohttp-server-lwt-unix, where we remove the dependency on conduit and move to Lwt_io.direct_access, see mirage/ocaml-cohttp#898 and mirage/ocaml-cohttp#907 (the package was introduced in mirage/ocaml-cohttp#838).

This goes hand in hand with the new very fast parser introduced in mirage/ocaml-cohttp#819 and later improved and moved to the general http package. The PR that introduces it also includes some

@rgrinberg @anuragsoni @kandu please correct me or integrate with anything I may be missing or misremember.

@anuragsoni
Copy link
Contributor

@mseri I agree with the assessment that the new cohttp-server-lwt-unix is where this issue is probably "resolved" for cohttp. It should still be tested with the example in the original issue though to confirm this. For the cohttp-lwt packages there has been no change in its use Lwt_io so unless things have changed within lwt, i'd think that the issue still persists for cohttp + lwt.

@hansole
Copy link

hansole commented Dec 4, 2022

I have tried to experiment a little more to get a better understanding about memory consumption in the Ocsigen stack. But I'm getting more confused. I have a small test app that just sends notifications from server to client. When it sends notifications at a rate at about 50/s the application slowly increase memory consumption. If I increase the rate to 100/s, memory consumption increases. However, if I increase the rate to 200/s (or more), then memory consumption stays low,
at around 30MB.
MemUsage

Similarly, if I have other applications using most of the available memory before starting my application, then memory usage stays low. Also, I have seen that in some cases the application will give back memory to the OS if other applications starts using a lot of memory after the "Ocsigen" application has started consuming memory.

From my observations, it does not look like memory leakage, but but rather like a memory allocation/caching strategy that does not always works the way I would like. I guess this is more or less similar to what others have reported.

Is there no way for application writers to change/control the memory consumption behavior?

mysite.eliom.txt

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

5 participants