Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

bug(dev-cli): Unexpected wasm memory growth #915

Open
PeterFarber opened this issue Jul 25, 2024 · 3 comments
Open

bug(dev-cli): Unexpected wasm memory growth #915

PeterFarber opened this issue Jul 25, 2024 · 3 comments
Labels
bug Something isn't working loader ao Loader

Comments

@PeterFarber
Copy link
Contributor

Description

When running wasm64 (WebAssembly 64-bit) modules in our application, we've encountered an issue where memory usage steadily increases over time. This behavior occurs unpredictably and seems to be independent of specific module functions or operations.

Expected Behavior

Memory usage should remain stable or increase within expected limits as defined by the application's memory management and garbage collection routines.

Actual Behavior

Memory consumption grows continuously, eventually leading to performance degradation and potential crashes due to memory exhaustion.

Steps to Reproduce (uncertain)

Unfortunately, the issue of unexpected wasm64 memory growth appears sporadic, making it challenging to provide precise steps for reproduction.

Impact

This issue severely impacts the stability and performance of our application when running wasm64 modules, especially in long-running sessions.

Proposed Solution

Investigate and address the root cause of the memory growth in wasm64 modules. This may involve:

  • Modify Emscripten build memory configuration flags.
  • Reviewing memory management strategies within the wasm64 module.
  • Checking for potential memory leaks or inefficient memory handling.
  • Ensuring compatibility with the WebAssembly runtime.
@PeterFarber PeterFarber added bug Something isn't working loader ao Loader labels Jul 25, 2024
@ppedziwiatr
Copy link
Collaborator

ppedziwiatr commented Jul 30, 2024

I was recently analysing the memory dump of our token and was kinda suprised that that it has almost 90mb - where effectively sth like a map of ~100 balances is being hold - BUT - we've changed its source code several times (if I understand correctly - aos load sends an Eval interaction to the AOS process - and it causes lua's load to interpret the new code) - so maybe it has sth to code with loading/updated process' code?

Another example - our oracle process - https://cu.ao-testnet.xyz/state/fev8nSrdplynxom78XaQ65jSo7-88RxVVVPwHG8ffZk - 50mb

This is NOT a AOS process - but a process built with dev-cli from this source code https://github.com/warp-contracts/ao-redstone-oracle/blob/main/redstone-oracle-process/process.lua

Quickly reviewing the contents of the dump - it seems that like 60% are the source codes and the rest is the state of the process (in this case some jsons stored in a lua table)
Zrzut ekranu 2024-07-30 o 17 17 03

@PeterFarber
Copy link
Contributor Author

PeterFarber commented Jul 31, 2024

I was recently analysing the memory dump of our token and was kinda suprised that that it has almost 90mb - where effectively sth like a map of ~100 is being hold - BUT - we've changed its source code several times (if I understand correctly - aos load sends an Eval interaction to the AOS process - and it causes lua's load to interpret the new code) - so maybe it has sth to code with loading/updated process' code?

Another example - our oracle process - https://cu.ao-testnet.xyz/state/fev8nSrdplynxom78XaQ65jSo7-88RxVVVPwHG8ffZk - 50mb

This is NOT a AOS process - but a process built with dev-cli from this source code https://github.com/warp-contracts/ao-redstone-oracle/blob/main/redstone-oracle-process/process.lua

Quickly review the contents of the dump - it seems that like 60% are the source codes and the rest is the state of the process (in this case some jsons stored in a lua table) Zrzut ekranu 2024-07-30 o 17 17 03

Yeah looks like most of the dump is that the returned memory is never being freed. Do you have discord? Would like to review your finding. @ppedziwiatr

@ppedziwiatr
Copy link
Collaborator

We have a dedicated Slack with you Guys - https://redstone-ujk1058.slack.com/archives/C06NG8C2CNR - so maybe drop us a message there?

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
bug Something isn't working loader ao Loader
Projects
None yet
Development

No branches or pull requests

2 participants