Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

tfjs-tflite: error after page refresh #8166

Open
Jove125 opened this issue Feb 1, 2024 · 4 comments
Open

tfjs-tflite: error after page refresh #8166

Jove125 opened this issue Feb 1, 2024 · 4 comments
Assignees
Labels
comp:tfjs-tflite type:bug Something isn't working

Comments

@Jove125
Copy link

Jove125 commented Feb 1, 2024

Hi,
The error consistently occurs when performing a certain sequence of actions and only on some devices.

  1. Open the page and initialize/execute 2 models.
  2. Refresh the page (without closing it) and repeat the same steps from point 1.

The error occurs during the second execution of one of the initialized models (the first execution occurs without errors, but before the initialization of the second model).

Uncaught (in promise) RuntimeError: memory access out of bounds
    at tflite_web_api_cc_simd_threaded.wasm:0x84782
    at tflite_web_api_cc_simd_threaded.wasm:0x2f41d
    at tflite_web_api_cc_simd_threaded.wasm:0x6720
    at tflite_web_api_cc_simd_threaded.wasm:0x5d1d4
    at tflite_web_api_cc_simd_threaded.wasm:0x32f1ed
    at tflite_web_api_cc_simd_threaded.wasm:0x68e48
    at tflite_web_api_cc_simd_threaded.wasm:0x337a6e
    at tflite_web_api_cc_simd_threaded.wasm:0x421bb
    at Function.TFLiteWebModelRunner$CreateFromBufferAndOptions [as CreateFromBufferAndOptions] (eval at new_ (tflite_web_api_cc_simd_threaded.js:9:42857), <anonymous>:10:10)
exception thrown: RuntimeError: memory access out of bounds,RuntimeError: memory access out of bounds
    at tensorflow/@tensorflow/tfjs-tflite/dist/tflite_web_api_cc_simd_threaded.wasm:wasm-function[2485]:0x17db61
    at tensorflow/@tensorflow/tfjs-tflite/dist/tflite_web_api_cc_simd_threaded.wasm:wasm-function[4702]:0x32f82d
    at tensorflow/@tensorflow/tfjs-tflite/dist/tflite_web_api_cc_simd_threaded.wasm:wasm-function[2504]:0x1806d4
    at TFLiteWebModelRunner.TFLiteWebModelRunner$GetInputs [as GetInputs] (eval at new_ (tensorflow/@tensorflow/tfjs-tflite/dist/tflite_web_api_cc_simd_threaded.js:9:42857), <anonymous>:8:10)
    at module$exports$google3$third_party$tensorflow_lite_support$web$tflite_web_api_client.TFLiteWebModelRunner.getInputs (tensorflow/@tensorflow/tfjs-tflite/dist/tf-tflite.min.js:17:1201308)

The error occurs on Honor 10 Lite, Readme Note 7 and does not occur on Google Pixel 4a 5G (possibly because there is 6 GB of RAM, but not 3 GB).
If you do not refresh the page after the first launch in the browser, but close it and open a new one, then the error does not occur.

Question: is there any way to completely clear the data/cache before initializing tfjs-tflite models?

@Jove125
Copy link
Author

Jove125 commented Feb 1, 2024

Is there any way to release the tfjs-tflite model's memory? - Like dispose in GraphModel.
Then I can set it to window.onbeforeunload function (called when the browser page is refreshed or closed).

@Jove125
Copy link
Author

Jove125 commented Feb 2, 2024

I solved this issue by following implementation:

window.onbeforeunload = function()
{
	model.modelRunner.cleanUp();
	model.modelRunner = undefined;
}

Please, add it to documentation.
There are no information how to release memory at all, especialy after page refresh:
https://js.tensorflow.org/api_tflite/0.0.1-alpha.10/

@gaikwadrahul8
Copy link
Contributor

gaikwadrahul8 commented Feb 2, 2024

Hi, @Jove125

I apologize for the delayed response and good to hear that you're able to resolve your issue and as far my current understanding tfjs-tflite doesn't have a direct dispose() method like GraphModel, here are effective ways to manage model memory and release it when needed:

1. Dispose of Tensors: Explicitly release memory used by tensors after you're done with them:

const inputTensor = tf.zeros([1, 10]);
const outputTensor = model.predict(inputTensor);

//perform operations with outputTensor

outputTensor.dispose(); // Release tensor memory

2. Wrap Code in tf.tidy() : it will automatically dispose of all tensors created within a block as mentioned in this example tf.tidy

NOTE: Variables do not get cleaned up when inside a tidy(). If you want to dispose variables, please use tf.disposeVariables() or call dispose() directly on variables.

3. Use window.onbeforeunload to release memory before page refresh/unload as you mentioned in your solution

Thank you for your valuable suggestion and solution, as you mentioned in your issue template it's happening on some devices so I'll discuss this issue in our internal meeting and will update you soon on adding this solution in our official documentation

If possible could you please help me with your Github repo or code snippet ( along with model file) with complete steps which you followed before encountering the error so I'll go ahead and replicate the same behavior from my end also ?

Thank you for your cooperation!

@Jove125
Copy link
Author

Jove125 commented Feb 3, 2024

Hi, @gaikwadrahul8,

I had this error in the application that allocates memory not only for models, but also for textures, opencv.js and more. It is not so easy to reproduce the dump itself using only initialization of models (or the model should be "heavy"). But I specifically checked that the problem is after initializing the models (there was no error without loading the model, no matter how many times I refreshed the page). And it disappeared after implementing window.onbeforeunload function.

The memory used by the tab (and not just the application) should be monitored.
I assume that when multithreading is enabled, the memory allocated in separate threads is not released when the page is refreshed or another page is open on the same tab (or it is released, but not immediately).

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
comp:tfjs-tflite type:bug Something isn't working
Projects
None yet
Development

No branches or pull requests

3 participants