WASM backend for tensorflowjs throws “Unhandled Rejection (RuntimeError): index out of bounds” error in Reactjs

僤鯓⒐⒋嵵緔 提交于 2021-01-28 20:40:49

问题


I am trying to set up a WASM back-end for blazeface face detection model in a react app. Although the demo with the vanillajs can run it without any error for hours, in react it throws "Unhandled Rejection (RuntimeError): index out of bounds error" after leaving the cam open for more than 3-5 minutes.

Entire app crashes with this error. From the log of the error below, maybe it is related to disposeData() or disposeTensor() functions which to my guess, they are related to garbage collecting. But I don't know if it is a bug from the WASM lib itself or not. Do you have any idea why this might happen?

Below I provide my render prediction function as well.

  renderPrediction = async () => {
    const model = await blazeface.load({ maxFaces: 1, scoreThreshold: 0.95 });
    if (this.play) {
      const canvas = this.refCanvas.current;
      const ctx = canvas.getContext("2d");
      const returnTensors = false;
      const flipHorizontal = true;
      const annotateBoxes = true;
      const predictions = await model.estimateFaces(
        this.refVideo.current,
        returnTensors,
        flipHorizontal,
        annotateBoxes
      );

      if (predictions.length > 0) {
        ctx.clearRect(0, 0, canvas.width, canvas.height);
        for (let i = 0; i < predictions.length; i++) {
          if (returnTensors) {
            predictions[i].topLeft = predictions[i].topLeft.arraySync();
            predictions[i].bottomRight = predictions[i].bottomRight.arraySync();
            if (annotateBoxes) {
              predictions[i].landmarks = predictions[i].landmarks.arraySync();
            }
          }
          try {
          } catch (err) {
            console.log(err.message);
          }
          const start = predictions[i].topLeft;
          const end = predictions[i].bottomRight;
          const size = [end[0] - start[0], end[1] - start[1]];

      


          if (annotateBoxes) {
            const landmarks = predictions[i].landmarks;

            ctx.fillStyle = "blue";
            for (let j = 0; j < landmarks.length; j++) {
              const x = landmarks[j][0];
              //console.log(typeof x) // number
              const y = landmarks[j][1];
              ctx.fillRect(x, y, 5, 5);
            }
          }
        }
      }
      requestAnimationFrame(this.renderPrediction);
    }
  };

full log of the error:

Unhandled Rejection (RuntimeError): index out of bounds
(anonymous function)
unknown
./node_modules/@tensorflow/tfjs-backend-wasm/dist/tf-backend-wasm.esm.js/</tt</r</r._dispose_data
C:/Users/osman.cakir/Documents/osmancakirio/deepLearning/wasm-out/tfjs-backend-wasm.js:9



disposeData
C:/Users/osman.cakir/Documents/osmancakirio/deepLearning/src/backend_wasm.ts:115

  112 | 
  113 | disposeData(dataId: DataId) {
  114 |   const data = this.dataIdMap.get(dataId);
> 115 |   this.wasm._free(data.memoryOffset);
      | ^  116 |   this.wasm.tfjs.disposeData(data.id);
  117 |   this.dataIdMap.delete(dataId);
  118 | }

disposeTensor
C:/Users/osman.cakir/Documents/osmancakirio/deepLearning/src/engine.ts:838

  835 |     'tensors');
  836 | let res;
  837 | const inputMap = {};
> 838 | inputs.forEach((input, i) => {
      | ^  839 |     inputMap[i] = input;
  840 | });
  841 | return this.runKernelFunc((_, save) => {

dispose
C:/Users/osman.cakir/Documents/osmancakirio/deepLearning/src/tensor.ts:388
endScope
C:/Users/osman.cakir/Documents/osmancakirio/deepLearning/src/engine.ts:983
tidy/<
C:/Users/osman.cakir/Documents/osmancakirio/deepLearning/src/engine.ts:431

  428 | if (kernel != null) {
  429 |     kernelFunc = () => {
  430 |         const numDataIdsBefore = this.backend.numDataIds();
> 431 |         out = kernel.kernelFunc({ inputs, attrs, backend: this.backend });
      | ^  432 |         const outInfos = Array.isArray(out) ? out : [out];
  433 |         if (this.shouldCheckForMemLeaks()) {
  434 |             this.checkKernelForMemLeak(kernelName, numDataIdsBefore, outInfos);

scopedRun
C:/Users/osman.cakir/Documents/osmancakirio/deepLearning/src/engine.ts:448

  445 | // inputsToSave and outputsToSave. Currently this is the set of ops
  446 | // with kernel support in the WASM backend. Once those ops and
  447 | // respective gradients are modularised we can remove this path.
> 448 | if (outputsToSave == null) {
      | ^  449 |     outputsToSave = [];
  450 | }
  451 | const outsToSave = outTensors.filter((_, i) => outputsToSave[i]);

tidy
C:/Users/osman.cakir/Documents/osmancakirio/deepLearning/src/engine.ts:431

  428 | if (kernel != null) {
  429 |     kernelFunc = () => {
  430 |         const numDataIdsBefore = this.backend.numDataIds();
> 431 |         out = kernel.kernelFunc({ inputs, attrs, backend: this.backend });
      | ^  432 |         const outInfos = Array.isArray(out) ? out : [out];
  433 |         if (this.shouldCheckForMemLeaks()) {
  434 |             this.checkKernelForMemLeak(kernelName, numDataIdsBefore, outInfos);

tidy
C:/Users/osman.cakir/Documents/osmancakirio/deepLearning/src/globals.ts:190

  187 |     const tensors = getTensorsInContainer(container);
  188 |     tensors.forEach(tensor => tensor.dispose());
  189 | }
> 190 | /**
  191 |  * Keeps a `tf.Tensor` generated inside a `tf.tidy` from being disposed
  192 |  * automatically.
  193 |  */

estimateFaces
C:/Users/osman.cakir/Documents/osmancakirio/deepLearning/blazeface_reactjs/node_modules/@tensorflow-models/blazeface/dist/blazeface.esm.js:17
Camera/this.renderPrediction
C:/Users/osman.cakir/Documents/osmancakirio/deepLearning/blazeface_reactjs/src/Camera.js:148

  145 | const returnTensors = false;
  146 | const flipHorizontal = true;
  147 | const annotateBoxes = true;
> 148 | const predictions = await model.estimateFaces(
      | ^  149 |   this.refVideo.current,
  150 |   returnTensors,
  151 |   flipHorizontal,

async*Camera/this.renderPrediction
C:/Users/osman.cakir/Documents/osmancakirio/deepLearning/blazeface_reactjs/src/Camera.js:399

  396 |         // }
  397 |       }
  398 |     }
> 399 |     requestAnimationFrame(this.renderPrediction);
      | ^  400 |   }
  401 | };
  402 | 




回答1:


After using a tensor to make predictions you will need to free the tensor up from the devices memory otherwise it will build up and the cause a potential error you are having. This can simply be done using tf.dispose() to manually specify the place at which you want to dispose the tensors. You do it right after making predictions on the tensor.

const predictions = await model.estimateFaces(
        this.refVideo.current,
        returnTensors,
        flipHorizontal,
        annotateBoxes
      );
          
tf.dispose(this.refVideo.current);          

You can also just use tf.tidy() which does this automatically for you. With it you can just wrap the function where you handle the image tensors for making predictions on. This question on github goes through it quite well but I am not too sure about the implementation as it can only be used for synchronous function calls.

Or you could wrap the code for handling the image tensors in the following code which will also clean up any unused tensors

tf.engine().startScope()
// handling image tensors function
tf.engine().endScope()


来源:https://stackoverflow.com/questions/64499856/wasm-backend-for-tensorflowjs-throws-unhandled-rejection-runtimeerror-index

易学教程内所有资源均来自网络或用户发布的内容,如有违反法律规定的内容欢迎反馈
该文章没有解决你所遇到的问题?点击提问,说说你的问题,让更多的人一起探讨吧!