How to interpret the output of object detection model in tensorflow.js

☆樱花仙子☆ 提交于 2020-03-04 04:40:11

问题


I am trying to run custom object detection tensorflow.js model in a browser. I could able to convert tensorflow model to tensorflow.js model (in google colab) using the following command:

!tensorflowjs_converter \
--input_format=tf_frozen_model \
--output_node_names='detection_boxes,detection_scores,detection_classes,num_detections' \
/content/frozen_inference_graph.pb \
/content/web_model

I am sharing the code snippet of inference.html file:

<html>
<head>
<script src="https://cdn.jsdelivr.net/npm/@tensorflow/tfjs@latest"> </script>
<script src="webcam.js"></script>
</head>
<body>
    <div>
        <div>
            <video autoplay playsinline muted id="wc" width="224" height="224"></video>
        </div>
    </div>
    <button type="button" id="startPredicting" onclick="startPredicting()" >Start Predicting</button>
    <button type="button" id="stopPredicting" onclick="stopPredicting()" >Stop Predicting</button>
    <div id="prediction"></div>
</body>

<script src="index.js"></script>
</html>

The code snippet of index.js file is as follow:

let model;
const webcam = new Webcam(document.getElementById('wc'));
let isPredicting = false;


async function init(){
        try {
            await webcam.setup();
            model = await tf.loadGraphModel('http://127.0.0.1:8887/model/model.json');
        } catch (err) {
            console.log(err);
        }
}

async function predict() {
    const img = webcam.capture();
    console.log("executing model");
    const cat = document.getElementById('image');
    output = await model.executeAsync(img);
    output.forEach(t => t.print) // log out the data of all tensors
    const data = []
    for (let i = 0; i < output.length; i++){
        data.push(output.dataSync())
    }
    console.log(data);
}

init()


function startPredicting(){
    isPredicting = true;
    predict();
}

function stopPredicting(){
    isPredicting = false;
    predict();
}

When I run above inference.html file using web server, it returns the following output:

(4) [t, t, t, t]
0: t {kept: false, isDisposedInternal: false, shape: Array(3), dtype: "float32", size: 400, …}
1: t {kept: false, isDisposedInternal: false, shape: Array(2), dtype: "float32", size: 100, …}
2: t {kept: false, isDisposedInternal: false, shape: Array(2), dtype: "float32", size: 100, …}
3: t {kept: false, isDisposedInternal: false, shape: Array(1), dtype: "float32", size: 1, …}
length: 4
__proto__: Array(0)

The problem is output seems to be irrelevant or I can't understand it. Am I missing something? Please provide me your suggestions. I am sorry for the long post but I am beginner in tensorflow.js.


回答1:


output is a tf.Tensor. When you called console.log(output), it tries to stringify the object and prints out its properties.

The tensor also has the method, print to log out its data.

To get the data out of the tensor as a javaScript array, method such as data (respectively dataSync) and dataArray (respectively dataArraySync) can be called to retrieve the data aynchronously (respectively synchronously). The data is retrieved as a typedArray.

output = await model.executeAsync(img);
// output is an array of tf.tensor.
output.forEach(t => t.print()) // log out the data of all tensors
const data = []
for (let i = 0; i < output.length; i++)
  data.push(output[i].dataSync())  // get the data


来源:https://stackoverflow.com/questions/59575812/how-to-interpret-the-output-of-object-detection-model-in-tensorflow-js

易学教程内所有资源均来自网络或用户发布的内容,如有违反法律规定的内容欢迎反馈
该文章没有解决你所遇到的问题?点击提问,说说你的问题,让更多的人一起探讨吧!