问题
I am running the following in a browser:
INDEX.HTML (BODY)
<script src="https://unpkg.com/@tensorflow/tfjs"></script>
<script src="https://unpkg.com/@tensorflow/tfjs-automl"></script>
<img
id="daisy"
crossorigin="anonymous"
src="https://storage.googleapis.com/tfjs-testing/tfjs-automl/img_classification/daisy.jpg"
/>
<script>
async function run() {
const model = await tf.automl.loadImageClassification("model.json");
const image = document.getElementById("daisy");
const predictions = await model.classify(image);
const pre = document.createElement("pre");
pre.textContent = JSON.stringify(predictions, null, 2);
document.body.append(pre);
}
run();
</script>
What I am trying to do is convert the script to something I can run in node js, like this:
INDEX.JS (IMPORT/ESM)
import * as tf from "@tensorflow/tfjs";
import * as automl from "@tensorflow/tfjs-automl";
async function run() {
const model = await tf.automl.loadImageClassification("model.json");
const image = document.createElement("img");
image.src =
"https://storage.googleapis.com/tfjs-testing/tfjs-automl/img_classification/daisy.jpg";
const predictions = await model.classify(image);
console.log(predictions);
}
run();
I then run the script with node --experimental-modules index.js
and it fails with:
(node:24163) UnhandledPromiseRejectionWarning: TypeError: Cannot read property 'loadImageClassification' of undefined
I also tried require
:
INDEX.JS (REQUIRE/COMMON WITH CONST)
const tf = require("@tensorflow/tfjs");
const automl = require("@tensorflow/tfjs-automl");
async function run() {
const model = await tf.automl.loadImageClassification("model.json");
const image = document.createElement("img");
image.src =
"https://storage.googleapis.com/tfjs-testing/tfjs-automl/img_classification/daisy.jpg";
const predictions = await model.classify(image);
console.log(predictions);
}
run();
I had to remove "type": "module"
from package.json
and run with node index index.js
. It gave the same error.
I also tried not capturing the require
:
INDEX.JS (REQUIRE/COMMON)
require("@tensorflow/tfjs");
require("@tensorflow/tfjs-automl");
async function run() {
const model = await tf.automl.loadImageClassification("model.json");
const image = document.createElement("img");
image.src =
"https://storage.googleapis.com/tfjs-testing/tfjs-automl/img_classification/daisy.jpg";
const predictions = await model.classify(image);
console.log(predictions);
}
run();
When I run this, I get the error: (node:24211) UnhandledPromiseRejectionWarning: ReferenceError: tf is not defined
.
This seems like it might be obvious, but is there a way to do what <script src=
does, but in node, i.e. bring in the external script so my script can see and use the variables/methods in the external script?
回答1:
For anyone else who wants to run tensorflow predictions on node:
const tf = require("@tensorflow/tfjs-node");
const automl = require("@tensorflow/tfjs-automl");
const fs = require("fs");
const model_url = "<your-model-url>";
const image_path = process.argv.slice(2)[0];
if (!image_path) {
throw new Error("missing argument: path to image");
}
const image = fs.readFileSync(image_path);
const decoded_image = tf.node.decodeJpeg(image);
async function run() {
const model = await automl.loadImageClassification(model_url);
const predictions = await model.classify(decoded_image);
console.log(predictions);
}
run().catch(console.error);
回答2:
I initially missunderstood your question. Based on what I know now, you need to use not tfjs but tfjs-node as stated in this github issue
https://us04web.zoom.us/j/74533735324?pwd=d2NkcENZS3Q2MUtEeGFjc2V6TUowdz09
The reason being that tfjs by default is designed to run on the browser. I still think the info below might be helpful for some people in the future.
Here's an official resource from NodeJS https://nodejs.dev/differences-between-nodejs-and-the-browser
The reason your script fails might be related to a number of reasons. Here are my thoughts:
- You are trying to import a module in your app that Node doesn't know how to load. This might be, because you have not installed the module using:
npm install --save @tensorflow/tfjs-automl @tensorflow/tfjs
tfjs might need to be bundled before being used in the browser. This means that it is not enough with a simple nodejs app, but rather you need to do a simple setup to bundle your modules and your script into a single script, that you can then use in the browser.
You are indeed using your modules properly but your browser doesn't know how to load them, might be because your browser doesn't understand the "import" syntax. Although unlikely, is just a thougt I have in my mind.
Your setup is fine, but you're not using tensorflow library properly. From what you told me on the comments. I get the feeling that maybe either the
loadImageClassification
method orclassify
method, are not receiving the arguments they expected. Keep in mind that loadImageClassification expects a url for the json file that tensorflow will use during the classification. I found as well that tfjs has a couple of troubles when dynamically creating images if you don't set its width and height, as discussed here:
Requested texture size [0x0] is invalid. error when i am loading image in browser
I went through the tfjs examples and found one that seems actually like the one you're posting here: https://github.com/tensorflow/tfjs/tree/master/tfjs-automl/demo/img_classification
I also did a similar setup myself in my github profile without much trouble
https://github.com/taro-0/tsfjs-sample
来源:https://stackoverflow.com/questions/61804137/is-there-a-way-to-replace-script-tag-src-with-require-and-run-the-same-script-on