Having seen this github issue and this stackoverflow post I had hoped this would simply work.
It seems as though passing in the environment variable MODEL_CONFIG_FILE
has no affect. I am running this through docker-compose
but I get the same issue using docker-run
.
The error:
I tensorflow_serving/model_servers/server.cc:82] Building single TensorFlow model file config: model_name: model model_base_path: /models/model
I tensorflow_serving/model_servers/server_core.cc:461] Adding/updating models.
I tensorflow_serving/model_servers/server_core.cc:558] (Re-)adding model: model
E tensorflow_serving/sources/storage_path/file_system_storage_path_source.cc:369] FileSystemStoragePathSource encountered a file-system access error: Could not find base path /models/model for servable model
The Dockerfile
FROM tensorflow/serving:nightly
COPY ./models/first/ /models/first
COPY ./models/second/ /models/second
COPY ./config.conf /config/config.conf
ENV MODEL_CONFIG_FILE=/config/config.conf
The compose file
version: '3'
services:
serving:
build: .
image: testing-models
container_name: tf
The config file
model_config_list: {
config: {
name: "first",
base_path: "/models/first",
model_platform: "tensorflow",
model_version_policy: {
all: {}
}
},
config: {
name: "second",
base_path: "/models/second",
model_platform: "tensorflow",
model_version_policy: {
all: {}
}
}
}
There is no docker environment variable named “MODEL_CONFIG_FILE” (that’s a tensorflow/serving variable, see docker image link), so the docker image will only use the default docker environment variables ("MODEL_NAME=model" and "MODEL_BASE_PATH=/models"), and run the model “/models/model” at startup of the docker image. "config.conf" should be used as input at "tensorflow/serving" startup. Try to run something like this instead:
docker run -p 8500:8500 8501:8501 \
--mount type=bind,source=/path/to/models/first/,target=/models/first \
--mount type=bind,source=/path/to/models/second/,target=/models/second \
--mount type=bind,source=/path/to/config/config.conf,target=/config/config.conf\
-t tensorflow/serving --model_config_file=/config/config.conf
I ran into this double slash issue for git bash on windows.
As such I am passing the argument, mentioned by @KrisR89, in via command
in the docker-compose
.
The new docker-compose
looks like this and works with the supplied dockerfile
:
version: '3'
services:
serving:
build: .
image: testing-models
container_name: tf
command: --model_config_file=/config/config.conf
The error is cause serving couldn't find your model.
I tensorflow_serving/model_servers/server.cc:82] Building single TensorFlow model file config: model_name: model model_base_path: /models/model
I tensorflow_serving/model_servers/server_core.cc:461] Adding/updating models.
I tensorflow_serving/model_servers/server_core.cc:558] (Re-)adding model: model
E tensorflow_serving/sources/storage_path/file_system_storage_path_source.cc:369] FileSystemStoragePathSource encountered a file-system access error: Could not find base path /models/model for servable model
Your docker compose file didn't mount your model files in the container. So the Serving couldn't find your models.
version: '3'
services:
serving:
build: .
image: testing-models
container_name: tf
Mount your model files from host to the container. I think you could do this :
version: "3"
services:
serving:
image: tensorflow/serving:latest
restart: unless-stopped
ports:
- 8501:8501
volumes:
- ${FIRST_MODEL_PATH <HOST>}:/models/${FIRST_MODEL_NAME}
- ${SECOND_MODEL_PATH <HOST>}:/models/${SECOND_MODEL_NAME}
- <HOST PATH>/models.config:/models/models.config
command: --model_config_file=/models/models.config
Replace {PATH} and {MODEL_NAME} to your path and model name.
models.config
file key versions
should be set.
model_config_list: {
config: {
name: "first",
base_path: "/models/first",
model_platform: "tensorflow",
model_version_policy: {
versions: 1
versions: 2
}
},
config: {
name: "second",
base_path: "/models/second",
model_platform: "tensorflow",
model_version_policy: {
versions: 1
versions: 2
}
}
}
And you can see this serving official document
来源:https://stackoverflow.com/questions/53035896/serving-multiple-tensorflow-models-using-docker