Serving multiple tensorflow models using docker

冷暖自知 提交于 2019-12-04 07:25:40

There is no docker environment variable named “MODEL_CONFIG_FILE” (that’s a tensorflow/serving variable, see docker image link), so the docker image will only use the default docker environment variables ("MODEL_NAME=model" and "MODEL_BASE_PATH=/models"), and run the model “/models/model” at startup of the docker image. "config.conf" should be used as input at "tensorflow/serving" startup. Try to run something like this instead:

docker run -p 8500:8500 8501:8501 \
  --mount type=bind,source=/path/to/models/first/,target=/models/first \
  --mount type=bind,source=/path/to/models/second/,target=/models/second \
  --mount type=bind,source=/path/to/config/config.conf,target=/config/config.conf\
  -t tensorflow/serving --model_config_file=/config/config.conf

I ran into this double slash issue for git bash on windows.

As such I am passing the argument, mentioned by @KrisR89, in via command in the docker-compose.

The new docker-compose looks like this and works with the supplied dockerfile:

version: '3'

services:
  serving:
    build: .
    image: testing-models
    container_name: tf
    command: --model_config_file=/config/config.conf

The error is cause serving couldn't find your model.

I tensorflow_serving/model_servers/server.cc:82] Building single TensorFlow model file config:  model_name: model model_base_path: /models/model
I tensorflow_serving/model_servers/server_core.cc:461] Adding/updating models.
I tensorflow_serving/model_servers/server_core.cc:558]  (Re-)adding model: model
E tensorflow_serving/sources/storage_path/file_system_storage_path_source.cc:369] FileSystemStoragePathSource encountered a file-system access error: Could not find base path /models/model for servable model

Your docker compose file didn't mount your model files in the container. So the Serving couldn't find your models.

version: '3'
services:
  serving:
    build: .
    image: testing-models
    container_name: tf

Mount your model files from host to the container. I think you could do this :

version: "3"
services:
        serving:
                image: tensorflow/serving:latest
                restart: unless-stopped
                ports:
                        - 8501:8501
                volumes:
                        - ${FIRST_MODEL_PATH <HOST>}:/models/${FIRST_MODEL_NAME}
                        - ${SECOND_MODEL_PATH <HOST>}:/models/${SECOND_MODEL_NAME}
                        - <HOST PATH>/models.config:/models/models.config
                command: --model_config_file=/models/models.config

Replace {PATH} and {MODEL_NAME} to your path and model name.

models.config file key versions should be set.

model_config_list: {
  config: {
    name:  "first",
    base_path:  "/models/first",
    model_platform: "tensorflow",
    model_version_policy: {
        versions: 1
        versions: 2
    }
  },
  config: {
    name:  "second",
    base_path:  "/models/second",
    model_platform: "tensorflow",
    model_version_policy: {
        versions: 1
        versions: 2
    }
  }
}

And you can see this serving official document

易学教程内所有资源均来自网络或用户发布的内容,如有违反法律规定的内容欢迎反馈
该文章没有解决你所遇到的问题?点击提问,说说你的问题,让更多的人一起探讨吧!