问题
I have the following docker-compose
file:
version: "3"
services:
scraper-api:
build: ./ATPScraper
volumes:
- ./ATPScraper:/usr/src/app
ports:
- "5000:80"
test-app:
build: ./test-app
volumes:
- "./test-app:/app"
- "/app/node_modules"
ports:
- "3001:3000"
environment:
- NODE_ENV=development
depends_on:
- scraper-api
Which build the following Dockerfile
's:
scraper-api
(a python flask
application):
FROM python:3.7.3-alpine
WORKDIR /usr/src/app
COPY requirements.txt ./
RUN pip install --no-cache-dir -r requirements.txt
COPY . .
CMD ["python", "./app.py"]
test-app
(a test react
application for the api):
# base image
FROM node:12.2.0-alpine
# set working directory
WORKDIR /app
# add `/app/node_modules/.bin` to $PATH
ENV PATH /app/node_modules/.bin:/app/src/node_modules/.bin:$PATH
# install and cache app dependencies
COPY package.json /app/package.json
RUN npm install --silent
RUN npm install react-scripts@3.0.1 -g --silent
RUN npm install axios -g
# start app
CMD ["npm", "start"]
Admittedly, I'm a newbie when it comes to Docker networking, but I am trying to get the react
app to communicate with the scraper-api
. For example, the scraper-api
has the following endpoint: /api/top_10
. I have tried various permutations of the following url:
http://scraper-api:80/api/test_api
. None of them have been working for me.
I've been scavenging the internet and I can't really find a solution.
回答1:
The React application runs in the end user's browser, which has no idea this "Docker" thing exists at all and doesn't know about any of the Docker Compose networking setup. For browser apps that happen to be hosted out of Docker, they need to be configured to use the host's DNS name or IP address, and the published port of the back-end service.
A common setup (Docker or otherwise) is to put both the browser apps and the back-end application behind a reverse proxy. In that case you can use relative URLs without host names like /api/...
, and they will be interpreted as "the same host and port", which bypasses this problem entirely.
回答2:
As a side note: when no network is specified inside docker-compose.yml, default network will be created for you with the following name [dir location of docker_compose.yml]_default. For example, if docker_compose.yml is in app
folder. the network will be named app_default
.
Now, inside this network, containers are reachable by their service names. So scraper-api
host should resolve to the right container.
It could be that you are using wrong endpoint URL. In the question, you mentioned /api/top_10
as an endpoint, but URL to test was http://scraper-api:80/api/test_api
which is inconsistent.
Also, it could be that you confused the order of the ports in docker-compose.yml for scraper-api
service:
ports:
- "5000:80"
5000 is being exposed to host where docker is running. 80 is internal app port. Normally, flask apps are listening on 5000, so I thought you might have meant to say:
ports:
- "80:5000"
In which case, between containers you have to use :5000
as destination port in URLs: http://scraper-api:5000
as an example (+ endpoint suffix, of course).
To check connectivity, you might want to bash into client container, and see if things are connecting:
docker-compose exec test-app bash
wget http://scraper-api
wget http://scraper-api:5000
etc.
If you get a response, then you have connectivity, just need to figure out correct endpoint URL.
来源:https://stackoverflow.com/questions/56965489/having-trouble-communicating-between-docker-compose-services