Creating spark cluster with drone.yml not working

六月ゝ 毕业季﹏ 提交于 2019-12-22 01:31:44

问题


I have docker-compose.yml with below image and configuration

version: '3'
services:
  spark-master:
    image: bde2020/spark-master:2.4.4-hadoop2.7
    container_name: spark-master
    ports:
      - "8080:8080"
      - "7077:7077"
    environment:
      - INIT_DAEMON_STEP=setup_spark
  spark-worker-1:
    image: bde2020/spark-worker:2.4.4-hadoop2.7
    container_name: spark-worker-1
    depends_on:
      - spark-master
    ports:
      - "8081:8081"
    environment:
      - "SPARK_MASTER=spark://spark-master:7077"

here the docker-compose up log ---> https://jpst.it/1Xc4K

and here containers up and running and i mean spark worker connected to spark master without any issues , now problem is i created drone.yml and where i added services component with

services:
  jce-cassandra:
    image: cassandra:3.0
    ports:
      - "9042:9042"

  jce-elastic:
    image: elasticsearch:5.6.16-alpine
    ports:
      - "9200:9200"
    environment:
      - "ES_JAVA_OPTS=-Xms512m -Xmx512m"

  janusgraph:
    image: janusgraph/janusgraph:latest
    ports:
      - "8182:8182"
    environment:
      JANUS_PROPS_TEMPLATE: cassandra-es
      janusgraph.storage.backend: cql
      janusgraph.storage.hostname: jce-cassandra
      janusgraph.index.search.backend: elasticsearch
      janusgraph.index.search.hostname: jce-elastic
    depends_on:
      - jce-elastic
      - jce-cassandra

  spark-master:
    image: bde2020/spark-master:2.4.4-hadoop2.7
    container_name: spark-master
    ports:
      - "8080:8080"
      - "7077:7077"
    environment:
      - INIT_DAEMON_STEP=setup_spark

  spark-worker-1:
    image: bde2020/spark-worker:2.4.4-hadoop2.7
    container_name: spark-worker-1
    depends_on:
      - spark-master
    ports:
      - "8081:8081"
    environment:
      - "SPARK_MASTER=spark://spark-master:7077"

but here spark worker is not connected to spark master getting exceptions, here is exception log details , can some one please guide me why am facing this issue

Note : I am trying to create these services in drone.yml for my integration testing


回答1:


Answering for better formatting. The comments suggest sleeping. Assuming this is the dockerfile (https://hub.docker.com/r/bde2020/spark-worker/dockerfile) You could sleep by adding the command:

  spark-worker-1:
    image: bde2020/spark-worker:2.4.4-hadoop2.7
    container_name: spark-worker-1
    command: sleep 10 && /bin/bash /worker.sh
    depends_on:
      - spark-master
    ports:
      - "8081:8081"
    environment:
      - "SPARK_MASTER=spark://spark-master:7077"

Although sleep 10 is probably excessive, if this would would sleep 5 or sleep 2



来源:https://stackoverflow.com/questions/59232036/creating-spark-cluster-with-drone-yml-not-working

易学教程内所有资源均来自网络或用户发布的内容,如有违反法律规定的内容欢迎反馈
该文章没有解决你所遇到的问题?点击提问,说说你的问题,让更多的人一起探讨吧!