Hadoop's HDFS with Spark

前端 未结 1 1147
有刺的猬
有刺的猬 2020-12-21 01:07

I am new to cluster-computing and I am trying to set up a minimal 2-node cluster in Spark. What I am still a bit confused about: Do I have to set up a full Hadoop installat

1条回答
  •  囚心锁ツ
    2020-12-21 01:30

    Apache Spark is independent from Hadoop. Spark allows you to use different sources of data (incl. HDFS) and is capable of running either in a standalone cluster, or using an existing resource management framework (eg. YARN, Mesos).

    So if you're only interested in Spark, there is no need to install Hadoop.

    0 讨论(0)
提交回复
热议问题