Can I run spark unit tests within eclipse

大憨熊 提交于 2019-12-11 12:16:39

问题


Recently we moved from using scalding to spark. I used eclipse and the scala IDE for eclipse to write code and tests. The tests ran fine with twitter's JobTest class. Any class using JobTest would be automatically available to run as a scala unit test within eclipse. I've lost that ability now. The spark test cases are perfectly runnable using sbt, but the run configuration in eclipse for these tests lists 'none applicable'.

Is there a way to run spark unit tests within eclipse?


回答1:


I think this same approach using Java would work in Scala. Basically just make a SparkContext using the master as "local" and then build and run unit tests as normal. Be sure to stop the SparkContext when the test is finished.

I have this working for Spark 1.0.0 but not a newer version.

public class Test123 {
  static JavaSparkContext sparkCtx;

  @BeforeClass
  public static void sparkSetup() {
    // Setup Spark
    SparkConf conf = new SparkConf();
    sparkCtx = new JavaSparkContext("local", "test", conf);     
  }

  @AfterClass
  public static void sparkTeardown() {
    sparkCtx.stop();
  }

  @Test
  public void integrationTest() {
    JavaRDD<String> logRawInput = sparkCtx.parallelize(Arrays.asList(new String[] {
            "data1",
            "data2",
            "garbage",
            "data3",
        }));
  }
}


来源:https://stackoverflow.com/questions/30363170/can-i-run-spark-unit-tests-within-eclipse

易学教程内所有资源均来自网络或用户发布的内容,如有违反法律规定的内容欢迎反馈
该文章没有解决你所遇到的问题?点击提问,说说你的问题,让更多的人一起探讨吧!