在windows上运行spark单元测试

hwazgwia  于 2021-06-03  发布在  Hadoop
关注(0)|答案(1)|浏览(506)

我尝试在spark上运行一些转换,它在集群(yarn、linux机器)上运行良好。但是,当我尝试在本地计算机(windows 7)的单元测试下运行它时,出现了以下错误:

java.io.IOException: Could not locate executable null\bin\winutils.exe in the Hadoop binaries.
at org.apache.hadoop.util.Shell.getQualifiedBinPath(Shell.java:318)
at org.apache.hadoop.util.Shell.getWinUtilsPath(Shell.java:333)
at org.apache.hadoop.util.Shell.<clinit>(Shell.java:326)
at org.apache.hadoop.util.StringUtils.<clinit>(StringUtils.java:76)
at org.apache.hadoop.security.Groups.parseStaticMapping(Groups.java:93)

我的代码如下:

@Test
def testETL() = {
    val conf = new SparkConf()
    val sc = new SparkContext("local", "test", conf)
    try {
        val etl = new IxtoolsDailyAgg() // empty constructor

        val data = sc.parallelize(List("in1", "in2", "in3"))

        etl.etl(data) // rdd transformation, no access to SparkContext or Hadoop
        Assert.assertTrue(true)
    } finally {
        if(sc != null)
            sc.stop()
    }
}

为什么它要尝试访问hadoop呢?我该怎么修?先谢谢你

db2dz4w8

db2dz4w81#

我自己解决了这个问题http://simpletoad.blogspot.com/2014/07/runing-spark-unit-test-on-windows-7.html

相关问题