不确定如何避免以下类型错误:found:any required:string

bpzcxfmw  于 2021-07-14  发布在  Spark
关注(0)|答案(2)|浏览(471)

我的数据是这样的:

configStr: String =
"
{
  "validation": {
    "target_feed": "tables.validation",
    "data_validations":
        [
            {"program": "program1",
            "test_description": "Checking if column1 are distinct",
            "input_column": "column1",
            "test": "distinctness",
            "query": "select * from table1",
            "condition": "None"},
            {"program": "program12",
            "test_description": "Checking if column2 are distinct",
            "input_column": "column2",
            "test": "Anomaly",
            "query": "select * from table2",
            "condition": "None"}
        ]
  }
}"

我需要遍历数据验证并利用这些字段中的每一个。我打算这样做:

val resultsAsDf = conf("test")
.asInstanceOf[Map[String, Any]]("data_validations")
.asInstanceOf[Seq[Map[String, Any]]]
.map{ dv => someFunc(dv) }
.reduce(_.unionAll(_))

现在,创建 someFunc 这将处理逻辑。我建造了这样的东西:

def someFunc(testCase: Map[String, Any]): Unit = {

    if (testCase("test") == "distinctness") {

        val tempDF = spark.sql(testCase("query"))

        val verificationResults: VerificationResult = { VerificationSuite()
        .onData(tempDF)                                                    
        .addCheck(
        Check(CheckLevel.Error, testCase("program"))
        .hasDistinctness(testCase("column"), Check.IsOne))
        .run()
        }                                                           
    }
    else{
        println("Nothing")
    }
}

现在,我得到以下错误:

<console>:54: error: type mismatch;
 found   : Any
 required: String
               val tempDF = spark.sql(testCase("query"))
                                              ^
<console>:60: error: type mismatch;
 found   : Any
 required: String
               Check(CheckLevel.Error, testCase("program"))

问题是,我需要Map的值是各种类型的,这就是我选择的原因 Any . 有没有办法解决这个问题或者我做错什么了?

ghhaqwfi

ghhaqwfi1#

sequencetosequence,顺便说一下,谁几乎是正确的,混淆了哪种类型需要改变。 testCase 他回来了 Any ,何时 spark.sql(_) 期望 String ,问题不在于 someFunc 这是世界的回归 Map .

def someFunc(testCase: Map[String, Any]): Unit = {}
val tempDF = spark.sql(testCase("query"))
// Spark 3.1.1 ScalaDoc
def sql(sqlText: String): DataFrame

所以当你通过考试的时候 String 把钥匙插入Map,你会得到一个 Any ,这不是 String 作为 spark.sql 期望:

testCase: Map[String, Any]

testCase("string") // Result Any

参考文档:sparksession scaladoc

uxhixvfz

uxhixvfz2#

somefunc返回unit,类似于java的“void”类型,但是spark.sql需要一个字符串。您需要修改testcase以返回包含查询的字符串。

def someFunc(testCase: Map[String, Any]): String = { ... }

相关问题