spark返回rdd[nothing]而不是rdd[(long,string,…)]

mzmfm0qo  于 2021-05-27  发布在  Spark
关注(0)|答案(0)|浏览(409)

我正在尝试将我的DataframeMap到元组的rdd,它在某种程度上可以正常工作,但是当我添加更多的row.getas时,它就会突然返回 RDD[Nothing] .
例如

df.rdd.map(row => {
      (row.getAs[Long]("created_at"),
        row.getAs[String]("elem1"),
        row.getAs[String]("elem2"),
        row.getAs[String]("elem3"),
        row.getAs[String]("elem4"),
        row.getAs[String]("elem5"),
        row.getAs[String]("elem6"),
        row.getAs[String]("elem7"),
        row.getAs[String]("elem8"),
        row.getAs[String]("elem9"),
        row.getAs[String]("elem10"),
        row.getAs[String]("elem11"),
        row.getAs[String]("elem12"),
        row.getAs[String]("elem13"),
        row.getAs[String]("elem14"),
        row.getAs[String]("elem15"),
        row.getAs[String]("elem16"),
        row.getAs[String]("elem17"),
        row.getAs[String]("elem18"),
        row.getAs[String]("elem19"),
        row.getAs[String]("elem20"),
        row.getAs[String]("elem21"))
    })

退货 RDD[(Long,String,String,String,String,String,String,String,String,String,String,String,String,String,String,String,String,String,String,String,String,String)] 在那之后我可以很容易地打电话给小组 .groupBy(x => x._1) 但只要我再加一个 row.getAs[String] 在这种情况下 row.getAs[String]("elem22") ```
df.rdd.map(row => {
(row.getAsLong,
row.getAsString,
row.getAsString,
row.getAsString,
row.getAsString,
row.getAsString,
row.getAsString,
row.getAsString,
row.getAsString,
row.getAsString,
row.getAsString,
row.getAsString,
row.getAsString,
row.getAsString,
row.getAsString,
row.getAsString,
row.getAsString,
row.getAsString,
row.getAsString,
row.getAsString,
row.getAsString,
row.getAsString,
row.getAsString)
})

它会回来的 `RDD[Nothing]` 别让我打电话 `.groupBy(x => x._1)` 和顺从者
无法解析符号\u 1
我在文档中找不到任何关于Map时限制的内容。我做错什么了吗?

暂无答案!

目前还没有任何答案,快来回答吧!

相关问题