如何在不知道scala中json列的模式的情况下动态解析dataframe中的json列

ijxebb2r  于 2021-05-27  发布在  Spark
关注(0)|答案(1)|浏览(397)

给定一个sparkDataframe,它的列可能包含也可能不包含嵌套的json。这个嵌套的json是动态的。最终的要求是打破json并为嵌套json中的每个键生成一个新的dataframe,其中包含新的列。
json是动态的,所以生成的表是动态的。还请考虑Dataframe由超过1亿条记录组成。
例如-
输入

------------------------------------------------------------------------
|id         |key     |type      |value 

|f9f          |BUSI    |off         |false                                                                                                                                                                                                          
|f96          |NAME    |50          |true                                                                                                                                                                                                           
|f9z          |BANK    |off         |{"Name":"United School","admNumber":"197108","details":{"code":"WEREFFW32","studentName":"Abhishek kumar","doc":"certificate","admId":"3424325328","stat":0,"studentDetails":false} }|

输出:-

--------------------------------------------------------------------------------------------------------------------------
|id   |key    |type     |value  |Name    | admNumber   |code    | studentName  | doc   |admId    |stat   | studentDetails
+------------------------------------+-----------------+-------------+----------------------------------------------------                              

|f9f  |BUSI    |off     |false  |NULL    |NULL         |NULL    |NULL          |NULL   |NULL     |NULL   |NULL |                                            
|f96  |NAME    |50      |true   |NULL    |NULL         |NULL    |NULL          |NULL   |NULL     |NULL   |NULL            |                                                                                                                
|f9z  |BANK    |off     |NULL   |United School    |197108         |WEREFFW32    |Abhishek kumar          |certificate   |3424325328     |0   |false |
pbwdgjma

pbwdgjma1#

val data = Seq(
    (77, "email1", """{"key1":38,"key3":39}"""),
    (78, "email2", """{"key1":38,"key4":39}"""),
    (178, "email21", """{"key1":"when string","key4":36, "key6":"test", "key10":false }"""),
    (179, "email8", """{"sub1":"qwerty","sub2":["42"]}"""),
      (180, "email8", """{"sub1":"qwerty","sub2":["42", "56", "test"]}""")
  ).toDF("id", "name", "colJson")

  data.show(false)
//  +---+-------+---------------------------------------------------------------+
//  |id |name   |colJson                                                        |
//  +---+-------+---------------------------------------------------------------+
//  |77 |email1 |{"key1":38,"key3":39}                                          |
//  |78 |email2 |{"key1":38,"key4":39}                                          |
//  |178|email21|{"key1":"when string","key4":36, "key6":"test", "key10":false }|
//  |178|email8 |{"sub1":"qwerty","sub2":"42"}                                  |
//  +---+-------+---------------------------------------------------------------+

  val schema = spark.read.json(data.select("colJson").as[String]).schema
  val res = data.select($"id", $"name", from_json($"colJson", schema).as("s")).select("id", "name", "s.*")
  res.show(false)
//  +---+-------+-----------+-----+----+----+----+------+----+
//  |id |name   |key1       |key10|key3|key4|key6|sub1  |sub2|
//  +---+-------+-----------+-----+----+----+----+------+----+
//  |77 |email1 |38         |null |39  |null|null|null  |null|
//  |78 |email2 |38         |null |null|39  |null|null  |null|
//  |178|email21|when string|false|null|36  |test|null  |null|
//  |178|email8 |null       |null |null|null|null|qwerty|42  |
//  +---+-------+-----------+-----+----+----+----+------+----+

  val  df1 = res.filter('sub1.equalTo("qwerty"))
  df1.show(false)
//  +---+------+----+-----+----+----+----+------+----+
//  |id |name  |key1|key10|key3|key4|key6|sub1  |sub2|
//  +---+------+----+-----+----+----+----+------+----+
//  |178|email8|null|null |null|null|null|qwerty|42  |
//  +---+------+----+-----+----+----+----+------+----+

相关问题