在spark中删除不遵循schema的行

pod7payv  于 2021-05-27  发布在  Spark
关注(0)|答案(3)|浏览(423)

当前,我的表的架构是:

root
 |-- product_id: integer (nullable = true)
 |-- product_name: string (nullable = true)
 |-- aisle_id: string (nullable = true)
 |-- department_id: string (nullable = true)

我想对上表应用下面的架构,并删除所有不遵循下面架构的行:

val productsSchema = StructType(Seq(
    StructField("product_id",IntegerType,nullable = true),
    StructField("product_name",StringType,nullable = true),
    StructField("aisle_id",IntegerType,nullable = true),
    StructField("department_id",IntegerType,nullable = true)
  ))
lxkprmvk

lxkprmvk1#

一定要退房 na.drop 上的函数 data-frame ,可以基于空值、行中的最小空值以及具有空值的特定列删除行。

scala> sc.parallelize(Seq((1,"a","a"),(1,"a","a"),(2,"b","b"),(3,"c","c"),(4,"d","d"),(4,"d",null))).toDF
res7: org.apache.spark.sql.DataFrame = [_1: int, _2: string ... 1 more field]

scala> res7.show()
+---+---+----+
| _1| _2|  _3|
+---+---+----+
|  1|  a|   a|
|  1|  a|   a|
|  2|  b|   b|
|  3|  c|   c|
|  4|  d|   d|
|  4|  d|null|
+---+---+----+

//dropping row if a null is found
scala> res7.na.drop.show()
+---+---+---+
| _1| _2| _3|
+---+---+---+
|  1|  a|  a|
|  1|  a|  a|
|  2|  b|  b|
|  3|  c|  c|
|  4|  d|  d|
+---+---+---+

//drops only if `minNonNulls = 3` if accepted to each row
scala> res7.na.drop(minNonNulls = 3).show()
+---+---+---+
| _1| _2| _3|
+---+---+---+
|  1|  a|  a|
|  1|  a|  a|
|  2|  b|  b|
|  3|  c|  c|
|  4|  d|  d|
+---+---+---+

//not dropping any
scala> res7.na.drop(minNonNulls = 2).show()
+---+---+----+
| _1| _2|  _3|
+---+---+----+
|  1|  a|   a|
|  1|  a|   a|
|  2|  b|   b|
|  3|  c|   c|
|  4|  d|   d|
|  4|  d|null|
+---+---+----+

//drops row based on nulls in `_3` column
scala> res7.na.drop(Seq("_3")).show()
+---+---+---+
| _1| _2| _3|
+---+---+---+
|  1|  a|  a|
|  1|  a|  a|
|  2|  b|  b|
|  3|  c|  c|
|  4|  d|  d|
+---+---+---+
6ju8rftf

6ju8rftf2#

加载忽略损坏记录的数据时使用选项“dropmalformed”。

spark.read.format("json")
  .option("mode", "DROPMALFORMED")
  .option("header", "true")
  .schema(productsSchema)
  .load("sample.json")
ubby3x7f

ubby3x7f3#

如果数据与模式不匹配,spark将 null 作为该列中的值。我们只需要过滤所有列的空值。
使用 filter 筛选所有列的null值。

scala> "cat /tmp/sample.json".! // JSON File Data, one row is not matching with schema.
{"product_id":1,"product_name":"sampleA","aisle_id":"AA","department_id":"AAD"}
{"product_id":2,"product_name":"sampleBB","aisle_id":"AAB","department_id":"AADB"}
{"product_id":3,"product_name":"sampleCC","aisle_id":"CC","department_id":"CCC"}
{"product_id":3,"product_name":"sampledd","aisle_id":"dd","departmentId":"ddd"}
{"name","srinivas","age":29}
res100: Int = 0

scala> schema.printTreeString
root
 |-- aisle_id: string (nullable = true)
 |-- department_id: string (nullable = true)
 |-- product_id: long (nullable = true)
 |-- product_name: string (nullable = true)

scala> val df = spark.read.schema(schema).option("badRecordsPath", "/tmp/badRecordsPath").format("json").load("/tmp/sample.json") // Loading Json data & if schema is not matching we will be getting null rows for all columns.
df: org.apache.spark.sql.DataFrame = [aisle_id: string, department_id: string ... 2 more fields]

scala> df.show(false)
+--------+-------------+----------+------------+
|aisle_id|department_id|product_id|product_name|
+--------+-------------+----------+------------+
|AA      |AAD          |1         |sampleA     |
|AAB     |AADB         |2         |sampleBB    |
|CC      |CCC          |3         |sampleCC    |
|dd      |null         |3         |sampledd    |
|null    |null         |null      |null        |
+--------+-------------+----------+------------+

scala> df.filter(df.columns.map(c => s"${c} is not null").mkString(" or ")).show(false) // Filter null rows.
+--------+-------------+----------+------------+
|aisle_id|department_id|product_id|product_name|
+--------+-------------+----------+------------+
|AA      |AAD          |1         |sampleA     |
|AAB     |AADB         |2         |sampleBB    |
|CC      |CCC          |3         |sampleCC    |
|dd      |null         |3         |sampledd    |
+--------+-------------+----------+------------+

scala>

相关问题