我有一个大的csv文件中的spark(scala)Dataframe。
Dataframe是这样的
key| col1 | timestamp |
---------------------------------
1 | aa | 2019-01-01 08:02:05.1 |
1 | aa | 2019-09-02 08:02:05.2 |
1 | cc | 2019-12-24 08:02:05.3 |
2 | dd | 2013-01-22 08:02:05.4 |
我需要添加两列开始日期和结束日期类似的内容
key| col1 | timestamp | start date | end date |
---------------------------------+---------------------------------------------------
1 | aa | 2019-01-01 08:02:05.1 | 2017-01-01 08:02:05.1 | 2018-09-02 08:02:05.2 |
1 | aa | 2019-09-02 08:02:05.2 | 2018-09-02 08:02:05.2 | 2019-12-24 08:02:05.3 |
1 | cc | 2019-12-24 08:02:05.3 | 2019-12-24 08:02:05.3 | NULL |
2 | dd | 2013-01-22 08:02:05.4 | 2013-01-22 08:02:05.4 | NULL |
在这里,
对于每列“key”,end\ u date是同一个key的下一个时间戳。但是,最新日期的“结束日期”应为空。
到目前为止我尝试的是:
我尝试使用窗口函数来计算每个分区的排名
像这样的
var df = read_csv()
//copy timestamp to start_date
df = df
.withColumn("start_date", df.col("timestamp"))
//add null value to the end_date
df = df.withColumn("end_date", typedLit[Option[String]](None))
val windowSpec = Window.partitionBy("merge_key_column").orderBy("start_date")
df
.withColumn("rank", dense_rank()
.over(windowSpec))
.withColumn("max", max("rank").over(Window.partitionBy("merge_key_column")))
到目前为止,我还没有得到想要的输出。
1条答案
按热度按时间8ulbf1ek1#
使用
window lead function
为了这个案子。Example:
```val df=Seq((1,"aa","2019-01-01 08:02:05.1"),(1,"aa","2019-09-02 08:02:05.2"),(1,"cc","2019-12-24 08:02:05.3"),(2,"dd","2013-01-22 08:02:05.4")).toDF("key","col1","timestamp")
import org.apache.spark.sql.expressions._
import org.apache.spark.sql.functions._
import org.apache.spark.sql._
val df1=df.withColumn("start_date",col("timestamp"))
val windowSpec = Window.partitionBy("key").orderBy("start_date")
df1.withColumn("end_date",lead(col("start_date"),1).over(windowSpec)).show(10,false)
//+---+----+---------------------+---------------------+---------------------+
//|key|col1|timestamp |start_date |end_date |
//+---+----+---------------------+---------------------+---------------------+
//|1 |aa |2019-01-01 08:02:05.1|2019-01-01 08:02:05.1|2019-09-02 08:02:05.2|
//|1 |aa |2019-09-02 08:02:05.2|2019-09-02 08:02:05.2|2019-12-24 08:02:05.3|
//|1 |cc |2019-12-24 08:02:05.3|2019-12-24 08:02:05.3|null |
//|2 |dd |2013-01-22 08:02:05.4|2013-01-22 08:02:05.4|null |
//+---+----+---------------------+---------------------+---------------------+