如何基于另一列值透视数据

z8dt9xmd  于 2021-06-24  发布在  Hive
关注(0)|答案(3)|浏览(385)

我正在尝试透视/转换我的列值,并尝试获取相应的datetime。
表一:

User  Status     LogTime
----------------------------------------
Tom   Active     2019-09-06 17:36:08.233
Tom   Active     2019-09-06 18:37:08.244
Tom   Active     2019-09-06 20:46:08.133
Tom   InActive   2019-09-06 23:46:08.133
Tom   Active     2019-09-07 12:37:08.244
Tom   Active     2019-09-08 10:46:08.133
Tom   InActive   2019-09-08 11:46:08.133

试图得到如下数据。

User  Active                     InActive
------------------------------------------------------
Tom  2019-09-06 20:46:08.133   2019-09-06 23:46:08.133
Tom  2019-09-08 10:46:08.133   2019-09-08 11:46:08.133

我正在尝试将status列与last active logtime和last active logtime之后的inactive logtime互换

qc6wkl3g

qc6wkl3g1#

你可以试试 shift() 抓取前一行 InActive ,然后将每2行分隔为一组 unstack() :

m=df[df.Status.eq('InActive')|df.Status.eq('InActive').shift(-1)].reset_index(drop=True)
m.assign(k=m.groupby(m.index//2).ngroup()).set_index(['User','Status','k']).unstack(1)
LogTime                         
Status                   Active                 InActive
User k                                                  
Tom  0  2019-09-06 20:46:08.133  2019-09-06 23:46:08.133
     1  2019-09-08 10:46:08.133  2019-09-08 11:46:08.133

或使用相同的 mpivot_table :

m.assign(k=m.groupby(m.index//2).ngroup()).pivot_table(index=['User','k']
          ,columns='Status',values='LogTime',aggfunc='first').rename_axis(None,axis=1)
Active                 InActive
User k                                                  
Tom  0  2019-09-06 20:46:08.133  2019-09-06 23:46:08.133
     1  2019-09-08 10:46:08.133  2019-09-08 11:46:08.133
nfeuvbwi

nfeuvbwi2#

此查询在具有数据集的配置单元中工作。当用户的日志中没有非活动或活动状态时,我试图考虑可能的边界条件,当然,逻辑应该在真实的数据集上进行验证和调整。
演示:

with data as (
select stack(7,
'Tom','Active',   '2019-09-06 17:36:08.233',
'Tom','Active',   '2019-09-06 18:37:08.244',
'Tom','Active',   '2019-09-06 20:46:08.133',
'Tom','InActive', '2019-09-06 23:46:08.133',
'Tom','Active',   '2019-09-07 12:37:08.244',
'Tom','Active',   '2019-09-08 10:46:08.133',
'Tom','InActive', '2019-09-08 11:46:08.133'
) as(User,Status,LogTime)
) --use your_table instead of this

select User, Active, InActive
from
(
select User,MaxInActive,MaxActive,--Status,LogTime,nextStatus,
       case when (prevStatus='Active' and Status='InActive')  --the last Active LogTime
                 then prevLogTime
            when (Status='Active' and nextStatus is NULL) --boundary condition, Active is the last status, take current
                 OR (LogTime=MaxActive  and MaxInActive is NULL) --No InActive, take current
                 then LogTime             
       end as Active,

       case when (prevStatus='Active' and Status='InActive') --InActive LogTime after the last Active
                 OR (LogTime=MaxInActive and MaxActive is NULL) --No Active exists, take current
                 then LogTime
       end as InActive

from       
(
select User,Status,LogTime,
       max(case when Status='InActive' then LogTime end) over(partition by User) as MaxInActive ,
       max(case when Status='Active' then LogTime end) over(partition by User) as MaxActive,
       lead(Status) over(partition by User order by LogTime) nextStatus,
       lag(Status) over(partition by User order by LogTime) prevStatus,
       lag(LogTime) over(partition by User order by LogTime) prevLogTime
  from data
)s
)s
where (Active is not NULL and InActive is not NULL)
      or (MaxInActive is NULL and Active is not NULL) --only active records exist
      or (MaxActive is NULL and MaxInActive is not NULL) --only inactive exists
 ;

结果:

OK
user    active  inactive
Tom     2019-09-06 20:46:08.133 2019-09-06 23:46:08.133
Tom     2019-09-08 10:46:08.133 2019-09-08 11:46:08.133
Time taken: 100.645 seconds, Fetched: 2 row(s)
f5emj3cl

f5emj3cl3#

在“logtime”的“user”、“status”、“date”部分尝试groupby,并在“logtime”上调用“last”。下一步,“unstack”,将索引放入列,删除不需要的列和“dropna”

df1 = (df.groupby(['User','Status', df.LogTime.dt.date]).LogTime.last()
        .unstack(1).reset_index().drop('LogTime',1).dropna())

Out[890]:
Status User                  Active                InActive
0       Tom 2019-09-06 20:46:08.133 2019-09-06 23:46:08.133
2       Tom 2019-09-08 10:46:08.133 2019-09-08 11:46:08.133

相关问题