Python DataFrame: transpose one column into multiple column

霸气de小男生 提交于 2020-12-12 14:15:06

问题


I have a dataframe like below:

df = pd.DataFrame({'month':['2017-09-27','2017-09-27','2017-09-28','2017-09-29'],'Cost':[100,500,200,300]})

How can I get a df like this:

2017-09-27   2017-09-28    2017-09-29
  100            200          300
  500            NULL         NULL  

Thanks in advance!


回答1:


Use cumcount to compute a "cumulative count" of the items within each group. We'll use these values (below) as index labels.

In [97]: df['index'] = df.groupby('month').cumcount()

In [98]: df
Out[98]: 
   Cost       month  index
0   100  2017-09-27      0
1   500  2017-09-27      1
2   200  2017-09-28      0
3   300  2017-09-29      0

Then the desired result can be obtained by pivoting:

In [99]: df.pivot(index='index', columns='month', values='Cost')
Out[99]: 
month  2017-09-27  2017-09-28  2017-09-29
index                                    
0           100.0       200.0       300.0
1           500.0         NaN         NaN



回答2:


Option 1
zip_longest

from itertools import zip_longest

s = df.groupby('month').Cost.apply(list)
pd.DataFrame(list(zip_longest(*s)), columns=s.index)

month  2017-09-27  2017-09-28  2017-09-29
0             100       200.0       300.0
1             500         NaN         NaN

Option 2
pd.concat

pd.concat(
    {k: g.reset_index(drop=True) for k, g in df.groupby('month').Cost},
    axis=1
)

   2017-09-27  2017-09-28  2017-09-29
0         100       200.0       300.0
1         500         NaN         NaN

Option 3
Similar to @unutbu in that it uses cumcount. However, I use set_index and unstack to do the pivoting.

df.set_index([df.groupby('month').cumcount(), 'month']).Cost.unstack()

month  2017-09-27  2017-09-28  2017-09-29
0           100.0       200.0       300.0
1           500.0         NaN         NaN


来源:https://stackoverflow.com/questions/46457655/python-dataframe-transpose-one-column-into-multiple-column

易学教程内所有资源均来自网络或用户发布的内容,如有违反法律规定的内容欢迎反馈
该文章没有解决你所遇到的问题?点击提问,说说你的问题,让更多的人一起探讨吧!