get datatype of column using pyspark

前端 未结 6 428
野的像风
野的像风 2021-01-31 15:58

We are reading data from MongoDB Collection. Collection column has two different values (e.g.: (bson.Int64,int) (int,float) ).

I a

6条回答
  •  不知归路
    2021-01-31 16:28

    Your question is broad, thus my answer will also be broad.

    To get the data types of your DataFrame columns, you can use dtypes i.e :

    >>> df.dtypes
    [('age', 'int'), ('name', 'string')]
    

    This means your column age is of type int and name is of type string.

提交回复
热议问题