TIMESTAMP not behaving as intended with parquet in hive

前端 未结 1 1042
清歌不尽
清歌不尽 2021-01-16 12:38

I have parquet data which when read (TIMESTAMP column) using spark works perfectly fine. Below are the sample records:

scala>          


        
1条回答
  •  傲寒
    傲寒 (楼主)
    2021-01-16 12:50

    I figured out an alternative to my own problem. I changed the column type of TIMESTAMP column to STRING and while fetching data I used from_unixtime method to cast that particular column to the intended date format and was able to fetch it.
    But, the problem over here was if my date value is 2020-02-27 15:40:22 and when I fetched the data of this column via Hive it was returning EpochSeconds i.e 15340232000000.
    So, I solved this problem in Hive via below query:

    select *, from_unixtime(cast(SOURCE_LOAD_DATE as BIGINT) DIV 1000000) as SOURCE_LOAD_DATE from table_name;   
    

    Using the above query I was able to get the proper date with timestamp value.

    Note: You will need to cast every column which has timestamp data.

    This is the only trick which I could think of. I hope this might help you or others!

    0 讨论(0)
提交回复
热议问题