Failed with exception java.io.IOException:org.apache.avro.AvroTypeException: Found long, expecting union in hive

前端 未结 1 735
梦毁少年i
梦毁少年i 2021-01-05 19:28

Need help!!!

I am streaming twitter feeds into hdfs using flume and loading it up in hive for analysis.

The steps are as follows:

相关标签:
1条回答
  • 2021-01-05 19:57

    I was facing the exact same issue. The issue existed in the timestamp field("created_at" column in your case) which i was trying to insert as string into my new table. My assumption was this data would be in [ "null","string"] format in my source. I analyzed the source avro schema which got generated from the sqoop import --as-avrodatafile process. The avro schema generated from import had the below signature for the timestamp column.
    { "name" : "order_date", "type" : [ "null", "long" ], "default" : null, "columnName" : "order_date", "sqlType" : "93" },

    SqlType 93 stands for Timestamp datatype. So in my target table Avro Schema file I changed the data type to 'long' and this solved the issue. My guess is possibly the mismatch of datatype in one of your columns.

    0 讨论(0)
提交回复
热议问题