Avro file error while loading decimal field into Redshift table using Databricks

自作多情 提交于 2020-01-06 07:02:10

问题


I have a dataframe in Databricks, which has bunch of columns including a decimal(15,2) field. If I exclude the decimal field then I am able to insert this data into the Redshift table, but when decimal field is included then I get following error:
"Cannot init avro reader from s3 file Cannot parse file header: Cannot save fixed schema"
Any thoughts?


回答1:


Try to use just decimal without range. Or cast existing column to decimal. Also try to use different tempformat. From my experience CSV GZIP is faster.



来源:https://stackoverflow.com/questions/57089896/avro-file-error-while-loading-decimal-field-into-redshift-table-using-databricks

易学教程内所有资源均来自网络或用户发布的内容,如有违反法律规定的内容欢迎反馈
该文章没有解决你所遇到的问题?点击提问,说说你的问题,让更多的人一起探讨吧!