问题
I have a dataframe in Databricks, which has bunch of columns including a decimal(15,2) field. If I exclude the decimal field then I am able to insert this data into the Redshift table, but when decimal field is included then I get following error:
"Cannot init avro reader from s3 file Cannot parse file header: Cannot save fixed schema"
Any thoughts?
回答1:
Try to use just decimal
without range. Or cast existing column to decimal
. Also try to use different tempformat
. From my experience CSV GZIP
is faster.
来源:https://stackoverflow.com/questions/57089896/avro-file-error-while-loading-decimal-field-into-redshift-table-using-databricks