问题
I have a Spark DataFrame
, let's say 'df'.
I do the following simple aggregation on this DataFrame
:
df.groupBy().sum()
Upon doing so, I get the following exception:
java.lang.IllegalArgumentException: requirement failed: Decimal precision 39 exceeds max precision 38
Is there any way I can fix this?
My guess is, if I can decrease the decimal precision of all the columns of double type in df, it would solve the problem.
来源:https://stackoverflow.com/questions/46462377/change-decimal-precision-of-all-double-type-columns-in-a-spark-dataframe