问题
My database has numeric
value, which is up to 256-bit unsigned integer. However, spark's decimalType
has a limit of Decimal(38,18).
When I try to do calculations on the column, exceptions are thrown.
java.lang.IllegalArgumentException: requirement failed: Decimal precision 39 exceeds max precision 38".
Is there any third-party library or workarounds that solve this issue? Or Spark is designed for numbers smaller than Decimal(38,18)?
来源:https://stackoverflow.com/questions/53074721/how-to-use-spark-with-large-decimal-numbers