问题
Could someone help guide me in what data type or format I need to submit from_unixtime for the spark from_unixtime() function to work?
When I try the following it works, but responds not with current_timestamp.
from_unixtime(current_timestamp())
The response is below:
fromunixtime(currenttimestamp(),yyyy-MM-dd HH:mm:ss)
When I try to input
from_unixtime(1392394861,"yyyy-MM-dd HH:mm:ss.SSSS")
The above simply fails with a type mismatch:
error: type mismatch; found : Int(1392394861) required: org.apache.spark.sql.Column from_unixtime(1392394861,"yyyy-MM-dd HH:mm:ss.SSSS")
What am I missing? I've tried a number of different things and tried reading documentation on using date/time in spark and every example I've tried fails with type mismatches.
回答1:
Use lit() to create a column of literal value, like this:
from_unixtime(lit(1392394861), "yyyy-MM-dd HH:mm:ss.SSSS")
or, as zero323 mentioned:
from_unixtime(current_timestamp().cast("long"))
来源:https://stackoverflow.com/questions/39281152/spark-unix-timestamp-data-type-mismatch