How to get day of week in SparkSQL?

前端 未结 3 960
半阙折子戏
半阙折子戏 2020-12-03 14:19

I am trying to select all record recorded at Sunday through SparkSQL. I have the following try but in vain.

SELECT * FROM mytable WHERE DATEPART(WEEKDAY, cre         


        
相关标签:
3条回答
  • 2020-12-03 14:57

    This works for me:

    spark.sql("select dayofweek(time) as dow from some_table")
    

    Where time needs to be in date format

    0 讨论(0)
  • 2020-12-03 15:08

    SPARK 1.5.0 has a date_format function that accepts a format as an argument. This format returns a name of a week day from a timestamp:

    select date_format(my_timestamp, 'EEEE') from ....

    Result: e.g. 'Tuesday'

    0 讨论(0)
  • 2020-12-03 15:12

    If the create_time is in the format of UTC, you can use the following to filter out specific days in SparkSQL. I used Spark 1.6.1:

    select id,  date_format(from_unixtime(created_utc), 'EEEE') from testTable where date_format(from_unixtime(created_utc), 'EEEE') == "Wednesday"
    

    If you specify 'EEEE', the day of the week is spelled out completely. You can use 'E' to specify the shortened version, e.g. Wed. You can find more info here: http://spark.apache.org/docs/latest/api/python/pyspark.sql.html#pyspark.sql.DataFrame http://docs.oracle.com/javase/6/docs/api/java/text/SimpleDateFormat.html

    0 讨论(0)
提交回复
热议问题