How to check the Spark version

后端 未结 15 1152
时光取名叫无心
时光取名叫无心 2020-12-29 17:29

as titled, how do I know which version of spark has been installed in the CentOS?

The current system has installed cdh5.1.0.

相关标签:
15条回答
  • 2020-12-29 18:05

    If you are on Zeppelin notebook you can run:

    sc.version 
    

    to know the scala version as well you can ran:

    util.Properties.versionString
    
    0 讨论(0)
  • 2020-12-29 18:06

    If you are using Databricks and talking to a notebook, just run :

    spark.version
    
    0 讨论(0)
  • 2020-12-29 18:08

    If you use Spark-Shell, it appears in the banner at the start.

    Programatically, SparkContext.version can be used.

    0 讨论(0)
  • 2020-12-29 18:11

    use below to get the spark version

    spark-submit --version
    
    0 讨论(0)
  • 2020-12-29 18:14

    In Spark 2.x program/shell,

    use the

    spark.version  
    

    Where spark variable is of SparkSession object

    Using the console logs at the start of spark-shell

    [root@bdhost001 ~]$ spark-shell
    Setting the default log level to "WARN".
    To adjust logging level use sc.setLogLevel(newLevel).
    Welcome to
          ____              __
         / __/__  ___ _____/ /__
        _\ \/ _ \/ _ `/ __/  '_/
       /___/ .__/\_,_/_/ /_/\_\   version 2.2.0
          /_/
    

    Without entering into code/shell

    spark-shell --version

    [root@bdhost001 ~]$ spark-shell --version
    Welcome to
          ____              __
         / __/__  ___ _____/ /__
        _\ \/ _ \/ _ `/ __/  '_/
       /___/ .__/\_,_/_/ /_/\_\   version 2.2.0
          /_/
                            
    Type --help for more information.
    

    spark-submit --version

    [root@bdhost001 ~]$ spark-submit --version
    Welcome to
          ____              __
         / __/__  ___ _____/ /__
        _\ \/ _ \/ _ `/ __/  '_/
       /___/ .__/\_,_/_/ /_/\_\   version 2.2.0
          /_/
                            
    Type --help for more information.
    
    0 讨论(0)
  • 2020-12-29 18:14

    In order to print the Spark's version on the shell, following solution work.

    SPARK_VERSION=$(spark-shell --version &> tmp.data ; grep version tmp.data | head -1 | awk '{print $NF}';rm tmp.data)
    echo $SPARK_VERSION
    
    0 讨论(0)
提交回复
热议问题