How to check the Spark version

后端 未结 15 1151
时光取名叫无心
时光取名叫无心 2020-12-29 17:29

as titled, how do I know which version of spark has been installed in the CentOS?

The current system has installed cdh5.1.0.

15条回答
  •  一生所求
    2020-12-29 18:27

    If you are using pyspark, the spark version being used can be seen beside the bold Spark logo as shown below:

    manoj@hadoop-host:~$ pyspark
    Python 2.7.6 (default, Jun 22 2015, 17:58:13)
    [GCC 4.8.2] on linux2
    Type "help", "copyright", "credits" or "license" for more information.
    Setting default log level to "WARN".
    To adjust logging level use sc.setLogLevel(newLevel).
    
    Welcome to
          ____              __
         / __/__  ___ _____/ /__
        _\ \/ _ \/ _ `/ __/  '_/
       /__ / .__/\_,_/_/ /_/\_\   version 1.6.0
          /_/
    
    Using Python version 2.7.6 (default, Jun 22 2015 17:58:13)
    SparkContext available as sc, HiveContext available as sqlContext.
    >>>
    

    If you want to get the spark version explicitly, you can use version method of SparkContext as shown below:

    >>>
    >>> sc.version
    u'1.6.0'
    >>>
    

提交回复
热议问题