I would like to run spark-shell with a external package behind a corporate proxy. Unfortunately external packages passed via --packages
option are not resolved.
Found the correct settings:
bin/spark-shell --conf "spark.driver.extraJavaOptions=-Dhttp.proxyHost=<proxyHost> -Dhttp.proxyPort=<proxyPort> -Dhttps.proxyHost=<proxyHost> -Dhttps.proxyPort=<proxyPort>" --packages <somePackage>
Both http and https proxies have to be set as extra driver options. JAVA_OPTS does not seem to do anything.
Add
spark.driver.extraJavaOptions=-Dhttp.proxyHost=<proxyHost> -Dhttp.proxyPort=<proxyPort> -Dhttps.proxyHost=<proxyHost> -Dhttps.proxyPort=<proxyPort>
to $SPARK_HOME/conf/spark-defaults.conf
works for me.
This worked for me in spark 1.6.1:
bin\spark-shell --driver-java-options "-Dhttp.proxyHost=<proxyHost> -Dhttp.proxyPort=<proxyPort> -Dhttps.proxyHost=<proxyHost> -Dhttps.proxyPort=<proxyPort>" --packages <package>
If you need authentication to use proxy, you can use below in default conf file:
spark.driver.extraJavaOptions -Dhttp.proxyHost= -Dhttp.proxyPort= -Dhttps.proxyHost= -Dhttps.proxyPort= -Dhttp.proxyUsername= -Dhttp.proxyPassword= -Dhttps.proxyUsername= -Dhttps.proxyPassword=
If proxy is correctly configured on your OS, you can use the java property: java.net.useSystemProxies
:
--conf "spark.driver.extraJavaOptions=-Djava.net.useSystemProxies=true"
so proxy host / port and no-proxy hosts will be configured.
Was struggling with pyspark till I found this:
Adding on to @Tao Huang's answer:
bin/pyspark --driver-java-options="-Dhttp.proxyUser=user -Dhttp.proxyPassword=password -Dhttps.proxyUser=user -Dhttps.proxyPassword=password -Dhttp.proxyHost=proxy -Dhttp.proxyPort=port -Dhttps.proxyHost=proxy -Dhttps.proxyPort=port" --packages [groupId:artifactId]
I.e. should be -Dhttp(s).proxyUser instead of ...proxyUsername