How to add spark packages to Spark R notebook on DSX?
问题 The spark documentation shows how a spark package can be added: sparkR.session(sparkPackages = "com.databricks:spark-avro_2.11:3.0.0") I believe this can only be used when initialising the session. How can we add spark packages for SparkR using a notebook on DSX? 回答1: Please use pixiedust package manager to install the avro package. pixiedust.installPackage("com.databricks:spark-avro_2.11:3.0.0") http://datascience.ibm.com/docs/content/analyze-data/Package-Manager.html Install it from python