ModuleNotFoundError: No module named 'pyspark.dbutils'
问题 I am running pyspark from an Azure Machine Learning notebook. I am trying to move a file using the dbutil module. from pyspark.sql import SparkSession spark = SparkSession.builder.getOrCreate() def get_dbutils(spark): try: from pyspark.dbutils import DBUtils dbutils = DBUtils(spark) except ImportError: import IPython dbutils = IPython.get_ipython().user_ns["dbutils"] return dbutils dbutils = get_dbutils(spark) dbutils.fs.cp("file:source", "dbfs:destination") I got this error: