Azure Databricks to Azure SQL DW: Long text columns
问题 I would like to populate an Azure SQL DW from an Azure Databricks notebook environment. I am using the built-in connector with pyspark: sdf.write \ .format("com.databricks.spark.sqldw") \ .option("forwardSparkAzureStorageCredentials", "true") \ .option("dbTable", "test_table") \ .option("url", url) \ .option("tempDir", temp_dir) \ .save() This works fine, but I get an error when I include a string column with a sufficiently long content. I get the following error: Py4JJavaError: An error