How to execute a stored procedure in Azure Databricks PySpark?
问题 I am able to execute a simple SQL statement using PySpark in Azure Databricks but I want to execute a stored procedure instead. Below is the PySpark code I tried. #initialize pyspark import findspark findspark.init('C:\Spark\spark-2.4.5-bin-hadoop2.7') #import required modules from pyspark import SparkConf, SparkContext from pyspark.sql import SparkSession from pyspark.sql import * import pandas as pd #Create spark configuration object conf = SparkConf() conf.setMaster("local").setAppName("My