how to call oracle stored proc in spark?
问题 In my spark project , I am using spark-sql-2.4.1v. As part of my code , I need to call oracle stored procs in my spark job. how to call oracle stored procs? 回答1: You can try doing something like this, though I have never tried this personally in any implementation query = "exec SP_NAME" empDF = spark.read \ .format("jdbc") \ .option("url", "jdbc:oracle:thin:username/password@//hostname:portnumber/SID") \ .option("dbtable", query) \ .option("user", "db_user_name") \ .option("password",