I have a Spark application that I am submitting to the Bluemix Spark Cluster. It reads from a DASHDB database and writes the results to Cloudant. The code accesses the DAS
I think the jdbc driver will always need username and password to connect to database so that is out of question as you are in multi-tenant enviornment on bluemix.
Now about spark-submit.sh to read the arguments securely, that option is not available yet.
Thanks, Charles.
Based on the answer here, my preference would be to pass a properties file that has the credentials. Other tenants will not be able to read the properties file, but you will be able to read if from your spark application, e.g. as a dataframe spfrom which you can access the parameters.