Is there a way to have the output from Dataproc Spark jobs sent to Google Cloud logging? As explained in the Dataproc docs the output from the job driver (the master for a S
You can use the dataproc initialization actions for stackdriver for this:
gcloud dataproc clusters create \ --initialization-actions gs:///stackdriver.sh \ --scopes https://www.googleapis.com/auth/monitoring.write