问题
We could use some help on how to send Spark Driver and worker logs to a destination outside Azure Databricks, like e.g. Azure Blob storage or Elastic search using Eleastic-beats.
When configuring a new cluster, the only options on get reg log delivery destination is dbfs, see
https://docs.azuredatabricks.net/user-guide/clusters/log-delivery.html.
Any input much appreciated, thanks!
回答1:
Maybe the following could be helpful :
First you specify a dbfs location for your Spark driver and worker logs.
https://docs.databricks.com/user-guide/clusters/log-delivery.html
Then, you create a mount point that links your dbfs folder to a Blob Storage container. https://docs.databricks.com/spark/latest/data-sources/azure/azure-storage.html#mount-azure-blob-storage-containers-with-dbfs
Hope this help !
来源:https://stackoverflow.com/questions/54962314/how-to-re-direct-logs-from-azure-databricks-to-another-destination