How to re-direct logs from Azure Databricks to another destination?

本小妞迷上赌 提交于 2019-12-24 03:23:49

问题


We could use some help on how to send Spark Driver and worker logs to a destination outside Azure Databricks, like e.g. Azure Blob storage or Elastic search using Eleastic-beats.

When configuring a new cluster, the only options on get reg log delivery destination is dbfs, see

https://docs.azuredatabricks.net/user-guide/clusters/log-delivery.html.

Any input much appreciated, thanks!


回答1:


Maybe the following could be helpful :

First you specify a dbfs location for your Spark driver and worker logs.
https://docs.databricks.com/user-guide/clusters/log-delivery.html

Then, you create a mount point that links your dbfs folder to a Blob Storage container. https://docs.databricks.com/spark/latest/data-sources/azure/azure-storage.html#mount-azure-blob-storage-containers-with-dbfs

Hope this help !



来源:https://stackoverflow.com/questions/54962314/how-to-re-direct-logs-from-azure-databricks-to-another-destination

易学教程内所有资源均来自网络或用户发布的内容,如有违反法律规定的内容欢迎反馈
该文章没有解决你所遇到的问题?点击提问,说说你的问题,让更多的人一起探讨吧!