We could use some help on how to send Spark Driver and worker logs to a destination outside Azure Databricks, like e.g. Azure Blob storage or Elastic search using Eleastic-beats.
When configuring a new cluster, the only options on get reg log delivery destination is dbfs, see
https://docs.azuredatabricks.net/user-guide/clusters/log-delivery.html.
Any input much appreciated, thanks!