1
votes

We've configured Azure Data Lake Store Diagnostics logs to push data to Log Analytics. This data is later being used for Auditing purpose.

The issue we've encountered - our every-night backup process (which backups data from one Data Lake Store to another Data Lake Store) generates gigabytes of practically useless log records. We can filter-out these logs after they was uploaded to Log Analytics (in queries), but this data takes space and it costs a lot of money.

Is there a way to filter-out certain logs before uploading them to Log Analytics, so this data will never get to Log Analytics? (in our case this is logs produced by BACKUP user). What other options may be used to get rid of data backup logs?

1

1 Answers

2
votes

It is not possible to filter the logs in this manner before they are routed to Log Analytics.

In the future, you may be able to eliminate your backup process in favor of using a geo-redundant Azure Data Lake Store.