We've configured Azure Data Lake Store Diagnostics logs to push data to Log Analytics. This data is later being used for Auditing purpose.
The issue we've encountered - our every-night backup process (which backups data from one Data Lake Store to another Data Lake Store) generates gigabytes of practically useless log records. We can filter-out these logs after they was uploaded to Log Analytics (in queries), but this data takes space and it costs a lot of money.
Is there a way to filter-out certain logs before uploading them to Log Analytics, so this data will never get to Log Analytics? (in our case this is logs produced by BACKUP user). What other options may be used to get rid of data backup logs?