I have a number of splunk alerts which fire if there are no results. However, occasionally our splunk search goes down for a minute, causing all of these alerts to fire.
I am not in charge of the splunk server, so I cannot do anything to improve it's ability for the search to stay up. I can however modify the queries of the alerts. I am not looking for suggestions for how to keep the search up. It is out of my hands.
Is there a way I can modify a search to check that search is working? Like perhaps a way to always return at least 1 result from the search and then I can modify my alerts to fire when there is just 1 result rather than 0.
so the pseudo query would be something like: query which always returns just 1 result if search is working | append these results with original splunk query that expects > 0 results