0
votes

I have a number of splunk alerts which fire if there are no results. However, occasionally our splunk search goes down for a minute, causing all of these alerts to fire.

I am not in charge of the splunk server, so I cannot do anything to improve it's ability for the search to stay up. I can however modify the queries of the alerts. I am not looking for suggestions for how to keep the search up. It is out of my hands.

Is there a way I can modify a search to check that search is working? Like perhaps a way to always return at least 1 result from the search and then I can modify my alerts to fire when there is just 1 result rather than 0.

so the pseudo query would be something like: query which always returns just 1 result if search is working | append these results with original splunk query that expects > 0 results

1

1 Answers

0
votes

Using makeresults may be enough to generate the event you are after.

| makeresults | eval msg="makeresults generates a single event by default" | append [ your other search ]

However, is your use-case to search the last few minutes of data, and is that why your search returns 0 results? Best practice is to make the latest time a few minutes ago, to avoid any delays due to log transmission or outages. Eg, search earliest=-15m latest=-5m, and run this search every 10 minutes.