I'm set up as follows while we're testing out the ESK stack and learning how all this works:
Client server (CS) running filebeat sends output to logstash running on site aggregator node (AN).
AN runs logstash with some filters and then forwards to rabbitmq on our Elasticsearch/Kibana node (ESK).
ESK runs rabbitmq, and logstash pulls messages from rabbitmq before sending output to elasticsearch (without filtering). Kibana is our visualization (obviously) and we're all pretty new with elasticsearch so we're not doing much with it directly.
Here's the problem:
CS generates a message. It definitely gets sent to AN, where logstash filters it and fowrwards it on (after echoing it to logstash.stdout). The logstash instance on ESK also sees it (and writes it to logstash.stdout). I can see the messages in both logstash instances. They match and are appropriately tagged. But they aren't visible in Kibana.
Our configs and a sample message from both logs are all in gist form here: https://gist.github.com/wortmanb/ebd37b8bea278d06ffa058c1513ef940
Where could these messages be going? They're not showing up in Kibana -- if I filter on messages with tags: "puppet", I get basically nothing during timeframes when I know these messages are flowing.
Any debugging suggestions?