0
votes

I've recently configured a standalone environment to host my elastic stack as described here

https://www.digitalocean.com/community/tutorials/how-to-install-elasticsearch-logstash-and-kibana-elastic-stack-on-ubuntu-18-04

Overview

The setup is as follows

NGinx ( :80 ) < Kibana ( :5601 ) < Elastic Search ( : 9200 ) < Log Stash

So in order to access my logs I simply go to <machine-ip>:80 within the browser and login using my credentials for kibana I setup within the guide.

My logging server is setup correctly to use file-beat to send system logs to log-stash etc - What i'm not sure is the correct way to replicate this behaviour on a remote machine

Question

I now would like to post logs over to my log server from another machine but i'm a little unsure on the best way to approach this - Here is my understanding.

1) Install log-stash + filebeat on the machine I want to send logs from

2) Read STDOUT from the docker container/s using filebeat + format in log stash

3) Send the log stash output to my logging server

Now the final point is the part i'm not sure on ( Or maybe the other parts are not the best way to do it either! )

My questions are

Q1) Where should I post my logs too - Should I be hitting my <machine-ip>:80 and talking directly through kibana, or should I open port 9200 to talk to elastic search directly ( And if so how should I be authenticating this communication like Kibana is through credentials )

Q2) What are the best practices on logging from a docker container ( nodeJS in my case ). Should I be setup like point 1 + 2 mentioned where I run logstash / file-beat on that machine or is there a better way

Any help is much appreciated!

e/ Solution for Q1

I've come up with a solution to Q1 for anyone in the future looking

1) Setup an NGINX proxy listening on port 8080 on the elastic stack logging server - Only traffic coming from my application servers is allowed to talk to this

2) Forward traffic to the elasticsearch instance running on port 9200

The nginx config looks like this

server {
    listen 8080;
    allow xxx.xxx.xxx.xx;
    deny all;
    location / {
      proxy_pass http://localhost:9200;
    }
}

1
filebeat forwards logs to logstash; you filter data and convert to json format and send to elastic search; elastic search exposes address for kibana to listen to; configure elastic search index and address in kibana. Look at the docs elastic.co/guide/index.html and this guide is good as well logz.io/learn/complete-guide-elk-stack/#elasticsearch - Krishna
@Krishna Thanks for the response. I understand how it works as I already collect system logs through file-beat + log stash then send to elastic search etc on my logging server - what i'm unsure about is the secure way to send these from another server to this one. I've edited the question a little to add this part "My logging server is setup correctly to use file-beat to send system logs to log-stash etc - What i'm not sure is the correct way to replicate this behaviour on a remote machine" - Phil bloggs
you can configure the kibnaconfig.yml file to enable the credentials/ssl certs verification for elastic search. elastic.co/guide/en/kibana/6.3/using-kibana-with-security.html - Krishna
Thanks for the link - This is still a little unclear to me though. The link you have provided discusses the setup of x-pack which is a paid feature elastic.co/guide/en/elasticsearch/reference/6.3/…. I've done some more research and was considering just adding my application servers IP to the network.host section of the /elasticsearch.yml and opening up port 9200 on the logging server so it can communicate - thoughts on this? - Phil bloggs
Yes, that would work. Although since your stack is not paid you might have figure another way to think about securing the elasticsearch endpoint. For starters you can change the default port to something else. I found this article on securing the cluster dzone.com/articles/securing-your-elasticsearch-cluster-properly - Krishna

1 Answers

0
votes

https://www.npmjs.com/package/winston-transport-udp-logstash if you want try I created this npm package to send data to ups logstash endpoint