0
votes

I tried to setup a flume agent to source data from syslog server. basically, I have setup a syslog server on an server so-called (server1) to receive syslog events, then forward all messages to different server (server2) where the flume agent installed, then finally all data will be sink to kafka cluster.

Flume configuration as below.

# For each one of the sources, the type is defined
agent.sources.syslogSrc.type = syslogudp
agent.sources.syslogSrc.port = 9090
agent.sources.syslogSrc.host = server2

# The channel can be defined as follows.
agent.sources.syslogSrc.channels = memoryChannel

# Each channel's type is defined.
agent.channels.memoryChannel.type = memory

# Other config values specific to each type of channel(sink or source)
# can be defined as well
# In this case, it specifies the capacity of the memory channel
agent.channels.memoryChannel.capacity = 100


# config for kafka sink
agent.sinks.kafkaSink.channel = memoryChannel
agent.sinks.kafkaSink.type = org.apache.flume.sink.kafka.KafkaSink
agent.sinks.kafkaSink.kafka.topic = flume
agent.sinks.kafkaSink.kafka.bootstrap.servers = <kafka.broker.list>:9092
agent.sinks.kafkaSink.kafka.flumeBatchSize = 20
agent.sinks.kafkaSink.kafka.producer.acks = 1
agent.sinks.kafkaSink.kafka.producer.linger.ms = 1
agent.sinks.kafkaSink.kafka.producer.compression.type = snappy

But, somehow logsys is not getting injected into flume agent.

appricate for your advice.

1

1 Answers

0
votes

I have setup a syslog server on an server so-called (server1)

The syslogudp Source must bind to server1 host

agent.sources.syslogSrc.host = server1

then forward all messages to different server (server2)

the different server refers to the Sink

agent.sinks.kafkaSink.kafka.bootstrap.servers = server2:9092

Flume agent is only a process that hosts these components (Source, Sink, Channel) to facilitate the flow of events.