2
Fala galera,
I need a help...
I have the following Logstash configuration files:
agent conf.
input {
log4j {
type => "bdj"
port => 25827
}
}
filter{
json{
source => "message"
}
}
output {
stdout {
codec => rubydebug
}
redis {
host => "127.0.0.1"
data_type => "list"
key => "logstash"
}
}
This agent conf. receives the logs via tcp and forwards to redis.
central conf.
input {
redis {
host => "localhost"
type => "redis-input"
data_type => "list"
key => "logstash"
}
}
filter{
json{
source => "message"
}
}
output {
stdout { }
elasticsearch {
hosts => "localhost"
index => "logstash-%{+YYYY.MM.dd}"
}
}
The central conf. in turn, captures the data from the redis and forwards to elastichsarch.
The problem is that the data is being duplicated, like it’s on a loop,.
I am running logstash on a Debian as a service.
root@logs:~# uname -a
Linux logs 3.2.0-4-amd64 #1 SMP Debian 3.2.78-1 x86_64 GNU/Linux
Some light?