ELK + REDIS - Duplicating data

Asked

Viewed 115 times

2

Fala galera,

I need a help...

I have the following Logstash configuration files:

agent conf.

 input {
  log4j {
    type => "bdj"
    port => 25827
  }
}

filter{
    json{
        source => "message"
    }
    
}

output {
  stdout { 
    codec => rubydebug
  }
  redis {
    host => "127.0.0.1"
    data_type => "list"
    key => "logstash"
  }
}

This agent conf. receives the logs via tcp and forwards to redis.

central conf.

input {
  redis {
    host => "localhost"
    type => "redis-input"
    data_type => "list"
    key => "logstash"
  }
}

filter{
  json{
    source => "message"
  } 
}

output {
  stdout { }
  elasticsearch {
    hosts => "localhost"
    index => "logstash-%{+YYYY.MM.dd}"
  }

}

The central conf. in turn, captures the data from the redis and forwards to elastichsarch.

The problem is that the data is being duplicated, like it’s on a loop,.

I am running logstash on a Debian as a service.

root@logs:~# uname -a
Linux logs 3.2.0-4-amd64 #1 SMP Debian 3.2.78-1 x86_64 GNU/Linux

Some light?

1 answer

1

Duplicate log is not the ghost log record as with SCALA server?

"An alternative solution to avoid duplicate log records after logstash restart is to set the sincedb_path parameter in the file plug-in as /dev/null, thus indicating for logstash, ignore tracking the last monitored position of the file and always start monitoring from the end of the file. However, this will cause logstash to ignore all updates made to the log file while the logstash agent is inactive."

Check out this help from IBM:

Duplicating log records on the SCALA server

Browser other questions tagged

You are not signed in. Login or sign up in order to post.