elk stack - filebeat is sending log directly to elastic search not to logstash -


i facing problem since have deleted indices . executed following command

curl -xdelete 'http://localhost:9200/*'  

filebeat.yml

filebeat:   prospectors:     -      paths:       - /var/log/syslog     - input_type : log       document_type: syslog   registry_file: /var/lib/filebeat/registry output:   logstash:     hosts: ["127.0.0.1:5044"]     bulk_max_size: 1024  shipper: logging:   files:     rotateeverybytes: 10485760 # = 10mb 

and logstash config files input config

   input {       beats {         port => 5044       }     } 

and output config

  output {       elasticsearch {         hosts => ["localhost:9200"]         sniffing => true         manage_template => false         index => "%{[@metadata][beat]}-%{+yyyy.mm.dd}"         document_type => "%{[@metadata][type]}"       }     } 

problem logs not coming through logstash , these coming directly because can not see new field added in kibana , in case of apche-access log there log value of type.

you may have syntactical error in filebeat config, try changing

 - input_type : log 

to

input_type : log 

that - messing config declaring second prospector. if logstash processing done type, improperly typed logs make elasticsearch through logstash no parsing done.


Comments

Popular posts from this blog

iis - ASP.Net Core CreatedAtAction in HttpPost action returns 201 but entire request ends with 500 -

Failed to execute goal org.apache.maven.plugins:maven-surefire-plugin:2.12:test (default-test) on project.Error occurred in starting fork -