File Beats
Whether you're collecting from security devices, cloud, containers, hosts, or OT, Filebeat helps you keep the simple things simple by offering a lightweight way to forward and centralize logs and files.(Easlticsearch)
input
{
beats {
port => 5044
}
}
filter
{
grok{
match => {
"message" => "%{COMBINEDAPACHELOG}"
}
}
mutate{
convert => { "bytes" => "integer" }
}
date {
match => [ "timestamp", "dd/MMM/YYYY:HH:mm:ss Z" ]
locale => en
remove_field => "timestamp"
}
geoip {
source => "clientip"
}
useragent {
source => "agent"
target => "useragent"
}
}
output
{
stdout {
codec => dots
}
elasticsearch {
hosts => ["http://localhost:9200"]
index => "%{[@metadata][beat]}-%{[@metadata][version]}-%{+YYYY.MM.dd}"
}
}
Whether you're collecting from security devices, cloud, containers, hosts, or OT, Filebeat helps you keep the simple things simple by offering a lightweight way to forward and centralize logs and files.(Easlticsearch)
Reference:
How does the file beat work?
How to install filebeat?
From your kibana console -> Logging -> Apache Metrics -> Download the filebeats and configure.
I have modified the logstash file so that input would be using the filebeats ..
cat apache-filebeat.conf
{
beats {
port => 5044
}
}
filter
{
grok{
match => {
"message" => "%{COMBINEDAPACHELOG}"
}
}
mutate{
convert => { "bytes" => "integer" }
}
date {
match => [ "timestamp", "dd/MMM/YYYY:HH:mm:ss Z" ]
locale => en
remove_field => "timestamp"
}
geoip {
source => "clientip"
}
useragent {
source => "agent"
target => "useragent"
}
}
output
{
stdout {
codec => dots
}
elasticsearch {
hosts => ["http://localhost:9200"]
index => "%{[@metadata][beat]}-%{[@metadata][version]}-%{+YYYY.MM.dd}"
}
}
Start the logstash server from a new terminal, search for the logs which would listen to the beats..
C:\elk\logstash>bin\logstash.bat -f C:\elk\data\apache-filebeat.conf
Sending Logstash's logs to C:/elk/logstash/logs which is now configured via log4j2.properties
[2020-11-14T17:18:12,918][WARN ][logstash.config.source.multilocal] Ignoring the 'pipelines.yml' file because modules or command line options are specified
[2020-11-14T17:18:13,368][INFO ][logstash.runner ] Starting Logstash {"logstash.version"=>"6.3.2"}
[2020-11-14T17:18:16,372][INFO ][logstash.pipeline ] Starting pipeline {:pipeline_id=>"main", "pipeline.workers"=>8, "pipeline.batch.size"=>125, "pipeline.batch.delay"=>50}
[2020-11-14T17:18:16,784][INFO ][logstash.outputs.elasticsearch] Elasticsearch pool URLs updated {:changes=>{:removed=>[], :added=>[http://localhost:9200/]}}
[2020-11-14T17:18:16,784][INFO ][logstash.outputs.elasticsearch] Running health check to see if an Elasticsearch connection is working {:healthcheck_url=>http://localhost:9200/, :path=>"/"}
[2020-11-14T17:18:16,971][WARN ][logstash.outputs.elasticsearch] Restored connection to ES instance {:url=>"http://localhost:9200/"}
[2020-11-14T17:18:17,018][INFO ][logstash.outputs.elasticsearch] ES Output version determined {:es_version=>6}
[2020-11-14T17:18:17,018][WARN ][logstash.outputs.elasticsearch] Detected a 6.x and above cluster: the `type` event field won't be used to determine the document _type {:es_version=>6}
[2020-11-14T17:18:17,049][INFO ][logstash.outputs.elasticsearch] New Elasticsearch output {:class=>"LogStash::Outputs::ElasticSearch", :hosts=>["http://localhost:9200"]}
[2020-11-14T17:18:17,065][INFO ][logstash.outputs.elasticsearch] Using mapping template from {:path=>nil}
[2020-11-14T17:18:17,096][INFO ][logstash.outputs.elasticsearch] Attempting to install template {:manage_template=>{"template"=>"logstash-*", "version"=>60001, "settings"=>{"index.refresh_interval"=>"5s"}, "mappings"=>{"_default_"=>{"dynamic_templates"=>[{"message_field"=>{"path_match"=>"message", "match_mapping_type"=>"string", "mapping"=>{"type"=>"text", "norms"=>false}}}, {"string_fields"=>{"match"=>"*", "match_mapping_type"=>"string", "mapping"=>{"type"=>"text", "norms"=>false, "fields"=>{"keyword"=>{"type"=>"keyword", "ignore_above"=>256}}}}}], "properties"=>{"@timestamp"=>{"type"=>"date"}, "@version"=>{"type"=>"keyword"}, "geoip"=>{"dynamic"=>true, "properties"=>{"ip"=>{"type"=>"ip"}, "location"=>{"type"=>"geo_point"}, "latitude"=>{"type"=>"half_float"}, "longitude"=>{"type"=>"half_float"}}}}}}}}
[2020-11-14T17:18:17,252][INFO ][logstash.filters.geoip ] Using geoip database {:path=>"C:/elk/logstash/vendor/bundle/jruby/2.3.0/gems/logstash-filter-geoip-5.0.3-java/vendor/GeoLite2-City.mmdb"}
[2020-11-14T17:18:17,811][INFO ][logstash.inputs.beats ] Beats inputs: Starting input listener {:address=>"0.0.0.0:5044"}
[2020-11-14T17:18:17,827][INFO ][logstash.pipeline ] Pipeline started successfully {:pipeline_id=>"main", :thread=>"#<Thread:0x7758f1a6 run>"}
[2020-11-14T17:18:17,936][INFO ][logstash.agent ] Pipelines running {:count=>1, :running_pipelines=>[:main], :non_running_pipelines=>[]}
[2020-11-14T17:18:17,943][INFO ][org.logstash.beats.Server] Starting server on port: 5044
[2020-11-14T17:18:18,208][INFO ][logstash.agent ] Successfully started Logstash API endpoint {:port=>9600}
Sending Logstash's logs to C:/elk/logstash/logs which is now configured via log4j2.properties
[2020-11-14T17:18:12,918][WARN ][logstash.config.source.multilocal] Ignoring the 'pipelines.yml' file because modules or command line options are specified
[2020-11-14T17:18:13,368][INFO ][logstash.runner ] Starting Logstash {"logstash.version"=>"6.3.2"}
[2020-11-14T17:18:16,372][INFO ][logstash.pipeline ] Starting pipeline {:pipeline_id=>"main", "pipeline.workers"=>8, "pipeline.batch.size"=>125, "pipeline.batch.delay"=>50}
[2020-11-14T17:18:16,784][INFO ][logstash.outputs.elasticsearch] Elasticsearch pool URLs updated {:changes=>{:removed=>[], :added=>[http://localhost:9200/]}}
[2020-11-14T17:18:16,784][INFO ][logstash.outputs.elasticsearch] Running health check to see if an Elasticsearch connection is working {:healthcheck_url=>http://localhost:9200/, :path=>"/"}
[2020-11-14T17:18:16,971][WARN ][logstash.outputs.elasticsearch] Restored connection to ES instance {:url=>"http://localhost:9200/"}
[2020-11-14T17:18:17,018][INFO ][logstash.outputs.elasticsearch] ES Output version determined {:es_version=>6}
[2020-11-14T17:18:17,018][WARN ][logstash.outputs.elasticsearch] Detected a 6.x and above cluster: the `type` event field won't be used to determine the document _type {:es_version=>6}
[2020-11-14T17:18:17,049][INFO ][logstash.outputs.elasticsearch] New Elasticsearch output {:class=>"LogStash::Outputs::ElasticSearch", :hosts=>["http://localhost:9200"]}
[2020-11-14T17:18:17,065][INFO ][logstash.outputs.elasticsearch] Using mapping template from {:path=>nil}
[2020-11-14T17:18:17,096][INFO ][logstash.outputs.elasticsearch] Attempting to install template {:manage_template=>{"template"=>"logstash-*", "version"=>60001, "settings"=>{"index.refresh_interval"=>"5s"}, "mappings"=>{"_default_"=>{"dynamic_templates"=>[{"message_field"=>{"path_match"=>"message", "match_mapping_type"=>"string", "mapping"=>{"type"=>"text", "norms"=>false}}}, {"string_fields"=>{"match"=>"*", "match_mapping_type"=>"string", "mapping"=>{"type"=>"text", "norms"=>false, "fields"=>{"keyword"=>{"type"=>"keyword", "ignore_above"=>256}}}}}], "properties"=>{"@timestamp"=>{"type"=>"date"}, "@version"=>{"type"=>"keyword"}, "geoip"=>{"dynamic"=>true, "properties"=>{"ip"=>{"type"=>"ip"}, "location"=>{"type"=>"geo_point"}, "latitude"=>{"type"=>"half_float"}, "longitude"=>{"type"=>"half_float"}}}}}}}}
[2020-11-14T17:18:17,252][INFO ][logstash.filters.geoip ] Using geoip database {:path=>"C:/elk/logstash/vendor/bundle/jruby/2.3.0/gems/logstash-filter-geoip-5.0.3-java/vendor/GeoLite2-City.mmdb"}
[2020-11-14T17:18:17,811][INFO ][logstash.inputs.beats ] Beats inputs: Starting input listener {:address=>"0.0.0.0:5044"}
[2020-11-14T17:18:17,827][INFO ][logstash.pipeline ] Pipeline started successfully {:pipeline_id=>"main", :thread=>"#<Thread:0x7758f1a6 run>"}
[2020-11-14T17:18:17,936][INFO ][logstash.agent ] Pipelines running {:count=>1, :running_pipelines=>[:main], :non_running_pipelines=>[]}
[2020-11-14T17:18:17,943][INFO ][org.logstash.beats.Server] Starting server on port: 5044
[2020-11-14T17:18:18,208][INFO ][logstash.agent ] Successfully started Logstash API endpoint {:port=>9600}
It's now time to modify the filebeat configuration.
config file: filebeat.yml
Change these in the config file..
filebeat.inputs:
- type: log
enabled: true
paths:
- C:\elk\data\logs\*
enabled: true
paths:
- C:\elk\data\logs\*
Comment out from elastic search and un-comment logstash
#----------------------------- Logstash output --------------------------------
output.logstash:
hosts: ["localhost:5044"]
output.logstash:
hosts: ["localhost:5044"]
Save and Quit, start the filebeat server.
Open a new terminal and execute below command
elk/filebeat>filebeat.exe
Go to the Kibana console and check on the Management, you would find all the index building done by the logstash.
All your data is being sent from the host using filebeat to the logstash server.
That's all for ELK.
No comments:
Post a Comment