Installations of Logstash with Kibana
Install the logstash, and start the service. You can test if its working fine using the below command,
bin/logstash -e '{ input stdin{}}' '{ output stdout{}}'
This means whatever you type would be taken as input to logstash and would output the same on the output.
I would be taking an Apache log file and would create a logstash config file where it would load all the data that has to be searched and loaded as an index into kibana.
Place this file in your elk folder, create another folder as data and place this file over there.
Reference:
input
{
file {
path => "C:\elk\data\logs\logs"
type => "logs"
start_position => "beginning"
}
}
filter
{
grok{
match => {
"message" => "%{COMBINEDAPACHELOG}"
}
}
mutate{
convert => { "bytes" => "integer" }
}
date {
match => [ "timestamp", "dd/MMM/YYYY:HH:mm:ss Z" ]
locale => en
remove_field => "timestamp"
}
geoip {
source => "clientip"
}
useragent {
source => "agent"
target => "useragent"
}
}
output
{
stdout {
codec => dots
}
elasticsearch {
}
}
{
file {
path => "C:\elk\data\logs\logs"
type => "logs"
start_position => "beginning"
}
}
filter
{
grok{
match => {
"message" => "%{COMBINEDAPACHELOG}"
}
}
mutate{
convert => { "bytes" => "integer" }
}
date {
match => [ "timestamp", "dd/MMM/YYYY:HH:mm:ss Z" ]
locale => en
remove_field => "timestamp"
}
geoip {
source => "clientip"
}
useragent {
source => "agent"
target => "useragent"
}
}
output
{
stdout {
codec => dots
}
elasticsearch {
}
}
Download the apache log(https://github.com/elastic/elk-index-size-tests/blob/master/logs.gz) from whatever source you have mentioned in the above document(C:\elk\data\logs\logs) and start the logstash server. All these logs would be read from the elasticsearch as the output is destined over there. we could see the same from the kibana console.
C:\elk\logstash>bin\logstash.bat -f C:\elk\data\apache.conf
Sending Logstash's logs to C:/elk/logstash/logs which is now configured via log4j2.properties
[2020-11-14T09:41:21,710][INFO ][logstash.setting.writabledirectory] Creating directory {:setting=>"path.queue", :path=>"C:/elk/logstash/data/queue"}
[2020-11-14T09:41:21,722][INFO ][logstash.setting.writabledirectory] Creating directory {:setting=>"path.dead_letter_queue", :path=>"C:/elk/logstash/data/dead_letter_queue"}
[2020-11-14T09:41:21,880][WARN ][logstash.config.source.multilocal] Ignoring the 'pipelines.yml' file because modules or command line options are specified
[2020-11-14T09:41:21,927][INFO ][logstash.agent ] No persistent UUID file found. Generating new UUID {:uuid=>"ec39c09a-712e-4d86-a9d8-ab629546e04f", :path=>"C:/elk/logstash/data/uuid"}
[2020-11-14T09:41:22,721][INFO ][logstash.runner ] Starting Logstash {"logstash.version"=>"6.3.2"}
[2020-11-14T09:41:27,029][INFO ][logstash.pipeline ] Starting pipeline {:pipeline_id=>"main", "pipeline.workers"=>8, "pipeline.batch.size"=>125, "pipeline.batch.delay"=>50}
[2020-11-14T09:41:27,751][INFO ][logstash.outputs.elasticsearch] Elasticsearch pool URLs updated {:changes=>{:removed=>[], :added=>[http://127.0.0.1:9200/]}}
[2020-11-14T09:41:27,767][INFO ][logstash.outputs.elasticsearch] Running health check to see if an Elasticsearch connection is working {:healthcheck_url=>http://127.0.0.1:9200/, :path=>"/"}
[2020-11-14T09:41:28,095][WARN ][logstash.outputs.elasticsearch] Restored connection to ES instance {:url=>"http://127.0.0.1:9200/"}
[2020-11-14T09:41:28,189][INFO ][logstash.outputs.elasticsearch] ES Output version determined {:es_version=>6}
[2020-11-14T09:41:28,189][WARN ][logstash.outputs.elasticsearch] Detected a 6.x and above cluster: the `type` event field won't be used to determine the document _type {:es_version=>6}
[2020-11-14T09:41:28,236][INFO ][logstash.outputs.elasticsearch] New Elasticsearch output {:class=>"LogStash::Outputs::ElasticSearch", :hosts=>["//127.0.0.1"]}
[2020-11-14T09:41:28,251][INFO ][logstash.outputs.elasticsearch] Using mapping template from {:path=>nil}
[2020-11-14T09:41:28,298][INFO ][logstash.outputs.elasticsearch] Attempting to install template {:manage_template=>{"template"=>"logstash-*", "version"=>60001, "settings"=>{"index.refresh_interval"=>"5s"}, "mappings"=>{"_default_"=>{"dynamic_templates"=>[{"message_field"=>{"path_match"=>"message", "match_mapping_type"=>"string", "mapping"=>{"type"=>"text", "norms"=>false}}}, {"string_fields"=>{"match"=>"*", "match_mapping_type"=>"string", "mapping"=>{"type"=>"text", "norms"=>false, "fields"=>{"keyword"=>{"type"=>"keyword", "ignore_above"=>256}}}}}], "properties"=>{"@timestamp"=>{"type"=>"date"}, "@version"=>{"type"=>"keyword"}, "geoip"=>{"dynamic"=>true, "properties"=>{"ip"=>{"type"=>"ip"}, "location"=>{"type"=>"geo_point"}, "latitude"=>{"type"=>"half_float"}, "longitude"=>{"type"=>"half_float"}}}}}}}}
[2020-11-14T09:41:28,392][INFO ][logstash.outputs.elasticsearch] Installing elasticsearch template to _template/logstash
[2020-11-14T09:41:28,801][INFO ][logstash.filters.geoip ] Using geoip database {:path=>"C:/elk/logstash/vendor/bundle/jruby/2.3.0/gems/logstash-filter-geoip-5.0.3-java/vendor/GeoLite2-City.mmdb"}
[2020-11-14T09:41:30,949][INFO ][logstash.pipeline ] Pipeline started successfully {:pipeline_id=>"main", :thread=>"#<Thread:0x58a3a982 run>"}
[2020-11-14T09:41:31,074][INFO ][logstash.agent ] Pipelines running {:count=>1, :running_pipelines=>[:main], :non_running_pipelines=>[]}
[2020-11-14T09:41:31,851][INFO ][logstash.agent ] Successfully started Logstash API endpoint {:port=>9600}
.....................................................................................................
[2020-11-14T09:41:21,710][INFO ][logstash.setting.writabledirectory] Creating directory {:setting=>"path.queue", :path=>"C:/elk/logstash/data/queue"}
[2020-11-14T09:41:21,722][INFO ][logstash.setting.writabledirectory] Creating directory {:setting=>"path.dead_letter_queue", :path=>"C:/elk/logstash/data/dead_letter_queue"}
[2020-11-14T09:41:21,880][WARN ][logstash.config.source.multilocal] Ignoring the 'pipelines.yml' file because modules or command line options are specified
[2020-11-14T09:41:21,927][INFO ][logstash.agent ] No persistent UUID file found. Generating new UUID {:uuid=>"ec39c09a-712e-4d86-a9d8-ab629546e04f", :path=>"C:/elk/logstash/data/uuid"}
[2020-11-14T09:41:22,721][INFO ][logstash.runner ] Starting Logstash {"logstash.version"=>"6.3.2"}
[2020-11-14T09:41:27,029][INFO ][logstash.pipeline ] Starting pipeline {:pipeline_id=>"main", "pipeline.workers"=>8, "pipeline.batch.size"=>125, "pipeline.batch.delay"=>50}
[2020-11-14T09:41:27,751][INFO ][logstash.outputs.elasticsearch] Elasticsearch pool URLs updated {:changes=>{:removed=>[], :added=>[http://127.0.0.1:9200/]}}
[2020-11-14T09:41:27,767][INFO ][logstash.outputs.elasticsearch] Running health check to see if an Elasticsearch connection is working {:healthcheck_url=>http://127.0.0.1:9200/, :path=>"/"}
[2020-11-14T09:41:28,095][WARN ][logstash.outputs.elasticsearch] Restored connection to ES instance {:url=>"http://127.0.0.1:9200/"}
[2020-11-14T09:41:28,189][INFO ][logstash.outputs.elasticsearch] ES Output version determined {:es_version=>6}
[2020-11-14T09:41:28,189][WARN ][logstash.outputs.elasticsearch] Detected a 6.x and above cluster: the `type` event field won't be used to determine the document _type {:es_version=>6}
[2020-11-14T09:41:28,236][INFO ][logstash.outputs.elasticsearch] New Elasticsearch output {:class=>"LogStash::Outputs::ElasticSearch", :hosts=>["//127.0.0.1"]}
[2020-11-14T09:41:28,251][INFO ][logstash.outputs.elasticsearch] Using mapping template from {:path=>nil}
[2020-11-14T09:41:28,298][INFO ][logstash.outputs.elasticsearch] Attempting to install template {:manage_template=>{"template"=>"logstash-*", "version"=>60001, "settings"=>{"index.refresh_interval"=>"5s"}, "mappings"=>{"_default_"=>{"dynamic_templates"=>[{"message_field"=>{"path_match"=>"message", "match_mapping_type"=>"string", "mapping"=>{"type"=>"text", "norms"=>false}}}, {"string_fields"=>{"match"=>"*", "match_mapping_type"=>"string", "mapping"=>{"type"=>"text", "norms"=>false, "fields"=>{"keyword"=>{"type"=>"keyword", "ignore_above"=>256}}}}}], "properties"=>{"@timestamp"=>{"type"=>"date"}, "@version"=>{"type"=>"keyword"}, "geoip"=>{"dynamic"=>true, "properties"=>{"ip"=>{"type"=>"ip"}, "location"=>{"type"=>"geo_point"}, "latitude"=>{"type"=>"half_float"}, "longitude"=>{"type"=>"half_float"}}}}}}}}
[2020-11-14T09:41:28,392][INFO ][logstash.outputs.elasticsearch] Installing elasticsearch template to _template/logstash
[2020-11-14T09:41:28,801][INFO ][logstash.filters.geoip ] Using geoip database {:path=>"C:/elk/logstash/vendor/bundle/jruby/2.3.0/gems/logstash-filter-geoip-5.0.3-java/vendor/GeoLite2-City.mmdb"}
[2020-11-14T09:41:30,949][INFO ][logstash.pipeline ] Pipeline started successfully {:pipeline_id=>"main", :thread=>"#<Thread:0x58a3a982 run>"}
[2020-11-14T09:41:31,074][INFO ][logstash.agent ] Pipelines running {:count=>1, :running_pipelines=>[:main], :non_running_pipelines=>[]}
[2020-11-14T09:41:31,851][INFO ][logstash.agent ] Successfully started Logstash API endpoint {:port=>9600}
.....................................................................................................
.
.
Once these dots are completed, you can close this window. You can test the same in the kibana console that you have all the indexes from the logstash..
[ ctrl-c]
..................................................................[2020-11-14T10:46:36,690][WARN ][logstash.runner ] SIGINT received. Shutting down.
[2020-11-14T10:46:38,334][INFO ][logstash.pipeline ] Pipeline has terminated {:pipeline_id=>"main", :thread=>"#<Thread:0x58a3a982 run>"}
Terminate batch job (Y/N)? y
C:\elk\logstash>
[2020-11-14T10:46:38,334][INFO ][logstash.pipeline ] Pipeline has terminated {:pipeline_id=>"main", :thread=>"#<Thread:0x58a3a982 run>"}
Terminate batch job (Y/N)? y
C:\elk\logstash>
you can also check from the elasticsearch url
Now, login to kibana, and go to Discover and create an index pattern using @timestamp (select range from 01-June-2014 to 20-July-2014)and you see the diagram below. Once the datas is loaded you will need to get a playaround, creating your own visuals and those being presented in the Dashboard.
We can create metrics of various fields and add into a single dashboard.
Created Pie-Charts, Bar Charts, & Geo-location graphs into the dashboard for the total requests during the mentioned time line.
We could write more of the data and create more dashboards to provide more insights.
No comments:
Post a Comment