Thursday 12 November 2020

Elasticsearch/Kubernetes/Logstash(ELK) - Part 1

In this article we will learn about ELK. 

Installing Elasticsearch & Kibana:
As part of the prerequisite, ensure you have installed Java.

Download Elastic and Kibana from below release page

unzip the downloaded file for both.
cd kibana*
vim config/kibana.yml
search for elasticsearch.url  in the config file and uncomment the line. 
it would be default pointed to elasticsearch at localhost:9200

Running Elasticsearch and Kibana
First always start the elasticsearch, bin/elasticsearch.sh
second, start kibana, bin/kibana.sh

Open your browser and point to urls,
Elasticsearch: http://localhost:9200 

As part of practice working ELK, we can populate data into a classic search, retrieve data, and delete data.
In the elastic search rolled data is stored into something called an index.

We would be taking an example of HR index and will create an index called hr and will store employee type and each employee would be created with an id.
e.g
<index>/<type>/<name>
/hr/employee/xyz

PUT /hr/employees/sunil
{
  "Name": "Sunil",
  "EmpID": "123"
}


Returns, the success code of the API call.
HEAD /hr/employees/sunil

Retrieve, data
GET /hr/employees/sunil

Update data,
POST /hr/employees/sunil/_update
{
  "doc":{
    "Location": "Bengaluru"
  }
}


whenever data is being written it would not just change the attributes, instead the document itself.

Delete data,
DELETE /hr/employees/sunil
The deletion only did on the attribute on the call, however the index still remains.

DELETE /hr

Index Components

GET /business
You won't have any index hence it returns error, we will try to create a new index.

PUT /business/building/200
{
  "address": "498 Dave Street In",
  "floors": 3,
  "offices": 5,
  "loc": {
    "latitude": 23.2332,
    "longitute": 34.23233
  }
}


GET /business
You would get the below output which has main componets as
aliases, mappings, settings.

So when we try to add more records into the search with different fields, elastics would map itself to the mapping section.
elastic search dynamic

{
  "business": {
    "aliases": {},
    "mappings": {},
    "settings": {}
  }
}

PUT /business/building/201
{
  "address": "498 Dave Street In",
  "floors": 3,
  "offices": 5,
  "price": 5000000,
  "loc": {
    "latitude": 23.2332,
    "longitute": 34.23233
  }
}


Note: we could only have 1 type in the index.
e.g PUT /business/employees/232, this would give an error as /business is already associated with "buildings"
so you can crate in this way, with new Index

PUT /employees/_doc/200
{
  "Name": "Sunil",
  "title": "Senior Engineer",
  "joining_data": "Jan 01 2020"
}

PUT /employees/_doc/201
{
  "Name": "Ram",
  "title": "Senior Tech Engineer",
  "joining_data": "Jul 01 2000"
}


PUT /contracts/_doc/220
{
  "Name": "System Admins",
  "start_date": "Jan 10 2015",
  "employees": [200, 201]
}


Query data

GET business/building/_search
          or
GET business/_search


Search and get only the required record
GET business/_search
{
  "query": {
    "term": {
      "address": "498"
    }
  }
}


Actual request for elastic search which goes from kibana console would be like below,

curl -X GET "http://localhost:9200/business/_search?pretty" -H 'Content-Type: application/json' -d'
{
  "query": {
    "term": {
      "address": "498"
    }
  }
}'


We will further check on the next parts on text analysis for indexing and searching etc.


No comments:

Post a Comment