Tuesday, July 22, 2014

ElasticSearch, LogStash, and Kibana - Beginners guide | LogStash

Part 3 to ElasticSearch, LogStash, and Kibana - Beginners guide

Logstash. Now we're getting to the good stuff. In my set up, Logstash reads data from Redis, and sends it directly to elasticsearch. So there are 3 main parts to the logstash config. Input, Filter, and Output. The Input tells Logstash where to get data from, Filter is what to do with the data, and Output is where to send it.


Installing:

Installing is not as straightforward as the other things have been so far. Here are the straightforward steps.

  1. wget -O - http://packages.elasticsearch.org/GPG-KEY-elasticsearch | apt-key add -
  2. sudo sh -c "echo 'deb http://packages.elasticsearch.org/logstash/1.4/debian stable main' > /etc/apt/sources.list.d/logstash.list"
  3. apt-get update
  4. apt-get install logstash

Run the commands as root to make it easier and just paste them in.





Configuring:

Create your config file as /etc/logstash/conf.d/central.conf. here is again what I use.



input {
# The input section of where to get data from
redis { # Using Redis
host => "10.100.10.14"
# IP of redis server
type => "log_line" # This was specified in Beaver, use consistency
data_type => "list"
key => "logstash-data" # This is the redis-namespace to look at
}


}



filter {
if [type] == "log_line" { # Since this is all I import, everything should be type log_line

# Grok is awesome but can drive you mad.
# This is my grok pattern for a custom log file, tab delimited. Lots of regex
# The main thing to know is that an almighty GROK DEBUGGER exists!
grok {
match => [ "message", "(?<timestamp>[0-9]+/[0-9]+/[0-9]+ [0-9]+:[0-9]+:[0-9]+ [AMP]+)\t(?<msg_ver>|[^\t]+)\t(?<serial>|[^\t]+)\t(?<sip>|[^\t]+)\t(?<serv>|[Server0-9\.]+)\t(?<count>|[^\t]+)\t(?<addr>|[^\t]+)\t(?<o_addr>|[^\t]+)\t(?<sub>|[^\t]+)\t(?<hits>|[^\t]+)\t(?<path>|[^\t]+)\t(?<dom>[^\t]+)" ]
add_tag => [ "es1_grok" ] # Tag it as grokked
}




date { # This tells elasticsearch how to interpret the given time stamp

# If there can be multiple formats, you can comma seperate them
match => [ "timestamp", "M/d/YYYY hh:mm:ss aa", "M/dd/YYYY hh:mm:ss aa", "MM/d/YYYY hh:mm:ss aa", "MM/dd/YYYY hh:mm:ss aa" ]
add_tag => [ "dated" ]
}



}
}



 

output {

# Output to elasticsearch
elasticsearch {
# IP of elasticsearch server, can be local or remote
host => "10.60.0.82"
# Cluster name, specified later in elasticsearch 
cluster => "es-cluster"
# Index name, optional. Default is "logstash-%{+YYYY.MM.dd}"
index => "logstash-es-%{+YYYY.MM.dd}"
}
}




Use it like any other service: service logstash [start,stop,restart,status]

No comments:

Post a Comment