Help


Community tutorials and documentations

How to collect and visualize your logs with the ELK stack (Elasticsearch Logstash Kibana)

How to collect and visualize your logs with the ELK stack (Elasticsearch Logstash Kibana)

This page shows you how to use the ELK stack InstantApp on your C1 server. ELK stack is an environment that lets you collect and visualize your logs with:

  • Elasticsearch for search and data analytics
  • Logstash for centralized logging, log enrichment and parsing
  • Kibana to visualize data

Requirements

  • You have an account and are logged into cloud.scaleway.com
  • You have configured your SSH Key

There are three steps to deploy the ELK stack InstantApp

  • Create and start a new C1 server using the ELK stack InstantApp
  • Collect syslogs data with Logstash
  • Visualize your data with Kibana

Step 1 - Create and start a new C1 server using the ELK stack InstantApp

First, we need to create a new server using the ELK stack InstantApp. Click the “Create Server” button in the control panel.

Control Panel

You land on the server creation page where you must input information and choose an image.

Create server basic information

After inputting your server basic information, select the ELK stack image for your server.
On the ImageHub tab, select ELK stack and click the “Create Server” button.

The server will be created with a ready to use install of elasticsearch, Kibana and logstash.

Step 2 - Collect Syslogs data with Logstash

In this tutorial we will see how to track syslogs data and visualize them from Kibana.

Let’s start by creating a new configuration file to collect system logs. Open a new file in /etc/logstash/conf.d/logstash-syslog.conf and fill it with the following:

input {
  file {
    path => [ "/var/log/*.log", "/var/log/messages", "/var/log/syslog" ]
    type => "syslog"
  }
}

output {
  elasticsearch { host => localhost }
  stdout { codec => rubydebug }
}

The configuration above tells Logstash to collect all files with .log extention in /var/log, /var/log/messages and /var/log/syslog.

Next, we will create a filter to prevent Elasticsearch to store logs in the message field and simplify the analysis.

filter {
  if [type] == "syslog" {
    grok {
      match => { "message" => "%{SYSLOGTIMESTAMP:syslog_timestamp} %{SYSLOGHOST:syslog_hostname} %{DATA:syslog_program}(?:\[%{POSINT:syslog_pid}\])?: %{GREEDYDATA:syslog_message}" }
      add_field => [ "received_at", "%{@timestamp}" ]
      add_field => [ "received_from", "%{host}" ]
    }
    syslog_pri { }
    date {
      match => [ "syslog_timestamp", "MMM  d HH:mm:ss", "MMM dd HH:mm:ss" ]
    }
  }
}

Restart logstash to apply our changes service logstash restart

Step 3 - Visualize your data with Kibana

System logs are now collected and stored in elasticsearch, you can visualize them with Kibana. Open a browser and go to http://<your_server_public_ip>. You are asked for a login and password. You can retrieve them on the message of the day (MOTD) when you connect your server.

Welcome on ELK stack on Scaleway' C1.

 * Kernel:           GNU/Linux 3.2.34-30 armv7l - Marvell (Proprietary)
                     - This kernel has the best performances on this hardware
                     - For mainline kernel with latest features and plenty of modules, use a 3.17 kernel instead
 * Distribution:     ELK stack (2015-06-09) on Ubuntu 14.10
 * Internal ip:      10.1.35.26
 * External ip:      212.47.241.133
 * Disk /dev/nbd0:   scw-app-elk-latest-2015-06-09_18:11 (l_ssd 50G)
 * Uptime:           09:50:11 up 17:31,  0 users,  load average: 3.23, 3.15, 3.08

Links
 * Documentation:    https://scaleway.com/docs/how-to-use-the-elk-stack-instant-apps/
 * Community:        https://community.online.net/c/scaleway
 * Image source:     https://github.com/scaleway/image-app-elk
To access Kibana, open http://xxx.yyy.zzz.www/.
Login with user kibana and password -> ieshahchuemohfohxooshieshieshiojiepiengeng <-
You can hide this message on the next connection by deleting the /etc/update-motd.d/70-elk file.

You land on Kibana homepage and are asked to configure an index pattern. Index patterns are used to identify the Elasticsearch index to run search and analytics against.

Kibana

To create the first index, select @timestamp from the Time-field-name menu and click the Create button.

On the top navigation bar, click the Discover tab.

Here will be displayed all the log collected and an histogram representing the log activity.

Kibana

It is your turn now! Start playing with Kibana, create graphics and filters on your logs :)

Conclusion

ELK stack lets you search and analyze your data with ease. From here you can go deeper and create a more complex configuration. For instance you can use logstash-forwarder which let you collect logs from remote servers and send them to Logstash.

If you have any suggestion or question on this documentation, please leave a comment.

Discover a new cloud experience

Deploy SSD cloud servers in seconds.