How to use the ELK-Stack (Elastic Stack) InstantApp

This page shows you how to use the ELK stack InstantApp on your Scaleway instance. ELK stack is an environment that lets you collect and visualize your logs with:

  • Elasticsearch for search and data analytics
  • Logstash for centralized logging, log enrichment and parsing
  • Kibana to visualize data


There are three steps to deploy the ELK stack InstantApp

  • Create and start a new instance using the ELK stack InstantApp
  • Collect syslogs data with Logstash
  • Visualize your data with Kibana

Create and start a new instance using the ELK stack InstantApp

Before starting, click the “Create a Server” button in the control panel.

Control Panel

You will land on the server-creation page where you can choose the ELK Stack image in the InstantApps tab for your Cloud Instance:

Control Panel

Choose the server type and click on the Create a Server button. The server will be created with a ready to use install of elasticsearch, Kibana and logstash.

Collect Syslogs data with Logstash

In this tutorial we will see how to track syslogs data and visualize them from Kibana.

Let’s start by creating a new configuration file to collect system logs. Open a new file in /etc/logstash/conf.d/logstash-syslog.conf and fill it with the following:

input {
  file {
    path => [ "/var/log/*.log", "/var/log/messages", "/var/log/syslog" ]
    type => "syslog"

output {
  elasticsearch { host => localhost }
  stdout { codec => rubydebug }

The configuration above tells Logstash to collect all files with .log extention in /var/log, /var/log/messages and /var/log/syslog.

Next, we will create a filter to prevent Elasticsearch to store logs in the message field and simplify the analysis.

filter {
  if [type] == "syslog" {
    grok {
      match => { "message" => "%{SYSLOGTIMESTAMP:syslog_timestamp} %{SYSLOGHOST:syslog_hostname} %{DATA:syslog_program}(?:\[%{POSINT:syslog_pid}\])?: %{GREEDYDATA:syslog_message}" }
      add_field => [ "received_at", "%{@timestamp}" ]
      add_field => [ "received_from", "%{host}" ]
    syslog_pri { }
    date {
      match => [ "syslog_timestamp", "MMM  d HH:mm:ss", "MMM dd HH:mm:ss" ]

Restart logstash to apply our changes service logstash restart

Visualize your data with Kibana

System logs are now collected and stored in elasticsearch, you can visualize them with Kibana. Open a browser and go to http://<your_server_public_ip>. You are asked for a login and password. You can retrieve them on the message of the day (MOTD) when you connect your server.

Welcome on ELK stack on Scaleway' C1.

 * Kernel:           GNU/Linux 3.2.34-30 armv7l - Marvell (Proprietary)
                     - This kernel has the best performances on this hardware
                     - For mainline kernel with latest features and plenty of modules, use a 3.17 kernel instead
 * Distribution:     ELK stack (2015-06-09) on Ubuntu 14.10
 * Internal ip:
 * External ip:
 * Disk /dev/nbd0:   scw-app-elk-latest-2015-06-09_18:11 (l_ssd 50G)
 * Uptime:           09:50:11 up 17:31,  0 users,  load average: 3.23, 3.15, 3.08

 * Documentation:
 * Community:
 * Image source:
To access Kibana, open http://xxx.yyy.zzz.www/.
Login with user kibana and password -> ieshahchuemohfohxooshieshieshiojiepiengeng <-
You can hide this message on the next connection by deleting the /etc/update-motd.d/70-elk file.

You land on Kibana homepage and are asked to configure an index pattern. Index patterns are used to identify the Elasticsearch index to run search and analytics against.


To create the first index, select @timestamp from the Time-field-name menu and click the Create button.

On the top navigation bar, click the Discover tab.

Here will be displayed all the log collected and an histogram representing the log activity.


It is your turn now! Start playing with Kibana, create graphics and filters on your logs :)


ELK stack lets you search and analyze your data with ease. From here you can go deeper and create a more complex configuration. For instance you can use logstash-forwarder which let you collect logs from remote servers and send them to Logstash.

If you have any suggestion or question on this documentation, please leave a comment.

Discover the Cloud That Makes Sense