While Logz.io provides Kibana — the ELK Stack’s visualization tool — as part of its service, a lot of users have asked us to support Grafana. One of the leading open source visualization tools today, Grafana has some added value when compared to Kibana, especially around visualizing time-series data.
We are happy to announce that the Logz.io Grafana plugin is now available as an official data source plugin. Grafana users can now install the plugin and add Logz.io as a data source for analyzing the data stored in Logz.io.
The plugin is basically a fork of the existing Elasticsearch plugin, with the addition of support for custom HTTP headers, required for passing a Logz.io API token. To use the integration, you will need write access to your Grafana instance and access to Logz.io API (Enterprise users only).
Installing Grafana
Just in case you do not have a Grafana instance, here are instructions for installing the latest stable version (5.1.2) on Ubuntu/Debian. If you’re using a different OS, refer to Grafana’s great docs here (if you’re using Docker, that’s probably the easiest way to get Grafana up and running).
Download the package and install it with these three commands:
wget https://s3-us-west-2.amazonaws.com/grafana-releases/release/grafana_5.1.2_amd64.deb sudo apt-get install -y adduser libfontconfig sudo dpkg -i grafana_5.1.2_amd64.deb
Next, start Grafana with:
sudo service grafana-server start
To access Grafana and make sure all is working as expected, open your browser at
http://<serverIP>:3000
Use admin/admin as the credentials to access Grafana:
As you can see, we have no data source in Grafana, so our next step is to hook into the data stored in Logz.io by adding a new data source.
Installing the Logz.io plugin
Before we can add Logz.io as a data source in Grafana, however, we need to download and install the plugin.
There are a number of ways to do this, the simplest being using the Grafana CLI to install from the command line:
grafana-cli plugins install logzio-datasource
The plugin is installed in the Grafana plugins directory at /var/lib/grafana/plugins.
Or, you can try the manual approach by downloading the plugin and copying it into the plugins folder:
git clone https://github.com/logzio/grafana-logzio-datasource.git sudo cp -a grafana-logzio-datasource/dist/ /var/lib/grafana/plugins/logzio/
Restart Grafana with:
sudo service grafana-server restart
Adding the Logz.io data source to Grafana
Before you add Logz.io as a data source, retrieve an API token.
To do this, access Logz.io and under Cogwheel → Tools → API Tokens, copy one of your tokens or generate a new one.
Open up Grafana and click Add Data Source.
Configure the Logz.io data source as follows.
As the data source name, enter whatever name you want to give the data source. In my case, I’m going to use Metricbeat.
As the type, open the drop-down menu and select Logz.io from the list of available data sources.
In the HTTP section, use this URL – https://api.logz.io/v1/elasticsearch. You can leave the Access type as-is.
You can leave the default settings in the Auth and Advanced HTTP Settings sections untouched.
In the Custom Headers section, add the following header definition:
- Key – X-API-TOKEN
- Value – your Logz.io API token created in the steps above
Last but not least, in the Elasticsearch details section, enter an index name (e.g. metricbeat) and use @timestamp as the Time field name.
The final configuration should look something like this:
Click Save & Test to add the new Logz.io data source. If all goes as expected, you will see a green success message and the Logz.io data source will be added to Grafana.
Congrats! You’ve added Logz.io as a Grafana data source.
Analyzing your Logz.io data in Grafana
Grafana is a different beast compared to Kibana, but if you’re setting up this integration you are probably acquainted with the basics (in Grafana, visualizations are called Panels!).
I’m not going to bore you with how to build new dashboards in Grafana, but one of Grafana’s strongest features is its ecosystem of ready-made dashboards for different data sources and types. With a few adjustments, we can make use of these dashboards.
Browse for a dashboard that interests you on Grafana’s Dashboards page. In my case, I’m going to search for a dashboard for Metricbeat system metrics.
Download the dashboard’s JSON spec, and edit it to work with Logz.io’s data source:
- Change the pluigId from Elasticsearch to logzio in both the _inputs and the _requires sections.
- Change the plugin’s version in the _requires section to 1.0.0
"__inputs": [ { "name": "DS_EIGHTY20-ES", "label": "elasticsearch", "description": "", "type": "datasource", "pluginId": "logzio", "pluginName": "Elasticsearch" } ], "__requires": [ { "type": "datasource", "id": "logzio", "name": "Elasticsearch", "version": "1.0.0" }, { "type": "grafana", "id": "grafana", "name": "Grafana", "version": "4.2.0" }, { "type": "panel", "id": "graph", "name": "Graph", "version": "" }, { "type": "panel", "id": "singlestat", "name": "Singlestat", "version": "" } ],
In Grafana’s toolbar, select + → Create → Import, and paste the updated JSON in the relevant field.
Click Load, and you will be presented with some options for defining the imported dashboard.
Click Load, and you will be presented with some options for defining the imported dashboard.
Endnotes
Kibana has made a lot of progress towards providing users with visualization tools for time-series data, first introducing Timelion and then later on adding the Time Series Visual Builder to the list of the supported visualization types.
Still, many users prefer to stick to Grafana which is designed to integrate with a variety of time series data stores such as Prometheus and Graphite. Indeed, it is not unique to find users using both Kibana and Grafana for different data types and with different data stores.
We are glad we can help our users keep to their current workflows and use the tool they prefer. As many of our users know, we are also working on tighter integration with Grafana to support time-series analytics and will be announcing some news about this soon. Stay tuned!