Replacing Logstash Forwarder, Filebeat is ELK’s next-gen shipper for log data, tailing log files, and sending the traced information to Logstash for parsing or Elasticsearch for storage.
Logz.io, our enterprise-grade ELK as a service with added features, allows you to ship logs from Filebeat easily using an automated script. Once the logs are shipped and loaded in Kibana, you can use Logz.io’s features to monitor your logs and predict issues.
Here, I will explain how to establish a pipeline for shipping your logs to Logz.io using Filebeat. (Note: You can also ship logs to Logz.io using TopBeat, PacketBeat or WinlogBeat — see this knowledge base article for more information.)
PrerequisitesTo complete the steps below, you’ll need the following:
A common Linux distribution, with TCP traffic allowed to port 5000 An active Logz.io account. If you don’t have one yet, create a free account here. 5 minutes of free time! Step 1: Installing FilebeatI’m running Ubuntu 12.04, and I’m going to install Filebeat 1.1.1 from the repository. If you’re using a different OS, additional installation instructions are available here.
First, I’m going to download and install the Public Signing Key:
curl https://packages.elasticsearch.org/GPG-KEY-elasticsearch | sudo apt-key add -
Next, I’m going to save the repository definition to /etc/apt/sources.list.d/beats.list:
echo "deb https://packages.elastic.co/beats/apt stable main" | sudo tee -a /etc/apt/sources.list.d/beats.list
Finally, I’m going to run apt-get update and install Filebeat:
sudo apt-get update && sudo apt-get install filebeatStep 2: Downloading the Certificate
Our next step is to download a certificate and move it to the correct location, so first run:
wget http://raw.githubusercontent.com/cloudflare/cfssl_trust/master/intermediate_ca/COMODORSADomainValidationSecureServerCA.crt
And then:
sudo mkdir -p /etc/pki/tls/certs sudo cp COMODORSADomainValidationSecureServerCA.crt /etc/pki/tls/certs/Step 3: Configuring Filebeat
Our next step is to configure Filebeat to ship logs to Logz.io by tweaking the Filebeat configuration file, which on Linux is located at: /etc/filebeat/filebeat.yml
Before you begin to edit this file, make a backup copy just in case of problems.
Copy and paste the following configuration example into the file:
Defining the Filebeat ProspectorProspectors are where we define the log files that we want to tail. You can tail JSON files and simple text files. In the example file above, I’ve defined the path for tailing any log file under the /var/log/ directory ending with .log (line 12).
Please note that when harvesting JSON files, you need to add ‘logzio_codec: json’ to the fields object (line 28). When harvesting text lines, you need to add ‘logzio_codec: plain’ to the fields object (line 15).
Two additional properties are important for defining the prospector:
First, the fields_under_root property should always be set to true Second, the document_type property is used to identify the type of log data and should be defined. While not mandatory, defining this property will help optimize Logz.io’s parsing and groking of your dataA complete list of known types is available here, and if your type is not listed there, please let us know.
Defining the Filebeat OutputOutputs are responsible for sending the data in JSON format to Logstash. In the example above, the Logstash host is already defined (line 45) along with the location of the certificate that you downloaded earlier and the log rotation setting (line 48).
Be sure to use the following logz.io token in the required fields (lines 16 and 29):
Step 4: Verifying the pipelineThat’s it. You’ve successfully installed Filebeat and configured it to ship logs to Logz.io!
Make sure Filebeat is running:
$ cd /etc/init.d $ ./filebeat status
And if not, enter:
$ sudo ./filebeat start
To verify the pipeline, head over to your Kibana and see if the log files are being shipped. It may take a minute or two for the pipeline to work — but once you’re up and running, you can start to analyze your logs by performing searches, creating visualizations, using the Logz.io alerting feature to get notifications on events, and using our free ELK Apps library.
Please note that Filebeat saves the offset of the last data read from the file in the registry, so if the agent restarts, it will continue from the saved offset.
Logz.io offers enterprise-grade ELK as a service with alerts, unlimited scalability, and collaborative analytics
Start your free trial!Shipping Logs to Logz.io with Filebeat was first posted on March 10, 2016 at 8:49 am.
©2016 "Logz.io". Use of this feed is for personal non-commercial use only. If you are not reading this article in your feed reader, then the site is guilty of copyright infringement. Please contact me at shani@anymation.co.il
