Sending Json Format Log to Kibana Using Filebeat, Logstash and Elasticsearch?
To send JSON format logs to Kibana using Filebeat, Logstash, and Elasticsearch, you need to configure each component to handle JSON data correctly. Here’s a step-by-step guide to set up the pipeline:
1. Filebeat Configuration
Filebeat will collect and forward the JSON logs to Logstash.
Filebeat Configuration (filebeat.yml
):
Define Input: Configure Filebeat to read the log files containing JSON data.
filebeat.inputs: - type: log paths: - /path/to/your/logfile.json json.keys_under_root: true json.overwrite_keys: true
- **`json.keys_under_root`:** If `true`, JSON keys will be added to the root level of the event.
- **`json.overwrite_keys`:** If `true`, Filebeat will overwrite fields with the same name from different JSON objects.
Define Output to Logstash: Configure Filebeat to send the data to Logstash.
output.logstash: hosts: ["localhost:5044"]
Adjust the
hosts
value to match the address and port where Logstash is listening.
2. Logstash Configuration
Logstash will receive the data from Filebeat, process it, and send it to Elasticsearch.
Logstash Configuration (logstash.conf
):
Define Input: Configure Logstash to receive data from Filebeat.
input { beats { port => 5044 } }
Define Filters: Optionally, use filters to process or enrich the JSON data if needed. For example, you might add fields or modify the data structure.
filter { # Example filter to add a timestamp if not present if ![@timestamp] { date { match => ["log.timestamp", "ISO8601"] target => "@timestamp" } } }
Adjust this filter as needed based on your JSON structure and requirements.
Define Output: Configure Logstash to send the data to Elasticsearch.
output { elasticsearch { hosts => ["localhost:9200"] index => "your-index-name-%{+YYYY.MM.dd}" document_id => "%{[unique_id]}" # Optional: use a unique field if you want to avoid duplicates } stdout { codec => rubydebug } # Optional: for debugging }
- **`hosts`:** The address of your Elasticsearch instance.
- **`index`:** Define the index name pattern. The `%{+YYYY.MM.dd}` part creates daily indices.
- **`document_id`:** Optionally use a unique identifier to prevent duplicate documents.
3. Elasticsearch Configuration
Ensure Elasticsearch is running and accessible to receive data from Logstash.
Check Elasticsearch Status:
Use the following command to check if Elasticsearch is up and running:
curl -X GET "localhost:9200/"
Create Index Patterns in Kibana:
- Go to Kibana and create index patterns to match the indices where your logs are stored.
- Navigate to
Management
->Index Patterns
and create a pattern that matchesyour-index-name-*
.
4. Kibana Configuration
In Kibana, you can create visualizations and dashboards to analyze your JSON logs.
- Create Index Pattern:
- In Kibana, navigate to
Management
->Index Patterns
. - Create a new index pattern matching the indices created by Logstash (
your-index-name-*
).
- In Kibana, navigate to
- Explore Data:
- Go to
Discover
to view the incoming logs. - Create visualizations and dashboards based on your data.
- Go to
Summary
- Filebeat: Configured to read JSON logs and forward them to Logstash.
- Logstash: Receives data from Filebeat, optionally processes it, and sends it to Elasticsearch.
- Elasticsearch: Stores the data and makes it available for querying.
- Kibana: Create index patterns, visualizations, and dashboards to analyze the JSON logs.
By following these steps, you should be able to successfully send JSON format logs from Filebeat to Kibana using Logstash and Elasticsearch.
Make your mark
Join the writer's program
Are you a developer and love writing and sharing your knowledge with the world? Join our guest writing program and get paid for writing amazing technical guides. We'll get them to the right readers that will appreciate them.
Write for usBuild on top of Better Stack
Write a script, app or project on top of Better Stack and share it with the world. Make a public repository and share it with us at our email.
community@betterstack.comor submit a pull request and help us build better products for everyone.
See the full list of amazing projects on github