# Sending Json Format Log to Kibana Using Filebeat, Logstash and Elasticsearch?

To send JSON format logs to Kibana using Filebeat, Logstash, and Elasticsearch, you need to configure each component to handle JSON data correctly. Here’s a step-by-step guide to set up the pipeline:

### **1. Filebeat Configuration**

Filebeat will collect and forward the JSON logs to Logstash.

**Filebeat Configuration (`filebeat.yml`):**

1. **Define Input:**
Configure Filebeat to read the log files containing JSON data.
    
    ```yaml
    filebeat.inputs:
      - type: log
        paths:
          - /path/to/your/logfile.json
        json.keys_under_root: true
        json.overwrite_keys: true
    ```
    
    - **`json.keys_under_root`:** If `true`, JSON keys will be added to the root level of the event.
    - **`json.overwrite_keys`:** If `true`, Filebeat will overwrite fields with the same name from different JSON objects.
2. **Define Output to Logstash:**
Configure Filebeat to send the data to Logstash.
    
    ```yaml
    output.logstash:
      hosts: ["localhost:5044"]
    ```
    
    Adjust the `hosts` value to match the address and port where Logstash is listening.
    

### **2. Logstash Configuration**

Logstash will receive the data from Filebeat, process it, and send it to Elasticsearch.

**Logstash Configuration (`logstash.conf`):**

1. **Define Input:**
Configure Logstash to receive data from Filebeat.
    
    ```yaml
    input {
      beats {
        port => 5044
      }
    }
    ```
    
2. **Define Filters:**
Optionally, use filters to process or enrich the JSON data if needed. For example, you might add fields or modify the data structure.
    
    ```yaml
    filter {
      # Example filter to add a timestamp if not present
      if ![@timestamp] {
        date {
          match => ["log.timestamp", "ISO8601"]
          target => "@timestamp"
        }
      }
    }
    ```
    
    Adjust this filter as needed based on your JSON structure and requirements.
    
3. **Define Output:**
Configure Logstash to send the data to Elasticsearch.
    
    ```yaml
    output {
      elasticsearch {
        hosts => ["localhost:9200"]
        index => "your-index-name-%{+YYYY.MM.dd}"
        document_id => "%{[unique_id]}"  # Optional: use a unique field if you want to avoid duplicates
      }
      stdout { codec => rubydebug }  # Optional: for debugging
    }
    ```
    
    - **`hosts`:** The address of your Elasticsearch instance.
    - **`index`:** Define the index name pattern. The `%{+YYYY.MM.dd}` part creates daily indices.
    - **`document_id`:** Optionally use a unique identifier to prevent duplicate documents.

### **3. Elasticsearch Configuration**

Ensure Elasticsearch is running and accessible to receive data from Logstash.

1. **Check Elasticsearch Status:**
    - Use the following command to check if Elasticsearch is up and running:
        
        ```bash
        curl -X GET "localhost:9200/"
        ```
        
2. **Create Index Patterns in Kibana:**
    - Go to Kibana and create index patterns to match the indices where your logs are stored.
    - Navigate to `Management` -> `Index Patterns` and create a pattern that matches `your-index-name-*`.

### **4. Kibana Configuration**

In Kibana, you can create visualizations and dashboards to analyze your JSON logs.

1. **Create Index Pattern:**
    - In Kibana, navigate to `Management` -> `Index Patterns`.
    - Create a new index pattern matching the indices created by Logstash (`your-index-name-*`).
2. **Explore Data:**
    - Go to `Discover` to view the incoming logs.
    - Create visualizations and dashboards based on your data.

### **Summary**

1. **Filebeat:** Configured to read JSON logs and forward them to Logstash.
2. **Logstash:** Receives data from Filebeat, optionally processes it, and sends it to Elasticsearch.
3. **Elasticsearch:** Stores the data and makes it available for querying.
4. **Kibana:** Create index patterns, visualizations, and dashboards to analyze the JSON logs.

By following these steps, you should be able to successfully send JSON format logs from Filebeat to Kibana using Logstash and Elasticsearch.