# Modify Fluentd Json Output

Modifying the JSON output in Fluentd allows you to customize the log format to suit your needs, such as adding, removing, or transforming fields before sending the logs to their destination. This is commonly done using the `record_transformer` filter, which can manipulate JSON logs based on your requirements.

### **Step-by-Step Guide to Modify Fluentd JSON Output**

### **1. Basic Fluentd Configuration Example**

Here’s a basic Fluentd configuration that demonstrates how to read JSON logs, transform them, and then output them to a file.

### **Example Input: Reading JSON Logs**

Assuming you have logs coming from a `tail` input plugin:

```
<source>
  @type tail
  path /var/log/app.log
  pos_file /var/log/td-agent/app.log.pos
  tag app.logs
  format json
</source>

```

### **2. Using `record_transformer` Filter to Modify JSON Logs**

The `record_transformer` filter allows you to add, remove, or modify fields in the JSON log records.

### **Example Filter Configuration**

```
<filter app.logs>
  @type record_transformer
  <record>
    # Add or modify fields
    hostname ${hostname}
    environment production
    # Rename fields if needed
    user_id ${user.id}
    # Remove a field (e.g., sensitive data)
    remove_field sensitive_data
  </record>
</filter>

```

### **Explanation**

- **`hostname`**: Adds a field with the current hostname.
- **`environment`**: Adds a static field with the value `production`.
- **`user_id`**: Renames a field from `user.id` to `user_id`.
- **`remove_field`**: Removes a field named `sensitive_data` from the log.

### **3. Output Configuration**

You can then specify how to output the modified logs. Here’s an example of sending the logs to a file in JSON format:

```
<match app.logs>
  @type file
  path /var/log/fluentd/modified_logs.log
  <format>
    @type json
  </format>
</match>
```

### **4. Advanced Transformation Using `record_transformer`**

You can also use Ruby code within the `record_transformer` to perform more complex transformations.

### **Example: Advanced Transformation**

```
<filter app.logs>
  @type record_transformer
  enable_ruby
  <record>
    # Modify a field using Ruby code
    formatted_time ${time.strftime('%Y-%m-%dT%H:%M:%S')}
    # Add a field based on existing fields
    full_message ${record["message"]} - ${record["extra_info"]}
  </record>
</filter>
```

### **Explanation**

- **`formatted_time`**: Uses Ruby to format the timestamp.
- **`full_message`**: Concatenates two fields into a new field.

### **5. Testing and Validation**

After modifying your configuration:

1. **Restart Fluentd** to apply the changes:
    
    ```bash
    sudo systemctl restart td-agent
    ```
    
2. **Check the Output**: Verify the transformed logs in the output file or destination to ensure that the modifications are applied as expected.

### **6. Common Use Cases**

- **Redacting Sensitive Information**: Removing or masking sensitive data from logs.
- **Adding Metadata**: Adding contextual information like environment, application name, or hostname.
- **Flattening Nested JSON**: Transforming nested JSON into a flat structure for easier querying.

### **7. Using `fluent-plugin-record-reformer`**

If you need more advanced record manipulation, consider using the `fluent-plugin-record-reformer` plugin, which provides additional capabilities for transforming records.

### **Example Usage**

```bash
td-agent-gem install fluent-plugin-record-reformer
```

```
<filter app.logs>
  @type record_reformer
  <record>
    # Define field mappings and transformations
    @message ${record["log"]}
    @timestamp ${record["time"]}
  </record>
</filter>

```