Hello
I need a help with the configuration of logstash
I want to remove some fields from logstash
I read that which fields I can remove , so am removing the above field,but its not working ,can you plz help me out .
Am new on elastic and its a bit urgent
You should use the nested fields.
mutate {
remove_field => [ "[agent][version][keyword]" ] # or just set "agent" and remove all nested fields related to the agent field
Christian has right, you are trying to remove the field from Kibana. My mistake
remove_field => [ "[agent][version]" ]
or full agent
remove_field => [ "agent" ]
One more question :
I have more than one log file in my config file and I want to remove the same fields in each file .
So how can I achieve this
I'm not sure if there is a similar way to achieve the below approach on Filebeat itself.
Logstash has the ability to combine multiple files (config for pipelines.yml under /etc/logstash):
- pipeline.id: pingPoller
path.config: "/etc/logstash/conf.d/{Ping_dns-input.conf,Ping_server1-input.conf,Ping_server2-input.conf,Ping-filter_output.conf}"
queue.type: persisted
So if you route your filebeats over Logstash:
It would be quite easy to achieve the removal for all Filebeats in 1 place
If you require specific actions per Harvester in Filebeat, you could combine actions via a shared file for all pipelines.
processors:
- dissect:
#2021-12-08T08:34:04.370+0100 INFO [monitoring] log/log.go:144 Non-zero metrics in the last 30s {"monitoring": {"metrics
#datatype: string to integer, long, float, double, boolean or ip
tokenizer: "%{date}\t%{event.type}\t%{class}\t%{script}\t%{messageCut}"
field: "message"
target_prefix: ""
- timestamp:
field: date
layouts:
- '2006-01-02T15:04:05.999Z07:00'
- '2006-01-02T15:04:05.999Z0700'
- '2006-01-02T15:04:05.999999999Z07:00'
#- '2006-01-02T15:04:05.999-07:00'
test:
- '2021-12-08T08:34:04.370+0100'
- drop_fields:
fields: ["date", "class", "script", "message"]
- rename:
fields:
- from: "messageCut"
to: "message"
# Change to true to enable this input configuration.
enabled: true
# Paths that should be crawled and fetched. Glob based paths.
paths:
- /var/data/log/heartbeat/heartbeat.log
fields:
service.type: heartbeat
event.module: heartbeat
event.dataset: heartbeat.beat
fields_under_root: true
We mostly do processing in logstash, which was built for this purpose.
You could easily add a different processor in filebeat to add fields.