概述目标:在写入前通过管道解析日志/事件并统一字段命名与类型,减少查询歧义并提升聚合效果。适用:日志接入、事件清洗、跨系统字段对齐。核心与实战定义Pipeline(grok解析与规范化):PUT _ingest/pipeline/logs_normalize
{
"processors": [
{ "grok": { "field": "message", "patterns": ["%{TIMESTAMP_ISO8601:ts} %{LOGLEVEL:level} %{WORD:service} - %{GREEDYDATA:msg}"] } },
{ "set": { "field": "@timestamp", "value": "{{ts}}" } },
{ "rename": { "field": "service", "target_field": "app" } },
{ "convert": { "field": "level", "type": "string" } },
{ "remove": { "field": "ts" } }
]
}
索引模板绑定Pipeline:PUT _index_template/logs_template
{
"index_patterns": ["logs-*"],
"template": {
"settings": {
"index.default_pipeline": "logs_normalize"
},
"mappings": {
"properties": {
"@timestamp": {"type": "date"},
"level": {"type": "keyword"},
"app": {"type": "keyword"},
"msg": {"type": "text"}
}
}
}
}
示例写入并验证:POST logs-2025/_doc
{
"message": "2025-11-26 10:00:00 INFO api - start request"
}
GET logs-2025/_search
{
"query": {"term": {"level": "INFO"}},
"_source": ["@timestamp","level","app","msg"]
}
验证与监控Pipeline状态与失败:GET _ingest/pipeline/logs_normalize
GET logs-2025/_search {"query":{"exists":{"field":"@timestamp"}}}
映射一致性:检查`level/app`是否为`keyword`类型,避免因`text`导致聚合慢。接入质量:统计解析失败率;在`on_failure`中记录错误字段以便治理。常见误区未绑定`index.default_pipeline`导致写入未规范化;需在模板设置。将分类字段设为`text`导致聚合与排序慢;应为`keyword`。grok模式不覆盖全部日志格式;需按源系统维护模式集合。结语借助Ingest Pipeline可在写入阶段完成统一解析与规范化,显著提升Elasticsearch查询与聚合的稳定性与效率。

发表评论 取消回复