问题
Individual processors in an Elastic pipelines have an on_failure attribute. This allows you to handle a failure/error in a pipeline. The example in the docs show setting an additional field on your document.
{
"description" : "my first pipeline with handled exceptions",
"processors" : [
{
"rename" : {
"field" : "foo",
"to" : "bar",
"on_failure" : [
{
"set" : {
"field" : "error.message",
"value" : "{{ _ingest.on_failure_message }}"
}
}
]
}
}
]
}
Is it possible to tell the pipeline to SKIP importing a document if any processors in the pipeline fail?
回答1:
You can "hijack" the drop processor to skip either directly in the on_failure
step (no need for _ingest.on_failure_message
if you're aborting anyways):
{
"description": "my first pipeline with handled exceptions",
"processors": [
{
"rename" : {
"field" : "foo",
"target_field": "bar",
"on_failure" : [
{
"drop" : {
"if" : "true"
}
}
]
}
}
]
}
or use it as a separate processor, perhaps at the very end, after ctx.error
has been set by any of the processors' on_failure
handlers:
{
"description": "my first pipeline with handled exceptions",
"processors": [
{
"rename" : {
"field" : "foo",
"to" : "bar",
"on_failure" : [
{
"set" : {
"field" : "error.message",
"value" : "{{ _ingest.on_failure_message }}"
}
}
]
}
},
{
"drop": {
"if": "ctx.error.size() != null"
}
}
]
}
Both of these will result in a noop
when the pipeline is applied.
来源:https://stackoverflow.com/questions/65186939/elastic-pipelines-skip-import-on-failure