This example simulates different logs entries that are stored in Elasticsearch.
These entries define a log level (INFO, WARN or ERROR), a source application (AppA, AppB and AppC) and in the body message describes some reference about the subcomponent related (Security, Frontend or Backend).
These logs are stored in Elasticsearch with the following format:
{
"@timestamp":"2017-04-05T14:23:13.360447+0200",
"level":"ERROR",
"app":"AppA",
"message":"Message 6167 from Security"
}
In the example a first trigger is defined to fetch documents just for AppA, for this, the in the context we define a match filter as
"context": {
"timestamp": "@timestamp",
"filter": "{\"match\":{\"app\":\"AppA\"}}", (1)
"interval": "30s",
"index": "log",
"mapping": "level:category,@timestamp:ctime,message:text,app:dataId,index:tags" (2)
}
-
From all possible documents, only AppA are interested for this trigger
-
Note that app field is used as dataId for Events
Once events are mapped we can define an EventsCondition to detect any ERROR log related to Backend components.
"conditions":[
{
"type": "EVENT",
"dataId": "AppA",
"expression": "category == 'ERROR',text contains 'Backend'"
}
]
In a similar way, a second trigger is defined in the example to fetch documents for AppB.
"context": {
...
"filter": "{\"match\":{\"app\":\"AppB\"}}", (1)
}
-
From all possible documents, only AppB are interested for this trigger
On this second trigger, we want to detect when we have a suspected high number of WARN messages in the log. There are several ways to map this, in the example we are going to use a Dampening
"dampenings": [
{
"triggerMode": "FIRING",
"type":"RELAXED_COUNT",
"evalTrueSetting": 3,
"evalTotalSetting": 10
}
],
"conditions":[
{
"type": "EVENT",
"dataId": "AppB",
"expression": "category == 'WARN'"
}
]
So, this second trigger will alert when more than 3 WARN messages are detected in a rate of 10 messages processed.
Finally, the example defines several ways of notifications: sending an email to administrators and writting back into Elasticsearch the alerts fired.