[Security Solutions][Detection Engine] Adds a merge strategy key to kibana.yml and updates docker to have missing keys from security solutions (#103800)

## Summary

This is a follow up considered critical addition to:
https://github.com/elastic/kibana/pull/102280

This adds a key of `xpack.securitySolution.alertMergeStrategy` to `kibana.yml` which allows users to change their merge strategy between their raw events and the signals/alerts that are generated. This also adds additional security keys to the docker container that were overlooked in the past from security solutions.

The values you can use and add to to `xpack.securitySolution.alertMergeStrategy` are:
* missingFields (The default)
* allFields
* noFields

## missingFields

The default merge strategy we are using starting with 7.14 which will merge any primitive data types from the [fields API](https://www.elastic.co/guide/en/elasticsearch/reference/current/search-fields.html#search-fields-param) into the resulting signal/alert. This will copy over fields such as `constant_keyword`, `copy_to`, `runtime fields`, `field aliases` which previously were not copied over as long as they are primitive data types such as `keyword`, `text`, `numeric` and are not found in your original `_source` document. This will not copy copy `geo points`, `nested objects`, and in some cases if your `_source` contains arrays or top level objects or conflicts/ambiguities it will not merge them. This will _not_ merge existing values between `_source` and `fields` for `runtime fields` as well. It only merges missing primitive data types.

## allFields
A very aggressive merge strategy which should be considered experimental. It will do everything `missingFields` does but in addition to that it will merge existing values between `_source` and `fields` which means if you change values or override values with `runtime fields` this strategy will attempt to merge those values. This will also merge in most instances your nested fields but it will not merge `geo` data types due to ambiguities. If you have multi-fields this will choose your default field and merge that into `_source`. This can change a lot your data between your original `_source` and `fields` when the data is copied into an alert/signal which is why it is considered an aggressive merge strategy.

Both these strategies attempts to unbox single array elements when it makes sense and assumes you only want values in an array when it sees them in `_source` or if it sees multiple elements within an array.

## noFields

The behavior before https://github.com/elastic/kibana/pull/102280 was introduced and is a do nothing strategy. This should only be used if you are seeing problems with alerts/signals being inserted due to conflicts and/or bugs for some reason with `missingFields`. We are not anticipating this, but if you are setting `noFields` please reach out to our [forums](https://discuss.elastic.co/c/security/83) and let us know we have a bug so we can fix it. If you are encountering undesired merge behaviors or have other strategies you want us to implement let us know on the forums as well.

The missing keys added for docker are:

*  xpack.securitySolution.alertMergeStrategy
*  xpack.securitySolution.alertResultListDefaultDateRange
*  xpack.securitySolution.endpointResultListDefaultFirstPageIndex
*  xpack.securitySolution.endpointResultListDefaultPageSize
*  xpack.securitySolution.maxRuleImportExportSize
*  xpack.securitySolution.maxRuleImportPayloadBytes
*  xpack.securitySolution.maxTimelineImportExportSize
*  xpack.securitySolution.maxTimelineImportPayloadBytes
*  xpack.securitySolution.packagerTaskInterval
*  xpack.securitySolution.validateArtifactDownloads

I intentionally skipped adding the other `kibana.yml` keys which are considered either experimental flags or are for internal developers and are not documented and not supported in production by us. 

## Manual testing of the different strategies 

First add this mapping and document in the dev tools for basic tests
```json
# Mapping with two constant_keywords and a runtime field
DELETE frank-test-delme-17
PUT frank-test-delme-17
{
  "mappings": {
    "dynamic": "strict",
    "runtime": {
      "host.name": {
        "type": "keyword",
        "script": {
          "source": "emit('changed_hostname')"
        }
      }
    },
    "properties": {
      "@timestamp": {
        "type": "date"
      },
      "host": {
        "properties": {
          "name": {
            "type": "keyword"
          }
        }
      },
      "data_stream": {
        "properties": {
          "dataset": {
            "type": "constant_keyword",
            "value": "datastream_dataset_name_1"
          },
          "module": {
            "type": "constant_keyword",
            "value": "datastream_module_name_1"
          }
        }
      },
      "event": {
        "properties": {
          "dataset": {
            "type": "constant_keyword",
            "value": "event_dataset_name_1"
          },
          "module": {
            "type": "constant_keyword",
            "value": "event_module_name_1"
          }
        }
      }
    }
  }
}

# Document without an existing host.name 
PUT frank-test-delme-17/_doc/1
{
  "@timestamp": "2021-06-30T15:46:31.800Z"
}

# Document with an existing host.name
PUT frank-test-delme-17/_doc/2
{
  "@timestamp": "2021-06-30T15:46:31.800Z",
  "host": {
    "name": "host_name"
  }
}

# Query it to ensure the fields is returned with data that does not exist in _soruce
GET frank-test-delme-17/_search
{
  "fields": [
    {
      "field": "*"
    }
  ]
}
```

For all the different key combinations do the following:

Run a single detection rule against the index:
<img width="1139" alt="Screen Shot 2021-06-30 at 9 49 12 AM" src="https://user-images.githubusercontent.com/1151048/123997522-b8dc6600-d98d-11eb-9407-5480d5b2cc8a.png">

Ensure two signals are created:
<img width="1376" alt="Screen Shot 2021-06-30 at 10 26 03 AM" src="https://user-images.githubusercontent.com/1151048/123997739-f17c3f80-d98d-11eb-9eb9-90e9410f0cde.png">

If your `kibana.yml` or `kibana.dev.yml` you set this key (or omit it as it is the default):

```yml
xpack.securitySolution.alertMergeStrategy: 'missingFields'
```

When you click on each signal you should see that `event.module` and `event.dataset` were copied over as well as `data_stream.dataset` and `data_stream.module` since they're `constant_keyword`:
<img width="877" alt="Screen Shot 2021-06-30 at 10 20 44 AM" src="https://user-images.githubusercontent.com/1151048/123997961-31432700-d98e-11eb-96ee-06524f21e2d6.png">

However since this only merges missing fields, you should see that in the first record the `host.name` is the runtime field defined since `host.name` does not exist in `_source` and that in the second record it still shows up as `host_name` since we do not override merges right now:
First:
<img width="887" alt="Screen Shot 2021-06-30 at 10 03 31 AM" src="https://user-images.githubusercontent.com/1151048/123998398-b2022300-d98e-11eb-87be-aa5a153a91bc.png">

Second:
<img width="838" alt="Screen Shot 2021-06-30 at 10 03 44 AM" src="https://user-images.githubusercontent.com/1151048/123998413-b4fd1380-d98e-11eb-9821-d6189190918f.png">

When you set in your `kibana.yml` or `kibana.dev.yml` this key:

```yml
xpack.securitySolution.alertMergeStrategy: 'noFields'
```

Expect that your `event.module`, `event.dataset`, `data_stream.module`, `data_stream.dataset` are all non-existent since we do not copy anything over from `fields` at all and only use things within `_source`:
<img width="804" alt="Screen Shot 2021-06-30 at 9 58 25 AM" src="https://user-images.githubusercontent.com/1151048/123998694-f8578200-d98e-11eb-8d71-a0858d3ed3e7.png">

Expect that `host.name` is missing in the first record and has the default value in the second:

First:
<img width="797" alt="Screen Shot 2021-06-30 at 9 58 37 AM" src="https://user-images.githubusercontent.com/1151048/123998797-10c79c80-d98f-11eb-81b6-5174d8ef14f2.png">

Second:
<img width="806" alt="Screen Shot 2021-06-30 at 9 58 52 AM" src="https://user-images.githubusercontent.com/1151048/123998816-158c5080-d98f-11eb-87a0-0ac2f58793b3.png">

When you set in your `kibana.yml` or `kibana.dev.yml` this key:

```yml
xpack.securitySolution.alertMergeStrategy: 'allFields'
```

Expect that `event.module` and `event.dataset` were copied over as well as `data_stream.dataset` and `data_stream.module` since they're `constant_keyword`:
<img width="864" alt="Screen Shot 2021-06-30 at 10 03 15 AM" src="https://user-images.githubusercontent.com/1151048/123999000-48364900-d98f-11eb-9803-05349744ac10.png">

Expect that both the first and second records contain the runtime field since we merge both of them:
<img width="887" alt="Screen Shot 2021-06-30 at 10 03 31 AM" src="https://user-images.githubusercontent.com/1151048/123999078-58e6bf00-d98f-11eb-83bd-dda6b50fabcd.png">

### Checklist

Delete any items that are not applicable to this PR.

- [x] If a plugin configuration key changed, check if it needs to be allowlisted in the [cloud](https://github.com/elastic/cloud) and added to the [docker list](c29adfef29/src/dev/build/tasks/os_packages/docker_generator/resources/bin/kibana-docker)
This commit is contained in:
Frank Hassanabad 2021-06-30 15:50:05 -06:00 committed by GitHub
parent f65eaa2c49
commit 12e7fe50bb
No known key found for this signature in database
GPG key ID: 4AEE18F83AFDEB23
17 changed files with 145 additions and 27 deletions

View file

@ -380,6 +380,16 @@ kibana_vars=(
xpack.security.session.idleTimeout
xpack.security.session.lifespan
xpack.security.sessionTimeout
xpack.securitySolution.alertMergeStrategy
xpack.securitySolution.alertResultListDefaultDateRange
xpack.securitySolution.endpointResultListDefaultFirstPageIndex
xpack.securitySolution.endpointResultListDefaultPageSize
xpack.securitySolution.maxRuleImportExportSize
xpack.securitySolution.maxRuleImportPayloadBytes
xpack.securitySolution.maxTimelineImportExportSize
xpack.securitySolution.maxTimelineImportPayloadBytes
xpack.securitySolution.packagerTaskInterval
xpack.securitySolution.validateArtifactDownloads
xpack.spaces.enabled
xpack.spaces.maxSpaces
xpack.task_manager.enabled

View file

@ -21,6 +21,12 @@ export const configSchema = schema.object({
maxRuleImportPayloadBytes: schema.number({ defaultValue: 10485760 }),
maxTimelineImportExportSize: schema.number({ defaultValue: 10000 }),
maxTimelineImportPayloadBytes: schema.number({ defaultValue: 10485760 }),
alertMergeStrategy: schema.oneOf(
[schema.literal('allFields'), schema.literal('missingFields'), schema.literal('noFields')],
{
defaultValue: 'missingFields',
}
),
[SIGNALS_INDEX_KEY]: schema.string({ defaultValue: DEFAULT_SIGNALS_INDEX }),
/**

View file

@ -30,6 +30,7 @@ export const createMockConfig = (): ConfigType => ({
},
packagerTaskInterval: '60s',
validateArtifactDownloads: true,
alertMergeStrategy: 'missingFields',
});
export const mockGetCurrentUser = {

View file

@ -38,7 +38,11 @@ describe('buildBulkBody', () => {
const ruleSO = sampleRuleSO(getQueryRuleParams());
const doc = sampleDocNoSortId();
delete doc._source.source;
const fakeSignalSourceHit: SignalHitOptionalTimestamp = buildBulkBody(ruleSO, doc);
const fakeSignalSourceHit: SignalHitOptionalTimestamp = buildBulkBody(
ruleSO,
doc,
'missingFields'
);
// Timestamp will potentially always be different so remove it for the test
delete fakeSignalSourceHit['@timestamp'];
const expected: Omit<SignalHit, '@timestamp'> & { someKey: 'someValue' } = {
@ -102,7 +106,11 @@ describe('buildBulkBody', () => {
},
};
delete doc._source.source;
const fakeSignalSourceHit: SignalHitOptionalTimestamp = buildBulkBody(ruleSO, doc);
const fakeSignalSourceHit: SignalHitOptionalTimestamp = buildBulkBody(
ruleSO,
doc,
'missingFields'
);
// Timestamp will potentially always be different so remove it for the test
delete fakeSignalSourceHit['@timestamp'];
const expected: Omit<SignalHit, '@timestamp'> & { someKey: 'someValue' } = {
@ -180,7 +188,11 @@ describe('buildBulkBody', () => {
dataset: 'socket',
kind: 'event',
};
const fakeSignalSourceHit: SignalHitOptionalTimestamp = buildBulkBody(ruleSO, doc);
const fakeSignalSourceHit: SignalHitOptionalTimestamp = buildBulkBody(
ruleSO,
doc,
'missingFields'
);
// Timestamp will potentially always be different so remove it for the test
delete fakeSignalSourceHit['@timestamp'];
const expected: Omit<SignalHit, '@timestamp'> & { someKey: 'someValue' } = {
@ -244,7 +256,11 @@ describe('buildBulkBody', () => {
module: 'system',
dataset: 'socket',
};
const fakeSignalSourceHit: SignalHitOptionalTimestamp = buildBulkBody(ruleSO, doc);
const fakeSignalSourceHit: SignalHitOptionalTimestamp = buildBulkBody(
ruleSO,
doc,
'missingFields'
);
// Timestamp will potentially always be different so remove it for the test
delete fakeSignalSourceHit['@timestamp'];
const expected: Omit<SignalHit, '@timestamp'> & { someKey: 'someValue' } = {
@ -305,7 +321,11 @@ describe('buildBulkBody', () => {
doc._source.event = {
kind: 'event',
};
const fakeSignalSourceHit: SignalHitOptionalTimestamp = buildBulkBody(ruleSO, doc);
const fakeSignalSourceHit: SignalHitOptionalTimestamp = buildBulkBody(
ruleSO,
doc,
'missingFields'
);
// Timestamp will potentially always be different so remove it for the test
delete fakeSignalSourceHit['@timestamp'];
const expected: Omit<SignalHit, '@timestamp'> & { someKey: 'someValue' } = {
@ -365,7 +385,11 @@ describe('buildBulkBody', () => {
signal: 123,
},
} as unknown) as SignalSourceHit;
const { '@timestamp': timestamp, ...fakeSignalSourceHit } = buildBulkBody(ruleSO, doc);
const { '@timestamp': timestamp, ...fakeSignalSourceHit } = buildBulkBody(
ruleSO,
doc,
'missingFields'
);
const expected: Omit<SignalHit, '@timestamp'> & { someKey: string } = {
someKey: 'someValue',
event: {
@ -421,7 +445,11 @@ describe('buildBulkBody', () => {
signal: { child_1: { child_2: 'nested data' } },
},
} as unknown) as SignalSourceHit;
const { '@timestamp': timestamp, ...fakeSignalSourceHit } = buildBulkBody(ruleSO, doc);
const { '@timestamp': timestamp, ...fakeSignalSourceHit } = buildBulkBody(
ruleSO,
doc,
'missingFields'
);
const expected: Omit<SignalHit, '@timestamp'> & { someKey: string } = {
someKey: 'someValue',
event: {
@ -645,7 +673,12 @@ describe('buildSignalFromEvent', () => {
const ancestor = sampleDocWithAncestors().hits.hits[0];
delete ancestor._source.source;
const ruleSO = sampleRuleSO(getQueryRuleParams());
const signal: SignalHitOptionalTimestamp = buildSignalFromEvent(ancestor, ruleSO, true);
const signal: SignalHitOptionalTimestamp = buildSignalFromEvent(
ancestor,
ruleSO,
true,
'missingFields'
);
// Timestamp will potentially always be different so remove it for the test
delete signal['@timestamp'];

View file

@ -6,7 +6,7 @@
*/
import { SavedObject } from 'src/core/types';
import { mergeMissingFieldsWithSource } from './source_fields_merging/strategies/merge_missing_fields_with_source';
import { getMergeStrategy } from './source_fields_merging/strategies';
import {
AlertAttributes,
SignalSourceHit,
@ -21,6 +21,7 @@ import { additionalSignalFields, buildSignal } from './build_signal';
import { buildEventTypeSignal } from './build_event_type_signal';
import { EqlSequence } from '../../../../common/detection_engine/types';
import { generateSignalId, wrapBuildingBlocks, wrapSignal } from './utils';
import type { ConfigType } from '../../../config';
/**
* Formats the search_after result for insertion into the signals index. We first create a
@ -33,9 +34,10 @@ import { generateSignalId, wrapBuildingBlocks, wrapSignal } from './utils';
*/
export const buildBulkBody = (
ruleSO: SavedObject<AlertAttributes>,
doc: SignalSourceHit
doc: SignalSourceHit,
mergeStrategy: ConfigType['alertMergeStrategy']
): SignalHit => {
const mergedDoc = mergeMissingFieldsWithSource({ doc });
const mergedDoc = getMergeStrategy(mergeStrategy)({ doc });
const rule = buildRuleWithOverrides(ruleSO, mergedDoc._source ?? {});
const signal: Signal = {
...buildSignal([mergedDoc], rule),
@ -65,11 +67,12 @@ export const buildBulkBody = (
export const buildSignalGroupFromSequence = (
sequence: EqlSequence<SignalSource>,
ruleSO: SavedObject<AlertAttributes>,
outputIndex: string
outputIndex: string,
mergeStrategy: ConfigType['alertMergeStrategy']
): WrappedSignalHit[] => {
const wrappedBuildingBlocks = wrapBuildingBlocks(
sequence.events.map((event) => {
const signal = buildSignalFromEvent(event, ruleSO, false);
const signal = buildSignalFromEvent(event, ruleSO, false, mergeStrategy);
signal.signal.rule.building_block_type = 'default';
return signal;
}),
@ -130,9 +133,10 @@ export const buildSignalFromSequence = (
export const buildSignalFromEvent = (
event: BaseSignalHit,
ruleSO: SavedObject<AlertAttributes>,
applyOverrides: boolean
applyOverrides: boolean,
mergeStrategy: ConfigType['alertMergeStrategy']
): SignalHit => {
const mergedEvent = mergeMissingFieldsWithSource({ doc: event });
const mergedEvent = getMergeStrategy(mergeStrategy)({ doc: event });
const rule = applyOverrides
? buildRuleWithOverrides(ruleSO, mergedEvent._source ?? {})
: buildRuleWithoutOverrides(ruleSO);

View file

@ -69,6 +69,7 @@ describe('searchAfterAndBulkCreate', () => {
wrapHits = wrapHitsFactory({
ruleSO,
signalsIndex: DEFAULT_SIGNALS_INDEX,
mergeStrategy: 'missingFields',
});
});

View file

@ -192,6 +192,7 @@ describe('signal_rule_alert_type', () => {
version,
ml: mlMock,
lists: listMock.createSetup(),
mergeStrategy: 'missingFields',
});
});

View file

@ -68,6 +68,7 @@ import {
import { bulkCreateFactory } from './bulk_create_factory';
import { wrapHitsFactory } from './wrap_hits_factory';
import { wrapSequencesFactory } from './wrap_sequences_factory';
import { ConfigType } from '../../../config';
export const signalRulesAlertType = ({
logger,
@ -75,12 +76,14 @@ export const signalRulesAlertType = ({
version,
ml,
lists,
mergeStrategy,
}: {
logger: Logger;
eventsTelemetry: TelemetryEventsSender | undefined;
version: string;
ml: SetupPlugins['ml'];
lists: SetupPlugins['lists'] | undefined;
mergeStrategy: ConfigType['alertMergeStrategy'];
}): SignalRuleAlertTypeDefinition => {
return {
id: SIGNALS_ID,
@ -233,11 +236,13 @@ export const signalRulesAlertType = ({
const wrapHits = wrapHitsFactory({
ruleSO: savedObject,
signalsIndex: params.outputIndex,
mergeStrategy,
});
const wrapSequences = wrapSequencesFactory({
ruleSO: savedObject,
signalsIndex: params.outputIndex,
mergeStrategy,
});
if (isMlRule(type)) {

View file

@ -0,0 +1,31 @@
/*
* Copyright Elasticsearch B.V. and/or licensed to Elasticsearch B.V. under one
* or more contributor license agreements. Licensed under the Elastic License
* 2.0; you may not use this file except in compliance with the Elastic License
* 2.0.
*/
import { assertUnreachable } from '../../../../../../common';
import type { ConfigType } from '../../../../../config';
import { MergeStrategyFunction } from '../types';
import { mergeAllFieldsWithSource } from './merge_all_fields_with_source';
import { mergeMissingFieldsWithSource } from './merge_missing_fields_with_source';
import { mergeNoFields } from './merge_no_fields';
export const getMergeStrategy = (
mergeStrategy: ConfigType['alertMergeStrategy']
): MergeStrategyFunction => {
switch (mergeStrategy) {
case 'allFields': {
return mergeAllFieldsWithSource;
}
case 'missingFields': {
return mergeMissingFieldsWithSource;
}
case 'noFields': {
return mergeNoFields;
}
default:
return assertUnreachable(mergeStrategy);
}
};

View file

@ -6,3 +6,4 @@
*/
export * from './merge_all_fields_with_source';
export * from './merge_missing_fields_with_source';
export * from './get_strategy';

View file

@ -7,9 +7,9 @@
import { get } from 'lodash/fp';
import { set } from '@elastic/safer-lodash-set/fp';
import { SignalSource, SignalSourceHit } from '../../types';
import { SignalSource } from '../../types';
import { filterFieldEntries } from '../utils/filter_field_entries';
import type { FieldsType } from '../types';
import type { FieldsType, MergeStrategyFunction } from '../types';
import { isObjectLikeOrArrayOfObjectLikes } from '../utils/is_objectlike_or_array_of_objectlikes';
import { isNestedObject } from '../utils/is_nested_object';
import { recursiveUnboxingFields } from '../utils/recursive_unboxing_fields';
@ -26,7 +26,7 @@ import { isTypeObject } from '../utils/is_type_object';
* @param throwOnFailSafe Defaults to false, but if set to true it will cause a throw if the fail safe is triggered to indicate we need to add a new explicit test condition
* @returns The two merged together in one object where we can
*/
export const mergeAllFieldsWithSource = ({ doc }: { doc: SignalSourceHit }): SignalSourceHit => {
export const mergeAllFieldsWithSource: MergeStrategyFunction = ({ doc }) => {
const source = doc._source ?? {};
const fields = doc.fields ?? {};
const fieldEntries = Object.entries(fields);

View file

@ -7,9 +7,9 @@
import { get } from 'lodash/fp';
import { set } from '@elastic/safer-lodash-set/fp';
import { SignalSource, SignalSourceHit } from '../../types';
import { SignalSource } from '../../types';
import { filterFieldEntries } from '../utils/filter_field_entries';
import type { FieldsType } from '../types';
import type { FieldsType, MergeStrategyFunction } from '../types';
import { recursiveUnboxingFields } from '../utils/recursive_unboxing_fields';
import { isTypeObject } from '../utils/is_type_object';
import { arrayInPathExists } from '../utils/array_in_path_exists';
@ -22,11 +22,7 @@ import { isNestedObject } from '../utils/is_nested_object';
* @param throwOnFailSafe Defaults to false, but if set to true it will cause a throw if the fail safe is triggered to indicate we need to add a new explicit test condition
* @returns The two merged together in one object where we can
*/
export const mergeMissingFieldsWithSource = ({
doc,
}: {
doc: SignalSourceHit;
}): SignalSourceHit => {
export const mergeMissingFieldsWithSource: MergeStrategyFunction = ({ doc }) => {
const source = doc._source ?? {};
const fields = doc.fields ?? {};
const fieldEntries = Object.entries(fields);

View file

@ -0,0 +1,15 @@
/*
* Copyright Elasticsearch B.V. and/or licensed to Elasticsearch B.V. under one
* or more contributor license agreements. Licensed under the Elastic License
* 2.0; you may not use this file except in compliance with the Elastic License
* 2.0.
*/
import { MergeStrategyFunction } from '../types';
/**
* Does nothing and does not merge source with fields
* @param doc The doc to return and do nothing
* @returns The doc as a no operation and do nothing
*/
export const mergeNoFields: MergeStrategyFunction = ({ doc }) => doc;

View file

@ -5,7 +5,14 @@
* 2.0.
*/
import { SignalSourceHit } from '../types';
/**
* A bit stricter typing since the default fields type is an "any"
*/
export type FieldsType = string[] | number[] | boolean[] | object[];
/**
* The type of the merge strategy functions which must implement to be part of the strategy group
*/
export type MergeStrategyFunction = ({ doc }: { doc: SignalSourceHit }) => SignalSourceHit;

View file

@ -9,13 +9,16 @@ import { SearchAfterAndBulkCreateParams, WrapHits, WrappedSignalHit } from './ty
import { generateId } from './utils';
import { buildBulkBody } from './build_bulk_body';
import { filterDuplicateSignals } from './filter_duplicate_signals';
import type { ConfigType } from '../../../config';
export const wrapHitsFactory = ({
ruleSO,
signalsIndex,
mergeStrategy,
}: {
ruleSO: SearchAfterAndBulkCreateParams['ruleSO'];
signalsIndex: string;
mergeStrategy: ConfigType['alertMergeStrategy'];
}): WrapHits => (events) => {
const wrappedDocs: WrappedSignalHit[] = events.flatMap((doc) => [
{
@ -26,7 +29,7 @@ export const wrapHitsFactory = ({
String(doc._version),
ruleSO.attributes.params.ruleId ?? ''
),
_source: buildBulkBody(ruleSO, doc),
_source: buildBulkBody(ruleSO, doc, mergeStrategy),
},
]);

View file

@ -7,18 +7,21 @@
import { SearchAfterAndBulkCreateParams, WrappedSignalHit, WrapSequences } from './types';
import { buildSignalGroupFromSequence } from './build_bulk_body';
import { ConfigType } from '../../../config';
export const wrapSequencesFactory = ({
ruleSO,
signalsIndex,
mergeStrategy,
}: {
ruleSO: SearchAfterAndBulkCreateParams['ruleSO'];
signalsIndex: string;
mergeStrategy: ConfigType['alertMergeStrategy'];
}): WrapSequences => (sequences) =>
sequences.reduce(
(acc: WrappedSignalHit[], sequence) => [
...acc,
...buildSignalGroupFromSequence(sequence, ruleSO, signalsIndex),
...buildSignalGroupFromSequence(sequence, ruleSO, signalsIndex, mergeStrategy),
],
[]
);

View file

@ -387,6 +387,7 @@ export class Plugin implements IPlugin<PluginSetup, PluginStart, SetupPlugins, S
version: this.context.env.packageInfo.version,
ml: plugins.ml,
lists: plugins.lists,
mergeStrategy: this.config.alertMergeStrategy,
});
const ruleNotificationType = rulesNotificationAlertType({
logger: this.logger,