Rename 'Ingest Node Pipelines' to 'Ingest Pipelines' (#113783)

While Elasticsearch ingest pipelines require a node with the `ingest`
role, we don't need to include `ingest node` in the feature name.

There are no official plans, but the Elasticsearch team has discussed removing
the `ingest` role in the future. This also better aligns the Kibana UI with the
Elasticsearch docs.

The PR also makes some related changes to the Kibana docs.

Relates to https://github.com/elastic/elasticsearch/pull/70253.
This commit is contained in:
James Rodewig 2021-10-05 16:03:11 -04:00 committed by GitHub
parent a9a923d5ee
commit 40f2784f37
No known key found for this signature in database
GPG key ID: 4AEE18F83AFDEB23
17 changed files with 40 additions and 38 deletions

View file

@ -9,21 +9,22 @@ structure it. Grok is good for parsing syslog, apache, and other
webserver logs, mysql logs, and in general, any log format that is
written for human consumption.
Grok patterns are supported in the ingest node
{ref}/grok-processor.html[grok processor] and the Logstash
{logstash-ref}/plugins-filters-grok.html[grok filter]. See
{logstash-ref}/plugins-filters-grok.html#_grok_basics[grok basics]
for more information on the syntax for a grok pattern.
Grok patterns are supported in {es} {ref}/runtime.html[runtime fields], the {es}
{ref}/grok-processor.html[grok ingest processor], and the {ls}
{logstash-ref}/plugins-filters-grok.html[grok filter]. For syntax, see
{ref}/grok.html[Grokking grok].
The Elastic Stack ships
with more than 120 reusable grok patterns. See
https://github.com/elastic/elasticsearch/tree/master/libs/grok/src/main/resources/patterns[Ingest node grok patterns] and https://github.com/logstash-plugins/logstash-patterns-core/tree/master/patterns[Logstash grok patterns]
for the complete list of patterns.
The {stack} ships with more than 120 reusable grok patterns. For a complete
list of patterns, see
https://github.com/elastic/elasticsearch/tree/master/libs/grok/src/main/resources/patterns[{es}
grok patterns] and
https://github.com/logstash-plugins/logstash-patterns-core/tree/master/patterns[{ls}
grok patterns].
Because
ingest node and Logstash share the same grok implementation and pattern
{es} and {ls} share the same grok implementation and pattern
libraries, any grok pattern that you create in the *Grok Debugger* will work
in ingest node and Logstash.
in both {es} and {ls}.
[float]
[[grokdebugger-getting-started]]

View file

@ -458,7 +458,7 @@ the infrastructure monitoring use-case within Kibana.
|{kib-repo}blob/{branch}/x-pack/plugins/ingest_pipelines/README.md[ingestPipelines]
|The ingest_pipelines plugin provides Kibana support for Elasticsearch's ingest nodes. Please refer to the Elasticsearch documentation for more details.
|The ingest_pipelines plugin provides Kibana support for Elasticsearch's ingest pipelines.
|{kib-repo}blob/{branch}/x-pack/plugins/lens/readme.md[lens]

View file

@ -293,7 +293,7 @@ This content has moved. Refer to <<dashboard, **Dashboard**>>.
This content has moved. Refer to <<dashboard, **Dashboard**>>.
[role="exclude",id="ingest-node-pipelines"]
== Ingest Node Pipelines
== Ingest Pipelines
This content has moved. Refer to {ref}/ingest.html[Ingest pipelines].

View file

@ -17,7 +17,7 @@ Consult your administrator if you do not have the appropriate access.
[cols="50, 50"]
|===
| {ref}/ingest.html[Ingest Node Pipelines]
| {ref}/ingest.html[Ingest Pipelines]
| Create and manage ingest pipelines that let you perform common transformations
and enrichments on your data.

View file

@ -189,8 +189,9 @@ If you configured the monitoring cluster to use encrypted communications, you
must access it via HTTPS. For example, use a `hosts` setting like
`https://es-mon-1:9200`.
IMPORTANT: The {es} {monitor-features} use ingest pipelines, therefore the
cluster that stores the monitoring data must have at least one ingest node.
IMPORTANT: The {es} {monitor-features} use ingest pipelines. The
cluster that stores the monitoring data must have at least one node with the
`ingest` role.
If the {es} {security-features} are enabled on the monitoring cluster, you
must provide a valid user ID and password so that {metricbeat} can send metrics

View file

@ -86,7 +86,7 @@ function TutorialFleetInstructions({ http, basePath, isDarkTheme }: Props) {
'xpack.apm.tutorial.apmServer.fleet.message',
{
defaultMessage:
'The APM integration installs Elasticsearch templates and Ingest Node pipelines for APM data.',
'The APM integration installs Elasticsearch templates and ingest pipelines for APM data.',
}
)}
footer={

View file

@ -1,9 +1,9 @@
# Ingest Node Pipelines UI
# Ingest Pipelines UI
## Summary
The `ingest_pipelines` plugin provides Kibana support for [Elasticsearch's ingest nodes](https://www.elastic.co/guide/en/elasticsearch/reference/master/ingest.html). Please refer to the Elasticsearch documentation for more details.
The `ingest_pipelines` plugin provides Kibana support for [Elasticsearch's ingest pipelines](https://www.elastic.co/guide/en/elasticsearch/reference/master/ingest.html).
This plugin allows Kibana to create, edit, clone and delete ingest node pipelines. It also provides support to simulate a pipeline.
This plugin allows Kibana to create, edit, clone and delete ingest pipelines. It also provides support to simulate a pipeline.
It requires a Basic license and the following cluster privileges: `manage_pipeline` and `cluster:monitor/nodes/info`.
@ -11,7 +11,7 @@ It requires a Basic license and the following cluster privileges: `manage_pipeli
## Development
A new app called Ingest Node Pipelines is registered in the Management section and follows a typical CRUD UI pattern. The client-side portion of this app lives in [public/application](public/application) and uses endpoints registered in [server/routes/api](server/routes/api). For more information on the pipeline processors editor component, check out the [component readme](public/application/components/pipeline_processors_editor/README.md).
A new app called Ingest Pipelines is registered in the Management section and follows a typical CRUD UI pattern. The client-side portion of this app lives in [public/application](public/application) and uses endpoints registered in [server/routes/api](server/routes/api). For more information on the pipeline processors editor component, check out the [component readme](public/application/components/pipeline_processors_editor/README.md).
See the [kibana contributing guide](https://github.com/elastic/kibana/blob/master/CONTRIBUTING.md) for instructions on setting up your development environment.
@ -25,7 +25,7 @@ The app has the following test coverage:
### Quick steps for manual testing
You can run the following request in Console to create an ingest node pipeline:
You can run the following request in Console to create an ingest pipeline:
```
PUT _ingest/pipeline/test_pipeline
@ -73,7 +73,7 @@ PUT _ingest/pipeline/test_pipeline
}
```
Then, go to the Ingest Node Pipelines UI to edit, delete, clone, or view details of the pipeline.
Then, go to the Ingest Pipelines UI to edit, delete, clone, or view details of the pipeline.
To simulate a pipeline, go to the "Edit" page of your pipeline. Click the "Add documents" link under the "Processors" section. You may add the following sample documents to test the pipeline:

View file

@ -52,11 +52,11 @@ describe('<PipelinesList />', () => {
// Verify app title
expect(exists('appTitle')).toBe(true);
expect(find('appTitle').text()).toEqual('Ingest Node Pipelines');
expect(find('appTitle').text()).toEqual('Ingest Pipelines');
// Verify documentation link
expect(exists('documentationLink')).toBe(true);
expect(find('documentationLink').text()).toBe('Ingest Node Pipelines docs');
expect(find('documentationLink').text()).toBe('Ingest Pipelines docs');
// Verify create button exists
expect(exists('createPipelineButton')).toBe(true);

View file

@ -557,7 +557,7 @@ export const mapProcessorTypeToDescriptor: MapProcessorTypeToDescriptor = {
defaultMessage: 'Pipeline',
}),
typeDescription: i18n.translate('xpack.ingestPipelines.processors.description.pipeline', {
defaultMessage: 'Runs another ingest node pipeline.',
defaultMessage: 'Runs another ingest pipeline.',
}),
getDefaultDescription: ({ name }) =>
i18n.translate('xpack.ingestPipelines.processors.defaultDescription.pipeline', {

View file

@ -153,7 +153,7 @@ export const PipelinesList: React.FunctionComponent<RouteComponentProps> = ({
<span data-test-subj="appTitle">
<FormattedMessage
id="xpack.ingestPipelines.list.listTitle"
defaultMessage="Ingest Node Pipelines"
defaultMessage="Ingest Pipelines"
/>
</span>
}
@ -172,7 +172,7 @@ export const PipelinesList: React.FunctionComponent<RouteComponentProps> = ({
>
<FormattedMessage
id="xpack.ingestPipelines.list.pipelinesDocsLinkText"
defaultMessage="Ingest Node Pipelines docs"
defaultMessage="Ingest Pipelines docs"
/>
</EuiButtonEmpty>,
]}

View file

@ -11,7 +11,7 @@ import { ManagementAppMountParams } from '../../../../../../src/plugins/manageme
type SetBreadcrumbs = ManagementAppMountParams['setBreadcrumbs'];
const homeBreadcrumbText = i18n.translate('xpack.ingestPipelines.breadcrumb.pipelinesLabel', {
defaultMessage: 'Ingest Node Pipelines',
defaultMessage: 'Ingest Pipelines',
});
export class BreadcrumbService {

View file

@ -25,7 +25,7 @@ export class IngestPipelinesPlugin
apiService.setup(http, uiMetricService);
const pluginName = i18n.translate('xpack.ingestPipelines.appTitle', {
defaultMessage: 'Ingest Node Pipelines',
defaultMessage: 'Ingest Pipelines',
});
management.sections.section.ingest.registerApp({

View file

@ -14,14 +14,14 @@ export default function ({ getService, getPageObjects }: any) {
const log = getService('log');
const a11y = getService('a11y'); /* this is the wrapping service around axe */
describe('Ingest Node Pipelines', async () => {
describe('Ingest Pipelines', async () => {
before(async () => {
await putSamplePipeline(esClient);
await common.navigateToApp('ingestPipelines');
});
it('List View', async () => {
await retry.waitFor('Ingest Node Pipelines page to be visible', async () => {
await retry.waitFor('Ingest Pipelines page to be visible', async () => {
await common.navigateToApp('ingestPipelines');
return testSubjects.exists('pipelineDetailsLink') ? true : false;
});

View file

@ -8,7 +8,7 @@
import { FtrProviderContext } from '../../../ftr_provider_context';
export default function ({ loadTestFile }: FtrProviderContext) {
describe('Ingest Node Pipelines', () => {
describe('Ingest Pipelines', () => {
loadTestFile(require.resolve('./ingest_pipelines'));
});
}

View file

@ -145,7 +145,7 @@ export default function ({ getService }: FtrProviderContext) {
await createPipeline({ body: PIPELINE, id: PIPELINE_ID }, true);
} catch (err) {
// eslint-disable-next-line no-console
console.log('[Setup error] Error creating ingest node pipeline');
console.log('[Setup error] Error creating ingest pipeline');
throw err;
}
});
@ -225,7 +225,7 @@ export default function ({ getService }: FtrProviderContext) {
await createPipeline({ body: PIPELINE, id: PIPELINE_ID }, true);
} catch (err) {
// eslint-disable-next-line no-console
console.log('[Setup error] Error creating ingest node pipeline');
console.log('[Setup error] Error creating ingest pipeline');
throw err;
}
});

View file

@ -60,7 +60,7 @@ export default function ({ getPageObjects, getService }: FtrProviderContext) {
it('should render the "Ingest" section with ingest pipelines', async () => {
await PageObjects.common.navigateToApp('management');
const sections = await managementMenu.getSections();
// We gave the ingest node pipelines user access to advanced settings to allow them to use ingest node pipelines.
// We gave the ingest pipelines user access to advanced settings to allow them to use ingest pipelines.
// See https://github.com/elastic/kibana/pull/102409/
expect(sections).to.have.length(2);
expect(sections[0]).to.eql({

View file

@ -29,10 +29,10 @@ export default ({ getPageObjects, getService }: FtrProviderContext) => {
});
it('Loads the app', async () => {
log.debug('Checking for section heading to say Ingest Node Pipelines.');
log.debug('Checking for section heading to say Ingest Pipelines.');
const headingText = await pageObjects.ingestPipelines.sectionHeadingText();
expect(headingText).to.be('Ingest Node Pipelines');
expect(headingText).to.be('Ingest Pipelines');
});
it('Creates a pipeline', async () => {