Merge branch 'master' into data/allow-custom-formatting

This commit is contained in:
Timothy Sullivan 2020-07-13 08:50:21 -07:00
commit 34df3318bf
186 changed files with 5723 additions and 3309 deletions

1
.gitignore vendored
View file

@ -31,6 +31,7 @@ disabledPlugins
webpackstats.json
/config/*
!/config/kibana.yml
!/config/node.options
coverage
selenium
.babel_register_cache.json

View file

@ -1,739 +1,5 @@
# Contributing to Kibana
We understand that you may not have days at a time to work on Kibana. We ask that you read our contributing guidelines carefully so that you spend less time, overall, struggling to push your PR through our code review processes.
We understand that you may not have days at a time to work on Kibana. We ask that you read our [developer guide](https://www.elastic.co/guide/en/kibana/master/development.html) carefully so that you spend less time, overall, struggling to push your PR through our code review processes.
At the same time, reading the contributing guidelines will give you a better idea of how to post meaningful issues that will be more easily be parsed, considered, and resolved. A big win for everyone involved! :tada:
## Table of Contents
A high level overview of our contributing guidelines.
- [Effective issue reporting in Kibana](#effective-issue-reporting-in-kibana)
- [Voicing the importance of an issue](#voicing-the-importance-of-an-issue)
- ["My issue isn't getting enough attention"](#my-issue-isnt-getting-enough-attention)
- ["I want to help!"](#i-want-to-help)
- [How We Use Git and GitHub](#how-we-use-git-and-github)
- [Forking](#forking)
- [Branching](#branching)
- [Commits and Merging](#commits-and-merging)
- [Rebasing and fixing merge conflicts](#rebasing-and-fixing-merge-conflicts)
- [What Goes Into a Pull Request](#what-goes-into-a-pull-request)
- [Contributing Code](#contributing-code)
- [Setting Up Your Development Environment](#setting-up-your-development-environment)
- [Increase node.js heap size](#increase-nodejs-heap-size)
- [Running Elasticsearch Locally](#running-elasticsearch-locally)
- [Nightly snapshot (recommended)](#nightly-snapshot-recommended)
- [Keeping data between snapshots](#keeping-data-between-snapshots)
- [Source](#source)
- [Archive](#archive)
- [Sample Data](#sample-data)
- [Running Elasticsearch Remotely](#running-elasticsearch-remotely)
- [Running remote clusters](#running-remote-clusters)
- [Running Kibana](#running-kibana)
- [Running Kibana in Open-Source mode](#running-kibana-in-open-source-mode)
- [Unsupported URL Type](#unsupported-url-type)
- [Customizing `config/kibana.dev.yml`](#customizing-configkibanadevyml)
- [Potential Optimization Pitfalls](#potential-optimization-pitfalls)
- [Setting Up SSL](#setting-up-ssl)
- [Linting](#linting)
- [Setup Guide for VS Code Users](#setup-guide-for-vs-code-users)
- [Internationalization](#internationalization)
- [Localization](#localization)
- [Styling with SASS](#styling-with-sass)
- [Testing and Building](#testing-and-building)
- [Debugging server code](#debugging-server-code)
- [Instrumenting with Elastic APM](#instrumenting-with-elastic-apm)
- [Unit testing frameworks](#unit-testing-frameworks)
- [Running specific Kibana tests](#running-specific-kibana-tests)
- [Debugging Unit Tests](#debugging-unit-tests)
- [Unit Testing Plugins](#unit-testing-plugins)
- [Automated Accessibility Testing](#automated-accessibility-testing)
- [Cross-browser compatibility](#cross-browser-compatibility)
- [Testing compatibility locally](#testing-compatibility-locally)
- [Running Browser Automation Tests](#running-browser-automation-tests)
- [Building OS packages](#building-os-packages)
- [Writing documentation](#writing-documentation)
- [Release Notes Process](#release-notes-process)
- [Signing the contributor license agreement](#signing-the-contributor-license-agreement)
- [Submitting a Pull Request](#submitting-a-pull-request)
- [Code Reviewing](#code-reviewing)
- [Getting to the Code Review Stage](#getting-to-the-code-review-stage)
- [Reviewing Pull Requests](#reviewing-pull-requests)
Don't fret, it's not as daunting as the table of contents makes it out to be!
## Effective issue reporting in Kibana
### Voicing the importance of an issue
We seriously appreciate thoughtful comments. If an issue is important to you, add a comment with a solid write up of your use case and explain why it's so important. Please avoid posting comments comprised solely of a thumbs up emoji 👍.
Granted that you share your thoughts, we might even be able to come up with creative solutions to your specific problem. If everything you'd like to say has already been brought up but you'd still like to add a token of support, feel free to add a [👍 thumbs up reaction](https://github.com/blog/2119-add-reactions-to-pull-requests-issues-and-comments) on the issue itself and on the comment which best summarizes your thoughts.
### "My issue isn't getting enough attention"
First of all, **sorry about that!** We want you to have a great time with Kibana.
There's hundreds of open issues and prioritizing what to work on is an important aspect of our daily jobs. We prioritize issues according to impact and difficulty, so some issues can be neglected while we work on more pressing issues.
Feel free to bump your issues if you think they've been neglected for a prolonged period.
### "I want to help!"
**Now we're talking**. If you have a bug fix or new feature that you would like to contribute to Kibana, please **find or open an issue about it before you start working on it.** Talk about what you would like to do. It may be that somebody is already working on it, or that there are particular issues that you should know about before implementing the change.
We enjoy working with contributors to get their code accepted. There are many approaches to fixing a problem and it is important to find the best approach before writing too much code.
## How We Use Git and GitHub
### Forking
We follow the [GitHub forking model](https://help.github.com/articles/fork-a-repo/) for collaborating
on Kibana code. This model assumes that you have a remote called `upstream` which points to the
official Kibana repo, which we'll refer to in later code snippets.
### Branching
* All work on the next major release goes into master.
* Past major release branches are named `{majorVersion}.x`. They contain work that will go into the next minor release. For example, if the next minor release is `5.2.0`, work for it should go into the `5.x` branch.
* Past minor release branches are named `{majorVersion}.{minorVersion}`. They contain work that will go into the next patch release. For example, if the next patch release is `5.3.1`, work for it should go into the `5.3` branch.
* All work is done on feature branches and merged into one of these branches.
* Where appropriate, we'll backport changes into older release branches.
### Commits and Merging
* Feel free to make as many commits as you want, while working on a branch.
* When submitting a PR for review, please perform an interactive rebase to present a logical history that's easy for the reviewers to follow.
* Please use your commit messages to include helpful information on your changes, e.g. changes to APIs, UX changes, bugs fixed, and an explanation of *why* you made the changes that you did.
* Resolve merge conflicts by rebasing the target branch over your feature branch, and force-pushing (see below for instructions).
* When merging, we'll squash your commits into a single commit.
#### Rebasing and fixing merge conflicts
Rebasing can be tricky, and fixing merge conflicts can be even trickier because it involves force pushing. This is all compounded by the fact that attempting to push a rebased branch remotely will be rejected by git, and you'll be prompted to do a `pull`, which is not at all what you should do (this will really mess up your branch's history).
Here's how you should rebase master onto your branch, and how to fix merge conflicts when they arise.
First, make sure master is up-to-date.
```
git checkout master
git fetch upstream
git rebase upstream/master
```
Then, check out your branch and rebase master on top of it, which will apply all of the new commits on master to your branch, and then apply all of your branch's new commits after that.
```
git checkout name-of-your-branch
git rebase master
```
You want to make sure there are no merge conflicts. If there are merge conflicts, git will pause the rebase and allow you to fix the conflicts before continuing.
You can use `git status` to see which files contain conflicts. They'll be the ones that aren't staged for commit. Open those files, and look for where git has marked the conflicts. Resolve the conflicts so that the changes you want to make to the code have been incorporated in a way that doesn't destroy work that's been done in master. Refer to master's commit history on GitHub if you need to gain a better understanding of how code is conflicting and how best to resolve it.
Once you've resolved all of the merge conflicts, use `git add -A` to stage them to be committed, and then use `git rebase --continue` to tell git to continue the rebase.
When the rebase has completed, you will need to force push your branch because the history is now completely different than what's on the remote. **This is potentially dangerous** because it will completely overwrite what you have on the remote, so you need to be sure that you haven't lost any work when resolving merge conflicts. (If there weren't any merge conflicts, then you can force push without having to worry about this.)
```
git push origin name-of-your-branch --force
```
This will overwrite the remote branch with what you have locally. You're done!
**Note that you should not run `git pull`**, for example in response to a push rejection like this:
```
! [rejected] name-of-your-branch -> name-of-your-branch (non-fast-forward)
error: failed to push some refs to 'https://github.com/YourGitHubHandle/kibana.git'
hint: Updates were rejected because the tip of your current branch is behind
hint: its remote counterpart. Integrate the remote changes (e.g.
hint: 'git pull ...') before pushing again.
hint: See the 'Note about fast-forwards' in 'git push --help' for details.
```
Assuming you've successfully rebased and you're happy with the code, you should force push instead.
### What Goes Into a Pull Request
* Please include an explanation of your changes in your PR description.
* Links to relevant issues, external resources, or related PRs are very important and useful.
* Please update any tests that pertain to your code, and add new tests where appropriate.
* See [Submitting a Pull Request](#submitting-a-pull-request) for more info.
## Contributing Code
These guidelines will help you get your Pull Request into shape so that a code review can start as soon as possible.
### Setting Up Your Development Environment
Fork, then clone the `kibana` repo and change directory into it
```bash
git clone https://github.com/[YOUR_USERNAME]/kibana.git kibana
cd kibana
```
Install the version of Node.js listed in the `.node-version` file. This can be automated with tools such as [nvm](https://github.com/creationix/nvm), [nvm-windows](https://github.com/coreybutler/nvm-windows) or [avn](https://github.com/wbyoung/avn). As we also include a `.nvmrc` file you can switch to the correct version when using nvm by running:
```bash
nvm use
```
Install the latest version of [yarn](https://yarnpkg.com).
Bootstrap Kibana and install all the dependencies
```bash
yarn kbn bootstrap
```
> Node.js native modules could be in use and node-gyp is the tool used to build them. There are tools you need to install per platform and python versions you need to be using. Please see https://github.com/nodejs/node-gyp#installation and follow the guide according your platform.
(You can also run `yarn kbn` to see the other available commands. For more info about this tool, see https://github.com/elastic/kibana/tree/master/packages/kbn-pm.)
When switching branches which use different versions of npm packages you may need to run;
```bash
yarn kbn clean
```
If you have failures during `yarn kbn bootstrap` you may have some corrupted packages in your yarn cache which you can clean with;
```bash
yarn cache clean
```
#### Increase node.js heap size
Kibana is a big project and for some commands it can happen that the process hits the default heap limit and crashes with an out-of-memory error. If you run into this problem, you can increase maximum heap size by setting the `--max_old_space_size` option on the command line. To set the limit for all commands, simply add the following line to your shell config: `export NODE_OPTIONS="--max_old_space_size=2048"`.
### Running Elasticsearch Locally
There are a few options when it comes to running Elasticsearch locally:
#### Nightly snapshot (recommended)
These snapshots are built on a nightly basis which expire after a couple weeks. If running from an old, untracted branch this snapshot might not exist. In which case you might need to run from source or an archive.
```bash
yarn es snapshot
```
##### Keeping data between snapshots
If you want to keep the data inside your Elasticsearch between usages of this command,
you should use the following command, to keep your data folder outside the downloaded snapshot
folder:
```bash
yarn es snapshot -E path.data=../data
```
The same parameter can be used with the source and archive command shown in the following
paragraphs.
#### Source
By default, it will reference an [elasticsearch](https://github.com/elastic/elasticsearch) checkout which is a sibling to the Kibana directory named `elasticsearch`. If you wish to use a checkout in another location you can provide that by supplying `--source-path`
```bash
yarn es source
```
#### Archive
Use this if you already have a distributable. For released versions, one can be obtained on the [Elasticsearch downloads](https://www.elastic.co/downloads/elasticsearch) page.
```bash
yarn es archive <full_path_to_archive>
```
**Each of these will run Elasticsearch with a `basic` license. Additional options are available, pass `--help` for more information.**
##### Sample Data
If you're just getting started with Elasticsearch, you could use the following command to populate your instance with a few fake logs to hit the ground running.
```bash
node scripts/makelogs --auth <username>:<password>
```
> The default username and password combination are `elastic:changeme`
> Make sure to execute `node scripts/makelogs` *after* elasticsearch is up and running!
### Running Elasticsearch Remotely
You can save some system resources, and the effort of generating sample data, if you have a remote Elasticsearch cluster to connect to. (**Elasticians: you do! Check with your team about where to find credentials**)
You'll need to [create a `kibana.dev.yml`](#customizing-configkibanadevyml) and add the following to it:
```
elasticsearch.hosts:
- {{ url }}
elasticsearch.username: {{ username }}
elasticsearch.password: {{ password }}
elasticsearch.ssl.verificationMode: none
```
If many other users will be interacting with your remote cluster, you'll want to add the following to avoid causing conflicts:
```
kibana.index: '.{YourGitHubHandle}-kibana'
xpack.task_manager.index: '.{YourGitHubHandle}-task-manager-kibana'
```
### Running remote clusters
Setup remote clusters for cross cluster search (CCS) and cross cluster replication (CCR).
Start your primary cluster by running:
```bash
yarn es snapshot -E path.data=../data_prod1
```
Start your remote cluster by running:
```bash
yarn es snapshot -E transport.port=9500 -E http.port=9201 -E path.data=../data_prod2
```
Once both clusters are running, start kibana. Kibana will connect to the primary cluster.
Setup the remote cluster in Kibana from either `Management` -> `Elasticsearch` -> `Remote Clusters` UI or by running the following script in `Console`.
```
PUT _cluster/settings
{
"persistent": {
"cluster": {
"remote": {
"cluster_one": {
"seeds": [
"localhost:9500"
]
}
}
}
}
}
```
Follow the [cross-cluster search](https://www.elastic.co/guide/en/kibana/current/management-cross-cluster-search.html) instructions for setting up index patterns to search across clusters.
### Running Kibana
Change to your local Kibana directory.
Start the development server.
```bash
yarn start
```
> On Windows, you'll need to use Git Bash, Cygwin, or a similar shell that exposes the `sh` command. And to successfully build you'll need Cygwin optional packages zip, tar, and shasum.
Now you can point your web browser to http://localhost:5601 and start using Kibana! When running `yarn start`, Kibana will also log that it is listening on port 5603 due to the base path proxy, but you should still access Kibana on port 5601.
By default, you can log in with username `elastic` and password `changeme`. See the `--help` options on `yarn es <command>` if you'd like to configure a different password.
#### Running Kibana in Open-Source mode
If you're looking to only work with the open-source software, supply the license type to `yarn es`:
```bash
yarn es snapshot --license oss
```
And start Kibana with only open-source code:
```bash
yarn start --oss
```
#### Unsupported URL Type
If you're installing dependencies and seeing an error that looks something like
```
Unsupported URL Type: link:packages/eslint-config-kibana
```
you're likely running `npm`. To install dependencies in Kibana you need to run `yarn kbn bootstrap`. For more info, see [Setting Up Your Development Environment](#setting-up-your-development-environment) above.
#### Customizing `config/kibana.dev.yml`
The `config/kibana.yml` file stores user configuration directives. Since this file is checked into source control, however, developer preferences can't be saved without the risk of accidentally committing the modified version. To make customizing configuration easier during development, the Kibana CLI will look for a `config/kibana.dev.yml` file if run with the `--dev` flag. This file behaves just like the non-dev version and accepts any of the [standard settings](https://www.elastic.co/guide/en/kibana/current/settings.html).
#### Potential Optimization Pitfalls
- Webpack is trying to include a file in the bundle that I deleted and is now complaining about it is missing
- A module id that used to resolve to a single file now resolves to a directory, but webpack isn't adapting
- (if you discover other scenarios, please send a PR!)
#### Setting Up SSL
Kibana includes self-signed certificates that can be used for development purposes in the browser and for communicating with Elasticsearch: `yarn start --ssl` & `yarn es snapshot --ssl`.
### Linting
A note about linting: We use [eslint](http://eslint.org) to check that the [styleguide](STYLEGUIDE.md) is being followed. It runs in a pre-commit hook and as a part of the tests, but most contributors integrate it with their code editors for real-time feedback.
Here are some hints for getting eslint setup in your favorite editor:
Editor | Plugin
-----------|-------------------------------------------------------------------------------
Sublime | [SublimeLinter-eslint](https://github.com/roadhump/SublimeLinter-eslint#installation)
Atom | [linter-eslint](https://github.com/AtomLinter/linter-eslint#installation)
VSCode | [ESLint](https://marketplace.visualstudio.com/items?itemName=dbaeumer.vscode-eslint)
IntelliJ | Settings » Languages & Frameworks » JavaScript » Code Quality Tools » ESLint
`vi` | [scrooloose/syntastic](https://github.com/scrooloose/syntastic)
Another tool we use for enforcing consistent coding style is EditorConfig, which can be set up by installing a plugin in your editor that dynamically updates its configuration. Take a look at the [EditorConfig](http://editorconfig.org/#download) site to find a plugin for your editor, and browse our [`.editorconfig`](https://github.com/elastic/kibana/blob/master/.editorconfig) file to see what config rules we set up.
#### Setup Guide for VS Code Users
Note that for VSCode, to enable "live" linting of TypeScript (and other) file types, you will need to modify your local settings, as shown below. The default for the ESLint extension is to only lint JavaScript file types.
```json
"eslint.validate": [
"javascript",
"javascriptreact",
{ "language": "typescript", "autoFix": true },
{ "language": "typescriptreact", "autoFix": true }
]
```
`eslint` can automatically fix trivial lint errors when you save a file by adding this line in your setting.
```json
"eslint.autoFixOnSave": true,
```
:warning: It is **not** recommended to use the [`Prettier` extension/IDE plugin](https://prettier.io/) while maintaining the Kibana project. Formatting and styling roles are set in the multiple `.eslintrc.js` files across the project and some of them use the [NPM version of Prettier](https://www.npmjs.com/package/prettier). Using the IDE extension might cause conflicts, applying the formatting to too many files that shouldn't be prettier-ized and/or highlighting errors that are actually OK.
### Internationalization
All user-facing labels and info texts in Kibana should be internationalized. Please take a look at the [readme](packages/kbn-i18n/README.md) and the [guideline](packages/kbn-i18n/GUIDELINE.md) of the i18n package on how to do so.
In order to enable translations in the React parts of the application, the top most component of every `ReactDOM.render` call should be the `Context` component from the `i18n` core service:
```jsx
const I18nContext = coreStart.i18n.Context;
ReactDOM.render(
<I18nContext>
{myComponentTree}
</I18nContext>,
container
);
```
There are a number of tools created to support internationalization in Kibana that would allow one to validate internationalized labels,
extract them to a `JSON` file or integrate translations back to Kibana. To know more, please read corresponding [readme](src/dev/i18n/README.md) file.
### Localization
We cannot support accepting contributions to the translations from any source other than the translators we have engaged to do the work.
We are still to develop a proper process to accept any contributed translations. We certainly appreciate that people care enough about the localization effort to want to help improve the quality. We aim to build out a more comprehensive localization process for the future and will notify you once contributions can be supported, but for the time being, we are not able to incorporate suggestions.
### Styling with SASS
When writing a new component, create a sibling SASS file of the same name and import directly into the JS/TS component file. Doing so ensures the styles are never separated or lost on import and allows for better modularization (smaller individual plugin asset footprint).
All SASS (.scss) files will automatically build with the [EUI](https://elastic.github.io/eui/#/guidelines/sass) & Kibana invisibles (SASS variables, mixins, functions) from the [`globals_[theme].scss` file](src/legacy/ui/public/styles/_globals_v7light.scss).
**Example:**
```tsx
// component.tsx
import './component.scss';
export const Component = () => {
return (
<div className="plgComponent" />
);
}
```
```scss
// component.scss
.plgComponent { ... }
```
Do not use the underscore `_` SASS file naming pattern when importing directly into a javascript file.
### Testing and Building
To ensure that your changes will not break other functionality, please run the test suite and build process before submitting your Pull Request.
Before running the tests you will need to install the projects dependencies as described above.
Once that's done, just run:
```bash
yarn test && yarn build --skip-os-packages
```
You can get all build options using the following command:
```bash
yarn build --help
```
macOS users on a machine with a discrete graphics card may see significant speedups (up to 2x) when running tests by changing your terminal emulator's GPU settings. In iTerm2:
- Open Preferences (Command + ,)
- In the General tab, under the "Magic" section, ensure "GPU rendering" is checked
- Open "Advanced GPU Settings..."
- Uncheck the "Prefer integrated to discrete GPU" option
- Restart iTerm
#### Debugging Server Code
`yarn debug` will start the server with Node's inspect flag. Kibana's development mode will start three processes on ports `9229`, `9230`, and `9231`. Chrome's developer tools need to be configured to connect to all three connections. Add `localhost:<port>` for each Kibana process in Chrome's developer tools connection tab.
#### Instrumenting with Elastic APM
Kibana ships with the [Elastic APM Node.js Agent](https://github.com/elastic/apm-agent-nodejs) built-in for debugging purposes.
Its default configuration is meant to be used by core Kibana developers only, but it can easily be re-configured to your needs.
In its default configuration it's disabled and will, once enabled, send APM data to a centrally managed Elasticsearch cluster accessible only to Elastic employees.
To change the location where data is sent, use the [`serverUrl`](https://www.elastic.co/guide/en/apm/agent/nodejs/current/configuration.html#server-url) APM config option.
To activate the APM agent, use the [`active`](https://www.elastic.co/guide/en/apm/agent/nodejs/current/configuration.html#active) APM config option.
All config options can be set either via environment variables, or by creating an appropriate config file under `config/apm.dev.js`.
For more information about configuring the APM agent, please refer to [the documentation](https://www.elastic.co/guide/en/apm/agent/nodejs/current/configuring-the-agent.html).
Example `config/apm.dev.js` file:
```js
module.exports = {
active: true,
};
```
APM [Real User Monitoring agent](https://www.elastic.co/guide/en/apm/agent/rum-js/current/index.html) is not available in the Kibana distributables,
however the agent can be enabled by setting `ELASTIC_APM_ACTIVE` to `true`.
flags
```
ELASTIC_APM_ACTIVE=true yarn start
// activates both Node.js and RUM agent
```
Once the agent is active, it will trace all incoming HTTP requests to Kibana, monitor for errors, and collect process-level metrics.
The collected data will be sent to the APM Server and is viewable in the APM UI in Kibana.
#### Unit testing frameworks
Kibana is migrating unit testing from Mocha to Jest. Legacy unit tests still
exist in Mocha but all new unit tests should be written in Jest. Mocha tests
are contained in `__tests__` directories. Whereas Jest tests are stored in
the same directory as source code files with the `.test.js` suffix.
#### Running specific Kibana tests
The following table outlines possible test file locations and how to invoke them:
| Test runner | Test location | Runner command (working directory is kibana root) |
| ----------------- | ------------------------------------------------------------------------------------------------------------------------------------------------------- | --------------------------------------------------------------------------------------- |
| Jest | `src/**/*.test.js`<br>`src/**/*.test.ts` | `yarn test:jest -t regexp [test path]` |
| Jest (integration) | `**/integration_tests/**/*.test.js` | `yarn test:jest_integration -t regexp [test path]` |
| Mocha | `src/**/__tests__/**/*.js`<br>`!src/**/public/__tests__/*.js`<br>`packages/kbn-datemath/test/**/*.js`<br>`packages/kbn-dev-utils/src/**/__tests__/**/*.js`<br>`tasks/**/__tests__/**/*.js` | `node scripts/mocha --grep=regexp [test path]` |
| Functional | `test/*integration/**/config.js`<br>`test/*functional/**/config.js`<br>`test/accessibility/config.js` | `yarn test:ftr:server --config test/[directory]/config.js`<br>`yarn test:ftr:runner --config test/[directory]/config.js --grep=regexp` |
| Karma | `src/**/public/__tests__/*.js` | `yarn test:karma:debug` |
For X-Pack tests located in `x-pack/` see [X-Pack Testing](x-pack/README.md#testing)
Test runner arguments:
- Where applicable, the optional arguments `-t=regexp` or `--grep=regexp` will only run tests or test suites whose descriptions matches the regular expression.
- `[test path]` is the relative path to the test file.
Examples:
- Run the entire elasticsearch_service test suite:
```
yarn test:jest src/core/server/elasticsearch/elasticsearch_service.test.ts
```
- Run the jest test case whose description matches `stops both admin and data clients`:
```
yarn test:jest -t 'stops both admin and data clients' src/core/server/elasticsearch/elasticsearch_service.test.ts
```
- Run the api integration test case whose description matches the given string:
```
yarn test:ftr:server --config test/api_integration/config.js
yarn test:ftr:runner --config test/api_integration/config.js --grep='should return 404 if id does not match any sample data sets'
```
#### Debugging Unit Tests
The standard `yarn test` task runs several sub tasks and can take several minutes to complete, making debugging failures pretty painful. In order to ease the pain specialized tasks provide alternate methods for running the tests.
You could also add the `--debug` option so that `node` is run using the `--debug-brk` flag. You'll need to connect a remote debugger such as [`node-inspector`](https://github.com/node-inspector/node-inspector) to proceed in this mode.
```bash
node scripts/mocha --debug <file>
```
With `yarn test:karma`, you can run only the browser tests. Coverage reports are available for browser tests by running `yarn test:coverage`. You can find the results under the `coverage/` directory that will be created upon completion.
```bash
yarn test:karma
```
Using `yarn test:karma:debug` initializes an environment for debugging the browser tests. Includes an dedicated instance of the kibana server for building the test bundle, and a karma server. When running this task the build is optimized for the first time and then a karma-owned instance of the browser is opened. Click the "debug" button to open a new tab that executes the unit tests.
```bash
yarn test:karma:debug
```
In the screenshot below, you'll notice the URL is `localhost:9876/debug.html`. You can append a `grep` query parameter to this URL and set it to a string value which will be used to exclude tests which don't match. For example, if you changed the URL to `localhost:9876/debug.html?query=my test` and then refreshed the browser, you'd only see tests run which contain "my test" in the test description.
![Browser test debugging](http://i.imgur.com/DwHxgfq.png)
#### Unit Testing Plugins
This should work super if you're using the [Kibana plugin generator](https://github.com/elastic/kibana/tree/master/packages/kbn-plugin-generator). If you're not using the generator, well, you're on your own. We suggest you look at how the generator works.
To run the tests for just your particular plugin run the following command from your plugin:
```bash
yarn test:mocha
yarn test:karma:debug # remove the debug flag to run them once and close
```
#### Automated Accessibility Testing
To run the tests locally:
1. In one terminal window run `node scripts/functional_tests_server --config test/accessibility/config.ts`
2. In another terminal window run `node scripts/functional_test_runner.js --config test/accessibility/config.ts`
To run the x-pack tests, swap the config file out for `x-pack/test/accessibility/config.ts`.
After the server is up, you can go to this instance of Kibana at `localhost:5620`.
The testing is done using [axe](https://github.com/dequelabs/axe-core). The same thing that runs in CI,
can be run locally using their browser plugins:
- [Chrome](https://chrome.google.com/webstore/detail/axe-web-accessibility-tes/lhdoppojpmngadmnindnejefpokejbdd?hl=en-US)
- [Firefox](https://addons.mozilla.org/en-US/firefox/addon/axe-devtools/)
#### Cross-browser Compatibility
##### Testing Compatibility Locally
###### Testing IE on OS X
* [Download VMWare Fusion](http://www.vmware.com/products/fusion/fusion-evaluation.html).
* [Download IE virtual machines](https://developer.microsoft.com/en-us/microsoft-edge/tools/vms/#downloads) for VMWare.
* Open VMWare and go to Window > Virtual Machine Library. Unzip the virtual machine and drag the .vmx file into your Virtual Machine Library.
* Right-click on the virtual machine you just added to your library and select "Snapshots...", and then click the "Take" button in the modal that opens. You can roll back to this snapshot when the VM expires in 90 days.
* In System Preferences > Sharing, change your computer name to be something simple, e.g. "computer".
* Run Kibana with `yarn start --host=computer.local` (substituting your computer name).
* Now you can run your VM, open the browser, and navigate to `http://computer.local:5601` to test Kibana.
* Alternatively you can use browserstack
##### Running Browser Automation Tests
[Read about the `FunctionalTestRunner`](https://www.elastic.co/guide/en/kibana/current/development-functional-tests.html) to learn more about how you can run and develop functional tests for Kibana core and plugins.
You can also look into the [Scripts README.md](./scripts/README.md) to learn more about using the node scripts we provide for building Kibana, running integration tests, and starting up Kibana and Elasticsearch while you develop.
### Building OS packages
Packages are built using fpm, dpkg, and rpm. Package building has only been tested on Linux and is not supported on any other platform.
```bash
apt-get install ruby-dev rpm
gem install fpm -v 1.5.0
yarn build --skip-archives
```
To specify a package to build you can add `rpm` or `deb` as an argument.
```bash
yarn build --rpm
```
Distributable packages can be found in `target/` after the build completes.
### Writing documentation
Kibana documentation is written in [asciidoc](http://asciidoc.org/) format in
the `docs/` directory.
To build the docs, clone the [elastic/docs](https://github.com/elastic/docs)
repo as a sibling of your Kibana repo. Follow the instructions in that project's
README for getting the docs tooling set up.
**To build the Kibana docs and open them in your browser:**
```bash
./docs/build_docs --doc kibana/docs/index.asciidoc --chunk 1 --open
```
or
```bash
node scripts/docs.js --open
```
### Release Notes process
Part of this process only applies to maintainers, since it requires access to GitHub labels.
Kibana publishes [Release Notes](https://www.elastic.co/guide/en/kibana/current/release-notes.html) for major and minor releases. The Release Notes summarize what the PRs accomplish in language that is meaningful to users. To generate the Release Notes, the team runs a script against this repo to collect the merged PRs against the release.
#### Create the Release Notes text
The text that appears in the Release Notes is pulled directly from your PR title, or a single paragraph of text that you specify in the PR description.
To use a single paragraph of text, enter `Release note:` or a `## Release note` header in the PR description, followed by your text. For example, refer to this [PR](https://github.com/elastic/kibana/pull/65796) that uses the `## Release note` header.
When you create the Release Notes text, use the following best practices:
* Use present tense.
* Use sentence case.
* When you create a feature PR, start with `Adds`.
* When you create an enhancement PR, start with `Improves`.
* When you create a bug fix PR, start with `Fixes`.
* When you create a deprecation PR, start with `Deprecates`.
#### Add your labels
1. Label the PR with the targeted version (ex: `v7.3.0`).
2. Label the PR with the appropriate GitHub labels:
* For a new feature or functionality, use `release_note:enhancement`.
* For an external-facing fix, use `release_note:fix`. We do not include docs, build, and test fixes in the Release Notes, or unreleased issues that are only on `master`.
* For a deprecated feature, use `release_note:deprecation`.
* For a breaking change, use `release_note:breaking`.
* To **NOT** include your changes in the Release Notes, use `release_note:skip`.
We also produce a blog post that details more important breaking API changes in every major and minor release. When your PR includes a breaking API change, add the `release_note:dev_docs` label, and add a brief summary of the break at the bottom of the PR using the format below:
```
# Dev Docs
## Name the feature with the break (ex: Visualize Loader)
Summary of the change. Anything Under `#Dev Docs` is used in the blog.
```
## Signing the contributor license agreement
Please make sure you have signed the [Contributor License Agreement](http://www.elastic.co/contributor-agreement/). We are not asking you to assign copyright to us, but to give us the right to distribute your code without restriction. We ask this of all contributors in order to assure our users of the origin and continuing existence of the code. You only need to sign the CLA once.
## Submitting a Pull Request
Push your local changes to your forked copy of the repository and submit a Pull Request. In the Pull Request, describe what your changes do and mention the number of the issue where discussion has taken place, e.g., “Closes #123″.
Always submit your pull against `master` unless the bug is only present in an older version. If the bug affects both `master` and another branch say so in your pull.
Then sit back and wait. There will probably be discussion about the Pull Request and, if any changes are needed, we'll work with you to get your Pull Request merged into Kibana.
## Code Reviewing
After a pull is submitted, it needs to get to review. If you have commit permission on the Kibana repo you will probably perform these steps while submitting your Pull Request. If not, a member of the Elastic organization will do them for you, though you can help by suggesting a reviewer for your changes if you've interacted with someone while working on the issue.
### Getting to the Code Review Stage
1. Assign the `review` label. This signals to the team that someone needs to give this attention.
1. Do **not** assign a version label. Someone from Elastic staff will assign a version label, if necessary, when your Pull Request is ready to be merged.
1. Find someone to review your pull. Don't just pick any yahoo, pick the right person. The right person might be the original reporter of the issue, but it might also be the person most familiar with the code you've changed. If neither of those things apply, or your change is small in scope, try to find someone on the Kibana team without a ton of existing reviews on their plate. As a rule, most pulls will require 2 reviewers, but the first reviewer will pick the 2nd.
### Reviewing Pull Requests
So, you've been assigned a pull to review. Check out our [pull request review guidelines](https://www.elastic.co/guide/en/kibana/master/pr-review.html) for our general philosophy for pull request reviewers.
Thank you so much for reading our guidelines! :tada:
Our developer guide is written in asciidoc and located under [./docs/developer](./docs/developer) if you want to make edits or access it in raw form.

6
config/node.options Normal file
View file

@ -0,0 +1,6 @@
## Node command line options
## See `node --help` and `node --v8-options` for available options
## Please note you should specify one option per line
## max size of old space in megabytes
#--max-old-space-size=4096

View file

@ -1,38 +0,0 @@
[[add-data-guide]]
== Add Data Guide
`Add Data` in the Kibana Home application contains tutorials for setting up data flows in the Elastic stack.
Each tutorial contains three sets of instructions:
* `On Premise.` Set up a data flow when both Kibana and Elasticsearch are running on premise.
* `On Premise Elastic Cloud.` Set up a data flow when Kibana is running on premise and Elasticsearch is running on Elastic Cloud.
* `Elastic Cloud.` Set up a data flow when both Kibana and Elasticsearch are running on Elastic Cloud.
[float]
=== Creating a new tutorial
1. Create a new directory in the link:https://github.com/elastic/kibana/tree/master/src/plugins/home/server/tutorials[tutorials directory].
2. In the new directory, create a file called `index.ts` that exports a function.
The function must return a function object that conforms to the `TutorialSchema` interface link:https://github.com/elastic/kibana/blob/master/src/plugins/home/server/services/tutorials/lib/tutorial_schema.ts[tutorial schema].
3. Register the tutorial in link:https://github.com/elastic/kibana/blob/master/src/plugins/home/server/tutorials/register.ts[register.ts] by adding it to the `builtInTutorials`.
// TODO update path once assets are migrated
4. Add image assets to the link:https://github.com/elastic/kibana/tree/master/src/legacy/core_plugins/kibana/public/home/tutorial_resources[tutorial_resources directory].
5. Run Kibana locally to preview the tutorial.
6. Create a PR and go through the review process to get the changes approved.
If you are creating a new plugin and the tutorial is only related to that plugin, you can also place the `TutorialSchema` object into your plugin folder. Add `home` to the `requiredPlugins` list in your `kibana.json` file.
Then register the tutorial object by calling `home.tutorials.registerTutorial(tutorialObject)` in the `setup` lifecycle of your server plugin.
[float]
==== Variables
String values can contain variables that are substituted when rendered. Variables are specified by `{}`.
For example: `{config.docs.version}` is rendered as `6.2` when running the tutorial in Kibana 6.2.
link:https://github.com/elastic/kibana/blob/master/src/legacy/core_plugins/kibana/public/home/np_ready/components/tutorial/replace_template_strings.js#L23[Provided variables]
[float]
==== Markdown
String values can contain limited Markdown syntax.
link:https://github.com/elastic/kibana/blob/master/src/legacy/core_plugins/kibana/public/home/components/tutorial/content.js#L8[Enabled Markdown grammars]

View file

@ -0,0 +1,18 @@
[[development-basepath]]
=== Considerations for basepath
In dev mode, {kib} by default runs behind a proxy which adds a random path component to its URL.
You can set this explicitly using `server.basePath` in <<settings>>.
Use the server.rewriteBasePath setting to tell {kib} if it should remove the basePath from requests it receives, and to prevent a deprecation warning at startup. This setting cannot end in a slash (/).
If you want to turn off the basepath when in development mode, start {kib} with the `--no-basepath` flag
[source,bash]
----
yarn start --no-basepath
----

View file

@ -1,7 +1,7 @@
[[development-es-snapshots]]
=== Daily Elasticsearch Snapshots
For local development and CI, Kibana, by default, uses Elasticsearch snapshots that are built daily when running tasks that require Elasticsearch (e.g. functional tests).
For local development and CI, {kib}, by default, uses Elasticsearch snapshots that are built daily when running tasks that require Elasticsearch (e.g. functional tests).
A snapshot is just a group of tarballs, one for each supported distribution/architecture/os of Elasticsearch, and a JSON-based manifest file containing metadata about the distributions.
@ -9,13 +9,13 @@ https://ci.kibana.dev/es-snapshots[A dashboard] is available that shows the curr
==== Process Overview
1. Elasticsearch snapshots are built for each current tracked branch of Kibana.
1. Elasticsearch snapshots are built for each current tracked branch of {kib}.
2. Each snapshot is uploaded to a public Google Cloud Storage bucket, `kibana-ci-es-snapshots-daily`.
** At this point, the snapshot is not automatically used in CI or local development. It needs to be tested/verified first.
3. Each snapshot is tested with the latest commit of the corresponding Kibana branch, using the full CI suite.
3. Each snapshot is tested with the latest commit of the corresponding {kib} branch, using the full CI suite.
4. After CI
** If the snapshot passes, it is promoted and automatically used in CI and local development.
** If the snapshot fails, the issue must be investigated and resolved. A new incompatibility may exist between Elasticsearch and Kibana.
** If the snapshot fails, the issue must be investigated and resolved. A new incompatibility may exist between Elasticsearch and {kib}.
==== Using the latest snapshot
@ -39,7 +39,7 @@ KBN_ES_SNAPSHOT_USE_UNVERIFIED=true node scripts/functional_tests_server
Currently, there is not a way to run your pull request with the latest unverified snapshot without a code change. You can, however, do it with a small code change.
1. Edit `Jenkinsfile` in the root of the Kibana repo
1. Edit `Jenkinsfile` in the root of the {kib} repo
2. Add `env.KBN_ES_SNAPSHOT_USE_UNVERIFIED = 'true'` at the top of the file.
3. Commit the change
@ -75,13 +75,13 @@ The file structure for this bucket looks like this:
==== How snapshots are built, tested, and promoted
Each day, a https://kibana-ci.elastic.co/job/elasticsearch+snapshots+trigger/[Jenkins job] runs that triggers Elasticsearch builds for each currently tracked branch/version. This job is automatically updated with the correct branches whenever we release new versions of Kibana.
Each day, a https://kibana-ci.elastic.co/job/elasticsearch+snapshots+trigger/[Jenkins job] runs that triggers Elasticsearch builds for each currently tracked branch/version. This job is automatically updated with the correct branches whenever we release new versions of {kib}.
===== Build
https://kibana-ci.elastic.co/job/elasticsearch+snapshots+build/[This Jenkins job] builds the Elasticsearch snapshots and uploads them to GCS.
The Jenkins job pipeline definition is https://github.com/elastic/kibana/blob/master/.ci/es-snapshots/Jenkinsfile_build_es[in the kibana repo].
The Jenkins job pipeline definition is https://github.com/elastic/kibana/blob/master/.ci/es-snapshots/Jenkinsfile_build_es[in the {kib} repo].
1. Checkout Elasticsearch repo for the given branch/version.
2. Run `./gradlew -p distribution/archives assemble --parallel` to create all of the Elasticsearch distributions.
@ -91,15 +91,15 @@ The Jenkins job pipeline definition is https://github.com/elastic/kibana/blob/ma
** e.g. `<version>/archives/<unique id>`
6. Replace `<version>/manifest-latest.json` in GCS with this newest manifest.
** This allows the `KBN_ES_SNAPSHOT_USE_UNVERIFIED` flag to work.
7. Trigger the verification job, to run the full Kibana CI test suite with this snapshot.
7. Trigger the verification job, to run the full {kib} CI test suite with this snapshot.
===== Verification and Promotion
https://kibana-ci.elastic.co/job/elasticsearch+snapshots+verify/[This Jenkins job] tests the latest Elasticsearch snapshot with the full Kibana CI pipeline, and promotes if it there are no test failures.
https://kibana-ci.elastic.co/job/elasticsearch+snapshots+verify/[This Jenkins job] tests the latest Elasticsearch snapshot with the full {kib} CI pipeline, and promotes if it there are no test failures.
The Jenkins job pipeline definition is https://github.com/elastic/kibana/blob/master/.ci/es-snapshots/Jenkinsfile_verify_es[in the kibana repo].
The Jenkins job pipeline definition is https://github.com/elastic/kibana/blob/master/.ci/es-snapshots/Jenkinsfile_verify_es[in the {kib} repo].
1. Checkout Kibana and set up CI environment as normal.
1. Checkout {kib} and set up CI environment as normal.
2. Set the `ES_SNAPSHOT_MANIFEST` env var to point to the latest snapshot manifest.
3. Run CI (functional tests, integration tests, etc).
4. After CI

View file

@ -0,0 +1,12 @@
[[advanced]]
== Advanced
* <<running-elasticsearch>>
* <<development-es-snapshots>>
* <<development-basepath>>
include::development-es-snapshots.asciidoc[]
include::running-elasticsearch.asciidoc[]
include::development-basepath.asciidoc[]

View file

@ -0,0 +1,118 @@
[[running-elasticsearch]]
=== Running elasticsearch during development
There are many ways to run Elasticsearch while you are developing.
[float]
==== By snapshot
This will run a snapshot of elasticsearch that is usually built nightly. Read more about <<development-es-snapshots>>.
[source,bash]
----
yarn es snapshot
----
See all available options, like how to specify a specific license, with the `--help` flag.
[source,bash]
----
yarn es snapshot --help
----
`trial` will give you access to all capabilities.
**Keeping data between snapshots**
If you want to keep the data inside your Elasticsearch between usages of this command, you should use the following command, to keep your data folder outside the downloaded snapshot folder:
[source,bash]
----
yarn es snapshot -E path.data=../data
----
==== By source
If you have the Elasticsearch repo checked out locally and wish to run against that, use `source`. By default, it will reference an elasticsearch checkout which is a sibling to the {kib} directory named elasticsearch. If you wish to use a checkout in another location you can provide that by supplying --source-path
[source,bash]
----
yarn es source
----
==== From an archive
Use this if you already have a distributable. For released versions, one can be obtained on the Elasticsearch downloads page.
[source,bash]
----
yarn es archive <full_path_to_archive>
----
Each of these will run Elasticsearch with a basic license. Additional options are available, pass --help for more information.
==== From a remote host
You can save some system resources, and the effort of generating sample data, if you have a remote Elasticsearch cluster to connect to. (Elasticians: you do! Check with your team about where to find credentials)
You'll need to create a kibana.dev.yml (<<customize-kibana-yml>>) and add the following to it:
[source,bash]
----
elasticsearch.hosts:
- {{ url }}
elasticsearch.username: {{ username }}
elasticsearch.password: {{ password }}
elasticsearch.ssl.verificationMode: none
----
If many other users will be interacting with your remote cluster, you'll want to add the following to avoid causing conflicts:
[source,bash]
----
kibana.index: '.{YourGitHubHandle}-kibana'
xpack.task_manager.index: '.{YourGitHubHandle}-task-manager-kibana'
----
===== Running remote clusters
Setup remote clusters for cross cluster search (CCS) and cross cluster replication (CCR).
Start your primary cluster by running:
[source,bash]
----
yarn es snapshot -E path.data=../data_prod1
----
Start your remote cluster by running:
[source,bash]
----
yarn es snapshot -E transport.port=9500 -E http.port=9201 -E path.data=../data_prod2
----
Once both clusters are running, start {kib}. {kib} will connect to the primary cluster.
Setup the remote cluster in {kib} from either Management -> Elasticsearch -> Remote Clusters UI or by running the following script in Console.
[source,bash]
----
PUT _cluster/settings
{
"persistent": {
"cluster": {
"remote": {
"cluster_one": {
"seeds": [
"localhost:9500"
]
}
}
}
}
}
----
Follow the cross-cluster search instructions for setting up index patterns to search across clusters (<<management-cross-cluster-search>>).

View file

@ -0,0 +1,38 @@
[[add-data-tutorials]]
=== Add data tutorials
`Add Data` in the {kib} Home application contains tutorials for setting up data flows in the Elastic stack.
Each tutorial contains three sets of instructions:
* `On Premise.` Set up a data flow when both {kib} and Elasticsearch are running on premise.
* `On Premise Elastic Cloud.` Set up a data flow when {kib} is running on premise and Elasticsearch is running on Elastic Cloud.
* `Elastic Cloud.` Set up a data flow when both {kib} and Elasticsearch are running on Elastic Cloud.
[float]
==== Creating a new tutorial
1. Create a new directory in the link:https://github.com/elastic/kibana/tree/master/src/plugins/home/server/tutorials[tutorials directory].
2. In the new directory, create a file called `index.ts` that exports a function.
The function must return a function object that conforms to the `TutorialSchema` interface link:{kib-repo}tree/{branch}/src/plugins/home/server/services/tutorials/lib/tutorial_schema.ts[tutorial schema].
3. Register the tutorial in link:{kib-repo}tree/{branch}/src/plugins/home/server/tutorials/register.ts[register.ts] by adding it to the `builtInTutorials`.
// TODO update path once assets are migrated
4. Add image assets to the link:{kib-repo}tree/{branch}/src/legacy/core_plugins/kibana/public/home/tutorial_resources[tutorial_resources directory].
5. Run {kib} locally to preview the tutorial.
6. Create a PR and go through the review process to get the changes approved.
If you are creating a new plugin and the tutorial is only related to that plugin, you can also place the `TutorialSchema` object into your plugin folder. Add `home` to the `requiredPlugins` list in your `kibana.json` file.
Then register the tutorial object by calling `home.tutorials.registerTutorial(tutorialObject)` in the `setup` lifecycle of your server plugin.
[float]
===== Variables
String values can contain variables that are substituted when rendered. Variables are specified by `{}`.
For example: `{config.docs.version}` is rendered as `6.2` when running the tutorial in {kib} 6.2.
link:{kib-repo}tree/{branch}/src/legacy/core_plugins/kibana/public/home/np_ready/components/tutorial/replace_template_strings.js#L23[Provided variables]
[float]
===== Markdown
String values can contain limited Markdown syntax.
link:{kib-repo}tree/{branch}/src/legacy/core_plugins/kibana/public/home/components/tutorial/content.js#L8[Enabled Markdown grammars]

View file

@ -1,13 +1,13 @@
[[development-visualize-index]]
== Developing Visualizations
=== Developing Visualizations
[IMPORTANT]
==============================================
These pages document internal APIs and are not guaranteed to be supported across future versions of Kibana.
These pages document internal APIs and are not guaranteed to be supported across future versions of {kib}.
==============================================
The internal APIs for creating custom visualizations are in a state of heavy churn as
they are being migrated to the new Kibana platform, and large refactorings have been
they are being migrated to the new {kib} platform, and large refactorings have been
happening across minor releases in the `7.x` series. In particular, in `7.5` and later
we have made significant changes to the legacy APIs as we work to gradually replace them.
@ -20,7 +20,7 @@ If you would like to keep up with progress on the visualizations plugin in the m
here are a few resources:
* The <<breaking-changes,breaking changes>> documentation, where we try to capture any changes to the APIs as they occur across minors.
* link:https://github.com/elastic/kibana/issues/44121[Meta issue] which is tracking the move of the plugin to the new Kibana platform
* link:https://github.com/elastic/kibana/issues/44121[Meta issue] which is tracking the move of the plugin to the new {kib} platform
* Our link:https://www.elastic.co/blog/join-our-elastic-stack-workspace-on-slack[Elastic Stack workspace on Slack].
* The {kib-repo}blob/{branch}/src/plugins/visualizations[source code], which will continue to be
the most accurate source of information.

View file

@ -0,0 +1,25 @@
[[kibana-architecture]]
== Architecture
[IMPORTANT]
==============================================
{kib} developer services and apis are in a state of constant development. We cannot provide backwards compatibility at this time due to the high rate of change.
==============================================
Our developer services are changing all the time. One of the best ways to discover and learn about them is to read the available
READMEs from all the plugins inside our {kib-repo}tree/{branch}/src/plugins[open source plugins folder] and our
{kib-repo}/tree/{branch}/x-pack/plugins[commercial plugins folder].
A few services also automatically generate api documentation which can be browsed inside the {kib-repo}tree/{branch}/docs/development[docs/development section of our repo]
A few notable services are called out below.
* <<development-security>>
* <<add-data-tutorials>>
* <<development-visualize-index>>
include::add-data-tutorials.asciidoc[]
include::development-visualize-index.asciidoc[]
include::security/index.asciidoc[]

View file

@ -1,13 +1,13 @@
[[development-plugin-feature-registration]]
=== Plugin feature registration
==== Plugin feature registration
If your plugin will be used with {kib}'s default distribution, then you have the ability to register the features that your plugin provides. Features are typically apps in {kib}; once registered, you can toggle them via Spaces, and secure them via Roles when security is enabled.
==== UI Capabilities
===== UI Capabilities
Registering features also gives your plugin access to “UI Capabilities”. These capabilities are boolean flags that you can use to conditionally render your interface, based on the current user's permissions. For example, you can hide or disable a Save button if the current user is not authorized.
==== Registering a feature
===== Registering a feature
Feature registration is controlled via the built-in `xpack_main` plugin. To register a feature, call `xpack_main`'s `registerFeature` function from your plugin's `init` function, and provide the appropriate details:
@ -65,12 +65,12 @@ Registering a feature consists of the following fields. For more information, co
|The ID of the navigation link associated with your feature.
|===
===== Privilege definition
====== Privilege definition
The `privileges` section of feature registration allows plugins to implement read/write and read-only modes for their applications.
For a full explanation of fields and options, consult the {kib-repo}blob/{branch}/x-pack/plugins/features/server/feature_registry.ts[feature registry interface].
==== Using UI Capabilities
===== Using UI Capabilities
UI Capabilities are available to your public (client) plugin code. These capabilities are read-only, and are used to inform the UI. This object is namespaced by feature id. For example, if your feature id is “foo”, then your UI Capabilities are stored at `uiCapabilities.foo`.
To access capabilities, import them from `ui/capabilities`:
@ -86,7 +86,7 @@ if (canUserSave) {
-----------
[[example-1-canvas]]
==== Example 1: Canvas Application
===== Example 1: Canvas Application
["source","javascript"]
-----------
init(server) {
@ -118,13 +118,13 @@ init(server) {
}
-----------
This shows how the Canvas application might register itself as a Kibana feature.
This shows how the Canvas application might register itself as a {kib} feature.
Note that it specifies different `savedObject` access levels for each privilege:
- Users with read/write access (`all` privilege) need to be able to read/write `canvas-workpad` saved objects, and they need read-only access to `index-pattern` saved objects.
- Users with read-only access (`read` privilege) do not need to have read/write access to any saved objects, but instead get read-only access to `index-pattern` and `canvas-workpad` saved objects.
Additionally, Canvas registers the `canvas` UI app and `canvas` catalogue entry. This tells Kibana that these entities are available for users with either the `read` or `all` privilege.
Additionally, Canvas registers the `canvas` UI app and `canvas` catalogue entry. This tells {kib} that these entities are available for users with either the `read` or `all` privilege.
The `all` privilege defines a single “save” UI Capability. To access this in the UI, Canvas could:
@ -141,7 +141,7 @@ if (canUserSave) {
Because the `read` privilege does not define the `save` capability, users with read-only access will have their `uiCapabilities.canvas.save` flag set to `false`.
[[example-2-dev-tools]]
==== Example 2: Dev Tools
===== Example 2: Dev Tools
["source","javascript"]
-----------
@ -199,7 +199,7 @@ server.route({
-----------
[[example-3-discover]]
==== Example 3: Discover
===== Example 3: Discover
Discover takes advantage of subfeature privileges to allow fine-grained access control. In this example,
a single "Create Short URLs" subfeature privilege is defined, which allows users to grant access to this feature without having to grant the `all` privilege to Discover. In other words, you can grant `read` access to Discover, and also grant the ability to create short URLs.

View file

@ -0,0 +1,12 @@
[[development-security]]
=== Security
{kib} has generally been able to implement security transparently to core and plugin developers, and this largely remains the case. {kib} on two methods that the elasticsearch `Cluster` provides: `callWithRequest` and `callWithInternalUser`.
`callWithRequest` executes requests against Elasticsearch using the authentication credentials of the {kib} end-user. So, if you log into {kib} with the user of `foo` when `callWithRequest` is used, {kib} execute the request against Elasticsearch as the user `foo`. Historically, `callWithRequest` has been used extensively to perform actions that are initiated at the request of {kib} end-users.
`callWithInternalUser` executes requests against Elasticsearch using the internal {kib} server user, and has historically been used for performing actions that aren't initiated by {kib} end users; for example, creating the initial `.kibana` index or performing health checks against Elasticsearch.
However, with the changes that role-based access control (RBAC) introduces, this is no longer cut and dry. {kib} now requires all access to the `.kibana` index goes through the `SavedObjectsClient`. This used to be a best practice, as the `SavedObjectsClient` was responsible for translating the documents stored in Elasticsearch to and from Saved Objects, but RBAC is now taking advantage of this abstraction to implement access control and determine when to use `callWithRequest` versus `callWithInternalUser`.
include::rbac.asciidoc[]

View file

@ -1,5 +1,5 @@
[[development-security-rbac]]
=== Role-based access control
==== Role-based access control
Role-based access control (RBAC) in {kib} relies upon the
{ref}/security-privileges.html#application-privileges[application privileges]
@ -11,7 +11,7 @@ consumers when using `request.getSavedObjectsClient()` or
`savedObjects.getScopedSavedObjectsClient()`.
[[development-rbac-privileges]]
==== {kib} Privileges
===== {kib} Privileges
When {kib} first starts up, it executes the following `POST` request against {es}. This synchronizes the definition of the privileges with various `actions` which are later used to authorize a user:
@ -19,7 +19,7 @@ When {kib} first starts up, it executes the following `POST` request against {es
----------------------------------
POST /_security/privilege
Content-Type: application/json
Authorization: Basic kibana changeme
Authorization: Basic {kib} changeme
{
"kibana-.kibana":{
@ -56,7 +56,7 @@ The application is created by concatenating the prefix of `kibana-` with the val
==============================================
[[development-rbac-assigning-privileges]]
==== Assigning {kib} Privileges
===== Assigning {kib} Privileges
{kib} privileges are assigned to specific roles using the `applications` element. For example, the following role assigns the <<kibana-privileges-all, all>> privilege at `*` `resources` (which will in the future be used to secure spaces) to the default {kib} `application`:
@ -81,7 +81,7 @@ Roles that grant <<kibana-privileges>> should be managed using the <<role-manage
{ref}/security-api.html#security-user-apis[user management APIs].
[[development-rbac-authorization]]
==== Authorization
===== Authorization
The {es} {ref}/security-api-has-privileges.html[has privileges API] determines whether the user is authorized to perform a specific action:

View file

@ -0,0 +1,136 @@
[[development-best-practices]]
== Best practices
Consider these best practices, whether developing code directly to the {kib} repo or building your own plugins.
They are intended to support our https://github.com/elastic/engineering/blob/master/kibana_dev_principles.md[{kib} development principals].
[float]
=== Performance
Are you planning with scalability in mind?
* Consider data with many fields
* Consider data with high cardinality fields
* Consider large data sets, that span a long time range
* Do you make lots of requests to the server?
** If so, have you considered using the streaming {kib-repo}tree/{branch}/src/plugins/bfetch[bfetch service]?
[float]
=== Accessibility
Did you know {kib} makes a public statement about our commitment to
creating an accessible product for people with disabilities?
https://www.elastic.co/guide/en/kibana/master/accessibility.html[We do]!
Its very important all of our apps are accessible.
* Learn how https://elastic.github.io/eui/#/guidelines/accessibility[EUI
tackles accessibility]
* If you dont use EUI, follow the same EUI accessibility standards
[[kibana-localization-best-practices]]
[float]
=== Localization
{kib} is translated into other languages. Use our i18n utilities to
ensure your public facing strings will be translated to ensure all
{kib} apps are localized.
* Read and adhere to our
{kib-repo}blob/{branch}/packages/kbn-i18n/GUIDELINE.md[i18n
guidelines]
[float]
=== Conventions
* Become familiar with our
{kib-repo}blob/{branch}/STYLEGUIDE.md[styleguide]
(use Typescript!)
* Write all new code on
{kib-repo}blob/{branch}/src/core/README.md[the
platform], and following
{kib-repo}blob/{branch}/src/core/CONVENTIONS.md[conventions]
* _Always_ use the `SavedObjectClient` for reading and writing Saved
Objects.
* Add `README`s to all your plugins and services.
* Make your public APIs as small as possible. You will have to maintain
them, and consider backward compatibility when making any changes to
them.
* Use https://elastic.github.io/eui[EUI] for all your basic UI
components to create a consistent UI experience.
[float]
=== Re-inventing the wheel
Over-refactoring can be a problem in its own right, but its still
important to be aware of the existing services that are out there and
use them when it makes sense. We have service oriented teams dedicated
to providing our solution developers the tools needed to iterate faster.
They take care of the nitty gritty so you can focus on creative
solutions to your particular problem sphere. Some examples of common
services you should consider:
* {kib-repo}tree/{branch}/src/plugins/data/README.md[Data
services]
** {kib-repo}tree/{branch}/src/plugins/data/public/search/README.md[Search
strategies]
*** Use the `esSearchStrategy` to make raw queries to ES that will
support async searching and partial results, as well as injecting the
right advanced settings like whether to include frozen indices or not.
* {kib-repo}tree/{branch}/src/plugins/embeddable/README.md[Embeddables]
** Rendering maps, visualizations, dashboards in your application
** Register new widgets that will can be added to a dashboard or Canvas
workpad, or rendered in another plugin.
* {kib-repo}tree/{branch}/src/plugins/ui_actions/README.md[UiActions]
** Let other plugins inject functionality into your application
** Inject custom functionality into other plugins
* Stateless helper utilities
* {kib-repo}tree/{branch}/src/plugins/kibana_utils/docs/state_sync/README.md[state
syncing] and
* {kib-repo}tree/{branch}/src/plugins/kibana_utils/docs/state_containers/README.md[state
container] utilities provided by
* {kib-repo}tree/{branch}/src/plugins/kibana_utils/README.md[kibana_utils]
if you want to sync your application state to the URL?
** {kib-repo}tree/{branch}/src/plugins/kibana_react/README.md[kibana_react]
for react specific helpers
Re-using these services will help create a consistent experience across
{kib} from every solution.
[float]
=== Backward compatibility
Eventually we want to garauntee to our plugin developers that their plugins will not break from minor to minor.
Any time you create or change a public API, keep this in mind, and consider potential
backward compatibility issues. While we have a formal
{kib-repo}tree/{branch}/src/core/server/saved_objects/migrations/README.md[saved
object migration system] and are working on adding a formal state migration system, introducing state changes and migrations in a
minor always comes with a risk. Consider this before making huge and
risky changes in minors, _especially_ to saved objects.
* Are you persisting state from registries? Consider what will happen if
the author of the implementation changed their interfaces.
* Are you adding implementations to registries? Consider that someone
may be persisting your data, and that making changes to your public
interfaces can break their code.
Be very careful when changing the shape of saved objects or persistable
data.
Saved object exported from past {kib} versions should continue to work.
In addition, if users are relying on state stored in your apps URL as
part of your public contract, keep in mind that you may also need to
provide backwards compatibility for bookmarked URLs.
[float]
=== Testing & stability
Review:
* <<development-unit-tests>>
* <<stability>>
* <<security-best-practices>>
include::stability.asciidoc[]
include::security.asciidoc[]

View file

@ -0,0 +1,55 @@
[[security-best-practices]]
=== Security best practices
* XSS
** Check for usages of `dangerouslySetInnerHtml`, `Element.innerHTML`,
`Element.outerHTML`
** Ensure all user input is properly escaped.
** Ensure any input in `$.html`, `$.append`, `$.appendTo`,
latexmath:[$.prepend`, `$].prependTo`is escaped. Instead use`$.text`, or
dont use jQuery at all.
* CSRF
** Ensure all APIs are running inside the {kib} HTTP service.
* RCE
** Ensure no usages of `eval`
** Ensure no usages of dynamic requires
** Check for template injection
** Check for usages of templating libraries, including `_.template`, and
ensure that user provided input isnt influencing the template and is
only used as data for rendering the template.
** Check for possible prototype pollution.
* Prototype Pollution
** Check for instances of `anObject[a][b] = c` where a, b, and c are
user defined. This includes code paths where the following logical code
steps could be performed in separate files by completely different
operations, or recursively using dynamic operations.
** Validate any user input, including API
url-parameters/query-parameters/payloads, preferable against a schema
which only allows specific keys/values. At a very minimum, black-list
`__proto__` and `prototype.constructor` for use within keys
** When calling APIs which spawn new processes or potentially perform
code generation from strings, defensively protect against Prototype
Pollution by checking `Object.hasOwnProperty` if the arguments to the
APIs originate from an Object. An example is the Code apps
https://github.com/elastic/kibana/blob/b49192626a8528af5d888545fb14cd1ce66a72e7/x-pack/legacy/plugins/code/server/lsp/workspace_command.ts#L40-L44[spawnProcess].
*** Common Node.js offenders: `child_process.spawn`,
`child_process.exec`, `eval`, `Function('some string')`,
`vm.runIn*Context(x)`
*** Common Client-side offenders: `eval`, `Function('some string')`,
`setTimeout('some string', num)`, `setInterval('some string', num)`
* Check for accidental reveal of sensitive information
** The biggest culprit is errors which contain stack traces or other
sensitive information which end up in the HTTP Response
* Checked for Mishandled API requests
** Ensure no sensitive cookies are forwarded to external resources.
** Ensure that all user controllable variables that are used in
constructing a URL are escaped properly. This is relevant when using
`transport.request` with the Elasticsearch client as no automatic
escaping is performed.
* Reverse tabnabbing -
https://github.com/OWASP/CheatSheetSeries/blob/master/cheatsheets/HTML5_Security_Cheat_Sheet.md#tabnabbing
** When there are user controllable links or hard-coded links to
third-party domains that specify target="_blank" or target="_window", the a tag should have the rel="noreferrer noopener" attribute specified.
Allowing users to input markdown is a common culprit, a custom link renderer should be used
* SSRF - https://www.owasp.org/index.php/Server_Side_Request_Forgery
All network requests made from the {kib} server should use an explicit configuration or white-list specified in the kibana.yml

View file

@ -0,0 +1,66 @@
[[stability]]
=== Stability
Ensure your feature will work under all possible {kib} scenarios.
[float]
==== Environmental configuration scenarios
* Cloud
** Does the feature work on *cloud environment*?
** Does it create a setting that needs to be exposed, or configured
differently than the default, on Cloud? (whitelisting of certain
settings/users? Ref:
https://www.elastic.co/guide/en/cloud/current/ec-add-user-settings.html
,
https://www.elastic.co/guide/en/cloud/current/ec-manage-kibana-settings.html)
** Is there a significant performance impact that may affect Cloud
{kib} instances?
** Does it need to be aware of running in a container? (for example
monitoring)
* Multiple {kib} instances
** Pointing to the same index
** Pointing to different indexes
*** Should make sure that the {kib} index is not hardcoded anywhere.
*** Should not be storing a bunch of stuff in {kib} memory.
*** Should emulate a high availability deployment.
*** Anticipating different timing related issues due to shared resource
access.
*** We need to make sure security is set up in a specific way for
non-standard {kib} indices. (create their own custom roles)
* {kib} running behind a reverse proxy or load balancer, without sticky
sessions. (we have had many discuss/SDH tickets around this)
* If a proxy/loadbalancer is running between ES and {kib}
[float]
==== Kibana.yml settings
* Using a custom {kib} index alias
* When optional dependencies are disabled
** Ensure all your required dependencies are listed in kibana.json
dependency list!
[float]
==== Test coverage
* Does the feature have sufficient unit test coverage? (does it handle
storeinSessions?)
* Does the feature have sufficient Functional UI test coverage?
* Does the feature have sufficient Rest API coverage test coverage?
* Does the feature have sufficient Integration test coverage?
[float]
==== Browser coverage
Refer to the list of browsers and OS {kib} supports
https://www.elastic.co/support/matrix
Does the feature work efficiently on the list of supported browsers?
[float]
==== Upgrade Scenarios - Migration scenarios-
Does the feature affect old
indices, saved objects ? - Has the feature been tested with {kib}
aliases - Read/Write privileges of the indices before and after the
upgrade?

View file

@ -0,0 +1,23 @@
[[development-accessibility-tests]]
==== Automated Accessibility Testing
To run the tests locally:
[arabic]
. In one terminal window run
`node scripts/functional_tests_server --config test/accessibility/config.ts`
. In another terminal window run
`node scripts/functional_test_runner.js --config test/accessibility/config.ts`
To run the x-pack tests, swap the config file out for
`x-pack/test/accessibility/config.ts`.
After the server is up, you can go to this instance of {kib} at
`localhost:5620`.
The testing is done using https://github.com/dequelabs/axe-core[axe].
The same thing that runs in CI, can be run locally using their browser
plugins:
* https://chrome.google.com/webstore/detail/axe-web-accessibility-tes/lhdoppojpmngadmnindnejefpokejbdd?hl=en-US[Chrome]
* https://addons.mozilla.org/en-US/firefox/addon/axe-devtools/[Firefox]

View file

@ -0,0 +1,34 @@
[[development-documentation]]
=== Documentation during development
Docs should be written during development and accompany PRs when relevant. There are multiple types of documentation, and different places to add each.
[float]
==== Developer services documentation
Documentation about specific services a plugin offers should be encapsulated in:
* README.asciidoc at the base of the plugin folder.
* Typescript comments for all public services.
[float]
==== End user documentation
Documentation about user facing features should be written in http://asciidoc.org/[asciidoc] at
{kib-repo}/tree/master/docs[https://github.com/elastic/kibana/tree/master/docs]
To build the docs, you must clone the https://github.com/elastic/docs[elastic/docs]
repo as a sibling of your {kib} repo. Follow the instructions in that project's
README for getting the docs tooling set up.
**To build the docs:**
```bash
node scripts/docs.js --open
```
[float]
==== General developer documentation and guidelines
General developer guildlines and documentation, like this right here, should be written in http://asciidoc.org/[asciidoc]
at {kib-repo}/tree/master/docs/developer[https://github.com/elastic/kibana/tree/master/docs/developer]

View file

@ -1,38 +1,39 @@
[[development-functional-tests]]
=== Functional Testing
We use functional tests to make sure the Kibana UI works as expected. It replaces hours of manual testing by automating user interaction. To have better control over our functional test environment, and to make it more accessible to plugin authors, Kibana uses a tool called the `FunctionalTestRunner`.
We use functional tests to make sure the {kib} UI works as expected. It replaces hours of manual testing by automating user interaction. To have better control over our functional test environment, and to make it more accessible to plugin authors, {kib} uses a tool called the `FunctionalTestRunner`.
[float]
==== Running functional tests
The `FunctionalTestRunner` is very bare bones and gets most of its functionality from its config file, located at {blob}test/functional/config.js[test/functional/config.js]. If youre writing a plugin you will have your own config file. See <<development-plugin-functional-tests>> for more info.
The `FunctionalTestRunner` is very bare bones and gets most of its functionality from its config file, located at {blob}test/functional/config.js[test/functional/config.js]. If youre writing a plugin outside the {kib} repo, you will have your own config file.
See <<external-plugin-functional-tests>> for more info.
There are three ways to run the tests depending on your goals:
1. Easiest option:
** Description: Starts up Kibana & Elasticsearch servers, followed by running tests. This is much slower when running the tests multiple times because slow startup time for the servers. Recommended for single-runs.
** Description: Starts up {kib} & Elasticsearch servers, followed by running tests. This is much slower when running the tests multiple times because slow startup time for the servers. Recommended for single-runs.
** `node scripts/functional_tests`
*** does everything in a single command, including running Elasticsearch and Kibana locally
*** does everything in a single command, including running Elasticsearch and {kib} locally
*** tears down everything after the tests run
*** exit code reports success/failure of the tests
2. Best for development:
** Description: Two commands, run in separate terminals, separate the components that are long-running and slow from those that are ephemeral and fast. Tests can be re-run much faster, and this still runs Elasticsearch & Kibana locally.
** Description: Two commands, run in separate terminals, separate the components that are long-running and slow from those that are ephemeral and fast. Tests can be re-run much faster, and this still runs Elasticsearch & {kib} locally.
** `node scripts/functional_tests_server`
*** starts Elasticsearch and Kibana servers
*** starts Elasticsearch and {kib} servers
*** slow to start
*** can be reused for multiple executions of the tests, thereby saving some time when re-running tests
*** automatically restarts the Kibana server when relevant changes are detected
*** automatically restarts the {kib} server when relevant changes are detected
** `node scripts/functional_test_runner`
*** runs the tests against Kibana & Elasticsearch servers that were started by `node scripts/functional_tests_server`
*** runs the tests against {kib} & Elasticsearch servers that were started by `node scripts/functional_tests_server`
*** exit code reports success or failure of the tests
3. Custom option:
** Description: Runs tests against instances of Elasticsearch & Kibana started some other way (like Elastic Cloud, or an instance you are managing in some other way).
** Description: Runs tests against instances of Elasticsearch & {kib} started some other way (like Elastic Cloud, or an instance you are managing in some other way).
** just executes the functional tests
** url, credentials, etc. for Elasticsearch and Kibana are specified via environment variables
** Here's an example that runs against an Elastic Cloud instance. Note that you must run the same branch of tests as the version of Kibana you're testing.
** url, credentials, etc. for Elasticsearch and {kib} are specified via environment variables
** Here's an example that runs against an Elastic Cloud instance. Note that you must run the same branch of tests as the version of {kib} you're testing.
+
["source","shell"]
----------
@ -95,10 +96,10 @@ node scripts/functional_test_runner --exclude-tag skipCloud
When run without any arguments the `FunctionalTestRunner` automatically loads the configuration in the standard location, but you can override that behavior with the `--config` flag. List configs with multiple --config arguments.
* `--config test/functional/config.js` starts Elasticsearch and Kibana servers with the WebDriver tests configured to run in Chrome.
* `--config test/functional/config.firefox.js` starts Elasticsearch and Kibana servers with the WebDriver tests configured to run in Firefox.
* `--config test/api_integration/config.js` starts Elasticsearch and Kibana servers with the api integration tests configuration.
* `--config test/accessibility/config.ts` starts Elasticsearch and Kibana servers with the WebDriver tests configured to run an accessibility audit using https://www.deque.com/axe/[axe].
* `--config test/functional/config.js` starts Elasticsearch and {kib} servers with the WebDriver tests configured to run in Chrome.
* `--config test/functional/config.firefox.js` starts Elasticsearch and {kib} servers with the WebDriver tests configured to run in Firefox.
* `--config test/api_integration/config.js` starts Elasticsearch and {kib} servers with the api integration tests configuration.
* `--config test/accessibility/config.ts` starts Elasticsearch and {kib} servers with the WebDriver tests configured to run an accessibility audit using https://www.deque.com/axe/[axe].
There are also command line flags for `--bail` and `--grep`, which behave just like their mocha counterparts. For instance, use `--grep=foo` to run only tests that match a regular expression.
@ -117,7 +118,7 @@ The tests are written in https://mochajs.org[mocha] using https://github.com/ela
We use https://www.w3.org/TR/webdriver1/[WebDriver Protocol] to run tests in both Chrome and Firefox with the help of https://sites.google.com/a/chromium.org/chromedriver/[chromedriver] and https://firefox-source-docs.mozilla.org/testing/geckodriver/[geckodriver]. When the `FunctionalTestRunner` launches, remote service creates a new webdriver session, which starts the driver and a stripped-down browser instance. We use `browser` service and `webElementWrapper` class to wrap up https://seleniumhq.github.io/selenium/docs/api/javascript/module/selenium-webdriver/[Webdriver API].
The `FunctionalTestRunner` automatically transpiles functional tests using babel, so that tests can use the same ECMAScript features that Kibana source code uses. See {blob}style_guides/js_style_guide.md[style_guides/js_style_guide.md].
The `FunctionalTestRunner` automatically transpiles functional tests using babel, so that tests can use the same ECMAScript features that {kib} source code uses. See {blob}style_guides/js_style_guide.md[style_guides/js_style_guide.md].
[float]
===== Definitions
@ -304,9 +305,9 @@ The `FunctionalTestRunner` comes with three built-in services:
* Phases include: `beforeLoadTests`, `beforeTests`, `beforeEachTest`, `cleanup`
[float]
===== Kibana Services
===== {kib} Services
The Kibana functional tests define the vast majority of the actual functionality used by tests.
The {kib} functional tests define the vast majority of the actual functionality used by tests.
**browser**:::
* Source: {blob}test/functional/services/browser.ts[test/functional/services/browser.ts]
@ -356,7 +357,7 @@ await testSubjects.click(containerButton);
**kibanaServer:**:::
* Source: {blob}test/common/services/kibana_server/kibana_server.js[test/common/services/kibana_server/kibana_server.js]
* Helpers for interacting with Kibana's server
* Helpers for interacting with {kib}'s server
* Commonly used methods:
** `kibanaServer.uiSettings.update()`
** `kibanaServer.version.get()`
@ -501,3 +502,13 @@ const log = getService(log);
// log.debug only writes when using the `--debug` or `--verbose` flag.
log.debug(done clicking menu);
-----------
[float]
==== MacOS testing performance tip
macOS users on a machine with a discrete graphics card may see significant speedups (up to 2x) when running tests by changing your terminal emulator's GPU settings. In iTerm2:
* Open Preferences (Command + ,)
* In the General tab, under the "Magic" section, ensure "GPU rendering" is checked
* Open "Advanced GPU Settings..."
* Uncheck the "Prefer integrated to discrete GPU" option
* Restart iTerm

View file

@ -0,0 +1,112 @@
[[development-github]]
=== How we use git and github
[float]
==== Forking
We follow the https://help.github.com/articles/fork-a-repo/[GitHub
forking model] for collaborating on {kib} code. This model assumes that
you have a remote called `upstream` which points to the official {kib}
repo, which we'll refer to in later code snippets.
[float]
==== Branching
* All work on the next major release goes into master.
* Past major release branches are named `{majorVersion}.x`. They contain
work that will go into the next minor release. For example, if the next
minor release is `5.2.0`, work for it should go into the `5.x` branch.
* Past minor release branches are named `{majorVersion}.{minorVersion}`.
They contain work that will go into the next patch release. For example,
if the next patch release is `5.3.1`, work for it should go into the
`5.3` branch.
* All work is done on feature branches and merged into one of these
branches.
* Where appropriate, we'll backport changes into older release branches.
[float]
==== Commits and Merging
* Feel free to make as many commits as you want, while working on a
branch.
* When submitting a PR for review, please perform an interactive rebase
to present a logical history that's easy for the reviewers to follow.
* Please use your commit messages to include helpful information on your
changes, e.g. changes to APIs, UX changes, bugs fixed, and an
explanation of _why_ you made the changes that you did.
* Resolve merge conflicts by rebasing the target branch over your
feature branch, and force-pushing (see below for instructions).
* When merging, we'll squash your commits into a single commit.
[float]
===== Rebasing and fixing merge conflicts
Rebasing can be tricky, and fixing merge conflicts can be even trickier
because it involves force pushing. This is all compounded by the fact
that attempting to push a rebased branch remotely will be rejected by
git, and you'll be prompted to do a `pull`, which is not at all what you
should do (this will really mess up your branch's history).
Here's how you should rebase master onto your branch, and how to fix
merge conflicts when they arise.
First, make sure master is up-to-date.
["source","shell"]
-----------
git checkout master
git fetch upstream
git rebase upstream/master
-----------
Then, check out your branch and rebase master on top of it, which will
apply all of the new commits on master to your branch, and then apply
all of your branch's new commits after that.
["source","shell"]
-----------
git checkout name-of-your-branch
git rebase master
-----------
You want to make sure there are no merge conflicts. If there are merge
conflicts, git will pause the rebase and allow you to fix the conflicts
before continuing.
You can use `git status` to see which files contain conflicts. They'll
be the ones that aren't staged for commit. Open those files, and look
for where git has marked the conflicts. Resolve the conflicts so that
the changes you want to make to the code have been incorporated in a way
that doesn't destroy work that's been done in master. Refer to master's
commit history on GitHub if you need to gain a better understanding of how code is conflicting and how best to resolve it.
Once you've resolved all of the merge conflicts, use `git add -A` to stage them to be committed, and then use
`git rebase --continue` to tell git to continue the rebase.
When the rebase has completed, you will need to force push your branch because the history is now completely different than what's on the remote. This is potentially dangerous because it will completely overwrite what you have on the remote, so you need to be sure that you haven't lost any work when resolving merge conflicts. (If there weren't any merge conflicts, then you can force push without having to worry about this.)
["source","shell"]
-----------
git push origin name-of-your-branch --force
-----------
This will overwrite the remote branch with what you have locally. You're done!
**Note that you should not run git pull**, for example in response to a push rejection like this:
["source","shell"]
-----------
! [rejected] name-of-your-branch -> name-of-your-branch (non-fast-forward)
error: failed to push some refs to 'https://github.com/YourGitHubHandle/kibana.git'
hint: Updates were rejected because the tip of your current branch is behind
hint: its remote counterpart. Integrate the remote changes (e.g.
hint: 'git pull ...') before pushing again.
hint: See the 'Note about fast-forwards' in 'git push --help' for details.
-----------
Assuming you've successfully rebased and you're happy with the code, you should force push instead.
[float]
==== Creating a pull request
See <<development-pull-request>> for the next steps on getting your code changes merged into {kib}.

View file

@ -0,0 +1,32 @@
[[development-pull-request]]
=== Submitting a pull request
[float]
==== What Goes Into a Pull Request
* Please include an explanation of your changes in your PR description.
* Links to relevant issues, external resources, or related PRs are very important and useful.
* Please update any tests that pertain to your code, and add new tests where appropriate.
* Update or add docs when appropriate. Read more about <<development-documentation>>.
[float]
==== Submitting a Pull Request
1. Push your local changes to your forked copy of the repository and submit a pull request.
2. Describe what your changes do and mention the number of the issue where discussion has taken place, e.g., “Closes #123″.
3. Assign the `review` and `💝community` label (assuming you are not a member of the Elastic organization). This signals to the team that someone needs to give this attention.
4. Do *not* assign a version label. Someone from Elastic staff will assign a version label, if necessary, when your Pull Request is ready to be merged.
5. If you would like someone specific to review your pull request, assign them. Otherwise an Elastic staff member will assign the appropriate person.
Always submit your pull against master unless the bug is only present in an older version. If the bug affects both master and another branch say so in your pull.
Then sit back and wait. There will probably be discussion about the Pull Request and, if any changes are needed, we'll work with you to get your Pull Request merged into {kib}.
[float]
==== What to expect during the pull request review process
Most PRs go through several iterations of feedback and updates. Depending on the scope and complexity of the PR, the process can take weeks. Please
be patient and understand we hold our code base to a high standard.
Check out our <<pr-review>> for our general philosophy for pull request reviews.

View file

@ -0,0 +1,96 @@
[[development-tests]]
=== Testing
To ensure that your changes will not break other functionality, please run the test suite and build (<<building-kibana>>) before submitting your Pull Request.
[float]
==== Running specific {kib} tests
The following table outlines possible test file locations and how to
invoke them:
[width="100%",cols="7%,59%,34%",options="header",]
|===
|Test runner |Test location |Runner command (working directory is {kib}
root)
|Jest |`src/**/*.test.js` `src/**/*.test.ts`
|`yarn test:jest -t regexp [test path]`
|Jest (integration) |`**/integration_tests/**/*.test.js`
|`yarn test:jest_integration -t regexp [test path]`
|Mocha
|`src/**/__tests__/**/*.js` `!src/**/public/__tests__/*.js``packages/kbn-datemath/test/**/*.js` `packages/kbn-dev-utils/src/**/__tests__/**/*.js` `tasks/**/__tests__/**/*.js`
|`node scripts/mocha --grep=regexp [test path]`
|Functional
|`test/*integration/**/config.js` `test/*functional/**/config.js` `test/accessibility/config.js`
|`yarn test:ftr:server --config test/[directory]/config.js``yarn test:ftr:runner --config test/[directory]/config.js --grep=regexp`
|Karma |`src/**/public/__tests__/*.js` |`yarn test:karma:debug`
|===
For X-Pack tests located in `x-pack/` see
link:{kib-repo}tree/{branch}/x-pack/README.md#testing[X-Pack Testing]
Test runner arguments: - Where applicable, the optional arguments
`-t=regexp` or `--grep=regexp` will only run tests or test suites
whose descriptions matches the regular expression. - `[test path]` is
the relative path to the test file.
Examples: - Run the entire elasticsearch_service test suite:
`yarn test:jest src/core/server/elasticsearch/elasticsearch_service.test.ts`
- Run the jest test case whose description matches
`stops both admin and data clients`:
`yarn test:jest -t 'stops both admin and data clients' src/core/server/elasticsearch/elasticsearch_service.test.ts`
- Run the api integration test case whose description matches the given
string: ``` yarn test:ftr:server config test/api_integration/config.js
yarn test:ftr:runner config test/api_integration/config
[float]
==== Cross-browser compatibility
**Testing IE on OS X**
* http://www.vmware.com/products/fusion/fusion-evaluation.html[Download
VMWare Fusion].
* https://developer.microsoft.com/en-us/microsoft-edge/tools/vms/#downloads[Download
IE virtual machines] for VMWare.
* Open VMWare and go to Window > Virtual Machine Library. Unzip the
virtual machine and drag the .vmx file into your Virtual Machine
Library.
* Right-click on the virtual machine you just added to your library and
select "`Snapshots…`", and then click the "`Take`" button in the modal
that opens. You can roll back to this snapshot when the VM expires in 90
days.
* In System Preferences > Sharing, change your computer name to be
something simple, e.g. "`computer`".
* Run {kib} with `yarn start --host=computer.local` (substituting
your computer name).
* Now you can run your VM, open the browser, and navigate to
`http://computer.local:5601` to test {kib}.
* Alternatively you can use browserstack
[float]
==== Running browser automation tests
Check out <<development-functional-tests>> to learn more about how you can run
and develop functional tests for {kib} core and plugins.
You can also look into the {kib-repo}tree/{branch}/scripts/README.md[Scripts README.md]
to learn more about using the node scripts we provide for building
{kib}, running integration tests, and starting up {kib} and
Elasticsearch while you develop.
[float]
==== More testing information:
* <<development-functional-tests>>
* <<development-unit-tests>>
* <<development-accessibility-tests>>
include::development-functional-tests.asciidoc[]
include::development-unit-tests.asciidoc[]
include::development-accessibility-tests.asciidoc[]

View file

@ -1,15 +1,11 @@
[[development-unit-tests]]
=== Unit Testing
==== Unit testing frameworks
We use unit tests to make sure that individual software units of {kib} perform as they were designed to.
{kib} is migrating unit testing from `Mocha` to `Jest`. Legacy unit tests
still exist in Mocha but all new unit tests should be written in Jest.
[float]
=== Current Frameworks
{kib} is migrating unit testing from `Mocha` to `Jest`. Legacy unit tests still exist in `Mocha` but all new unit tests should be written in `Jest`.
[float]
==== Mocha (legacy)
===== Mocha (legacy)
Mocha tests are contained in `__tests__` directories.
@ -32,7 +28,7 @@ yarn test:jest
-----------
[float]
===== Writing Jest Unit Tests
====== Writing Jest Unit Tests
In order to write those tests there are two main things you need to be aware of.
The first one is the different between `jest.mock` and `jest.doMock`
@ -42,7 +38,7 @@ specially for the tests implemented on Typescript in order to benefit from the
auto-inference types feature.
[float]
===== Jest.mock vs Jest.doMock
====== Jest.mock vs Jest.doMock
Both methods are essentially the same on their roots however the `jest.mock`
calls will get hoisted to the top of the file and can only reference variables
@ -52,7 +48,7 @@ variables are instantiated at the time we need them which lead us to the next
section where we'll talk about our jest mock files pattern.
[float]
===== Jest Mock Files Pattern
====== Jest Mock Files Pattern
Specially on typescript it is pretty common to have in unit tests
`jest.doMock` calls which reference for example imported types. Any error
@ -79,5 +75,71 @@ like: `import * as Mocks from './mymodule.test.mocks'`,
`import { mockX } from './mymodule.test.mocks'`
or just `import './mymodule.test.mocks'` if there isn't anything
exported to be used.
[float]
[[debugging-unit-tests]]
===== Debugging Unit Tests
The standard `yarn test` task runs several sub tasks and can take
several minutes to complete, making debugging failures pretty painful.
In order to ease the pain specialized tasks provide alternate methods
for running the tests.
You could also add the `--debug` option so that `node` is run using
the `--debug-brk` flag. Youll need to connect a remote debugger such
as https://github.com/node-inspector/node-inspector[`node-inspector`]
to proceed in this mode.
[source,bash]
----
node scripts/mocha --debug <file>
----
With `yarn test:karma`, you can run only the browser tests. Coverage
reports are available for browser tests by running
`yarn test:coverage`. You can find the results under the `coverage/`
directory that will be created upon completion.
[source,bash]
----
yarn test:karma
----
Using `yarn test:karma:debug` initializes an environment for debugging
the browser tests. Includes an dedicated instance of the {kib} server
for building the test bundle, and a karma server. When running this task
the build is optimized for the first time and then a karma-owned
instance of the browser is opened. Click the "`debug`" button to open a
new tab that executes the unit tests.
[source,bash]
----
yarn test:karma:debug
----
In the screenshot below, youll notice the URL is
`localhost:9876/debug.html`. You can append a `grep` query parameter
to this URL and set it to a string value which will be used to exclude
tests which dont match. For example, if you changed the URL to
`localhost:9876/debug.html?query=my test` and then refreshed the
browser, youd only see tests run which contain "`my test`" in the test
description.
image:http://i.imgur.com/DwHxgfq.png[Browser test debugging]
[float]
===== Unit Testing Plugins
This should work super if youre using the
https://github.com/elastic/kibana/tree/master/packages/kbn-plugin-generator[Kibana
plugin generator]. If youre not using the generator, well, youre on
your own. We suggest you look at how the generator works.
To run the tests for just your particular plugin run the following
command from your plugin:
[source,bash]
----
yarn test:mocha
yarn test:karma:debug # remove the debug flag to run them once and close
----

View file

@ -0,0 +1,89 @@
[[contributing]]
== Contributing
Whether you want to fix a bug, implement a feature, or add some other improvements or apis, the following sections will
guide you on the process.
Read <<development-getting-started>> to get your environment up and running, then read <<development-best-practices>>.
* <<development-tests>>
* <<development-github>>
* <<interpreting-ci-failures>>
* <<development-documentation>>
* <<development-pull-request>>
* <<kibana-issue-reporting>>
* <<signing-contributor-agreement>>
* <<kibana-localization>>
* <<kibana-release-notes-process>>
* <<kibana-linting>>
[discrete]
[[signing-contributor-agreement]]
=== Signing the contributor license agreement
Please make sure you have signed the [Contributor License Agreement](http://www.elastic.co/contributor-agreement/). We are not asking you to assign copyright to us, but to give us the right to distribute your code without restriction. We ask this of all contributors in order to assure our users of the origin and continuing existence of the code. You only need to sign the CLA once.
[float]
[[kibana-localization]]
=== Localization
Read <<kibana-localization-best-practices>> for details on our localization practices.
Note that we cannot support accepting contributions to the translations from any source other than the translators we have engaged to do the work.
We are still to develop a proper process to accept any contributed translations. We certainly appreciate that people care enough about the localization effort to want to help improve the quality. We aim to build out a more comprehensive localization process for the future and will notify you once contributions can be supported, but for the time being, we are not able to incorporate suggestions.
[float]
[[kibana-release-notes-process]]
=== Release Notes Process
Part of this process only applies to maintainers, since it requires
access to GitHub labels.
{kib} publishes https://www.elastic.co/guide/en/kibana/current/release-notes.html[Release Notes] for major and minor releases.
The Release Notes summarize what the PRs accomplish in language that is meaningful to users.
To generate the Release Notes, the team runs a script against this repo to collect the merged PRs against the release.
[float]
==== Create the Release Notes text
The text that appears in the Release Notes is pulled directly from your PR title, or a single paragraph of text that you specify in the PR description.
To use a single paragraph of text, enter `Release note:` or a `## Release note` header in the PR description, followed by your text. For example, refer to this https://github.com/elastic/kibana/pull/65796[PR] that uses the `## Release note` header.
When you create the Release Notes text, use the following best practices:
* Use present tense.
* Use sentence case.
* When you create a feature PR, start with `Adds`.
* When you create an enhancement PR, start with `Improves`.
* When you create a bug fix PR, start with `Fixes`.
* When you create a deprecation PR, start with `Deprecates`.
[float]
==== Add your labels
[arabic]
. Label the PR with the targeted version (ex: `v7.3.0`).
. Label the PR with the appropriate GitHub labels:
* For a new feature or functionality, use `release_note:enhancement`.
* For an external-facing fix, use `release_note:fix`. We do not include docs, build, and test fixes in the Release Notes, or unreleased issues that are only on `master`.
* For a deprecated feature, use `release_note:deprecation`.
* For a breaking change, use `release_note:breaking`.
* To **NOT** include your changes in the Release Notes, use `release_note:skip`.
include::development-github.asciidoc[]
include::development-tests.asciidoc[]
include::interpreting-ci-failures.asciidoc[]
include::development-documentation.asciidoc[]
include::development-pull-request.asciidoc[]
include::kibana-issue-reporting.asciidoc[]
include::pr-review.asciidoc[]
include::linting.asciidoc[]

View file

@ -1,19 +1,19 @@
[[interpreting-ci-failures]]
== Interpreting CI Failures
=== Interpreting CI Failures
Kibana CI uses a Jenkins feature called "Pipelines" to automate testing of the code in pull requests and on tracked branches. Pipelines are defined within the repository via the `Jenkinsfile` at the root of the project.
{kib} CI uses a Jenkins feature called "Pipelines" to automate testing of the code in pull requests and on tracked branches. Pipelines are defined within the repository via the `Jenkinsfile` at the root of the project.
More information about Jenkins Pipelines can be found link:https://jenkins.io/doc/book/pipeline/[in the Jenkins book].
[float]
=== Github Checks
==== Github Checks
When a test fails it will be reported to Github via Github Checks. We currently bucket tests into several categories which run in parallel to make CI faster. Groups like `ciGroup{X}` get a single check in Github, and other tests like linting, or type checks, get their own checks.
Clicking the link next to the check in the conversation tab of a pull request will take you to the log output from that section of the tests. If that log output is truncated, or doesn't clearly identify what happened, you can usually get more complete information by visiting Jenkins directly.
[float]
=== Viewing Job Executions in Jenkins
==== Viewing Job Executions in Jenkins
To view the results of a job execution in Jenkins, either click the link in the comment left by `@elasticmachine` or search for the `kibana-ci` check in the list at the bottom of the PR. This link will take you to the top-level page for the specific job execution that failed.
@ -25,7 +25,7 @@ image::images/job_view.png[]
4. *Pipeline Steps:*: A breakdown of the pipline that was executed, along with individual log output for each step in the pipeline.
[float]
=== Viewing ciGroup/test Logs
==== Viewing ciGroup/test Logs
To view the logs for a failed specific ciGroup, jest, mocha, type checkers, linters, etc., click on the *Pipeline Steps* link in from the Job page.

View file

@ -0,0 +1,46 @@
[[kibana-issue-reporting]]
=== Effective issue reporting in {kib}
[float]
==== Voicing the importance of an issue
We seriously appreciate thoughtful comments. If an issue is important to
you, add a comment with a solid write up of your use case and explain
why its so important. Please avoid posting comments comprised solely of
a thumbs up emoji 👍.
Granted that you share your thoughts, we might even be able to come up
with creative solutions to your specific problem. If everything youd
like to say has already been brought up but youd still like to add a
token of support, feel free to add a
https://github.com/blog/2119-add-reactions-to-pull-requests-issues-and-comments[👍
thumbs up reaction] on the issue itself and on the comment which best
summarizes your thoughts.
[float]
==== "`My issue isnt getting enough attention`"
First of all, *sorry about that!* We want you to have a great time with
{kib}.
Theres hundreds of open issues and prioritizing what to work on is an
important aspect of our daily jobs. We prioritize issues according to
impact and difficulty, so some issues can be neglected while we work on
more pressing issues.
Feel free to bump your issues if you think theyve been neglected for a
prolonged period.
[float]
==== "`I want to help!`"
*Now were talking*. If you have a bug fix or new feature that you would
like to contribute to {kib}, please *find or open an issue about it
before you start working on it.* Talk about what you would like to do.
It may be that somebody is already working on it, or that there are
particular issues that you should know about before implementing the
change.
We enjoy working with contributors to get their code accepted. There are
many approaches to fixing a problem and it is important to find the best
approach before writing too much code.

View file

@ -0,0 +1,70 @@
[[kibana-linting]]
=== Linting
A note about linting: We use http://eslint.org[eslint] to check that the
link:STYLEGUIDE.md[styleguide] is being followed. It runs in a
pre-commit hook and as a part of the tests, but most contributors
integrate it with their code editors for real-time feedback.
Here are some hints for getting eslint setup in your favorite editor:
[width="100%",cols="13%,87%",options="header",]
|===
|Editor |Plugin
|Sublime
|https://github.com/roadhump/SublimeLinter-eslint#installation[SublimeLinter-eslint]
|Atom
|https://github.com/AtomLinter/linter-eslint#installation[linter-eslint]
|VSCode
|https://marketplace.visualstudio.com/items?itemName=dbaeumer.vscode-eslint[ESLint]
|IntelliJ |Settings » Languages & Frameworks » JavaScript » Code Quality
Tools » ESLint
|`vi` |https://github.com/scrooloose/syntastic[scrooloose/syntastic]
|===
Another tool we use for enforcing consistent coding style is
EditorConfig, which can be set up by installing a plugin in your editor
that dynamically updates its configuration. Take a look at the
http://editorconfig.org/#download[EditorConfig] site to find a plugin
for your editor, and browse our
https://github.com/elastic/kibana/blob/master/.editorconfig[`.editorconfig`]
file to see what config rules we set up.
[float]
==== Setup Guide for VS Code Users
Note that for VSCode, to enable "`live`" linting of TypeScript (and
other) file types, you will need to modify your local settings, as shown
below. The default for the ESLint extension is to only lint JavaScript
file types.
[source,json]
----
"eslint.validate": [
"javascript",
"javascriptreact",
{ "language": "typescript", "autoFix": true },
{ "language": "typescriptreact", "autoFix": true }
]
----
`eslint` can automatically fix trivial lint errors when you save a
file by adding this line in your setting.
[source,json]
----
"eslint.autoFixOnSave": true,
----
:warning: It is *not* recommended to use the
https://prettier.io/[`Prettier` extension/IDE plugin] while
maintaining the {kib} project. Formatting and styling roles are set in
the multiple `.eslintrc.js` files across the project and some of them
use the https://www.npmjs.com/package/prettier[NPM version of Prettier].
Using the IDE extension might cause conflicts, applying the formatting
to too many files that shouldnt be prettier-ized and/or highlighting
errors that are actually OK.

View file

@ -1,7 +1,7 @@
[[pr-review]]
== Pull request review guidelines
=== Pull request review guidelines
Every change made to Kibana must be held to a high standard, and while the responsibility for quality in a pull request ultimately lies with the author, Kibana team members have the responsibility as reviewers to verify during their review process.
Every change made to {kib} must be held to a high standard, and while the responsibility for quality in a pull request ultimately lies with the author, {kib} team members have the responsibility as reviewers to verify during their review process.
Frankly, it's impossible to build a concrete list of requirements that encompass all of the possible situations we'll encounter when reviewing pull requests, so instead this document tries to lay out a common set of the few obvious requirements while also outlining a general philosophy that we should have when approaching any PR review.
@ -11,15 +11,15 @@ While the review process is always done by Elastic staff members, these guidelin
[float]
=== Target audience
==== Target audience
The target audience for this document are pull request reviewers. For Kibana maintainers, the PR review is the only part of the contributing process in which we have complete control. The author of any given pull request may not be up to speed on the latest expectations we have for pull requests, and they may have never read our guidelines at all. It's our responsibility as reviewers to guide folks through this process, but it's hard to do that consistently without a common set of documented principles.
The target audience for this document are pull request reviewers. For {kib} maintainers, the PR review is the only part of the contributing process in which we have complete control. The author of any given pull request may not be up to speed on the latest expectations we have for pull requests, and they may have never read our guidelines at all. It's our responsibility as reviewers to guide folks through this process, but it's hard to do that consistently without a common set of documented principles.
Pull request authors can benefit from reading this document as well because it'll help establish a common set of expectations between authors and reviewers early.
[float]
=== Reject fast
==== Reject fast
Every pull request is different, and before reviewing any given PR, reviewers should consider the optimal way to approach the PR review so that if the change is ultimately rejected, it is done so as early in the process as possible.
@ -27,7 +27,7 @@ For example, a reviewer may want to do a product level review as early as possib
[float]
=== The big three
==== The big three
There are a lot of discrete requirements and guidelines we want to follow in all of our pull requests, but three things in particular stand out as important above all the rest.
@ -58,24 +58,24 @@ This isn't simply a question of enough test files. The code in the tests themsel
All of our code should have unit tests that verify its behaviors, including not only the "happy path", but also edge cases, error handling, etc. When you change an existing API of a module, then there should always be at least one failing unit test, which in turn means we need to verify that all code consuming that API properly handles the change if necessary. For modules at a high enough level, this will mean we have breaking change in the product, which we'll need to handle accordingly.
In addition to extensive unit test coverage, PRs should include relevant functional and integration tests. In some cases, we may simply be testing a programmatic interface (e.g. a service) that is integrating with the file system, the network, Elasticsearch, etc. In other cases, we'll be testing REST APIs over HTTP or comparing screenshots/snapshots with prior known acceptable state. In the worst case, we are doing browser-based functional testing on a running instance of Kibana using selenium.
In addition to extensive unit test coverage, PRs should include relevant functional and integration tests. In some cases, we may simply be testing a programmatic interface (e.g. a service) that is integrating with the file system, the network, Elasticsearch, etc. In other cases, we'll be testing REST APIs over HTTP or comparing screenshots/snapshots with prior known acceptable state. In the worst case, we are doing browser-based functional testing on a running instance of {kib} using selenium.
Enhancements are pretty much always going to have extensive unit tests as a base as well as functional and integration testing. Bug fixes should always include regression tests to ensure that same bug does not manifest again in the future.
--
[float]
=== Product level review
==== Product level review
Reviewers are not simply evaluating the code itself, they are also evaluating the quality of the user-facing change in the product. This generally means they need to check out the branch locally and "play around" with it. In addition to the "do we want this change in the product" details, the reviewer should be looking for bugs and evaluating how approachable and useful the feature is as implemented. Special attention should be given to error scenarios and edge cases to ensure they are all handled well within the product.
[float]
=== Consistency, style, readability
==== Consistency, style, readability
Having a relatively consistent codebase is an important part of us building a sustainable project. With dozens of active contributors at any given time, we rely on automation to help ensure consistency - we enforce a comprehensive set of linting rules through CI. We're also rolling out prettier to make this even more automatic.
For things that can't be easily automated, we maintain a link:https://github.com/elastic/kibana/blob/master/STYLEGUIDE.md[style guide] that authors should adhere to and reviewers should keep in mind when they review a pull request.
For things that can't be easily automated, we maintain a link:{kib-repo}tree/{branch}/STYLEGUIDE.md[style guide] that authors should adhere to and reviewers should keep in mind when they review a pull request.
Beyond that, we're into subjective territory. Statements like "this isn't very readable" are hardly helpful since they can't be qualified, but that doesn't mean a reviewer should outright ignore code that is hard to understand due to how it is written. There isn't one definitively "best" way to write any particular code, so pursuing such shouldn't be our goal. Instead, reviewers and authors alike must accept that there are likely many different appropriate ways to accomplish the same thing with code, and so long as the contribution is utilizing one of those ways, then we're in good shape.
@ -87,7 +87,7 @@ There may also be times when a person is inspired by a particular contribution t
[float]
=== Nitpicking
==== Nitpicking
Nitpicking is when a reviewer identifies trivial and unimportant details in a pull request and asks the author to change them. This is a completely subjective category that is impossible to define universally, and it's equally impractical to define a blanket policy on nitpicking that everyone will be happy with.
@ -97,13 +97,13 @@ Often, reviewers have an opinion about whether the feedback they are about to gi
[float]
=== Handling disagreements
==== Handling disagreements
Conflicting opinions between reviewers and authors happen, and sometimes it is hard to reconcile those opinions. Ideally folks can work together in the spirit of these guidelines toward a consensus, but if that doesn't work out it may be best to bring a third person into the discussion. Our pull requests generally have two reviewers, so an appropriate third person may already be obvious. Otherwise, reach out to the functional area that is most appropriate or to technical leadership if an area isn't obvious.
[float]
=== Inappropriate review feedback
==== Inappropriate review feedback
Whether or not a bit of feedback is appropriate for a pull request is often dependent on the motivation for giving the feedback in the first place.
@ -113,7 +113,7 @@ Inflammatory feedback such as "this is crap" isn't feedback at all. It's both me
[float]
=== A checklist
==== A checklist
Establishing a comprehensive checklist for all of the things that should happen in all possible pull requests is impractical, but that doesn't mean we lack a concrete set of minimum requirements that we can enumerate. The following items should be double checked for any pull request:

View file

@ -1,24 +0,0 @@
[[core-development]]
== Core Development
* <<development-basepath>>
* <<development-dependencies>>
* <<development-modules>>
* <<development-elasticsearch>>
* <<development-unit-tests>>
* <<development-functional-tests>>
* <<development-es-snapshots>>
include::core/development-basepath.asciidoc[]
include::core/development-dependencies.asciidoc[]
include::core/development-modules.asciidoc[]
include::core/development-elasticsearch.asciidoc[]
include::core/development-unit-tests.asciidoc[]
include::core/development-functional-tests.asciidoc[]
include::core/development-es-snapshots.asciidoc[]

View file

@ -1,85 +0,0 @@
[[development-basepath]]
=== Considerations for basePath
All communication from the Kibana UI to the server needs to respect the
`server.basePath`. Here are the "blessed" strategies for dealing with this
based on the context:
[float]
==== Getting a static asset url
Use webpack to import the asset into the build. This will give you a URL in
JavaScript and gives webpack a chance to perform optimizations and
cache-busting.
["source","shell"]
-----------
// in plugin/public/main.js
import uiChrome from 'ui/chrome';
import logoUrl from 'plugins/facechimp/assets/banner.png';
uiChrome.setBrand({
logo: `url(${logoUrl}) center no-repeat`
});
-----------
[float]
==== API requests from the front-end
Use `chrome.addBasePath()` to append the basePath to the front of the url.
["source","shell"]
-----------
import chrome from 'ui/chrome';
$http.get(chrome.addBasePath('/api/plugin/things'));
-----------
[float]
==== Server side
Append `request.getBasePath()` to any absolute URL path.
["source","shell"]
-----------
const basePath = server.config().get('server.basePath');
server.route({
path: '/redirect',
handler(request, h) {
return h.redirect(`${request.getBasePath()}/otherLocation`);
}
});
-----------
[float]
==== BasePathProxy in dev mode
The Kibana dev server automatically runs behind a proxy with a random
`server.basePath`. This way developers will be constantly verifying that their
code works with basePath, while they write it.
To accomplish this the `serve` task does a few things:
1. change the port for the server to the `dev.basePathProxyTarget` setting (default `5603`)
2. start a `BasePathProxy` at `server.port`
- picks a random 3-letter value for `randomBasePath`
- redirects from `/` to `/{randomBasePath}`
- redirects from `/{any}/app/{appName}` to `/{randomBasePath}/app/{appName}` so that refreshes should work
- proxies all requests starting with `/{randomBasePath}/` to the Kibana server
If you're writing scripts that interact with the Kibana API, the base path proxy will likely
make this difficult. To bypass the base path proxy for a single request, prefix urls with
`__UNSAFE_bypassBasePath` and the request will be routed to the development Kibana server.
["source","shell"]
-----------
curl "http://elastic:changeme@localhost:5601/__UNSAFE_bypassBasePath/api/status"
-----------
This proxy can sometimes have unintended side effects in development, so when
needed you can opt out by passing the `--no-base-path` flag to the `serve` task
or `yarn start`.
["source","shell"]
-----------
yarn start --no-base-path
-----------

View file

@ -1,103 +0,0 @@
[[development-dependencies]]
=== Managing Dependencies
While developing plugins for use in the Kibana front-end environment you will
probably want to include a library or two (at least). While that should be
simple to do 90% of the time, there are always outliers, and some of those
outliers are very popular projects.
Before you can use an external library with Kibana you have to install it. You
do that using...
[float]
==== yarn (preferred method)
Once you've http://npmsearch.com[found] a dependency you want to add, you can
install it like so:
["source","shell"]
-----------
yarn add some-neat-library
-----------
At the top of a javascript file, just import the library using it's name:
["source","shell"]
-----------
import someNeatLibrary from 'some-neat-library';
-----------
Just like working in node.js, front-end code can require node modules installed
by yarn without any additional configuration.
[float]
==== webpackShims
When a library you want to use does use es6 or common.js modules but is not
available with yarn, you can copy the source of the library into a webpackShim.
["source","shell"]
-----------
# create a directory for our new library to live
mkdir -p webpackShims/some-neat-library
# download the library you want to use into that directory
curl https://cdnjs.com/some-neat-library/library.js > webpackShims/some-neat-library/index.js
-----------
Then include the library in your JavaScript code as you normally would:
["source","shell"]
-----------
import someNeatLibrary from 'some-neat-library';
-----------
[float]
==== Shimming third party code
Some JavaScript libraries do not declare their dependencies in a way that tools
like webpack can understand. It is also often the case that libraries do not
`export` their provided values, but simply write them to a global variable name
(or something to that effect).
When pulling code like this into Kibana we need to write "shims" that will
adapt the third party code to work with our application, other libraries, and
module system. To do this we can utilize the `webpackShims` directory.
The easiest way to explain how to write a shim is to show you some. Here is our
webpack shim for jQuery:
["source","shell"]
-----------
// webpackShims/jquery.js
module.exports = window.jQuery = window.$ = require('../node_modules/jquery/dist/jquery');
require('ui/jquery/findTestSubject')(window.$);
-----------
This shim is loaded up anytime an `import 'jquery';` statement is found by
webpack, because of the way that `webpackShims` behaves like `node_modules`.
When that happens, the shim does two things:
. Assign the exported value of the actual jQuery module to the window at `$` and `jQuery`, allowing libraries like angular to detect that jQuery is available, and use it as the module's export value.
. Finally, a jQuery plugin that we wrote is included so that every time a file imports jQuery it will get both jQuery and the `$.findTestSubject` helper function.
Here is what our webpack shim for angular looks like:
["source","shell"]
-----------
// webpackShims/angular.js
require('jquery');
require('../node_modules/angular/angular');
require('../node_modules/angular-elastic/elastic');
require('ui/modules').get('kibana', ['monospaced.elastic']);
module.exports = window.angular;
-----------
What this shim does is fairly simple if you go line by line:
. makes sure that jQuery is loaded before angular (which actually runs the shim)
. load the angular.js file from the node_modules directory
. load the angular-elastic plugin, a plugin we want to always be included whenever we import angular
. use the `ui/modules` module to add the module exported by angular-elastic as a dependency to the `kibana` angular module
. finally, export the window.angular variable. This means that writing `import angular from 'angular';` will properly set the angular variable to the angular library, rather than undefined which is the default behavior.

View file

@ -1,40 +0,0 @@
[[development-elasticsearch]]
=== Communicating with Elasticsearch
Kibana exposes two clients on the server and browser for communicating with elasticsearch.
There is an 'admin' client which is used for managing Kibana's state, and a 'data' client for all
other requests. The clients use the {jsclient-current}/index.html[elasticsearch.js library].
[float]
[[client-server]]
=== Server clients
Server clients are exposed through the elasticsearch plugin.
[source,javascript]
----
const adminCluster = server.plugins.elasticsearch.getCluster('admin');
const dataCluster = server.plugins.elasticsearch.getCluster('data');
//ping as the configured elasticsearch.user in kibana.yml
adminCluster.callWithInternalUser('ping');
//ping as the user specified in the current requests header
adminCluster.callWithRequest(req, 'ping');
----
[float]
[[client-browser]]
=== Browser clients
Browser clients are exposed through AngularJS services.
[source,javascript]
----
uiModules.get('kibana')
.run(function (es) {
es.ping()
.catch(err => {
console.log('error pinging servers');
});
});
----

View file

@ -1,63 +0,0 @@
[[development-modules]]
=== Modules and Autoloading
[float]
==== Autoloading
Because of the disconnect between JS modules and angular directives, filters,
and services it is difficult to know what you need to import. It is even more
difficult to know if you broke something by removing an import that looked
unused.
To prevent this from being an issue the ui module provides "autoloading"
modules. The sole purpose of these modules is to extend the environment with
certain components. Here is a breakdown of those modules:
- *`import 'ui/autoload/modules'`*
Imports angular and several ui services and "components" which Kibana
depends on without importing. The full list of imports is hard coded in the
module. Hopefully this list will shrink over time as we properly map out
the required modules and import them were they are actually necessary.
- *`import 'ui/autoload/all'`*
Imports all of the modules
[float]
==== Resolving Require Paths
Kibana uses Webpack to bundle Kibana's dependencies.
Here is how import/require statements are resolved to a file:
. Check the beginning of the module path
* if the path starts with a '.'
** append it the directory of the current file
** proceed to *3*
* if the path starts with a '/'
** search for this exact path
** proceed to *3*
* proceed to *2*
. Search for a named module
* `moduleName` = dirname(require path)`
* match if `moduleName` is or starts with one of these aliases
** replace the alias with the match and continue to ***3***
* match when any of these conditions are met:
** `./webpackShims/${moduleName}` is a directory
** `./node_modules/${moduleName}` is a directory
* if no match was found
** move to the parent directory
** start again at *2.iii* until reaching the root directory or a match is found
* if a match was found
** replace the `moduleName` prefix from the require statement with the full path of the match and proceed to *3*
. Search for a file
* the first of the following paths that resolves to a **file** is our match
** path + '.js'
** path + '.json'
** path
** path/${basename(path)} + '.js'
** path/${basename(path)} + '.json'
** path/${basename(path)}
** path/index + '.js'
** path/index + '.json'
** path/index
* if none of the paths matches then an error is thrown

View file

@ -0,0 +1,39 @@
[[building-kibana]]
=== Building a {kib} distributable
The following commands will build a {kib} production distributable.
[source,bash]
----
yarn build --skip-os-packages
----
You can get all build options using the following command:
[source,bash]
----
yarn build --help
----
[float]
==== Building OS packages
Packages are built using fpm, dpkg, and rpm. Package building has only been tested on Linux and is not supported on any other platform.
[source,bash]
----
apt-get install ruby-dev rpm
gem install fpm -v 1.5.0
yarn build --skip-archives
----
To specify a package to build you can add `rpm` or `deb` as an argument.
[source,bash]
----
yarn build --rpm
----
Distributable packages can be found in `target/` after the build completes.

View file

@ -0,0 +1,59 @@
[[kibana-debugging]]
=== Debugging {kib}
For information about how to debug unit tests, refer to <<debugging-unit-tests>>.
[float]
==== Server Code
`yarn debug` will start the server with Node's inspect flag. {kib}'s development mode will start three processes on ports `9229`, `9230`, and `9231`. Chrome's developer tools need to be configured to connect to all three connections. Add `localhost:<port>` for each {kib} process in Chrome's developer tools connection tab.
[float]
==== Instrumenting with Elastic APM
{kib} ships with the
https://github.com/elastic/apm-agent-nodejs[Elastic APM Node.js Agent]
built-in for debugging purposes.
Its default configuration is meant to be used by core {kib} developers
only, but it can easily be re-configured to your needs. In its default
configuration its disabled and will, once enabled, send APM data to a
centrally managed Elasticsearch cluster accessible only to Elastic
employees.
To change the location where data is sent, use the
https://www.elastic.co/guide/en/apm/agent/nodejs/current/configuration.html#server-url[`serverUrl`]
APM config option. To activate the APM agent, use the
https://www.elastic.co/guide/en/apm/agent/nodejs/current/configuration.html#active[`active`]
APM config option.
All config options can be set either via environment variables, or by
creating an appropriate config file under `config/apm.dev.js`. For
more information about configuring the APM agent, please refer to
https://www.elastic.co/guide/en/apm/agent/nodejs/current/configuring-the-agent.html[the
documentation].
Example `config/apm.dev.js` file:
[source,js]
----
module.exports = {
active: true,
};
----
APM
https://www.elastic.co/guide/en/apm/agent/rum-js/current/index.html[Real
User Monitoring agent] is not available in the {kib} distributables,
however the agent can be enabled by setting `ELASTIC_APM_ACTIVE` to
`true`. flags
....
ELASTIC_APM_ACTIVE=true yarn start
// activates both Node.js and RUM agent
....
Once the agent is active, it will trace all incoming HTTP requests to
{kib}, monitor for errors, and collect process-level metrics. The
collected data will be sent to the APM Server and is viewable in the APM
UI in {kib}.

View file

@ -5,54 +5,35 @@ Here are some resources that are helpful for getting started with plugin develop
[float]
==== Some light reading
Our {kib-repo}blob/master/CONTRIBUTING.md[contributing guide] can help you get a development environment going.
If you haven't already, start with <<development-getting-started>>. If you are planning to add your plugin to the {kib} repo, read the <<contributing>> guide, if you are building a plugin externally, read <<external-plugin-development>>. In both cases, read up on our recommended <<development-best-practices>>.
[float]
==== Plugin Generator
==== Creating an empty plugin
We recommend that you kick-start your plugin by generating it with the {kib-repo}tree/{branch}/packages/kbn-plugin-generator[Kibana Plugin Generator]. Run the following in the Kibana repo, and you will be asked a couple questions, see some progress bars, and have a freshly generated plugin ready for you to play with in Kibana's `plugins` folder.
You can use the <<automatic-plugin-generator>> to get a basic structure for a new plugin. Plugins that are not part of the
{kib} repo should be developed inside the `plugins` folder. If you are building a new plugin to check in to the {kib} repo,
you will choose between a few locations:
["source","shell"]
-----------
node scripts/generate_plugin my_plugin_name # replace "my_plugin_name" with your desired plugin name
-----------
[float]
==== Directory structure for plugins
The Kibana directory must be named `kibana`, and your plugin directory should be located in the root of `kibana` in a `plugins` directory, for example:
["source","shell"]
----
.
└── kibana
└── plugins
├── foo-plugin
└── bar-plugin
----
[float]
==== References in the code
- {kib-repo}blob/{branch}/src/legacy/server/plugins/lib/plugin.js[Plugin class]: What options does the `kibana.Plugin` class accept?
- <<development-uiexports>>: What type of exports are available?
- {kib-repo}tree/{branch}/x-pack/plugins[x-pack/plugins] for commercially licensed plugins
- {kib-repo}tree/{branch}/src/plugins[src/plugins] for open source licensed plugins
- {kib-repo}tree/{branch}/examples[examples] for developer example plugins (these will not be included in the distributables)
[float]
==== Elastic UI Framework
If you're developing a plugin that has a user interface, take a look at our https://elastic.github.io/eui[Elastic UI Framework].
It documents the CSS and React components we use to build Kibana's user interface.
It documents the CSS and React components we use to build {kib}'s user interface.
You're welcome to use these components, but be aware that they are rapidly evolving, and we might introduce breaking changes that will disrupt your plugin's UI.
[float]
==== TypeScript Support
Plugin code can be written in http://www.typescriptlang.org/[TypeScript] if desired.
We recommend your plugin code is written in http://www.typescriptlang.org/[TypeScript].
To enable TypeScript support, create a `tsconfig.json` file at the root of your plugin that looks something like this:
["source","js"]
-----------
{
// extend Kibana's tsconfig, or use your own settings
// extend {kib}'s tsconfig, or use your own settings
"extends": "../../kibana/tsconfig.json",
// tell the TypeScript compiler where to find your source files
@ -64,10 +45,17 @@ To enable TypeScript support, create a `tsconfig.json` file at the root of your
-----------
TypeScript code is automatically converted into JavaScript during development,
but not in the distributable version of Kibana. If you use the
{kib-repo}blob/{branch}/packages/kbn-plugin-helpers[@kbn/plugin-helpers] to build your plugin, then your `.ts` and `.tsx` files will be permanently transpiled before your plugin is archived. If you have your own build process, make sure to run the TypeScript compiler on your source files and ship the compilation output so that your plugin will work with the distributable version of Kibana.
but not in the distributable version of {kib}. If you use the
{kib-repo}blob/{branch}/packages/kbn-plugin-helpers[@kbn/plugin-helpers] to build your plugin, then your `.ts` and `.tsx` files will be permanently transpiled before your plugin is archived. If you have your own build process, make sure to run the TypeScript compiler on your source files and ship the compilation output so that your plugin will work with the distributable version of {kib}.
[float]
==== {kib} platform migration guide
{kib-repo}blob/{branch}/src/core/MIGRATION.md#migrating-legacy-plugins-to-the-new-platform[This guide]
provides an action plan for moving a legacy plugin to the new platform.
[float]
==== Externally developed plugins
If you are building a plugin outside of the {kib} repo, read <<external-plugin-development>>.

View file

@ -0,0 +1,144 @@
[[development-getting-started]]
== Getting started
Get started building your own plugins, or contributing directly to the {kib} repo.
[float]
[[get-kibana-code]]
=== Get the code
https://help.github.com/en/github/getting-started-with-github/fork-a-repo[Fork], then https://help.github.com/en/github/getting-started-with-github/fork-a-repo#step-2-create-a-local-clone-of-your-fork[clone] the {kib-repo}[{kib} repo] and change directory into it:
[source,bash]
----
git clone https://github.com/[YOUR_USERNAME]/kibana.git kibana
cd kibana
----
[float]
=== Install dependencies
Install the version of Node.js listed in the `.node-version` file. This
can be automated with tools such as
https://github.com/creationix/nvm[nvm],
https://github.com/coreybutler/nvm-windows[nvm-windows] or
https://github.com/wbyoung/avn[avn]. As we also include a `.nvmrc` file
you can switch to the correct version when using nvm by running:
[source,bash]
----
nvm use
----
Install the latest version of https://yarnpkg.com[yarn].
Bootstrap {kib} and install all the dependencies:
[source,bash]
----
yarn kbn bootstrap
----
____
Node.js native modules could be in use and node-gyp is the tool used to
build them. There are tools you need to install per platform and python
versions you need to be using. Please see
https://github.com/nodejs/node-gyp#installation[https://github.com/nodejs/node-gyp#installation]
and follow the guide according your platform.
____
(You can also run `yarn kbn` to see the other available commands. For
more info about this tool, see
{kib-repo}tree/{branch}/packages/kbn-pm[{kib-repo}tree/{branch}packages/kbn-pm].)
When switching branches which use different versions of npm packages you
may need to run:
[source,bash]
----
yarn kbn clean
----
If you have failures during `yarn kbn bootstrap` you may have some
corrupted packages in your yarn cache which you can clean with:
[source,bash]
----
yarn cache clean
----
[float]
=== Configure environmental settings
[[increase-nodejs-heap-size]]
[float]
==== Increase node.js heap size
{kib} is a big project and for some commands it can happen that the
process hits the default heap limit and crashes with an out-of-memory
error. If you run into this problem, you can increase maximum heap size
by setting the `--max_old_space_size` option on the command line. To set
the limit for all commands, simply add the following line to your shell
config: `export NODE_OPTIONS="--max_old_space_size=2048"`.
[float]
=== Run Elasticsearch
Run the latest Elasticsearch snapshot. Specify an optional license with the `--license` flag.
[source,bash]
----
yarn es snapshot --license trial
----
`trial` will give you access to all capabilities.
Read about more options for <<running-elasticsearch>>, like connecting to a remote host, running from source,
preserving data inbetween runs, running remote cluster, etc.
[float]
=== Run {kib}
In another terminal window, start up {kib}. Include developer examples by adding an optional `--run-examples` flag.
[source,bash]
----
yarn start --run-examples
----
View all available options by running `yarn start --help`
Read about more advanced options for <<running-kibana-advanced>>.
[float]
=== Code away!
You are now ready to start developing. Changes to your files should be picked up automatically. Server side changes will
cause the {kib} server to reboot.
[float]
=== More information
* <<running-kibana-advanced>>
* <<sample-data>>
* <<kibana-debugging>>
* <<kibana-sass>>
* <<building-kibana>>
* <<development-plugin-resources>>
include::running-kibana-advanced.asciidoc[]
include::sample-data.asciidoc[]
include::debugging.asciidoc[]
include::sass.asciidoc[]
include::building-kibana.asciidoc[]
include::development-plugin-resources.asciidoc[]

View file

@ -0,0 +1,87 @@
[[running-kibana-advanced]]
=== Running {kib}
Change to your local {kib} directory. Start the development server.
[source,bash]
----
yarn start
----
____
On Windows, youll need to use Git Bash, Cygwin, or a similar shell that
exposes the `sh` command. And to successfully build youll need Cygwin
optional packages zip, tar, and shasum.
____
Now you can point your web browser to http://localhost:5601 and start
using {kib}! When running `yarn start`, {kib} will also log that it
is listening on port 5603 due to the base path proxy, but you should
still access {kib} on port 5601.
By default, you can log in with username `elastic` and password
`changeme`. See the `--help` options on `yarn es <command>` if
youd like to configure a different password.
[float]
==== Running {kib} in Open-Source mode
If youre looking to only work with the open-source software, supply the
license type to `yarn es`:
[source,bash]
----
yarn es snapshot --license oss
----
And start {kib} with only open-source code:
[source,bash]
----
yarn start --oss
----
[float]
==== Unsupported URL Type
If youre installing dependencies and seeing an error that looks
something like
....
Unsupported URL Type: link:packages/eslint-config-kibana
....
youre likely running `npm`. To install dependencies in {kib} you
need to run `yarn kbn bootstrap`. For more info, see
link:#setting-up-your-development-environment[Setting Up Your
Development Environment] above.
[float]
[[customize-kibana-yml]]
==== Customizing `config/kibana.dev.yml`
The `config/kibana.yml` file stores user configuration directives.
Since this file is checked into source control, however, developer
preferences cant be saved without the risk of accidentally committing
the modified version. To make customizing configuration easier during
development, the {kib} CLI will look for a `config/kibana.dev.yml`
file if run with the `--dev` flag. This file behaves just like the
non-dev version and accepts any of the
https://www.elastic.co/guide/en/kibana/current/settings.html[standard
settings].
[float]
==== Potential Optimization Pitfalls
* Webpack is trying to include a file in the bundle that I deleted and
is now complaining about it is missing
* A module id that used to resolve to a single file now resolves to a
directory, but webpack isnt adapting
* (if you discover other scenarios, please send a PR!)
[float]
==== Setting Up SSL
{kib} includes self-signed certificates that can be used for
development purposes in the browser and for communicating with
Elasticsearch: `yarn start --ssl` & `yarn es snapshot --ssl`.

View file

@ -0,0 +1,31 @@
[[sample-data]]
=== Installing sample data
There are a couple ways to easily get data ingested into Elasticsearch.
[float]
==== Sample data packages available for one click installation
The easiest is to install one or more of our vailable sample data packages. If you have no data, you should be
prompted to install when running {kib} for the first time. You can also access and install the sample data packages
by going to the home page and clicking "add sample data".
[float]
==== makelogs script
The provided `makelogs` script will generate sample data.
[source,bash]
----
node scripts/makelogs --auth <username>:<password>
----
The default username and password combination are `elastic:changeme`
Make sure to execute `node scripts/makelogs` *after* elasticsearch is up and running!
[float]
==== CSV upload
If running with a platinum or trial license, you can also use the CSV uploader provided inside the Machine learning app.
Navigate to the Data visualizer to upload your data from a file.

View file

@ -0,0 +1,36 @@
[[kibana-sass]]
=== Styling with SASS
When writing a new component, create a sibling SASS file of the same
name and import directly into the JS/TS component file. Doing so ensures
the styles are never separated or lost on import and allows for better
modularization (smaller individual plugin asset footprint).
All SASS (.scss) files will automatically build with the
https://elastic.github.io/eui/#/guidelines/sass[EUI] & {kib} invisibles (SASS variables, mixins, functions) from
the {kib-repo}tree/{branch}/src/legacy/ui/public/styles/_globals_v7light.scss[globals_THEME.scss] file.
*Example:*
[source,tsx]
----
// component.tsx
import './component.scss';
export const Component = () => {
return (
<div className="plgComponent" />
);
}
----
[source,scss]
----
// component.scss
.plgComponent { ... }
----
Do not use the underscore `_` SASS file naming pattern when importing
directly into a javascript file.

View file

@ -3,25 +3,27 @@
[partintro]
--
Contributing to Kibana can be daunting at first, but it doesn't have to be. If
you're planning a pull request to the Kibana repository, you may want to start
with <<core-development>>.
Contributing to {kib} can be daunting at first, but it doesn't have to be. The following sections should get you up and
running in no time. If you have any problems, file an issue in the https://github.com/elastic/kibana/issues[Kibana repo].
* <<development-getting-started>>
* <<development-best-practices>>
* <<contributing>>
* <<external-plugin-development>>
* <<kibana-architecture>>
* <<advanced>>
If you'd prefer to use Kibana's internal plugin API, then check out
<<plugin-development>>.
--
include::core-development.asciidoc[]
include::getting-started/index.asciidoc[]
include::plugin-development.asciidoc[]
include::best-practices/index.asciidoc[]
include::visualize/development-visualize-index.asciidoc[]
include::architecture/index.asciidoc[]
include::add-data-guide.asciidoc[]
include::contributing/index.asciidoc[]
include::security/index.asciidoc[]
include::plugin/index.asciidoc[]
include::pr-review.asciidoc[]
include::testing/interpreting-ci-failures.asciidoc[]
include::advanced/index.asciidoc[]

View file

@ -1,24 +0,0 @@
[[plugin-development]]
== Plugin Development
[IMPORTANT]
==============================================
The Kibana plugin interfaces are in a state of constant development. We cannot provide backwards compatibility for plugins due to the high rate of change. Kibana enforces that the installed plugins match the version of Kibana itself. Plugin developers will have to release a new version of their plugin for each new Kibana release as a result.
==============================================
* <<development-plugin-resources>>
* <<development-uiexports>>
* <<development-plugin-feature-registration>>
* <<development-plugin-functional-tests>>
* <<development-plugin-localization>>
include::plugin/development-plugin-resources.asciidoc[]
include::plugin/development-uiexports.asciidoc[]
include::plugin/development-plugin-feature-registration.asciidoc[]
include::plugin/development-plugin-functional-tests.asciidoc[]
include::plugin/development-plugin-localization.asciidoc[]

View file

@ -1,16 +0,0 @@
[[development-uiexports]]
=== UI Exports
An aggregate list of available UiExport types:
[cols="<h,<",options="header",]
|=======================================================================
| Type | Purpose
| hacks | Any module that should be included in every application
| visTypes | Modules that register providers with the `ui/registry/vis_types` registry.
| inspectorViews | Modules that register custom inspector views via the `viewRegistry` in `ui/inspector`.
| chromeNavControls | Modules that register providers with the `ui/registry/chrome_header_nav_controls` registry.
| navbarExtensions | Modules that register providers with the setup contract of the `navigation` plugin.
| docViews | Modules that register providers with the setup contract method `addDocView` of the `discover` plugin.
| app | Adds an application to the system. This uiExport type is defined as an object of metadata rather than just a module id.
|=======================================================================

View file

@ -1,12 +1,12 @@
[[development-plugin-functional-tests]]
=== Functional Tests for Plugins
[[external-plugin-functional-tests]]
=== Functional Tests for Plugins outside the {kib} repo
Plugins use the `FunctionalTestRunner` by running it out of the Kibana repo. Ensure that your Kibana Development Environment is setup properly before continuing.
Plugins use the `FunctionalTestRunner` by running it out of the {kib} repo. Ensure that your {kib} Development Environment is setup properly before continuing.
[float]
==== Writing your own configuration
Every project or plugin should have its own `FunctionalTestRunner` config file. Just like Kibana's, this config file will define all of the test files to load, providers for Services and PageObjects, as well as configuration options for certain services.
Every project or plugin should have its own `FunctionalTestRunner` config file. Just like {kib}'s, this config file will define all of the test files to load, providers for Services and PageObjects, as well as configuration options for certain services.
To get started copy and paste this example to `test/functional/config.js`:
@ -22,7 +22,7 @@ import { MyAppPageProvider } from './services/my_app_page';
// that returns an object with the projects config values
export default async function ({ readConfigFile }) {
// read the Kibana config file so that we can utilize some of
// read the {kib} config file so that we can utilize some of
// its services and PageObjects
const kibanaConfig = await readConfigFile(resolveKibanaPath('test/functional/config.js'));
@ -41,7 +41,7 @@ export default async function ({ readConfigFile }) {
},
// just like services, PageObjects are defined as a map of
// names to Providers. Merge in Kibana's or pick specific ones
// names to Providers. Merge in {kib}'s or pick specific ones
pageObjects: {
management: kibanaConfig.get('pageObjects.management'),
myApp: MyAppPageProvider,
@ -50,7 +50,7 @@ export default async function ({ readConfigFile }) {
// the apps section defines the urls that
// `PageObjects.common.navigateTo(appKey)` will use.
// Merge urls for your plugin with the urls defined in
// Kibana's config in order to use this helper
// {kib}'s config in order to use this helper
apps: {
...kibanaConfig.get('apps'),
myApp: {
@ -85,5 +85,5 @@ node ../../kibana/scripts/functional_test_runner
[float]
==== Using esArchiver
We're working on documentation for this, but for now the best place to look is the original {kibana-pull}10359[pull request].
We're working on documentation for this, but for now the best place to look is the original {kib-repo}/issues/10359[pull request].

View file

@ -1,12 +1,12 @@
[[development-plugin-localization]]
=== Localization for plugins
[[external-plugin-localization]]
=== Localization for plugins outside the {kib} repo
To introduce localization for your plugin, use our i18n tool to create IDs and default messages. You can then extract these IDs with respective default messages into localization JSON files for Kibana to use when running your plugin.
To introduce localization for your plugin, use our i18n tool to create IDs and default messages. You can then extract these IDs with respective default messages into localization JSON files for {kib} to use when running your plugin.
[float]
==== Adding localization to your plugin
You must add a `translations` directory at the root of your plugin. This directory will contain the translation files that Kibana uses.
You must add a `translations` directory at the root of your plugin. This directory will contain the translation files that {kib} uses.
["source","shell"]
-----------
@ -20,13 +20,13 @@ You must add a `translations` directory at the root of your plugin. This directo
[float]
==== Using Kibana i18n tooling
To simplify the localization process, Kibana provides tools for the following functions:
==== Using {kib} i18n tooling
To simplify the localization process, {kib} provides tools for the following functions:
* Verify all translations have translatable strings and extract default messages from templates
* Verify translation files and integrate them into Kibana
* Verify translation files and integrate them into {kib}
To use Kibana i18n tooling, create a `.i18nrc.json` file with the following configs:
To use {kib} i18n tooling, create a `.i18nrc.json` file with the following configs:
* `paths`. The directory from which the i18n translation IDs are extracted.
* `exclude`. The list of files to exclude while parsing paths.
@ -47,7 +47,7 @@ To use Kibana i18n tooling, create a `.i18nrc.json` file with the following conf
}
-----------
An example Kibana `.i18nrc.json` is {blob}.i18nrc.json[here].
An example {kib} `.i18nrc.json` is {blob}.i18nrc.json[here].
Full documentation about i18n tooling is {blob}src/dev/i18n/README.md[here].
@ -83,10 +83,10 @@ node scripts/i18n_check --fix --include-config ../kibana-extra/myPlugin/.i18nrc.
[float]
==== Implementing i18n in the UI
Kibana relies on several UI frameworks (ReactJS and AngularJS) and
{kib} relies on several UI frameworks (ReactJS and AngularJS) and
requires localization in different environments (browser and NodeJS).
The internationalization engine is framework agnostic and consumable in
all parts of Kibana (ReactJS, AngularJS and NodeJS).
all parts of {kib} (ReactJS, AngularJS and NodeJS).
To simplify
internationalization in UI frameworks, additional abstractions are

View file

@ -0,0 +1,42 @@
[[external-plugin-development]]
== External plugin development
[IMPORTANT]
==============================================
The {kib} plugin interfaces are in a state of constant development. We cannot provide backwards compatibility for plugins due to the high rate of change. {kib} enforces that the installed plugins match the version of {kib} itself. Plugin developers will have to release a new version of their plugin for each new {kib} release as a result.
==============================================
Most developers who contribute code directly to the {kib} repo are writing code inside plugins, so our <<contributing>> docs are the best place to
start. However, there are a few differences when developing plugins outside the {kib} repo. These differences are covered here.
[float]
[[automatic-plugin-generator]]
==== Automatic plugin generator
We recommend that you kick-start your plugin by generating it with the {kib-repo}tree/{branch}/packages/kbn-plugin-generator[Kibana Plugin Generator]. Run the following in the {kib} repo, and you will be asked a couple questions, see some progress bars, and have a freshly generated plugin ready for you to play with in {kib}'s `plugins` folder.
["source","shell"]
-----------
node scripts/generate_plugin my_plugin_name # replace "my_plugin_name" with your desired plugin name
-----------
[float]
=== Plugin location
The {kib} directory must be named `kibana`, and your plugin directory should be located in the root of `kibana` in a `plugins` directory, for example:
["source","shell"]
----
.
└── kibana
└── plugins
├── foo-plugin
└── bar-plugin
----
* <<external-plugin-functional-tests>>
* <<external-plugin-localization>>
include::external-plugin-functional-tests.asciidoc[]
include::external-plugin-localization.asciidoc[]

View file

@ -1,12 +0,0 @@
[[development-security]]
== Security
Kibana has generally been able to implement security transparently to core and plugin developers, and this largely remains the case. {kib} on two methods that the <<development-elasticsearch, elasticsearch plugin>>'s `Cluster` provides: `callWithRequest` and `callWithInternalUser`.
`callWithRequest` executes requests against Elasticsearch using the authentication credentials of the Kibana end-user. So, if you log into Kibana with the user of `foo` when `callWithRequest` is used, {kib} execute the request against Elasticsearch as the user `foo`. Historically, `callWithRequest` has been used extensively to perform actions that are initiated at the request of Kibana end-users.
`callWithInternalUser` executes requests against Elasticsearch using the internal Kibana server user, and has historically been used for performing actions that aren't initiated by Kibana end users; for example, creating the initial `.kibana` index or performing health checks against Elasticsearch.
However, with the changes that role-based access control (RBAC) introduces, this is no longer cut and dry. {kib} now requires all access to the `.kibana` index goes through the `SavedObjectsClient`. This used to be a best practice, as the `SavedObjectsClient` was responsible for translating the documents stored in Elasticsearch to and from Saved Objects, but RBAC is now taking advantage of this abstraction to implement access control and determine when to use `callWithRequest` versus `callWithInternalUser`.
include::rbac.asciidoc[]

View file

@ -19,7 +19,7 @@ Maps makes requests directly from the browser to EMS.
To connect to EMS when your Kibana server and browser are in an internal network:
. Set `map.proxyElasticMapsServiceInMaps` to `true` in your <<settings, kibana.yml>> file to proxy EMS requests through the Kibana server.
. Update your firewall rules to whitelist connections from your Kibana server to the EMS domains.
. Update your firewall rules to allow connections from your Kibana server to the EMS domains.
NOTE: Coordinate map and region map visualizations do not support `map.proxyElasticMapsServiceInMaps` and will not proxy EMS requests through the Kibana server.

View file

@ -167,9 +167,9 @@ These can be used to automatically update the list of hosts as a cluster is resi
Kibana has a default maximum memory limit of 1.4 GB, and in most cases, we recommend leaving this unconfigured. In some scenarios, such as large reporting jobs,
it may make sense to tweak limits to meet more specific requirements.
You can modify this limit by setting `--max-old-space-size` in the `NODE_OPTIONS` environment variable. For deb and rpm, packages this is passed in via `/etc/default/kibana` and can be appended to the bottom of the file.
You can modify this limit by setting `--max-old-space-size` in the `node.options` config file that can be found inside `kibana/config` folder or any other configured with the environment variable `KIBANA_PATH_CONF` (for example in debian based system would be `/etc/kibana`).
The option accepts a limit in MB:
--------
NODE_OPTIONS="--max-old-space-size=2048" bin/kibana
--max-old-space-size=2048
--------

View file

@ -67,7 +67,7 @@
"uiFramework:documentComponent": "cd packages/kbn-ui-framework && yarn documentComponent",
"kbn:watch": "node scripts/kibana --dev --logging.json=false",
"build:types": "tsc --p tsconfig.types.json",
"docs:acceptApiChanges": "node --max-old-space-size=6144 scripts/check_published_api_changes.js --accept",
"docs:acceptApiChanges": "node --max-old-space-size=6144 scripts/check_published_api_changes.js --accept",
"kbn:bootstrap": "node scripts/register_git_hook",
"spec_to_console": "node scripts/spec_to_console",
"backport-skip-ci": "backport --prDescription \"[skip-ci]\"",
@ -87,6 +87,7 @@
"**/@types/hoist-non-react-statics": "^3.3.1",
"**/@types/chai": "^4.2.11",
"**/cypress/@types/lodash": "^4.14.155",
"**/cypress/lodash": "^4.15.19",
"**/typescript": "3.9.5",
"**/graphql-toolkit/lodash": "^4.17.15",
"**/hoist-non-react-statics": "^3.3.2",
@ -255,7 +256,6 @@
"redux-actions": "^2.6.5",
"redux-thunk": "^2.3.0",
"regenerator-runtime": "^0.13.3",
"regression": "2.0.1",
"request": "^2.88.0",
"require-in-the-middle": "^5.0.2",
"reselect": "^4.0.0",
@ -407,7 +407,7 @@
"babel-eslint": "^10.0.3",
"babel-jest": "^25.5.1",
"babel-plugin-istanbul": "^6.0.0",
"backport": "5.4.6",
"backport": "5.5.1",
"chai": "3.5.0",
"chance": "1.0.18",
"cheerio": "0.22.0",

View file

@ -18,5 +18,10 @@
*/
require('../src/setup_node_env/node_version_validator');
var process = require('process');
// forward command line args to backport
var args = process.argv.slice(2);
var backport = require('backport');
backport.run();
backport.run({}, args);

View file

@ -14,6 +14,7 @@ while [ -h "$SCRIPT" ] ; do
done
DIR="$(dirname "${SCRIPT}")/.."
CONFIG_DIR=${KIBANA_PATH_CONF:-"$DIR/config"}
NODE="${DIR}/node/bin/node"
test -x "$NODE"
if [ ! -x "$NODE" ]; then
@ -21,4 +22,8 @@ if [ ! -x "$NODE" ]; then
exit 1
fi
NODE_OPTIONS="--no-warnings --max-http-header-size=65536 ${NODE_OPTIONS}" NODE_ENV=production exec "${NODE}" "${DIR}/src/cli" ${@}
if [ -f "${CONFIG_DIR}/node.options" ]; then
KBN_NODE_OPTS="$(grep -v ^# < ${CONFIG_DIR}/node.options | xargs)"
fi
NODE_OPTIONS="--no-warnings --max-http-header-size=65536 $KBN_NODE_OPTS $NODE_OPTIONS" NODE_ENV=production exec "${NODE}" "${DIR}/src/cli" ${@}

View file

@ -14,6 +14,7 @@ while [ -h "$SCRIPT" ] ; do
done
DIR="$(dirname "${SCRIPT}")/.."
CONFIG_DIR=${KIBANA_PATH_CONF:-"$DIR/config"}
NODE="${DIR}/node/bin/node"
test -x "$NODE"
if [ ! -x "$NODE" ]; then
@ -21,4 +22,8 @@ if [ ! -x "$NODE" ]; then
exit 1
fi
"${NODE}" "${DIR}/src/cli_keystore" "$@"
if [ -f "${CONFIG_DIR}/node.options" ]; then
KBN_NODE_OPTS="$(grep -v ^# < ${CONFIG_DIR}/node.options | xargs)"
fi
NODE_OPTIONS="$KBN_NODE_OPTS $NODE_OPTIONS" "${NODE}" "${DIR}/src/cli_keystore" "$@"

View file

@ -1,6 +1,6 @@
@echo off
SETLOCAL
SETLOCAL ENABLEDELAYEDEXPANSION
set SCRIPT_DIR=%~dp0
for %%I in ("%SCRIPT_DIR%..") do set DIR=%%~dpfI
@ -12,6 +12,21 @@ If Not Exist "%NODE%" (
Exit /B 1
)
set CONFIG_DIR=%KIBANA_PATH_CONF%
If [%KIBANA_PATH_CONF%] == [] (
set CONFIG_DIR=%DIR%\config
)
IF EXIST "%CONFIG_DIR%\node.options" (
for /F "eol=# tokens=*" %%i in (%CONFIG_DIR%\node.options) do (
If [!NODE_OPTIONS!] == [] (
set "NODE_OPTIONS=%%i"
) Else (
set "NODE_OPTIONS=!NODE_OPTIONS! %%i"
)
)
)
TITLE Kibana Keystore
"%NODE%" "%DIR%\src\cli_keystore" %*

View file

@ -14,6 +14,7 @@ while [ -h "$SCRIPT" ] ; do
done
DIR="$(dirname "${SCRIPT}")/.."
CONFIG_DIR=${KIBANA_PATH_CONF:-"$DIR/config"}
NODE="${DIR}/node/bin/node"
test -x "$NODE"
if [ ! -x "$NODE" ]; then
@ -21,4 +22,8 @@ if [ ! -x "$NODE" ]; then
exit 1
fi
NODE_OPTIONS="--no-warnings ${NODE_OPTIONS}" NODE_ENV=production exec "${NODE}" "${DIR}/src/cli_plugin" "$@"
if [ -f "${CONFIG_DIR}/node.options" ]; then
KBN_NODE_OPTS="$(grep -v ^# < ${CONFIG_DIR}/node.options | xargs)"
fi
NODE_OPTIONS="--no-warnings $KBN_NODE_OPTS $NODE_OPTIONS" NODE_ENV=production exec "${NODE}" "${DIR}/src/cli_plugin" "$@"

View file

@ -1,6 +1,6 @@
@echo off
SETLOCAL
SETLOCAL ENABLEDELAYEDEXPANSION
set SCRIPT_DIR=%~dp0
for %%I in ("%SCRIPT_DIR%..") do set DIR=%%~dpfI
@ -13,9 +13,26 @@ If Not Exist "%NODE%" (
Exit /B 1
)
TITLE Kibana Server
set CONFIG_DIR=%KIBANA_PATH_CONF%
If [%KIBANA_PATH_CONF%] == [] (
set CONFIG_DIR=%DIR%\config
)
set "NODE_OPTIONS=--no-warnings %NODE_OPTIONS%" && "%NODE%" "%DIR%\src\cli_plugin" %*
IF EXIST "%CONFIG_DIR%\node.options" (
for /F "eol=# tokens=*" %%i in (%CONFIG_DIR%\node.options) do (
If [!NODE_OPTIONS!] == [] (
set "NODE_OPTIONS=%%i"
) Else (
set "NODE_OPTIONS=!NODE_OPTIONS! %%i"
)
)
)
:: Include pre-defined node option
set "NODE_OPTIONS=--no-warnings %NODE_OPTIONS%"
TITLE Kibana Server
"%NODE%" "%DIR%\src\cli_plugin" %*
:finally

View file

@ -1,6 +1,6 @@
@echo off
SETLOCAL
SETLOCAL ENABLEDELAYEDEXPANSION
set SCRIPT_DIR=%~dp0
for %%I in ("%SCRIPT_DIR%..") do set DIR=%%~dpfI
@ -14,7 +14,27 @@ If Not Exist "%NODE%" (
Exit /B 1
)
set "NODE_OPTIONS=--no-warnings --max-http-header-size=65536 %NODE_OPTIONS%" && "%NODE%" "%DIR%\src\cli" %*
set CONFIG_DIR=%KIBANA_PATH_CONF%
If [%KIBANA_PATH_CONF%] == [] (
set CONFIG_DIR=%DIR%\config
)
IF EXIST "%CONFIG_DIR%\node.options" (
for /F "eol=# tokens=*" %%i in (%CONFIG_DIR%\node.options) do (
If [!NODE_OPTIONS!] == [] (
set "NODE_OPTIONS=%%i"
) Else (
set "NODE_OPTIONS=!NODE_OPTIONS! %%i"
)
)
)
:: Include pre-defined node option
set "NODE_OPTIONS=--no-warnings --max-http-header-size=65536 %NODE_OPTIONS%"
:: This should run independently as the last instruction
:: as we need NODE_OPTIONS previously set to expand
"%NODE%" "%DIR%\src\cli" %*
:finally

View file

@ -43,6 +43,7 @@ export const CopySourceTask = {
'typings/**',
'webpackShims/**',
'config/kibana.yml',
'config/node.options',
'tsconfig*.json',
'.i18nrc.json',
'kibana.d.ts',

View file

@ -30,7 +30,7 @@ import {
} from '../types';
const templateMatchRE = /{{([\s\S]+?)}}/g;
const whitelistUrlSchemes = ['http://', 'https://'];
const allowedUrlSchemes = ['http://', 'https://'];
const URL_TYPES = [
{
@ -161,7 +161,7 @@ export class UrlFormat extends FieldFormat {
return this.generateImgHtml(url, imageLabel);
default:
const inWhitelist = whitelistUrlSchemes.some((scheme) => url.indexOf(scheme) === 0);
const inWhitelist = allowedUrlSchemes.some((scheme) => url.indexOf(scheme) === 0);
if (!inWhitelist && !parsedUrl) {
return url;
}

View file

@ -1956,42 +1956,43 @@ export const UI_SETTINGS: {
// src/plugins/data/public/index.ts:136:21 - (ae-forgotten-export) The symbol "getEsQueryConfig" needs to be exported by the entry point index.d.ts
// src/plugins/data/public/index.ts:136:21 - (ae-forgotten-export) The symbol "luceneStringToDsl" needs to be exported by the entry point index.d.ts
// src/plugins/data/public/index.ts:136:21 - (ae-forgotten-export) The symbol "decorateQuery" needs to be exported by the entry point index.d.ts
// src/plugins/data/public/index.ts:176:26 - (ae-forgotten-export) The symbol "FieldFormatsRegistry" needs to be exported by the entry point index.d.ts
// src/plugins/data/public/index.ts:176:26 - (ae-forgotten-export) The symbol "BoolFormat" needs to be exported by the entry point index.d.ts
// src/plugins/data/public/index.ts:176:26 - (ae-forgotten-export) The symbol "BytesFormat" needs to be exported by the entry point index.d.ts
// src/plugins/data/public/index.ts:176:26 - (ae-forgotten-export) The symbol "ColorFormat" needs to be exported by the entry point index.d.ts
// src/plugins/data/public/index.ts:176:26 - (ae-forgotten-export) The symbol "DurationFormat" needs to be exported by the entry point index.d.ts
// src/plugins/data/public/index.ts:176:26 - (ae-forgotten-export) The symbol "IpFormat" needs to be exported by the entry point index.d.ts
// src/plugins/data/public/index.ts:176:26 - (ae-forgotten-export) The symbol "NumberFormat" needs to be exported by the entry point index.d.ts
// src/plugins/data/public/index.ts:176:26 - (ae-forgotten-export) The symbol "PercentFormat" needs to be exported by the entry point index.d.ts
// src/plugins/data/public/index.ts:176:26 - (ae-forgotten-export) The symbol "RelativeDateFormat" needs to be exported by the entry point index.d.ts
// src/plugins/data/public/index.ts:176:26 - (ae-forgotten-export) The symbol "SourceFormat" needs to be exported by the entry point index.d.ts
// src/plugins/data/public/index.ts:176:26 - (ae-forgotten-export) The symbol "StaticLookupFormat" needs to be exported by the entry point index.d.ts
// src/plugins/data/public/index.ts:176:26 - (ae-forgotten-export) The symbol "UrlFormat" needs to be exported by the entry point index.d.ts
// src/plugins/data/public/index.ts:176:26 - (ae-forgotten-export) The symbol "StringFormat" needs to be exported by the entry point index.d.ts
// src/plugins/data/public/index.ts:176:26 - (ae-forgotten-export) The symbol "TruncateFormat" needs to be exported by the entry point index.d.ts
// src/plugins/data/public/index.ts:232:27 - (ae-forgotten-export) The symbol "isFilterable" needs to be exported by the entry point index.d.ts
// src/plugins/data/public/index.ts:232:27 - (ae-forgotten-export) The symbol "isNestedField" needs to be exported by the entry point index.d.ts
// src/plugins/data/public/index.ts:232:27 - (ae-forgotten-export) The symbol "validateIndexPattern" needs to be exported by the entry point index.d.ts
// src/plugins/data/public/index.ts:232:27 - (ae-forgotten-export) The symbol "getFromSavedObject" needs to be exported by the entry point index.d.ts
// src/plugins/data/public/index.ts:232:27 - (ae-forgotten-export) The symbol "flattenHitWrapper" needs to be exported by the entry point index.d.ts
// src/plugins/data/public/index.ts:232:27 - (ae-forgotten-export) The symbol "formatHitProvider" needs to be exported by the entry point index.d.ts
// src/plugins/data/public/index.ts:369:20 - (ae-forgotten-export) The symbol "getRequestInspectorStats" needs to be exported by the entry point index.d.ts
// src/plugins/data/public/index.ts:369:20 - (ae-forgotten-export) The symbol "getResponseInspectorStats" needs to be exported by the entry point index.d.ts
// src/plugins/data/public/index.ts:369:20 - (ae-forgotten-export) The symbol "tabifyAggResponse" needs to be exported by the entry point index.d.ts
// src/plugins/data/public/index.ts:369:20 - (ae-forgotten-export) The symbol "tabifyGetColumns" needs to be exported by the entry point index.d.ts
// src/plugins/data/public/index.ts:371:1 - (ae-forgotten-export) The symbol "CidrMask" needs to be exported by the entry point index.d.ts
// src/plugins/data/public/index.ts:372:1 - (ae-forgotten-export) The symbol "dateHistogramInterval" needs to be exported by the entry point index.d.ts
// src/plugins/data/public/index.ts:381:1 - (ae-forgotten-export) The symbol "InvalidEsCalendarIntervalError" needs to be exported by the entry point index.d.ts
// src/plugins/data/public/index.ts:382:1 - (ae-forgotten-export) The symbol "InvalidEsIntervalFormatError" needs to be exported by the entry point index.d.ts
// src/plugins/data/public/index.ts:383:1 - (ae-forgotten-export) The symbol "Ipv4Address" needs to be exported by the entry point index.d.ts
// src/plugins/data/public/index.ts:384:1 - (ae-forgotten-export) The symbol "isDateHistogramBucketAggConfig" needs to be exported by the entry point index.d.ts
// src/plugins/data/public/index.ts:388:1 - (ae-forgotten-export) The symbol "isValidEsInterval" needs to be exported by the entry point index.d.ts
// src/plugins/data/public/index.ts:389:1 - (ae-forgotten-export) The symbol "isValidInterval" needs to be exported by the entry point index.d.ts
// src/plugins/data/public/index.ts:392:1 - (ae-forgotten-export) The symbol "parseInterval" needs to be exported by the entry point index.d.ts
// src/plugins/data/public/index.ts:393:1 - (ae-forgotten-export) The symbol "propFilter" needs to be exported by the entry point index.d.ts
// src/plugins/data/public/index.ts:396:1 - (ae-forgotten-export) The symbol "toAbsoluteDates" needs to be exported by the entry point index.d.ts
// src/plugins/data/public/query/state_sync/connect_to_query_state.ts:40:60 - (ae-forgotten-export) The symbol "FilterStateStore" needs to be exported by the entry point index.d.ts
// src/plugins/data/public/index.ts:177:26 - (ae-forgotten-export) The symbol "FieldFormatsRegistry" needs to be exported by the entry point index.d.ts
// src/plugins/data/public/index.ts:177:26 - (ae-forgotten-export) The symbol "BoolFormat" needs to be exported by the entry point index.d.ts
// src/plugins/data/public/index.ts:177:26 - (ae-forgotten-export) The symbol "BytesFormat" needs to be exported by the entry point index.d.ts
// src/plugins/data/public/index.ts:177:26 - (ae-forgotten-export) The symbol "ColorFormat" needs to be exported by the entry point index.d.ts
// src/plugins/data/public/index.ts:177:26 - (ae-forgotten-export) The symbol "DateNanosFormat" needs to be exported by the entry point index.d.ts
// src/plugins/data/public/index.ts:177:26 - (ae-forgotten-export) The symbol "DurationFormat" needs to be exported by the entry point index.d.ts
// src/plugins/data/public/index.ts:177:26 - (ae-forgotten-export) The symbol "IpFormat" needs to be exported by the entry point index.d.ts
// src/plugins/data/public/index.ts:177:26 - (ae-forgotten-export) The symbol "NumberFormat" needs to be exported by the entry point index.d.ts
// src/plugins/data/public/index.ts:177:26 - (ae-forgotten-export) The symbol "PercentFormat" needs to be exported by the entry point index.d.ts
// src/plugins/data/public/index.ts:177:26 - (ae-forgotten-export) The symbol "RelativeDateFormat" needs to be exported by the entry point index.d.ts
// src/plugins/data/public/index.ts:177:26 - (ae-forgotten-export) The symbol "SourceFormat" needs to be exported by the entry point index.d.ts
// src/plugins/data/public/index.ts:177:26 - (ae-forgotten-export) The symbol "StaticLookupFormat" needs to be exported by the entry point index.d.ts
// src/plugins/data/public/index.ts:177:26 - (ae-forgotten-export) The symbol "UrlFormat" needs to be exported by the entry point index.d.ts
// src/plugins/data/public/index.ts:177:26 - (ae-forgotten-export) The symbol "StringFormat" needs to be exported by the entry point index.d.ts
// src/plugins/data/public/index.ts:177:26 - (ae-forgotten-export) The symbol "TruncateFormat" needs to be exported by the entry point index.d.ts
// src/plugins/data/public/index.ts:233:27 - (ae-forgotten-export) The symbol "isFilterable" needs to be exported by the entry point index.d.ts
// src/plugins/data/public/index.ts:233:27 - (ae-forgotten-export) The symbol "isNestedField" needs to be exported by the entry point index.d.ts
// src/plugins/data/public/index.ts:233:27 - (ae-forgotten-export) The symbol "validateIndexPattern" needs to be exported by the entry point index.d.ts
// src/plugins/data/public/index.ts:233:27 - (ae-forgotten-export) The symbol "getFromSavedObject" needs to be exported by the entry point index.d.ts
// src/plugins/data/public/index.ts:233:27 - (ae-forgotten-export) The symbol "flattenHitWrapper" needs to be exported by the entry point index.d.ts
// src/plugins/data/public/index.ts:233:27 - (ae-forgotten-export) The symbol "formatHitProvider" needs to be exported by the entry point index.d.ts
// src/plugins/data/public/index.ts:370:20 - (ae-forgotten-export) The symbol "getRequestInspectorStats" needs to be exported by the entry point index.d.ts
// src/plugins/data/public/index.ts:370:20 - (ae-forgotten-export) The symbol "getResponseInspectorStats" needs to be exported by the entry point index.d.ts
// src/plugins/data/public/index.ts:370:20 - (ae-forgotten-export) The symbol "tabifyAggResponse" needs to be exported by the entry point index.d.ts
// src/plugins/data/public/index.ts:370:20 - (ae-forgotten-export) The symbol "tabifyGetColumns" needs to be exported by the entry point index.d.ts
// src/plugins/data/public/index.ts:372:1 - (ae-forgotten-export) The symbol "CidrMask" needs to be exported by the entry point index.d.ts
// src/plugins/data/public/index.ts:373:1 - (ae-forgotten-export) The symbol "dateHistogramInterval" needs to be exported by the entry point index.d.ts
// src/plugins/data/public/index.ts:382:1 - (ae-forgotten-export) The symbol "InvalidEsCalendarIntervalError" needs to be exported by the entry point index.d.ts
// src/plugins/data/public/index.ts:383:1 - (ae-forgotten-export) The symbol "InvalidEsIntervalFormatError" needs to be exported by the entry point index.d.ts
// src/plugins/data/public/index.ts:384:1 - (ae-forgotten-export) The symbol "Ipv4Address" needs to be exported by the entry point index.d.ts
// src/plugins/data/public/index.ts:385:1 - (ae-forgotten-export) The symbol "isDateHistogramBucketAggConfig" needs to be exported by the entry point index.d.ts
// src/plugins/data/public/index.ts:389:1 - (ae-forgotten-export) The symbol "isValidEsInterval" needs to be exported by the entry point index.d.ts
// src/plugins/data/public/index.ts:390:1 - (ae-forgotten-export) The symbol "isValidInterval" needs to be exported by the entry point index.d.ts
// src/plugins/data/public/index.ts:393:1 - (ae-forgotten-export) The symbol "parseInterval" needs to be exported by the entry point index.d.ts
// src/plugins/data/public/index.ts:394:1 - (ae-forgotten-export) The symbol "propFilter" needs to be exported by the entry point index.d.ts
// src/plugins/data/public/index.ts:397:1 - (ae-forgotten-export) The symbol "toAbsoluteDates" needs to be exported by the entry point index.d.ts
// src/plugins/data/public/query/state_sync/connect_to_query_state.ts:41:60 - (ae-forgotten-export) The symbol "FilterStateStore" needs to be exported by the entry point index.d.ts
// src/plugins/data/public/types.ts:52:5 - (ae-forgotten-export) The symbol "createFiltersFromValueClickAction" needs to be exported by the entry point index.d.ts
// src/plugins/data/public/types.ts:53:5 - (ae-forgotten-export) The symbol "createFiltersFromRangeSelectAction" needs to be exported by the entry point index.d.ts
// src/plugins/data/public/types.ts:61:5 - (ae-forgotten-export) The symbol "IndexPatternSelectProps" needs to be exported by the entry point index.d.ts

View file

@ -24,6 +24,7 @@ import { BaseStateContainer } from '../../../../kibana_utils/public';
import { QuerySetup, QueryStart } from '../query_service';
import { QueryState, QueryStateChange } from './types';
import { FilterStateStore, COMPARE_ALL_OPTIONS, compareFilters } from '../../../common';
import { validateTimeRange } from '../timefilter';
/**
* Helper to setup two-way syncing of global data and a state container
@ -159,9 +160,9 @@ export const connectToQueryState = <S extends QueryState>(
// cloneDeep is required because services are mutating passed objects
// and state in state container is frozen
if (syncConfig.time) {
const time = state.time || timefilter.getTimeDefaults();
const time = validateTimeRange(state.time) ? state.time : timefilter.getTimeDefaults();
if (!_.isEqual(time, timefilter.getTime())) {
timefilter.setTime(_.cloneDeep(time));
timefilter.setTime(_.cloneDeep(time!));
}
}

View file

@ -24,3 +24,4 @@ export { Timefilter, TimefilterContract } from './timefilter';
export { TimeHistory, TimeHistoryContract } from './time_history';
export { changeTimeFilter, convertRangeFilterToTimeRangeString } from './lib/change_time_filter';
export { extractTimeFilter } from './lib/extract_time_filter';
export { validateTimeRange } from './lib/validate_timerange';

View file

@ -16,24 +16,37 @@
* specific language governing permissions and limitations
* under the License.
*/
import { PluginInitializerContext } from 'kibana/public';
// eslint-disable-next-line @kbn/eslint/no-restricted-paths
import { npStart, npSetup } from 'ui/new_platform';
import {
TableVisPlugin,
TablePluginSetupDependencies,
// eslint-disable-next-line @kbn/eslint/no-restricted-paths
} from '../../../../../../plugins/vis_type_table/public/plugin';
const plugins: Readonly<TablePluginSetupDependencies> = {
expressions: npSetup.plugins.expressions,
visualizations: npSetup.plugins.visualizations,
};
import { validateTimeRange } from './validate_timerange';
const pluginInstance = new TableVisPlugin({} as PluginInitializerContext);
describe('Validate timerange', () => {
test('Validate no range', () => {
const ok = validateTimeRange();
export const setup = pluginInstance.setup(npSetup.core, plugins);
export const start = pluginInstance.start(npStart.core, {
data: npStart.plugins.data,
kibanaLegacy: npStart.plugins.kibanaLegacy,
expect(ok).toBe(false);
});
test('normal range', () => {
const ok = validateTimeRange({
to: 'now',
from: 'now-7d',
});
expect(ok).toBe(true);
});
test('bad from time', () => {
const ok = validateTimeRange({
to: 'nowa',
from: 'now-7d',
});
expect(ok).toBe(false);
});
test('bad to time', () => {
const ok = validateTimeRange({
to: 'now',
from: 'nowa-7d',
});
expect(ok).toBe(false);
});
});

View file

@ -0,0 +1,28 @@
/*
* Licensed to Elasticsearch B.V. under one or more contributor
* license agreements. See the NOTICE file distributed with
* this work for additional information regarding copyright
* ownership. Elasticsearch B.V. licenses this file to you under
* the Apache License, Version 2.0 (the "License"); you may
* not use this file except in compliance with the License.
* You may obtain a copy of the License at
*
* http://www.apache.org/licenses/LICENSE-2.0
*
* Unless required by applicable law or agreed to in writing,
* software distributed under the License is distributed on an
* "AS IS" BASIS, WITHOUT WARRANTIES OR CONDITIONS OF ANY
* KIND, either express or implied. See the License for the
* specific language governing permissions and limitations
* under the License.
*/
import dateMath from '@elastic/datemath';
import { TimeRange } from '../../../../common';
export function validateTimeRange(time?: TimeRange): boolean {
if (!time) return false;
const momentDateFrom = dateMath.parse(time.from);
const momentDateTo = dateMath.parse(time.to);
return !!(momentDateFrom && momentDateFrom.isValid() && momentDateTo && momentDateTo.isValid());
}

View file

@ -316,6 +316,83 @@ describe('Terms Agg Other bucket helper', () => {
}
});
test('excludes exists filter for scripted fields', () => {
const aggConfigs = getAggConfigs(nestedTerm.aggs);
aggConfigs.aggs[1].params.field.scripted = true;
const agg = buildOtherBucketAgg(
aggConfigs,
aggConfigs.aggs[1] as IBucketAggConfig,
nestedTermResponse
);
const expectedResponse = {
'other-filter': {
aggs: undefined,
filters: {
filters: {
'-IN': {
bool: {
must: [],
filter: [{ match_phrase: { 'geo.src': 'IN' } }],
should: [],
must_not: [
{
script: {
script: {
lang: undefined,
params: { value: 'ios' },
source: '(undefined) == value',
},
},
},
{
script: {
script: {
lang: undefined,
params: { value: 'win xp' },
source: '(undefined) == value',
},
},
},
],
},
},
'-US': {
bool: {
must: [],
filter: [{ match_phrase: { 'geo.src': 'US' } }],
should: [],
must_not: [
{
script: {
script: {
lang: undefined,
params: { value: 'ios' },
source: '(undefined) == value',
},
},
},
{
script: {
script: {
lang: undefined,
params: { value: 'win xp' },
source: '(undefined) == value',
},
},
},
],
},
},
},
},
},
};
expect(agg).toBeDefined();
if (agg) {
expect(agg()).toEqual(expectedResponse);
}
});
test('returns false when nested terms agg has no buckets', () => {
const aggConfigs = getAggConfigs(nestedTerm.aggs);
const agg = buildOtherBucketAgg(

View file

@ -202,10 +202,12 @@ export const buildOtherBucketAgg = (
return;
}
if (
!aggWithOtherBucket.params.missingBucket ||
agg.buckets.some((bucket: { key: string }) => bucket.key === '__missing__')
) {
const hasScriptedField = !!aggWithOtherBucket.params.field.scripted;
const hasMissingBucket = !!aggWithOtherBucket.params.missingBucket;
const hasMissingBucketKey = agg.buckets.some(
(bucket: { key: string }) => bucket.key === '__missing__'
);
if (!hasScriptedField && (!hasMissingBucket || hasMissingBucketKey)) {
filters.push(
buildExistsFilter(
aggWithOtherBucket.params.field,

View file

@ -20,4 +20,3 @@
export { SEARCH_EMBEDDABLE_TYPE } from './constants';
export * from './types';
export * from './search_embeddable_factory';
export * from './search_embeddable';

View file

@ -38,7 +38,7 @@ import * as columnActions from '../angular/doc_table/actions/columns';
import searchTemplate from './search_template.html';
import { ISearchEmbeddable, SearchInput, SearchOutput } from './types';
import { SortOrder } from '../angular/doc_table/components/table_header/helpers';
import { getSortForSearchSource } from '../angular/doc_table/lib/get_sort_for_search_source';
import { getSortForSearchSource } from '../angular/doc_table';
import {
getRequestInspectorStats,
getResponseInspectorStats,

View file

@ -28,8 +28,8 @@ import {
} from '../../../../embeddable/public';
import { TimeRange } from '../../../../data/public';
import { SearchEmbeddable } from './search_embeddable';
import { SearchInput, SearchOutput } from './types';
import { SearchInput, SearchOutput, SearchEmbeddable } from './types';
import { SEARCH_EMBEDDABLE_TYPE } from './constants';
interface StartServices {
@ -92,7 +92,8 @@ export class SearchEmbeddableFactory
const savedObject = await getServices().getSavedSearchById(savedObjectId);
const indexPattern = savedObject.searchSource.getField('index');
const { executeTriggerActions } = await this.getStartServices();
return new SearchEmbeddable(
const { SearchEmbeddable: SearchEmbeddableClass } = await import('./search_embeddable');
return new SearchEmbeddableClass(
{
savedSearch: savedObject,
$rootScope,

View file

@ -17,7 +17,12 @@
* under the License.
*/
import { EmbeddableInput, EmbeddableOutput, IEmbeddable } from 'src/plugins/embeddable/public';
import {
Embeddable,
EmbeddableInput,
EmbeddableOutput,
IEmbeddable,
} from 'src/plugins/embeddable/public';
import { SortOrder } from '../angular/doc_table/components/table_header/helpers';
import { Filter, IIndexPattern, TimeRange, Query } from '../../../../data/public';
import { SavedSearch } from '../..';
@ -40,3 +45,7 @@ export interface SearchOutput extends EmbeddableOutput {
export interface ISearchEmbeddable extends IEmbeddable<SearchInput, SearchOutput> {
getSavedSearch(): SavedSearch;
}
export interface SearchEmbeddable extends Embeddable<SearchInput, SearchOutput> {
type: string;
}

View file

@ -66,6 +66,7 @@ import {
DISCOVER_APP_URL_GENERATOR,
DiscoverUrlGenerator,
} from './url_generator';
import { SearchEmbeddableFactory } from './application/embeddable';
declare module '../../share/public' {
export interface UrlGeneratorStateMapping {
@ -345,12 +346,7 @@ export class DiscoverPlugin
/**
* register embeddable with a slimmer embeddable version of inner angular
*/
private async registerEmbeddable(
core: CoreSetup<DiscoverStartPlugins>,
plugins: DiscoverSetupPlugins
) {
const { SearchEmbeddableFactory } = await import('./application/embeddable');
private registerEmbeddable(core: CoreSetup<DiscoverStartPlugins>, plugins: DiscoverSetupPlugins) {
if (!this.getEmbeddableInjector) {
throw Error('Discover plugin method getEmbeddableInjector is undefined');
}

View file

@ -28,7 +28,7 @@ export async function findAll<T extends SavedObjectAttributes>(
savedObjectsClient: ISavedObjectsRepository,
opts: SavedObjectsFindOptions
): Promise<Array<SavedObject<T>>> {
const { page = 1, perPage = 100, ...options } = opts;
const { page = 1, perPage = 10000, ...options } = opts;
const { saved_objects: savedObjects, total } = await savedObjectsClient.find<T>({
...options,
page,

View file

@ -116,7 +116,7 @@ describe('hash unhash url', () => {
expect(mockStorage.length).toBe(3);
});
it('hashes only whitelisted properties', () => {
it('hashes only allow-listed properties', () => {
const stateParamKey1 = '_g';
const stateParamValue1 = '(yes:!t)';
const stateParamKey2 = '_a';
@ -227,7 +227,7 @@ describe('hash unhash url', () => {
);
});
it('unhashes only whitelisted properties', () => {
it('un-hashes only allow-listed properties', () => {
const stateParamKey1 = '_g';
const stateParamValueHashed1 = 'h@4e60e02';
const state1 = { yes: true };

View file

@ -35,7 +35,7 @@ export const hashUrl = createQueryReplacer(hashQuery);
// naive hack, but this allows to decouple these utils from AppState, GlobalState for now
// when removing AppState, GlobalState and migrating to IState containers,
// need to make sure that apps explicitly passing this whitelist to hash
// need to make sure that apps explicitly passing this allow-list to hash
const __HACK_HARDCODED_LEGACY_HASHABLE_PARAMS = ['_g', '_a', '_s'];
function createQueryMapper(queryParamMapper: (q: string) => string | null) {
return (

View file

@ -0,0 +1,84 @@
/*
* Licensed to Elasticsearch B.V. under one or more contributor
* license agreements. See the NOTICE file distributed with
* this work for additional information regarding copyright
* ownership. Elasticsearch B.V. licenses this file to you under
* the Apache License, Version 2.0 (the "License"); you may
* not use this file except in compliance with the License.
* You may obtain a copy of the License at
*
* http://www.apache.org/licenses/LICENSE-2.0
*
* Unless required by applicable law or agreed to in writing,
* software distributed under the License is distributed on an
* "AS IS" BASIS, WITHOUT WARRANTIES OR CONDITIONS OF ANY
* KIND, either express or implied. See the License for the
* specific language governing permissions and limitations
* under the License.
*/
import { createInteractionPositionTracker } from './open_context_menu';
import { fireEvent } from '@testing-library/dom';
let targetEl: Element;
const top = 100;
const left = 100;
const right = 200;
const bottom = 200;
beforeEach(() => {
targetEl = document.createElement('div');
jest.spyOn(targetEl, 'getBoundingClientRect').mockImplementation(() => ({
top,
left,
right,
bottom,
width: right - left,
height: bottom - top,
x: left,
y: top,
toJSON: () => {},
}));
document.body.append(targetEl);
});
afterEach(() => {
targetEl.remove();
});
test('should use last clicked element position if mouse position is outside target element', () => {
const { resolveLastPosition } = createInteractionPositionTracker();
fireEvent.click(targetEl, { clientX: 0, clientY: 0 });
const { x, y } = resolveLastPosition();
expect(y).toBe(bottom);
expect(x).toBe(left + (right - left) / 2);
});
test('should use mouse position if mouse inside clicked element', () => {
const { resolveLastPosition } = createInteractionPositionTracker();
const mouseX = 150;
const mouseY = 150;
fireEvent.click(targetEl, { clientX: mouseX, clientY: mouseY });
const { x, y } = resolveLastPosition();
expect(y).toBe(mouseX);
expect(x).toBe(mouseY);
});
test('should use position of previous element, if latest element is no longer in DOM', () => {
const { resolveLastPosition } = createInteractionPositionTracker();
const detachedElement = document.createElement('div');
const spy = jest.spyOn(detachedElement, 'getBoundingClientRect');
fireEvent.click(targetEl);
fireEvent.click(detachedElement);
const { x, y } = resolveLastPosition();
expect(y).toBe(bottom);
expect(x).toBe(left + (right - left) / 2);
expect(spy).not.toBeCalled();
});

View file

@ -26,14 +26,86 @@ import ReactDOM from 'react-dom';
let activeSession: ContextMenuSession | null = null;
const CONTAINER_ID = 'contextMenu-container';
let initialized = false;
/**
* Tries to find best position for opening context menu using mousemove and click event
* Returned position is relative to document
*/
export function createInteractionPositionTracker() {
let lastMouseX = 0;
let lastMouseY = 0;
const lastClicks: Array<{ el?: Element; mouseX: number; mouseY: number }> = [];
const MAX_LAST_CLICKS = 10;
/**
* Track both `mouseup` and `click`
* `mouseup` is for clicks and brushes with mouse
* `click` is a fallback for keyboard interactions
*/
document.addEventListener('mouseup', onClick, true);
document.addEventListener('click', onClick, true);
document.addEventListener('mousemove', onMouseUpdate, { passive: true });
document.addEventListener('mouseenter', onMouseUpdate, { passive: true });
function onClick(event: MouseEvent) {
lastClicks.push({
el: event.target as Element,
mouseX: event.clientX,
mouseY: event.clientY,
});
if (lastClicks.length > MAX_LAST_CLICKS) {
lastClicks.shift();
}
}
function onMouseUpdate(event: MouseEvent) {
lastMouseX = event.clientX;
lastMouseY = event.clientY;
}
return {
resolveLastPosition: (): { x: number; y: number } => {
const lastClick = [...lastClicks]
.reverse()
.find(({ el }) => el && document.body.contains(el));
if (!lastClick) {
// fallback to last mouse position
return {
x: lastMouseX,
y: lastMouseY,
};
}
const { top, left, bottom, right } = lastClick.el!.getBoundingClientRect();
const mouseX = lastClick.mouseX;
const mouseY = lastClick.mouseY;
if (top <= mouseY && bottom >= mouseY && left <= mouseX && right >= mouseX) {
// click was inside target element
return {
x: mouseX,
y: mouseY,
};
} else {
// keyboard edge case. no cursor position. use target element position instead
return {
x: left + (right - left) / 2,
y: bottom,
};
}
},
};
}
const { resolveLastPosition } = createInteractionPositionTracker();
function getOrCreateContainerElement() {
let container = document.getElementById(CONTAINER_ID);
const y = getMouseY() + document.body.scrollTop;
let { x, y } = resolveLastPosition();
y = y + window.scrollY;
x = x + window.scrollX;
if (!container) {
container = document.createElement('div');
container.style.left = getMouseX() + 'px';
container.style.left = x + 'px';
container.style.top = y + 'px';
container.style.position = 'absolute';
@ -44,38 +116,12 @@ function getOrCreateContainerElement() {
container.id = CONTAINER_ID;
document.body.appendChild(container);
} else {
container.style.left = getMouseX() + 'px';
container.style.left = x + 'px';
container.style.top = y + 'px';
}
return container;
}
let x: number = 0;
let y: number = 0;
function initialize() {
if (!initialized) {
document.addEventListener('mousemove', onMouseUpdate, false);
document.addEventListener('mouseenter', onMouseUpdate, false);
initialized = true;
}
}
function onMouseUpdate(e: any) {
x = e.pageX;
y = e.pageY;
}
function getMouseX() {
return x;
}
function getMouseY() {
return y;
}
initialize();
/**
* A FlyoutSession describes the session of one opened flyout panel. It offers
* methods to close the flyout panel again. If you open a flyout panel you should make
@ -87,16 +133,6 @@ initialize();
* @extends EventEmitter
*/
class ContextMenuSession extends EventEmitter {
/**
* Binds the current flyout session to an Angular scope, meaning this flyout
* session will be closed as soon as the Angular scope gets destroyed.
* @param {object} scope - An angular scope object to bind to.
*/
public bindToAngularScope(scope: ng.IScope): void {
const removeWatch = scope.$on('$destroy', () => this.close());
this.on('closed', () => removeWatch());
}
/**
* Closes the opened flyout as long as it's still the open one.
* If this is not the active session anymore, this method won't do anything.
@ -151,6 +187,7 @@ export function openContextMenu(
panelPaddingSize="none"
anchorPosition="downRight"
withTitle
ownFocus={true}
>
<EuiContextMenu
initialPanelId="mainMenu"

View file

@ -83,14 +83,16 @@ export class CollectorSet {
);
}
const collectorTypesNotReady: string[] = [];
let allReady = true;
for (const collector of collectorSet.collectors.values()) {
if (!(await collector.isReady())) {
allReady = false;
collectorTypesNotReady.push(collector.type);
}
}
const collectorTypesNotReady = (
await Promise.all(
[...collectorSet.collectors.values()].map(async (collector) => {
if (!(await collector.isReady())) {
return collector.type;
}
})
)
).filter((collectorType): collectorType is string => !!collectorType);
const allReady = collectorTypesNotReady.length === 0;
if (!allReady && this.maximumWaitTimeForAllCollectorsInS >= 0) {
const nowTimestamp = +new Date();
@ -119,21 +121,24 @@ export class CollectorSet {
callCluster: LegacyAPICaller,
collectors: Map<string, Collector<any, any>> = this.collectors
) => {
const responses = [];
for (const collector of collectors.values()) {
this.logger.debug(`Fetching data from ${collector.type} collector`);
try {
responses.push({
type: collector.type,
result: await collector.fetch(callCluster),
});
} catch (err) {
this.logger.warn(err);
this.logger.warn(`Unable to fetch data from ${collector.type} collector`);
}
}
const responses = await Promise.all(
[...collectors.values()].map(async (collector) => {
this.logger.debug(`Fetching data from ${collector.type} collector`);
try {
return {
type: collector.type,
result: await collector.fetch(callCluster),
};
} catch (err) {
this.logger.warn(err);
this.logger.warn(`Unable to fetch data from ${collector.type} collector`);
}
})
);
return responses;
return responses.filter(
(response): response is { type: string; result: unknown } => typeof response !== 'undefined'
);
};
/*

View file

@ -19,44 +19,71 @@
import $ from 'jquery';
import moment from 'moment';
import ngMock from 'ng_mock';
import expect from '@kbn/expect';
import angular from 'angular';
import 'angular-mocks';
import sinon from 'sinon';
import './legacy';
// eslint-disable-next-line @kbn/eslint/no-restricted-paths
import { npStart } from 'ui/new_platform';
import { round } from 'lodash';
// eslint-disable-next-line @kbn/eslint/no-restricted-paths
import { getInnerAngular } from '../../../../../../plugins/vis_type_table/public/get_inner_angular';
// eslint-disable-next-line @kbn/eslint/no-restricted-paths
import { initTableVisLegacyModule } from '../../../../../../plugins/vis_type_table/public/table_vis_legacy_module';
import { getFieldFormatsRegistry } from '../../../../test_utils/public/stub_field_formats';
import { coreMock } from '../../../../core/public/mocks';
import { initAngularBootstrap } from '../../../kibana_legacy/public';
import { setUiSettings } from '../../../data/public/services';
import { UI_SETTINGS } from '../../../data/public/';
import { CSV_SEPARATOR_SETTING, CSV_QUOTE_VALUES_SETTING } from '../../../share/public';
import { setFormatService } from '../services';
import { getInnerAngular } from '../get_inner_angular';
import { initTableVisLegacyModule } from '../table_vis_legacy_module';
import { tabifiedData } from './tabified_data';
// eslint-disable-next-line @kbn/eslint/no-restricted-paths
import { configureAppAngularModule } from '../../../../../../plugins/kibana_legacy/public/angular';
const uiSettings = new Map();
describe('Table Vis - AggTable Directive', function () {
const core = coreMock.createStart();
core.uiSettings.set = jest.fn((key, value) => {
uiSettings.set(key, value);
});
core.uiSettings.get = jest.fn((key) => {
const defaultValues = {
dateFormat: 'MMM D, YYYY @ HH:mm:ss.SSS',
'dateFormat:tz': 'UTC',
[UI_SETTINGS.SHORT_DOTS_ENABLE]: true,
[UI_SETTINGS.FORMAT_CURRENCY_DEFAULT_PATTERN]: '($0,0.[00])',
[UI_SETTINGS.FORMAT_NUMBER_DEFAULT_PATTERN]: '0,0.[000]',
[UI_SETTINGS.FORMAT_PERCENT_DEFAULT_PATTERN]: '0,0.[000]%',
[UI_SETTINGS.FORMAT_NUMBER_DEFAULT_LOCALE]: 'en',
[UI_SETTINGS.FORMAT_DEFAULT_TYPE_MAP]: {},
[CSV_SEPARATOR_SETTING]: ',',
[CSV_QUOTE_VALUES_SETTING]: true,
};
return defaultValues[key] || uiSettings.get(key);
});
let $rootScope;
let $compile;
let settings;
const initLocalAngular = () => {
const tableVisModule = getInnerAngular('kibana/table_vis', npStart.core);
configureAppAngularModule(tableVisModule, npStart.core, true);
const tableVisModule = getInnerAngular('kibana/table_vis', core);
initTableVisLegacyModule(tableVisModule);
};
beforeEach(initLocalAngular);
beforeEach(ngMock.module('kibana/table_vis'));
beforeEach(
ngMock.inject(function ($injector, config) {
beforeEach(() => {
setUiSettings(core.uiSettings);
setFormatService(getFieldFormatsRegistry(core));
initAngularBootstrap();
initLocalAngular();
angular.mock.module('kibana/table_vis');
angular.mock.inject(($injector, config) => {
settings = config;
$rootScope = $injector.get('$rootScope');
$compile = $injector.get('$compile');
})
);
});
});
let $scope;
beforeEach(function () {
@ -66,7 +93,7 @@ describe('Table Vis - AggTable Directive', function () {
$scope.$destroy();
});
it('renders a simple response properly', function () {
test('renders a simple response properly', function () {
$scope.dimensions = {
metrics: [{ accessor: 0, format: { id: 'number' }, params: {} }],
buckets: [],
@ -78,12 +105,12 @@ describe('Table Vis - AggTable Directive', function () {
);
$scope.$digest();
expect($el.find('tbody').length).to.be(1);
expect($el.find('td').length).to.be(1);
expect($el.find('td').text()).to.eql('1,000');
expect($el.find('tbody').length).toBe(1);
expect($el.find('td').length).toBe(1);
expect($el.find('td').text()).toEqual('1,000');
});
it('renders nothing if the table is empty', function () {
test('renders nothing if the table is empty', function () {
$scope.dimensions = {};
$scope.table = null;
const $el = $compile('<kbn-agg-table table="table" dimensions="dimensions"></kbn-agg-table>')(
@ -91,10 +118,10 @@ describe('Table Vis - AggTable Directive', function () {
);
$scope.$digest();
expect($el.find('tbody').length).to.be(0);
expect($el.find('tbody').length).toBe(0);
});
it('renders a complex response properly', async function () {
test('renders a complex response properly', async function () {
$scope.dimensions = {
buckets: [
{ accessor: 0, params: {} },
@ -112,37 +139,37 @@ describe('Table Vis - AggTable Directive', function () {
$compile($el)($scope);
$scope.$digest();
expect($el.find('tbody').length).to.be(1);
expect($el.find('tbody').length).toBe(1);
const $rows = $el.find('tbody tr');
expect($rows.length).to.be.greaterThan(0);
expect($rows.length).toBeGreaterThan(0);
function validBytes(str) {
const num = str.replace(/,/g, '');
if (num !== '-') {
expect(num).to.match(/^\d+$/);
expect(num).toMatch(/^\d+$/);
}
}
$rows.each(function () {
// 6 cells in every row
const $cells = $(this).find('td');
expect($cells.length).to.be(6);
expect($cells.length).toBe(6);
const txts = $cells.map(function () {
return $(this).text().trim();
});
// two character country code
expect(txts[0]).to.match(/^(png|jpg|gif|html|css)$/);
expect(txts[0]).toMatch(/^(png|jpg|gif|html|css)$/);
validBytes(txts[1]);
// country
expect(txts[2]).to.match(/^\w\w$/);
expect(txts[2]).toMatch(/^\w\w$/);
validBytes(txts[3]);
// os
expect(txts[4]).to.match(/^(win|mac|linux)$/);
expect(txts[4]).toMatch(/^(win|mac|linux)$/);
validBytes(txts[5]);
});
});
@ -153,9 +180,9 @@ describe('Table Vis - AggTable Directive', function () {
moment.tz.setDefault(settings.get('dateFormat:tz'));
}
const off = $scope.$on('change:config.dateFormat:tz', setDefaultTimezone);
const oldTimezoneSetting = settings.get('dateFormat:tz');
settings.set('dateFormat:tz', 'UTC');
setDefaultTimezone();
$scope.dimensions = {
buckets: [
@ -181,24 +208,24 @@ describe('Table Vis - AggTable Directive', function () {
$compile($el)($scope);
$scope.$digest();
expect($el.find('tfoot').length).to.be(1);
expect($el.find('tfoot').length).toBe(1);
const $rows = $el.find('tfoot tr');
expect($rows.length).to.be(1);
expect($rows.length).toBe(1);
const $cells = $($rows[0]).find('th');
expect($cells.length).to.be(6);
expect($cells.length).toBe(6);
for (let i = 0; i < 6; i++) {
expect($($cells[i]).text().trim()).to.be(expected[i]);
expect($($cells[i]).text().trim()).toBe(expected[i]);
}
settings.set('dateFormat:tz', oldTimezoneSetting);
off();
setDefaultTimezone();
}
it('as count', async function () {
test('as count', async function () {
await totalsRowTest('count', ['18', '18', '18', '18', '18', '18']);
});
it('as min', async function () {
test('as min', async function () {
await totalsRowTest('min', [
'',
'2014-09-28',
@ -208,7 +235,7 @@ describe('Table Vis - AggTable Directive', function () {
'11',
]);
});
it('as max', async function () {
test('as max', async function () {
await totalsRowTest('max', [
'',
'2014-10-03',
@ -218,16 +245,16 @@ describe('Table Vis - AggTable Directive', function () {
'837',
]);
});
it('as avg', async function () {
test('as avg', async function () {
await totalsRowTest('avg', ['', '', '87,221.5', '', '64.667', '206.833']);
});
it('as sum', async function () {
test('as sum', async function () {
await totalsRowTest('sum', ['', '', '1,569,987', '', '1,164', '3,723']);
});
});
describe('aggTable.toCsv()', function () {
it('escapes rows and columns properly', function () {
test('escapes rows and columns properly', function () {
const $el = $compile('<kbn-agg-table table="table" dimensions="dimensions"></kbn-agg-table>')(
$scope
);
@ -244,12 +271,12 @@ describe('Table Vis - AggTable Directive', function () {
rows: [{ a: 1, b: 2, c: '"foobar"' }],
};
expect(aggTable.toCsv()).to.be(
expect(aggTable.toCsv()).toBe(
'one,two,"with double-quotes("")"' + '\r\n' + '1,2,"""foobar"""' + '\r\n'
);
});
it('exports rows and columns properly', async function () {
test('exports rows and columns properly', async function () {
$scope.dimensions = {
buckets: [
{ accessor: 0, params: {} },
@ -274,7 +301,7 @@ describe('Table Vis - AggTable Directive', function () {
$tableScope.table = $scope.table;
const raw = aggTable.toCsv(false);
expect(raw).to.be(
expect(raw).toBe(
'"extension: Descending","Average bytes","geo.src: Descending","Average bytes","machine.os: Descending","Average bytes"' +
'\r\n' +
'png,412032,IT,9299,win,0' +
@ -304,7 +331,7 @@ describe('Table Vis - AggTable Directive', function () {
);
});
it('exports formatted rows and columns properly', async function () {
test('exports formatted rows and columns properly', async function () {
$scope.dimensions = {
buckets: [
{ accessor: 0, params: {} },
@ -332,7 +359,7 @@ describe('Table Vis - AggTable Directive', function () {
$tableScope.formattedColumns[0].formatter.convert = (v) => `${v}_formatted`;
const formatted = aggTable.toCsv(true);
expect(formatted).to.be(
expect(formatted).toBe(
'"extension: Descending","Average bytes","geo.src: Descending","Average bytes","machine.os: Descending","Average bytes"' +
'\r\n' +
'"png_formatted",412032,IT,9299,win,0' +
@ -363,7 +390,7 @@ describe('Table Vis - AggTable Directive', function () {
});
});
it('renders percentage columns', async function () {
test('renders percentage columns', async function () {
$scope.dimensions = {
buckets: [
{ accessor: 0, params: {} },
@ -390,8 +417,8 @@ describe('Table Vis - AggTable Directive', function () {
$scope.$digest();
const $headings = $el.find('th');
expect($headings.length).to.be(7);
expect($headings.eq(3).text().trim()).to.be('Average bytes percentages');
expect($headings.length).toBe(7);
expect($headings.eq(3).text().trim()).toBe('Average bytes percentages');
const countColId = $scope.table.columns.find((col) => col.name === $scope.percentageCol).id;
const counts = $scope.table.rows.map((row) => row[countColId]);
@ -400,7 +427,7 @@ describe('Table Vis - AggTable Directive', function () {
$percentageColValues.each((i, value) => {
const percentage = `${round((counts[i] / total) * 100, 3)}%`;
expect(value).to.be(percentage);
expect(value).toBe(percentage);
});
});
@ -420,7 +447,7 @@ describe('Table Vis - AggTable Directive', function () {
window.Blob = origBlob;
});
it('calls _saveAs properly', function () {
test('calls _saveAs properly', function () {
const $el = $compile('<kbn-agg-table table="table" dimensions="dimensions">')($scope);
$scope.$digest();
@ -440,19 +467,19 @@ describe('Table Vis - AggTable Directive', function () {
aggTable.csv.filename = 'somefilename.csv';
aggTable.exportAsCsv();
expect(saveAs.callCount).to.be(1);
expect(saveAs.callCount).toBe(1);
const call = saveAs.getCall(0);
expect(call.args[0]).to.be.a(FakeBlob);
expect(call.args[0].slices).to.eql([
expect(call.args[0]).toBeInstanceOf(FakeBlob);
expect(call.args[0].slices).toEqual([
'one,two,"with double-quotes("")"' + '\r\n' + '1,2,"""foobar"""' + '\r\n',
]);
expect(call.args[0].opts).to.eql({
expect(call.args[0].opts).toEqual({
type: 'text/plain;charset=utf-8',
});
expect(call.args[1]).to.be('somefilename.csv');
expect(call.args[1]).toBe('somefilename.csv');
});
it('should use the export-title attribute', function () {
test('should use the export-title attribute', function () {
const expected = 'export file name';
const $el = $compile(
`<kbn-agg-table table="table" dimensions="dimensions" export-title="exportTitle">`
@ -468,7 +495,7 @@ describe('Table Vis - AggTable Directive', function () {
$tableScope.exportTitle = expected;
$scope.$digest();
expect(aggTable.csv.filename).to.equal(`${expected}.csv`);
expect(aggTable.csv.filename).toEqual(`${expected}.csv`);
});
});
});

View file

@ -18,38 +18,50 @@
*/
import $ from 'jquery';
import ngMock from 'ng_mock';
import angular from 'angular';
import 'angular-mocks';
import expect from '@kbn/expect';
import './legacy';
// eslint-disable-next-line @kbn/eslint/no-restricted-paths
import { getInnerAngular } from '../../../../../../plugins/vis_type_table/public/get_inner_angular';
// eslint-disable-next-line @kbn/eslint/no-restricted-paths
import { initTableVisLegacyModule } from '../../../../../../plugins/vis_type_table/public/table_vis_legacy_module';
import { getFieldFormatsRegistry } from '../../../../test_utils/public/stub_field_formats';
import { coreMock } from '../../../../core/public/mocks';
import { initAngularBootstrap } from '../../../kibana_legacy/public';
import { setUiSettings } from '../../../data/public/services';
import { setFormatService } from '../services';
import { getInnerAngular } from '../get_inner_angular';
import { initTableVisLegacyModule } from '../table_vis_legacy_module';
import { tabifiedData } from './tabified_data';
// eslint-disable-next-line @kbn/eslint/no-restricted-paths
import { npStart } from 'ui/new_platform';
// eslint-disable-next-line @kbn/eslint/no-restricted-paths
import { configureAppAngularModule } from '../../../../../../plugins/kibana_legacy/public/angular';
const uiSettings = new Map();
describe('Table Vis - AggTableGroup Directive', function () {
const core = coreMock.createStart();
let $rootScope;
let $compile;
core.uiSettings.set = jest.fn((key, value) => {
uiSettings.set(key, value);
});
core.uiSettings.get = jest.fn((key) => {
return uiSettings.get(key);
});
const initLocalAngular = () => {
const tableVisModule = getInnerAngular('kibana/table_vis', npStart.core);
configureAppAngularModule(tableVisModule, npStart.core, true);
const tableVisModule = getInnerAngular('kibana/table_vis', core);
initTableVisLegacyModule(tableVisModule);
};
beforeEach(initLocalAngular);
beforeEach(ngMock.module('kibana/table_vis'));
beforeEach(
ngMock.inject(function ($injector) {
beforeEach(() => {
setUiSettings(core.uiSettings);
setFormatService(getFieldFormatsRegistry(core));
initAngularBootstrap();
initLocalAngular();
angular.mock.module('kibana/table_vis');
angular.mock.inject(($injector) => {
$rootScope = $injector.get('$rootScope');
$compile = $injector.get('$compile');
})
);
});
});
let $scope;
beforeEach(function () {

View file

@ -19,6 +19,7 @@
import $ from 'jquery';
import _ from 'lodash';
import angular from 'angular';
import tableCellFilterHtml from './table_cell_filter.html';
export function KbnRows($compile) {
@ -65,7 +66,9 @@ export function KbnRows($compile) {
if (column.filterable && contentsIsDefined) {
$cell = createFilterableCell(contents);
$cellContent = $cell.find('[data-cell-content]');
// in jest tests 'angular' is using jqLite. In jqLite the method find lookups only by tags.
// Because of this, we should change a way how we get cell content so that tests will pass.
$cellContent = angular.element($cell[0].querySelector('[data-cell-content]'));
} else {
$cell = $cellContent = createCell();
}

View file

@ -20,11 +20,20 @@
import { buildProcessorFunction } from '../build_processor_function';
import { processors } from '../response_processors/table';
import { getLastValue } from '../../../../common/get_last_value';
import regression from 'regression';
import { first, get } from 'lodash';
import { overwrite } from '../helpers';
import { getActiveSeries } from '../helpers/get_active_series';
function trendSinceLastBucket(data) {
if (data.length < 2) {
return 0;
}
const currentBucket = data[data.length - 1];
const prevBucket = data[data.length - 2];
const trend = (currentBucket[1] - prevBucket[1]) / currentBucket[1];
return Number.isNaN(trend) ? 0 : trend;
}
export function processBucket(panel) {
return (bucket) => {
const series = getActiveSeries(panel).map((series) => {
@ -38,14 +47,12 @@ export function processBucket(panel) {
};
overwrite(bucket, series.id, { meta, timeseries });
}
const processor = buildProcessorFunction(processors, bucket, panel, series);
const result = first(processor([]));
if (!result) return null;
const data = get(result, 'data', []);
const linearRegression = regression.linear(data);
result.slope = trendSinceLastBucket(data);
result.last = getLastValue(data);
result.slope = linearRegression.equation[0];
return result;
});
return { key: bucket.key, series };

View file

@ -0,0 +1,159 @@
/*
* Licensed to Elasticsearch B.V. under one or more contributor
* license agreements. See the NOTICE file distributed with
* this work for additional information regarding copyright
* ownership. Elasticsearch B.V. licenses this file to you under
* the Apache License, Version 2.0 (the "License"); you may
* not use this file except in compliance with the License.
* You may obtain a copy of the License at
*
* http://www.apache.org/licenses/LICENSE-2.0
*
* Unless required by applicable law or agreed to in writing,
* software distributed under the License is distributed on an
* "AS IS" BASIS, WITHOUT WARRANTIES OR CONDITIONS OF ANY
* KIND, either express or implied. See the License for the
* specific language governing permissions and limitations
* under the License.
*/
import { processBucket } from './process_bucket';
function createValueObject(key, value, seriesId) {
return { key_as_string: `${key}`, doc_count: value, key, [seriesId]: { value } };
}
function createBucketsObjects(size, sort, seriesId) {
const values = Array(size)
.fill(1)
.map((_, i) => i + 1);
if (sort === 'flat') {
return values.map((_, i) => createValueObject(i, 1, seriesId));
}
if (sort === 'desc') {
return values.reverse().map((v, i) => createValueObject(i, v, seriesId));
}
return values.map((v, i) => createValueObject(i, v, seriesId));
}
function createPanel(series) {
return {
type: 'table',
time_field: '',
series: series.map((seriesId) => ({
id: seriesId,
metrics: [{ id: seriesId, type: 'count' }],
trend_arrows: 1,
})),
};
}
function createBuckets(series) {
return [
{ key: 'A', trend: 'asc', size: 10 },
{ key: 'B', trend: 'desc', size: 10 },
{ key: 'C', trend: 'flat', size: 10 },
{ key: 'D', trend: 'asc', size: 1, expectedTrend: 'flat' },
].map(({ key, trend, size, expectedTrend }) => {
const baseObj = {
key,
expectedTrend: expectedTrend || trend,
};
for (const seriesId of series) {
baseObj[seriesId] = {
meta: {
timeField: 'timestamp',
seriesId: seriesId,
},
buckets: createBucketsObjects(size, trend, seriesId),
};
}
return baseObj;
});
}
function trendChecker(trend, slope) {
switch (trend) {
case 'asc':
return slope > 0;
case 'desc':
return slope <= 0;
case 'flat':
return slope === 0;
default:
throw Error(`Slope value ${slope} not valid for trend "${trend}"`);
}
}
describe('processBucket(panel)', () => {
describe('single metric panel', () => {
let panel;
const SERIES_ID = 'series-id';
beforeEach(() => {
panel = createPanel([SERIES_ID]);
});
test('return the correct trend direction', () => {
const bucketProcessor = processBucket(panel);
const buckets = createBuckets([SERIES_ID]);
for (const bucket of buckets) {
const result = bucketProcessor(bucket);
expect(result.key).toEqual(bucket.key);
expect(trendChecker(bucket.expectedTrend, result.series[0].slope)).toBeTruthy();
}
});
test('properly handle 0 values for trend', () => {
const bucketProcessor = processBucket(panel);
const bucketforNaNResult = {
key: 'NaNScenario',
expectedTrend: 'flat',
[SERIES_ID]: {
meta: {
timeField: 'timestamp',
seriesId: SERIES_ID,
},
buckets: [
// this is a flat case, but 0/0 has not a valid number result
createValueObject(0, 0, SERIES_ID),
createValueObject(1, 0, SERIES_ID),
],
},
};
const result = bucketProcessor(bucketforNaNResult);
expect(result.key).toEqual(bucketforNaNResult.key);
expect(trendChecker(bucketforNaNResult.expectedTrend, result.series[0].slope)).toEqual(true);
});
test('have the side effect to create the timeseries property if missing on bucket', () => {
const bucketProcessor = processBucket(panel);
const buckets = createBuckets([SERIES_ID]);
for (const bucket of buckets) {
bucketProcessor(bucket);
expect(bucket[SERIES_ID].buckets).toBeUndefined();
expect(bucket[SERIES_ID].timeseries).toBeDefined();
}
});
});
describe('multiple metrics panel', () => {
let panel;
const SERIES = ['series-id-1', 'series-id-2'];
beforeEach(() => {
panel = createPanel(SERIES);
});
test('return the correct trend direction', () => {
const bucketProcessor = processBucket(panel);
const buckets = createBuckets(SERIES);
for (const bucket of buckets) {
const result = bucketProcessor(bucket);
expect(result.key).toEqual(bucket.key);
expect(trendChecker(bucket.expectedTrend, result.series[0].slope)).toBeTruthy();
expect(trendChecker(bucket.expectedTrend, result.series[1].slope)).toBeTruthy();
}
});
});
});

View file

@ -254,6 +254,19 @@ export default function ({ getService, getPageObjects }) {
});
});
describe('invalid time range in URL', function () {
it('should get the default timerange', async function () {
const prevTime = await PageObjects.timePicker.getTimeConfig();
await PageObjects.common.navigateToUrl('discover', '#/?_g=(time:(from:now-15m,to:null))', {
useActualUrl: true,
});
await PageObjects.header.awaitKibanaChrome();
const time = await PageObjects.timePicker.getTimeConfig();
expect(time.start).to.be(prevTime.start);
expect(time.end).to.be(prevTime.end);
});
});
describe('empty query', function () {
it('should update the histogram timerange when the query is resubmitted', async function () {
await kibanaServer.uiSettings.update({
@ -268,17 +281,6 @@ export default function ({ getService, getPageObjects }) {
});
});
describe('invalid time range in URL', function () {
it('should display a "Invalid time range toast"', async function () {
await PageObjects.common.navigateToUrl('discover', '#/?_g=(time:(from:now-15m,to:null))', {
useActualUrl: true,
});
await PageObjects.header.awaitKibanaChrome();
const toastMessage = await PageObjects.common.closeToast();
expect(toastMessage).to.be('Invalid time range');
});
});
describe('managing fields', function () {
it('should add a field, sort by it, remove it and also sorting by it', async function () {
await PageObjects.timePicker.setDefaultAbsoluteRangeViaUiSettings();

View file

@ -29,7 +29,7 @@ export default function ({ getService, getPageObjects }) {
const retry = getService('retry');
// Flaky: https://github.com/elastic/kibana/issues/71216
describe.skip('doc link in discover', function contextSize() {
describe('doc link in discover', function contextSize() {
beforeEach(async function () {
log.debug('load kibana index with default index pattern');
await esArchiver.loadIfNeeded('discover');
@ -63,20 +63,28 @@ export default function ({ getService, getPageObjects }) {
await filterBar.addFilter('agent', 'is', 'Missing/Fields');
await PageObjects.discover.waitUntilSearchingHasFinished();
// navigate to the doc view
await docTable.clickRowToggle({ rowIndex: 0 });
await retry.try(async () => {
// navigate to the doc view
await docTable.clickRowToggle({ rowIndex: 0 });
const details = await docTable.getDetailsRow();
await docTable.addInclusiveFilter(details, 'referer');
await PageObjects.discover.waitUntilSearchingHasFinished();
const details = await docTable.getDetailsRow();
await docTable.addInclusiveFilter(details, 'referer');
await PageObjects.discover.waitUntilSearchingHasFinished();
const hasInclusiveFilter = await filterBar.hasFilter('referer', 'exists', true, false, true);
expect(hasInclusiveFilter).to.be(true);
const hasInclusiveFilter = await filterBar.hasFilter(
'referer',
'exists',
true,
false,
true
);
expect(hasInclusiveFilter).to.be(true);
await docTable.removeInclusiveFilter(details, 'referer');
await PageObjects.discover.waitUntilSearchingHasFinished();
const hasExcludeFilter = await filterBar.hasFilter('referer', 'exists', true, false, false);
expect(hasExcludeFilter).to.be(true);
await docTable.removeInclusiveFilter(details, 'referer');
await PageObjects.discover.waitUntilSearchingHasFinished();
const hasExcludeFilter = await filterBar.hasFilter('referer', 'exists', true, false, false);
expect(hasExcludeFilter).to.be(true);
});
});
});
}

View file

@ -179,9 +179,12 @@ export function ListingTableProvider({ getService, getPageObjects }: FtrProvider
* @param promptBtnTestSubj testSubj locator for Prompt button
*/
public async clickNewButton(promptBtnTestSubj: string): Promise<void> {
await retry.try(async () => {
await retry.tryForTime(20000, async () => {
// newItemButton button is only visible when there are items in the listing table is displayed.
if (await testSubjects.exists('newItemButton')) {
const isnNewItemButtonPresent = await testSubjects.exists('newItemButton', {
timeout: 5000,
});
if (isnNewItemButtonPresent) {
await testSubjects.click('newItemButton');
} else {
// no items exist, click createPromptButton to create new dashboard/visualization

1
x-pack/.gitignore vendored
View file

@ -6,6 +6,7 @@
/test/page_load_metrics/screenshots
/test/functional/apps/reporting/reports/session
/test/reporting/configs/failure_debug/
/plugins/reporting/.chromium/
/legacy/plugins/reporting/.chromium/
/legacy/plugins/reporting/.phantom/
/plugins/reporting/chromium/

View file

@ -4,5 +4,16 @@
* you may not use this file except in compliance with the Elastic License.
*/
import { i18n } from '@kbn/i18n';
export const ENVIRONMENT_ALL = 'ENVIRONMENT_ALL';
export const ENVIRONMENT_NOT_DEFINED = 'ENVIRONMENT_NOT_DEFINED';
export function getEnvironmentLabel(environment: string) {
if (environment === ENVIRONMENT_NOT_DEFINED) {
return i18n.translate('xpack.apm.filter.environment.notDefinedLabel', {
defaultMessage: 'Not defined',
});
}
return environment;
}

View file

@ -20,6 +20,7 @@ import { EuiTabLink } from '../../shared/EuiTabLink';
import { ServiceMapLink } from '../../shared/Links/apm/ServiceMapLink';
import { ServiceOverviewLink } from '../../shared/Links/apm/ServiceOverviewLink';
import { SettingsLink } from '../../shared/Links/apm/SettingsLink';
import { AnomalyDetectionSetupLink } from '../../shared/Links/apm/AnomalyDetectionSetupLink';
import { TraceOverviewLink } from '../../shared/Links/apm/TraceOverviewLink';
import { SetupInstructionsLink } from '../../shared/Links/SetupInstructionsLink';
import { ServiceMap } from '../ServiceMap';
@ -118,6 +119,9 @@ export function Home({ tab }: Props) {
</EuiButtonEmpty>
</SettingsLink>
</EuiFlexItem>
<EuiFlexItem grow={false}>
<AnomalyDetectionSetupLink />
</EuiFlexItem>
<EuiFlexItem grow={false}>
<SetupInstructionsLink />
</EuiFlexItem>

View file

@ -22,7 +22,7 @@ import { i18n } from '@kbn/i18n';
import { useFetcher, FETCH_STATUS } from '../../../../hooks/useFetcher';
import { useApmPluginContext } from '../../../../hooks/useApmPluginContext';
import { createJobs } from './create_jobs';
import { ENVIRONMENT_NOT_DEFINED } from '../../../../../common/environment_filter_values';
import { getEnvironmentLabel } from '../../../../../common/environment_filter_values';
interface Props {
currentEnvironments: string[];
@ -45,11 +45,13 @@ export const AddEnvironments = ({
);
const environmentOptions = data.map((env) => ({
label: env === ENVIRONMENT_NOT_DEFINED ? NOT_DEFINED_OPTION_LABEL : env,
label: getEnvironmentLabel(env),
value: env,
disabled: currentEnvironments.includes(env),
}));
const [isSaving, setIsSaving] = useState(false);
const [selectedOptions, setSelected] = useState<
Array<EuiComboBoxOptionOption<string>>
>([]);
@ -127,9 +129,12 @@ export const AddEnvironments = ({
</EuiFlexItem>
<EuiFlexItem grow={false}>
<EuiButton
isLoading={isSaving}
isDisabled={isSaving || selectedOptions.length === 0}
fill
disabled={selectedOptions.length === 0}
onClick={async () => {
setIsSaving(true);
const selectedEnvironments = selectedOptions.map(
({ value }) => value as string
);
@ -140,6 +145,7 @@ export const AddEnvironments = ({
if (success) {
onCreateJobSuccess();
}
setIsSaving(false);
}}
>
{i18n.translate(
@ -155,10 +161,3 @@ export const AddEnvironments = ({
</EuiPanel>
);
};
const NOT_DEFINED_OPTION_LABEL = i18n.translate(
'xpack.apm.filter.environment.notDefinedLabel',
{
defaultMessage: 'Not defined',
}
);

View file

@ -15,7 +15,11 @@ import { LicensePrompt } from '../../../shared/LicensePrompt';
import { useLicense } from '../../../../hooks/useLicense';
import { APIReturnType } from '../../../../services/rest/createCallApmApi';
const DEFAULT_VALUE: APIReturnType<'/api/apm/settings/anomaly-detection'> = {
export type AnomalyDetectionApiResponse = APIReturnType<
'/api/apm/settings/anomaly-detection'
>;
const DEFAULT_VALUE: AnomalyDetectionApiResponse = {
jobs: [],
hasLegacyJobs: false,
};
@ -80,7 +84,7 @@ export const AnomalyDetection = () => {
) : (
<JobsList
status={status}
anomalyDetectionJobsByEnv={data.jobs}
jobs={data.jobs}
hasLegacyJobs={data.hasLegacyJobs}
onAddEnvironments={() => {
setViewAddEnvironments(true);

View file

@ -19,27 +19,22 @@ import { FormattedMessage } from '@kbn/i18n/react';
import { FETCH_STATUS } from '../../../../hooks/useFetcher';
import { ITableColumn, ManagedTable } from '../../../shared/ManagedTable';
import { LoadingStatePrompt } from '../../../shared/LoadingStatePrompt';
import { AnomalyDetectionJobByEnv } from '../../../../../typings/anomaly_detection';
import { MLJobLink } from '../../../shared/Links/MachineLearningLinks/MLJobLink';
import { MLLink } from '../../../shared/Links/MachineLearningLinks/MLLink';
import { ENVIRONMENT_NOT_DEFINED } from '../../../../../common/environment_filter_values';
import { getEnvironmentLabel } from '../../../../../common/environment_filter_values';
import { LegacyJobsCallout } from './legacy_jobs_callout';
import { AnomalyDetectionApiResponse } from './index';
const columns: Array<ITableColumn<AnomalyDetectionJobByEnv>> = [
type Jobs = AnomalyDetectionApiResponse['jobs'];
const columns: Array<ITableColumn<Jobs[0]>> = [
{
field: 'environment',
name: i18n.translate(
'xpack.apm.settings.anomalyDetection.jobList.environmentColumnLabel',
{ defaultMessage: 'Environment' }
),
render: (environment: string) => {
if (environment === ENVIRONMENT_NOT_DEFINED) {
return i18n.translate('xpack.apm.filter.environment.notDefinedLabel', {
defaultMessage: 'Not defined',
});
}
return environment;
},
render: getEnvironmentLabel,
},
{
field: 'job_id',
@ -64,13 +59,13 @@ const columns: Array<ITableColumn<AnomalyDetectionJobByEnv>> = [
interface Props {
status: FETCH_STATUS;
onAddEnvironments: () => void;
anomalyDetectionJobsByEnv: AnomalyDetectionJobByEnv[];
jobs: Jobs;
hasLegacyJobs: boolean;
}
export const JobsList = ({
status,
onAddEnvironments,
anomalyDetectionJobsByEnv,
jobs,
hasLegacyJobs,
}: Props) => {
const isLoading =
@ -135,7 +130,7 @@ export const JobsList = ({
)
}
columns={columns}
items={isLoading || hasFetchFailure ? [] : anomalyDetectionJobsByEnv}
items={jobs}
/>
<EuiSpacer size="l" />

View file

@ -89,7 +89,6 @@ export function TransactionDetails() {
<EuiSpacer size="s" />
<TransactionCharts
hasMLJob={false}
charts={transactionChartsData}
urlParams={urlParams}
location={location}

View file

@ -130,8 +130,6 @@ export function TransactionOverview() {
<EuiSpacer size="s" />
<TransactionCharts
// TODO [APM ML] set hasMLJob prop when ML integration is reintroduced:
hasMLJob={false}
charts={transactionCharts}
location={location}
urlParams={urlParams}

View file

@ -0,0 +1,63 @@
/*
* Copyright Elasticsearch B.V. and/or licensed to Elasticsearch B.V. under one
* or more contributor license agreements. Licensed under the Elastic License;
* you may not use this file except in compliance with the Elastic License.
*/
import React from 'react';
import { EuiButtonEmpty, EuiToolTip, EuiIcon } from '@elastic/eui';
import { i18n } from '@kbn/i18n';
import { APMLink } from './APMLink';
import { getEnvironmentLabel } from '../../../../../common/environment_filter_values';
import { useUrlParams } from '../../../../hooks/useUrlParams';
import { useFetcher, FETCH_STATUS } from '../../../../hooks/useFetcher';
export function AnomalyDetectionSetupLink() {
const { uiFilters } = useUrlParams();
const environment = uiFilters.environment;
const { data = { jobs: [], hasLegacyJobs: false }, status } = useFetcher(
(callApmApi) =>
callApmApi({ pathname: `/api/apm/settings/anomaly-detection` }),
[],
{ preservePreviousData: false }
);
const isFetchSuccess = status === FETCH_STATUS.SUCCESS;
// Show alert if there are no jobs OR if no job matches the current environment
const showAlert =
isFetchSuccess && !data.jobs.some((job) => environment === job.environment);
return (
<APMLink path="/settings/anomaly-detection">
<EuiButtonEmpty size="s" color="primary" iconType="inspect">
{ANOMALY_DETECTION_LINK_LABEL}
</EuiButtonEmpty>
{showAlert && (
<EuiToolTip position="bottom" content={getTooltipText(environment)}>
<EuiIcon type="alert" color="danger" />
</EuiToolTip>
)}
</APMLink>
);
}
function getTooltipText(environment?: string) {
if (!environment) {
return i18n.translate('xpack.apm.anomalyDetectionSetup.notEnabledText', {
defaultMessage: `Anomaly detection is not yet enabled. Click to continue setup.`,
});
}
return i18n.translate(
'xpack.apm.anomalyDetectionSetup.notEnabledForEnvironmentText',
{
defaultMessage: `Anomaly detection is not yet enabled for the "{currentEnvironment}" environment. Click to continue setup.`,
values: { currentEnvironment: getEnvironmentLabel(environment) },
}
);
}
const ANOMALY_DETECTION_LINK_LABEL = i18n.translate(
'xpack.apm.anomalyDetectionSetup.linkLabel',
{ defaultMessage: `Anomaly detection` }
);

View file

@ -42,7 +42,6 @@ import {
} from '../../../../../common/transaction_types';
interface TransactionChartProps {
hasMLJob: boolean;
charts: ITransactionChartData;
location: Location;
urlParams: IUrlParams;
@ -96,18 +95,17 @@ export class TransactionCharts extends Component<TransactionChartProps> {
};
public renderMLHeader(hasValidMlLicense: boolean | undefined) {
const { hasMLJob } = this.props;
if (!hasValidMlLicense || !hasMLJob) {
const { mlJobId } = this.props.charts;
if (!hasValidMlLicense || !mlJobId) {
return null;
}
const { serviceName, kuery } = this.props.urlParams;
const { serviceName, kuery, transactionType } = this.props.urlParams;
if (!serviceName) {
return null;
}
const linkedJobId = ''; // TODO [APM ML] link to ML job id for the selected environment
const hasKuery = !isEmpty(kuery);
const icon = hasKuery ? (
<EuiIconTip
@ -140,7 +138,13 @@ export class TransactionCharts extends Component<TransactionChartProps> {
}
)}{' '}
</span>
<MLJobLink jobId={linkedJobId}>View Job</MLJobLink>
<MLJobLink
jobId={mlJobId}
serviceName={serviceName}
transactionType={transactionType}
>
View Job
</MLJobLink>
</ShiftedEuiText>
</EuiFlexItem>
);

View file

@ -33,6 +33,7 @@ export interface ITpmBucket {
export interface ITransactionChartData {
tpmSeries: ITpmBucket[];
responseTimeSeries: TimeSeries[];
mlJobId: string | undefined;
}
const INITIAL_DATA = {
@ -62,6 +63,7 @@ export function getTransactionCharts(
return {
tpmSeries,
responseTimeSeries,
mlJobId: anomalyTimeseries?.jobId,
};
}

View file

@ -10,12 +10,11 @@ import { snakeCase } from 'lodash';
import { PromiseReturnType } from '../../../../observability/typings/common';
import { Setup } from '../helpers/setup_request';
import {
SERVICE_ENVIRONMENT,
TRANSACTION_DURATION,
PROCESSOR_EVENT,
} from '../../../common/elasticsearch_fieldnames';
import { ENVIRONMENT_NOT_DEFINED } from '../../../common/environment_filter_values';
import { APM_ML_JOB_GROUP, ML_MODULE_ID_APM_TRANSACTION } from './constants';
import { getEnvironmentUiFilterES } from '../helpers/convert_ui_filters/get_environment_ui_filter_es';
export type CreateAnomalyDetectionJobsAPIResponse = PromiseReturnType<
typeof createAnomalyDetectionJobs
@ -89,9 +88,7 @@ async function createAnomalyDetectionJob({
filter: [
{ term: { [PROCESSOR_EVENT]: 'transaction' } },
{ exists: { field: TRANSACTION_DURATION } },
environment === ENVIRONMENT_NOT_DEFINED
? ENVIRONMENT_NOT_DEFINED_FILTER
: { term: { [SERVICE_ENVIRONMENT]: environment } },
...getEnvironmentUiFilterES(environment),
],
},
},
@ -109,13 +106,3 @@ async function createAnomalyDetectionJob({
],
});
}
const ENVIRONMENT_NOT_DEFINED_FILTER = {
bool: {
must_not: {
exists: {
field: SERVICE_ENVIRONMENT,
},
},
},
};

View file

@ -6,7 +6,7 @@
import { Logger } from 'kibana/server';
import { Setup } from '../helpers/setup_request';
import { getMlJobsWithAPMGroup } from './get_ml_jobs_by_group';
import { getMlJobsWithAPMGroup } from './get_ml_jobs_with_apm_group';
export async function getAnomalyDetectionJobs(setup: Setup, logger: Logger) {
const { ml } = setup;

Some files were not shown because too many files have changed in this diff Show more