Merge puppeteer into 6.x (#22533)

This commit is contained in:
Chris Davies 2018-08-30 11:48:10 -04:00 committed by GitHub
parent 2a190be694
commit 18ca9014e4
No known key found for this signature in database
GPG key ID: 4AEE18F83AFDEB23
47 changed files with 651 additions and 1198 deletions

1
x-pack/build_chromium/.gitignore vendored Normal file
View file

@ -0,0 +1 @@
*.pyc

View file

@ -1,12 +1,123 @@
# Building on Linux
The following script will run in the elastic/ubuntu-16.04-x86_64 virtual machine. It should be noted that the harddisk of the Virtual Machine should be resized to at least 40GB.
# Chromium build
# Building on Windows
Some dependencies must be installed on the build machine to build on Windows, those are specified [here](https://chromium.googlesource.com/chromium/src/+/master/docs/windows_build_instructions.md). You don't have to install the depot_tools, just Visual Studio and the Windows SDK. When building on Windows, using the `--workspace` flag is likely required because of the NTFS max paths.
We ship our own headless build of Chromium which is significantly smaller than the standard binaries shipped by Google. The scripts in this folder can be used to initialize the build environments and run the build on Mac, Windows, and Linux.
# Building on macOS
XCode and the OS X 10.10 SDK are currently required, but the full requirements are specified [here](https://chromium.googlesource.com/chromium/src/+/master/docs/mac_build_instructions.md). Also, you don't have to install the depot_tools.
The official Chromium build process is poorly documented, and seems to have breaking changes fairly regularly. The build pre-requisites, and the build flags change over time, so it is likely that the scripts in this directory will be out of date by the time we have to do another Chromium build.
This document is an attempt to note all of the gotchas we've come across while building, so that the next time we have to tinker here, we'll have a good starting point.
## Updating the revision
If you want to build the headless_shell that mirrors a specific version of the Chrome browser, you can take the version number displayed in `Chrome -> About` and plug it into the "Version" input [here](https://omahaproxy.appspot.com/) and it'll give you the Commit that you can feed into the build script using the `--git-sha` argument. It's highly recommended to update the default when you're upgrading the version.
## Build args
Chromium is built via a build tool called "ninja". The build can be configured by specifying build flags either in an "args.gn" file or via commandline args. We have an "args.gn" file per platform:
- mac: darwin/args.gn
- linux: linux/args.gn
- windows: windows/args.gn
The various build flags are not well documented. Some are documented [here](https://www.chromium.org/developers/gn-build-configuration). Some, such as `enable_basic_printing = false`, I only found by poking through 3rd party build scripts.
As of this writing, there is an officially supported headless Chromium build args file for Linux: `build/args/headless.gn`. This does not work on Windows or Mac, so we have taken that as our starting point, and modified it until the Windows / Mac builds succeeded.
## VMs
I ran Linux and Windows VMs in GCP with the following specs:
- 8 core vCPU
- 30GB RAM
- 128GB hard drive
- Ubuntu 18.04 LTS (not minimal)
- Windows Server 2016 (full, with desktop)
The more cores the better, as the build makes effective use of each. For Linux, Ubuntu is the only officially supported build target.
- Linux:
- SSH in using [gcloud](https://cloud.google.com/sdk/)
- Get the ssh command in the [GCP console](https://console.cloud.google.com/) -> VM instances -> your-vm-name -> SSH -> gcloud
- Their in-browser UI is kinda sluggish, so use the commandline tool
- Windows:
- Install Microsoft's Remote Desktop tools
- Get the RDP file in the [GCP console](https://console.cloud.google.com/) -> VM instances -> your-vm-name -> RDP -> Download the RDP file
- Edit it in Microsoft Remote Desktop:
- Display -> Resolution (1280 x 960 or something reasonable)
- Local Resources -> Folders, then select the folder(s) you want to share, this is at least `build_chromium` folder
- Save
## Initializing each VM / environment
You only need to initialize each environment once.
Create the build folder:
- Mac / Linux: `mkdir -p ~/chromium`
- Windows: `mkdir c:\chromium`
Copy the `x-pack/build-chromium` folder to each. Replace `you@your-machine` with the correct username and VM name:
- Mac: `cp -r ~/dev/elastic/kibana/x-pack/build_chromium ~/chromium/build_chromium`
- Linux: `gcloud compute scp --recurse ~/dev/elastic/kibana/x-pack/build_chromium you@your-machine:~/chromium --zone=us-east1-b`
- Windows: Copy the `build_chromium` folder via the RDP GUI into `c:\chromium\build_chromium`
There is an init script for each platform. This downloads and installs the necessary prerequisites, sets environment variables, etc.
- Mac: `~/chromium/build_chromium/darwin/init.sh`
- Linux: `~/chromium/build_chromium/linux/init.sh`
- Windows `c:\chromium\build_chromium\windows\init.bat`
In windows, at least, you will need to do a number of extra steps:
- Follow the prompts in the Visual Studio installation process, click "Install" and wait a while
- Once it's installed, open Control Panel and turn on Debugging Tools for Windows:
- Control Panel → Programs → Programs and Features → Select the “Windows Software Development Kit” → Change → Change → Check “Debugging Tools For Windows” → Change
- Press enter in the terminal to continue running the init
## Building
Find the sha of the Chromium commit you wish to build. Most likely, you want to build the Chromium revision that is tied to the version of puppeteer that we're using.
Find the Chromium revision (modify the following command to be wherever you have the kibana source installed):
- `cat ~/dev/elastic/kibana/x-pack/node_modules/puppeteer-core/package.json | grep chromium_revision`
- Take the revision number from that, and tack it to the end of this URL: https://crrev.com
- (For example: https://crrev.com/575458)
- Grab the SHA from there
- (For example, rev 575458 has sha 4747cc23ae334a57a35ed3c8e6adcdbc8a50d479)
Note: In Linux, you should run the build command in tmux so that if your ssh session disconnects, the build can keep going. To do this, just type `tmux` into your terminal to hop into a tmux session. If you get disconnected, you can hop back in like so:
- SSH into the server
- Run `tmux list-sessions`
- Run `tmux switch -t {session_id}`, replacing {session_id} with the value from the list-sessions output
To run the build, replace the sha in the following commands with the sha that you wish to build:
- Mac: `python ~/chromium/build_chromium/build.py 4747cc23ae334a57a35ed3c8e6adcdbc8a50d479`
- Linux: `python ~/chromium/build_chromium/build.py 4747cc23ae334a57a35ed3c8e6adcdbc8a50d479`
- Windows: `python c:\chromium\build_chromium\build.py 4747cc23ae334a57a35ed3c8e6adcdbc8a50d479`
## Artifacts
After the build completes, there will be a .zip file and a .md5 file in `~/chromium/chromium/src/out/headless`. These are named like so: `chromium-{first_7_of_SHA}-{platform}`, for example: `chromium-4747cc2-linux`.
The zip files need to be deployed to s3. For testing, I drop them into `headless-shell-dev`, but for production, they need to be in `headless-shell`. And the `x-pack/plugins/reporting/server/browsers/chromium/paths.js` file needs to be upated to have the correct `archiveChecksum`, `archiveFilename` and `baseUrl`.
*If you're building in the cloud, don't forget to turn off your VM after retrieving the build artifacts!*
## Diagnosing runtime failures
After getting the build to pass, the resulting binaries often failed to run or would hang.
You can run the headless browser manually to see what errors it is generating (replace the `c:\dev\data` with the path to a dummy folder you've created on your system):
`headless_shell.exe --disable-translate --disable-extensions --disable-background-networking --safebrowsing-disable-auto-update --disable-sync --metrics-recording-only --disable-default-apps --mute-audio --no-first-run --user-data-dir=c:\dev\data --disable-gpu --no-sandbox --headless --hide-scrollbars --window-size=400,400 --remote-debugging-port=3333 about:blank`
## Resources
The following links provide helpful context about how the Chromium build works, and its prerequisites:
- https://www.chromium.org/developers/how-tos/get-the-code/working-with-release-branches
- https://chromium.googlesource.com/chromium/src/+/master/docs/windows_build_instructions.md
- https://chromium.googlesource.com/chromium/src/+/master/docs/mac_build_instructions.md
- https://chromium.googlesource.com/chromium/src/+/master/docs/linux_build_instructions.md
- Some build-flag descriptions: https://www.chromium.org/developers/gn-build-configuration
- The serverless Chromium project was indispensable: https://github.com/adieuadieu/serverless-chrome/blob/b29445aa5a96d031be2edd5d1fc8651683bf262c/packages/lambda/builds/chromium/build/build.sh

View file

@ -1,148 +0,0 @@
/*
* Copyright Elasticsearch B.V. and/or licensed to Elasticsearch B.V. under one
* or more contributor license agreements. Licensed under the Elastic License;
* you may not use this file except in compliance with the Elastic License.
*/
import fs from 'fs';
import path from 'path';
import { spawn } from 'child_process';
import mkdirp from 'mkdirp';
import { promisify } from 'bluebird';
import { platformArchives } from './platform_archives';
import { createZip } from './create_zip';
const DEPOT_TOOLS_GIT_URL = 'https://chromium.googlesource.com/chromium/tools/depot_tools.git';
class Paths {
constructor(workspace) {
this.workspace = workspace;
this.depotTools = path.join(this.workspace, 'depot_tools');
this.chromium = path.join(this.workspace, 'chromium');
this.chromiumSrc = path.join(this.chromium, 'src');
this.gclientConfig = path.join(this.chromium, '.gclient');
this.headlessOutput = path.join(this.chromiumSrc, 'out', 'Headless');
this.headlessOutputArgs = path.join(this.headlessOutput, 'args.gn');
this.target = path.join(this.workspace, 'target');
}
getHeadlessArgs = (platform) => path.join(__dirname, `${platform}.args.gn`);
}
const fsp = {
mkdir: promisify(fs.mkdir, fs),
mkdirp: promisify(mkdirp),
readFile: promisify(fs.readFile, fs),
writeFile: promisify(fs.writeFile, fs),
};
const log = (msg) => {
console.log(msg);
};
const fromExitCode = (process) => {
return new Promise((resolve, reject) => {
process.on('exit', code => code === 0 ? resolve() : reject(new Error(`Child process exited with code ${code}`)));
});
};
const printOutput = (proc) => {
proc.stdout.on('data', buffer => console.log(buffer.toString()));
proc.stderr.on('data', buffer => console.error(buffer.toString()));
return proc;
};
const pathEnvName = process.platform === 'win32' ? 'Path' : 'PATH';
export async function build(command) {
const paths = new Paths(command.workspace);
if (!fs.existsSync(paths.workspace)) {
log(`creating workspace`);
await fsp.mkdir(paths.workspace);
}
log('setting up depot_tools');
if (!fs.existsSync(paths.depotTools)) {
log(`cloning depot_tools`);
await fromExitCode(spawn('git', ['clone', DEPOT_TOOLS_GIT_URL, paths.depotTools], { cwd: paths.workspace, shell: true }));
} else {
log(`depot_tools already cloned, updating`);
await fromExitCode(spawn('git', ['pull'], { cwd: paths.depotTools, env: process.env, shell: true }));
}
const depotToolsPathEnv = {
...process.env,
DEPOT_TOOLS_WIN_TOOLCHAIN: 0,
[pathEnvName]: `${process.env[pathEnvName]}${path.delimiter}${paths.depotTools}`,
};
if (!fs.existsSync(paths.chromium)) {
log(`creating chromium directory`);
await fsp.mkdir(paths.chromium);
}
if (!fs.existsSync(paths.gclientConfig)) {
log(`creating .gclient`);
const solution = `{
'url': 'https://chromium.googlesource.com/chromium/src.git',
'managed': False,
'name': 'src',
'deps_file': '.DEPS.git',
'custom_deps': {},
}`.replace(/\n/g, '');
await fromExitCode(printOutput(spawn('gclient', [
'config',
`--spec "solutions=[${solution.replace(/\n/g, '')}]"`
], { cwd: paths.chromium, env: depotToolsPathEnv, shell: true })));
}
log(`syncing src`);
await fromExitCode(printOutput(spawn('gclient', [
'sync',
'-r',
`src@${command.gitSha}`,
'--no-history',
'--nohooks'
], { cwd: paths.chromium, env: depotToolsPathEnv, shell: true })));
if (process.platform === 'linux') {
log(`installing build dependencies`);
await fromExitCode(printOutput(spawn('build/install-build-deps.sh', [
'--no-prompt',
'--no-nacl'
], { cwd: paths.chromiumSrc, env: depotToolsPathEnv, shell: true })));
}
log(`running hooks`);
await fromExitCode(printOutput(spawn('gclient', [
'runhooks'
], { cwd: paths.chromium, env: depotToolsPathEnv, shell: true })));
log(`generating ninja`);
await fsp.mkdirp(paths.headlessOutput);
const argsContent = await fsp.readFile(paths.getHeadlessArgs(process.platform));
await fsp.writeFile(paths.headlessOutputArgs, argsContent);
await fromExitCode(printOutput(spawn('gn', [
'gen',
'out/Headless'
], { cwd: paths.chromiumSrc, env: depotToolsPathEnv, shell: true })));
log("building");
await fromExitCode(printOutput(spawn('ninja', [
`-C ${paths.headlessOutput}`, 'headless_shell'
], { cwd: paths.chromiumSrc, env: depotToolsPathEnv, shell: true })));
if (!fs.existsSync(paths.target)) {
log(`creating target folder`);
await fsp.mkdir(paths.target);
}
const platformArchive = platformArchives[process.platform];
const zipFilename = `chromium-${command.gitSha.substr(0, 7)}-${process.platform}.zip`;
await createZip(paths.headlessOutput, platformArchive.files, platformArchive.directoryName, paths.target, zipFilename);
console.log(`Archive created at ${path.join(paths.target, zipFilename)}`);
}

View file

@ -0,0 +1,92 @@
import subprocess, os, sys, platform, zipfile, hashlib, shutil
from build_util import runcmd, mkdir, md5_file, script_dir, root_dir, configure_environment
# This file builds Chromium headless on Windows, Mac, and Linux.
# Verify that we have an argument, and if not print instructions
if (len(sys.argv) < 2):
print('Usage:')
print('python build.py {chromium_version}')
print('Example:')
print('python build.py 68.0.3440.106')
print('python build.py 4747cc23ae334a57a35ed3c8e6adcdbc8a50d479')
sys.exit(1)
# The version of Chromium we wish to build. This can be any valid git
# commit, tag, or branch, so: 68.0.3440.106 or
# 4747cc23ae334a57a35ed3c8e6adcdbc8a50d479
source_version = sys.argv[1]
print('Building Chromium ' + source_version)
# Set the environment variables required by the build tools
print('Configuring the build environment')
configure_environment()
# Sync the codebase to the correct version, syncing master first
# to ensure that we actually have all the versions we may refer to
print('Syncing source code')
os.chdir(os.path.join(root_dir, 'chromium/src'))
runcmd('git checkout master')
runcmd('git fetch origin')
runcmd('gclient sync --with_branch_heads --with_tags --jobs 16')
runcmd('git checkout ' + source_version)
runcmd('gclient sync --with_branch_heads --with_tags --jobs 16')
runcmd('gclient runhooks')
# Copy build args/{Linux | Darwin | Windows}.gn from the root of our directory to out/headless/args.gn,
platform_build_args = os.path.join(script_dir, platform.system().lower(), 'args.gn')
print('Generating platform-specific args')
print('Copying build args: ' + platform_build_args + ' to out/headless/args.gn')
mkdir('out/headless')
shutil.copyfile(platform_build_args, 'out/headless/args.gn')
runcmd('gn gen out/headless')
# Build Chromium... this takes *forever* on underpowered VMs
print('Compiling... this will take a while')
runcmd('autoninja -C out/headless headless_shell')
# Optimize the output on Linux and Mac by stripping inessentials from the binary
if platform.system() != 'Windows':
print('Optimizing headless_shell')
shutil.move('out/headless/headless_shell', 'out/headless/headless_shell_raw')
runcmd('strip -o out/headless/headless_shell out/headless/headless_shell_raw')
# Create the zip and generate the md5 hash using filenames like:
# chromium-4747cc2-linux.zip
base_filename = 'out/headless/chromium-' + source_version[:7].strip('.') + '-' + platform.system().lower()
zip_filename = base_filename + '.zip'
md5_filename = base_filename + '.md5'
print('Creating ' + zip_filename)
archive = zipfile.ZipFile(zip_filename, mode='w', compression=zipfile.ZIP_DEFLATED)
def archive_file(name):
"""A little helper function to write individual files to the zip file"""
from_path = os.path.join('out/headless', name)
to_path = os.path.join('headless_shell-' + platform.system().lower(), name)
archive.write(from_path, to_path)
# Each platform has slightly different requirements for what dependencies
# must be bundled with the Chromium executable.
if platform.system() == 'Linux':
archive_file('headless_shell')
elif platform.system() == 'Windows':
archive_file('headless_shell.exe')
archive_file('dbghelp.dll')
archive_file('icudtl.dat')
archive_file('natives_blob.bin')
archive_file('snapshot_blob.bin')
elif platform.system() == 'Darwin':
archive_file('headless_shell')
archive_file('natives_blob.bin')
archive_file('snapshot_blob.bin')
archive_file('Helpers/crashpad_handler')
archive.close()
print('Creating ' + md5_filename)
with open (md5_filename, 'w') as f:
f.write(md5_file(zip_filename))

View file

@ -0,0 +1,33 @@
import os, hashlib
# This file contains various utility functions used by the init and build scripts
# Compute the root build and script directory as relative to this file
script_dir = os.path.realpath(os.path.join(__file__, '..'))
root_dir = os.path.realpath(os.path.join(script_dir, '..'))
def runcmd(cmd):
"""Executes a string command in the shell"""
print(cmd)
result = os.system(cmd)
if result != 0:
raise Exception(cmd + ' returned ' + str(result))
def mkdir(dir):
"""Makes a directory if it doesn't exist"""
if not os.path.exists(dir):
print('mkdir -p ' + dir)
return os.makedirs(dir)
def md5_file(filename):
"""Builds a hex md5 hash of the given file"""
md5 = hashlib.md5()
with open(filename, 'rb') as f:
for chunk in iter(lambda: f.read(128 * md5.block_size), b''):
md5.update(chunk)
return md5.hexdigest()
def configure_environment():
"""Configures temporary environment variables required by Chromium's build"""
depot_tools_path = os.path.join(root_dir, 'depot_tools')
os.environ['PATH'] = depot_tools_path + os.pathsep + os.environ['PATH']

View file

@ -1,25 +0,0 @@
/*
* Copyright Elasticsearch B.V. and/or licensed to Elasticsearch B.V. under one
* or more contributor license agreements. Licensed under the Elastic License;
* you may not use this file except in compliance with the Elastic License.
*/
import path from 'path';
import { Command } from 'commander';
import { build } from './build';
const cmd = new Command('node scripts/build_chromium');
cmd.option('--workspace [path]', 'working directory for building chromium', path.join(__dirname, '.workspace'))
.option('--git-sha [sha]', 'chromium src git SHA to checkout', '503a3e48dffe2d5bcbacef72d33b6e1801d061a2')
.parse(process.argv);
(async function () {
try {
await build(cmd);
}
catch (err) {
console.error(err.stack);
process.exit(1);
}
}());

View file

@ -1,25 +0,0 @@
/*
* Copyright Elasticsearch B.V. and/or licensed to Elasticsearch B.V. under one
* or more contributor license agreements. Licensed under the Elastic License;
* you may not use this file except in compliance with the Elastic License.
*/
import path from 'path';
import rename from 'gulp-rename';
import vfs from 'vinyl-fs';
import zip from 'gulp-zip';
// we create the zip, but we want all the files to be in a folder, the zipDirectoryName,
// so we have to rename them as such
export function createZip(dir, files, zipDirectoryName, zipPath, zipFilename) {
return new Promise(function (resolve, reject) {
vfs.src(files, { cwd: dir, base: dir })
.pipe(rename(function (filePath) {
filePath.dirname = path.join(zipDirectoryName, filePath.dirname);
}))
.pipe(zip(zipFilename))
.pipe(vfs.dest(zipPath))
.on('end', resolve)
.on('error', reject);
});
}

View file

@ -1,4 +1,4 @@
# This based on the //build/headless.gn, but modified to work with OSX.
# Based on //build/headless.gn
# Embed resource.pak into binary to simplify deployment.
headless_use_embedded_resources = true
@ -7,13 +7,17 @@ headless_use_embedded_resources = true
# into binary.
icu_use_data_file = false
# Use embedded data instead external files for headless in order
# to simplify deployment.
v8_use_external_startup_data = false
enable_nacl = false
enable_print_preview = false
enable_basic_printing = false
enable_remoting = false
use_alsa = false
use_ash = false
use_cups = false
use_dbus = false
use_gconf = false
use_gio = false
use_kerberos = false
use_libpci = false

View file

@ -0,0 +1,5 @@
#!/bin/bash
# Launch the cross-platform init script using a relative path
# from this script's location.
python "`dirname "$0"`/../init.py"

View file

@ -0,0 +1,32 @@
import os, platform
from build_util import runcmd, mkdir, md5_file, root_dir, configure_environment
# This is a cross-platform initialization script which should only be run
# once per environment, and isn't intended to be run directly. You should
# run the appropriate platform init script (e.g. Linux/init.sh) which will
# call this once the platform-specific initialization has completed.
os.chdir(root_dir)
# Configure git
runcmd('git config --global core.autocrlf false')
runcmd('git config --global core.filemode false')
runcmd('git config --global branch.autosetuprebase always')
# Grab Chromium's custom build tools, if they aren't already installed
# (On Windows, they are installed before this Python script is run)
if not os.path.isdir('depot_tools'):
runcmd('git clone https://chromium.googlesource.com/chromium/tools/depot_tools.git')
# Put depot_tools on the path so we can properly run the fetch command
configure_environment()
# Fetch the Chromium source code
mkdir('chromium')
os.chdir('chromium')
runcmd('fetch chromium')
# Build Linux deps
if platform.system() == 'Linux':
os.chdir('src')
runcmd('build/install-build-deps.sh')

View file

@ -0,0 +1,13 @@
#!/bin/bash
# Initializes a Linux environment. This need only be done once per
# machine. The OS needs to be a flavor that supports apt get, such as Ubuntu.
if ! [ -x "$(command -v python)" ]; then
echo "Installing Python"
sudo apt-get --assume-yes install python
fi
# Launch the cross-platform init script using a relative path
# from this script's location.
python "`dirname "$0"`/../init.py"

View file

@ -1,33 +0,0 @@
/*
* Copyright Elasticsearch B.V. and/or licensed to Elasticsearch B.V. under one
* or more contributor license agreements. Licensed under the Elastic License;
* you may not use this file except in compliance with the Elastic License.
*/
export const platformArchives = {
darwin: {
directoryName: 'headless_shell-darwin',
files: [
'headless_shell',
'natives_blob.bin',
'snapshot_blob.bin',
'Helpers/crashpad_handler'
]
},
linux: {
directoryName: 'headless_shell-linux',
files: [
'headless_shell'
]
},
win32: {
directoryName: 'headless_shell-win32',
files: [
'dbghelp.dll',
'headless_shell.exe',
'icudtl.dat',
'natives_blob.bin',
'snapshot_blob.bin',
]
}
};

View file

@ -1,16 +1,19 @@
# This based on the //build/headless.gn, but modified to work with Windows.
# Based on //build/headless.gn
# Embed resource.pak into binary to simplify deployment.
headless_use_embedded_resources = true
# Use embedded data instead external files for headless in order
# to simplify deployment.
v8_use_external_startup_data = false
enable_nacl = false
enable_print_preview = false
enable_basic_printing = false
enable_remoting = false
use_alsa = false
use_ash = false
use_cups = false
use_dbus = false
use_gconf = false
use_gio = false
use_libpci = false
use_pulseaudio = false

View file

@ -0,0 +1,31 @@
: This only needs to be run once per environment to set it up.
: This requires a GUI, as the VS installation is graphical.
@echo off
: Install Visual Studio (this requires user interaction, and takes quite a while)
: Most of the subsequent commands can be run in parallel with this (downloading, unzipping,
: grabbing the source, etc). This must be completed before building, though.
@echo "Installing Visual Studio"
powershell -command "& {iwr -outf c:\chromium\install_vs.exe https://download.visualstudio.microsoft.com/download/pr/f9c35424-ffad-4b44-bb8f-d4e3968e90ce/f75403c967456e32e758ef558957f345/vs_community.exe}"
install_vs.exe --add Microsoft.VisualStudio.Workload.NativeDesktop --add Microsoft.VisualStudio.Component.VC.ATLMFC --includeRecommended
: Install Chromium's custom build tools
@echo "Installing Chromium build tools"
powershell -command "& {iwr -outf %~dp0../../depot_tools.zip https://storage.googleapis.com/chrome-infra/depot_tools.zip}"
powershell -command "& {Expand-Archive %~dp0../../depot_tools.zip -DestinationPath %~dp0../../depot_tools}"
: Set the environment variables required by depot_tools
@echo "When Visual Studio is installed, you need to enable the Windows SDK in Control Panel. After taht, press <enter> here to continue initialization"
pause
SETX PATH "%~dp0..\..\depot_tools;%path%"
SETX DEPOT_TOOLS_WIN_TOOLCHAIN 0
call gclient
python %~dp0../init.py

View file

@ -59,7 +59,8 @@
"mustache": "^2.3.0",
"mutation-observer": "^1.0.3",
"node-fetch": "^2.1.2",
"pdf-image": "1.1.0",
"pdf-image": "2.0.0",
"pdfjs-dist": "^2.0.489",
"pixelmatch": "4.0.2",
"proxyquire": "1.7.11",
"react-test-renderer": "^16.2.0",
@ -98,7 +99,6 @@
"bluebird": "3.1.1",
"boom": "3.1.1",
"brace": "0.11.1",
"chrome-remote-interface": "0.24.2",
"classnames": "2.2.5",
"concat-stream": "1.5.1",
"d3": "3.5.6",
@ -135,6 +135,7 @@
"polished": "^1.9.2",
"prop-types": "^15.6.0",
"puid": "1.0.5",
"puppeteer-core": "^1.7.0",
"react": "^16.3.0",
"react-clipboard.js": "^1.1.2",
"react-dom": "^16.3.0",
@ -169,4 +170,4 @@
"engines": {
"yarn": "^1.6.0"
}
}
}

View file

@ -6,9 +6,7 @@
import * as Rx from 'rxjs';
import { first, tap, mergeMap } from 'rxjs/operators';
import path from 'path';
import fs from 'fs';
import moment from 'moment';
import getPort from 'get-port';
import { promisify } from 'bluebird';
import { LevelLogger } from '../../../../server/lib/level_logger';
@ -24,8 +22,6 @@ export function screenshotsObservableFactory(server) {
const browserDriverFactory = server.plugins.reporting.browserDriverFactory;
const captureConfig = config.get('xpack.reporting.capture');
const dataDirectory = config.get('path.data');
const asyncDurationLogger = async (description, promise) => {
const start = new Date();
const result = await promise;
@ -33,16 +29,6 @@ export function screenshotsObservableFactory(server) {
return result;
};
const startRecording = (browser) => {
if (captureConfig.record) {
if (!browser.record) {
throw new Error('Unable to record capture with current browser');
}
browser.record(path.join(dataDirectory, `recording-${moment().utc().format().replace(/:/g, '_')}`));
}
};
const openUrl = async (browser, url, headers) => {
const waitForSelector = '.application';
@ -256,7 +242,6 @@ export function screenshotsObservableFactory(server) {
const screenshot$ = driver$.pipe(
tap(browser => startRecording(browser)),
tap(() => logger.debug(`opening ${url}`)),
mergeMap(
browser => openUrl(browser, url, headers),
@ -280,20 +265,20 @@ export function screenshotsObservableFactory(server) {
browser => getNumberOfItems(browser, layout),
(browser, itemsCount) => ({ browser, itemsCount })
),
tap(() => logger.debug('setting viewport')),
mergeMap(
({ browser, itemsCount }) => setViewport(browser, itemsCount, layout),
({ browser, itemsCount }) => ({ browser, itemsCount }),
),
tap(({ itemsCount }) => logger.debug(`waiting for ${itemsCount} to be in the DOM`)),
mergeMap(
({ browser, itemsCount }) => waitForElementsToBeInDOM(browser, itemsCount, layout),
({ browser, itemsCount }) => ({ browser, itemsCount })
),
tap(() => logger.debug('setting viewport')),
mergeMap(
({ browser, itemsCount }) => setViewport(browser, itemsCount, layout),
({ browser }) => browser
),
tap(() => logger.debug('positioning elements')),
mergeMap(
browser => positionElements(browser, layout),
browser => browser
({ browser }) => positionElements(browser, layout),
({ browser }) => browser
),
tap(() => logger.debug('waiting for rendering to complete')),
mergeMap(

View file

@ -37,7 +37,7 @@ export const reporting = (kibana) => {
'plugins/reporting/controls/visualize',
'plugins/reporting/controls/dashboard',
],
hacks: [ 'plugins/reporting/hacks/job_completion_notifier'],
hacks: ['plugins/reporting/hacks/job_completion_notifier'],
home: ['plugins/reporting/register_feature'],
managementSections: ['plugins/reporting/views/management'],
injectDefaultVars(server, options) {
@ -77,7 +77,6 @@ export const reporting = (kibana) => {
timeout: Joi.number().integer().default(120000),
}).default(),
capture: Joi.object({
record: Joi.boolean().default(false),
zoom: Joi.number().integer().default(2),
viewport: Joi.object({
width: Joi.number().integer().default(1950),

View file

@ -1,13 +0,0 @@
/*
* Copyright Elasticsearch B.V. and/or licensed to Elasticsearch B.V. under one
* or more contributor license agreements. Licensed under the Elastic License;
* you may not use this file except in compliance with the Elastic License.
*/
export async function ignoreSSLErrorsBehavior(Security) {
await Security.enable();
await Security.setOverrideCertificateErrors({ override: true });
Security.certificateError(({ eventId }) => {
Security.handleCertificateError({ eventId, action: 'continue' });
});
}

View file

@ -4,79 +4,30 @@
* you may not use this file except in compliance with the Elastic License.
*/
import fs from 'fs';
import path from 'path';
import moment from 'moment';
import { promisify, delay } from 'bluebird';
import { transformFn } from './transform_fn';
import { ignoreSSLErrorsBehavior } from './ignore_ssl_errors';
import { screenshotStitcher, CapturePngSizeError } from './screenshot_stitcher';
export class HeadlessChromiumDriver {
constructor(client, { maxScreenshotDimension, logger }) {
this._client = client;
constructor(page, { maxScreenshotDimension, logger }) {
this._page = page;
this._maxScreenshotDimension = maxScreenshotDimension;
this._waitForDelayMs = 100;
this._zoom = 1;
this._logger = logger;
}
async evaluate({ fn, args = [], awaitPromise = false, returnByValue = false }) {
const { Runtime } = this._client;
const serializedArgs = args.map(arg => JSON.stringify(arg)).join(',');
const expression = `(${transformFn(fn)})(${serializedArgs})`;
const result = await Runtime.evaluate({ expression, awaitPromise, returnByValue });
return result.result.value;
}
async open(url, { headers, waitForSelector }) {
this._logger.debug(`HeadlessChromiumDriver:opening url ${url}`);
const { Network, Page } = this._client;
await Promise.all([
Network.enable(),
Page.enable(),
]);
await ignoreSSLErrorsBehavior(this._client.Security);
await Network.setExtraHTTPHeaders({ headers });
await Page.navigate({ url });
await Page.loadEventFired();
const { frameTree } = await Page.getResourceTree();
if (frameTree.frame.unreachableUrl) {
throw new Error('URL open failed. Is the server running?');
}
this.documentNode = await this._client.DOM.getDocument();
await this.waitForSelector(waitForSelector);
}
await this._page.setExtraHTTPHeaders(headers);
await this._page.goto(url, { waitUntil: 'networkidle0' });
async record(recordPath) {
const { Page } = this._client;
await promisify(fs.mkdir, fs)(recordPath);
await Page.startScreencast();
Page.screencastFrame(async ({ data, sessionId }) => {
await this._writeData(path.join(recordPath, `${moment().utc().format('HH_mm_ss_SSS')}.png`), data);
await Page.screencastFrameAck({ sessionId });
});
this.documentNode = await this._page.evaluateHandle(() => document);
await this._page.waitFor(waitForSelector);
}
async screenshot(elementPosition = null) {
const { Page } = this._client;
let outputClip;
if (!elementPosition) {
const { layoutViewport } = await Page.getLayoutMetrics();
outputClip = {
x: layoutViewport.pageX,
y: layoutViewport.pageY,
width: layoutViewport.clientWidth,
height: layoutViewport.clientHeight,
};
} else {
let clip;
if (elementPosition) {
const { boundingClientRect, scroll = { x: 0, y: 0 } } = elementPosition;
outputClip = {
clip = {
x: boundingClientRect.left + scroll.x,
y: boundingClientRect.top + scroll.y,
height: boundingClientRect.height,
@ -84,69 +35,44 @@ export class HeadlessChromiumDriver {
};
}
// Wrapping screenshotStitcher function call in a retry because of this bug:
// https://github.com/elastic/kibana/issues/19563. The reason was never found - it only appeared on ci and
// debug logic right after Page.captureScreenshot to ensure the correct size made the bug disappear.
let retryCount = 0;
const MAX_RETRIES = 3;
while (true) {
try {
return await screenshotStitcher(outputClip, this._zoom, this._maxScreenshotDimension, async screenshotClip => {
const { data } = await Page.captureScreenshot({
clip: {
...screenshotClip,
scale: 1
}
});
this._logger.debug(`Captured screenshot clip ${JSON.stringify(screenshotClip)}`);
return data;
}, this._logger);
} catch (error) {
const isCapturePngSizeError = error instanceof CapturePngSizeError;
if (!isCapturePngSizeError || retryCount === MAX_RETRIES) {
throw error;
} else {
this._logger.error(error.message);
this._logger.error('Trying again...');
retryCount++;
}
}
}
const screenshot = await this._page.screenshot({
encoding: 'base64',
clip,
});
return screenshot;
}
async _writeData(writePath, base64EncodedData) {
const buffer = Buffer.from(base64EncodedData, 'base64');
await promisify(fs.writeFile, fs)(writePath, buffer);
async evaluate({ fn, args = [] }) {
const result = await this._page.evaluate(fn, ...args);
return result;
}
waitForSelector(selector) {
return this._page.waitFor(selector);
}
async waitFor({ fn, args, toEqual }) {
while (true) {
const result = await this.evaluate({ fn, args });
if (result === toEqual) {
return;
}
await new Promise(r => setTimeout(r, this._waitForDelayMs));
}
}
async setViewport({ width, height, zoom }) {
this._logger.debug(`Setting viewport to width: ${width}, height: ${height}, zoom: ${zoom}`);
const { Emulation } = this._client;
await Emulation.setDeviceMetricsOverride({
await this._page.setViewport({
width: Math.floor(width / zoom),
height: Math.floor(height / zoom),
deviceScaleFactor: zoom,
mobile: false,
isMobile: false,
});
this._zoom = zoom;
}
async waitFor({ fn, args, toEqual }) {
while ((await this.evaluate({ fn, args })) !== toEqual) {
await delay(this._waitForDelayMs);
}
}
async waitForSelector(selector) {
while (true) {
const { nodeId } = await this._client.DOM.querySelector({ nodeId: this.documentNode.root.nodeId, selector });
if (nodeId) {
break;
}
await delay(this._waitForDelayMs);
}
}
}

View file

@ -1,115 +0,0 @@
/*
* Copyright Elasticsearch B.V. and/or licensed to Elasticsearch B.V. under one
* or more contributor license agreements. Licensed under the Elastic License;
* you may not use this file except in compliance with the Elastic License.
*/
// No types found for this package. May want to investigate an alternative with types.
// @ts-ignore: implicit any for JS file
import $streamToObservable from '@samverschueren/stream-to-observable';
import { PNG } from 'pngjs';
import * as Rx from 'rxjs';
import { ObservableInput } from 'rxjs';
import { map, mergeMap, reduce, switchMap, tap, toArray } from 'rxjs/operators';
import { Logger, Screenshot, Size } from './types';
export class CapturePngSizeError extends Error {
constructor(
actualSize: { width: number; height: number },
expectedSize: { width: number; height: number }
) {
super();
this.message =
`Capture PNG size error. Please visit https://github.com/elastic/kibana/issues/19563 to report this error. ` +
`Screenshot captured of size ${actualSize.width}x${
actualSize.height
} was not of expected size ${expectedSize.width}x${expectedSize.height}`;
}
}
// if we're only given one screenshot, and it matches the given size
// we're going to skip the combination and just use it
const canUseFirstScreenshot = (
screenshots: Screenshot[],
size: { width: number; height: number }
) => {
if (screenshots.length !== 1) {
return false;
}
const firstScreenshot = screenshots[0];
return (
firstScreenshot.rectangle.width === size.width &&
firstScreenshot.rectangle.height === size.height
);
};
/**
* Combines the screenshot clips into a single screenshot of size `outputSize`.
* @param screenshots - Array of screenshots to combine
* @param outputSize - Final output size that the screenshots should match up with
* @param logger - logger for extra debug output
*/
export function $combine(
screenshots: Screenshot[],
outputSize: Size,
logger: Logger
): Rx.Observable<string> {
logger.debug(
`Combining screenshot clips into final, scaled output dimension of ${JSON.stringify(
outputSize
)}`
);
if (screenshots.length === 0) {
return Rx.throwError('Unable to combine 0 screenshots');
}
if (canUseFirstScreenshot(screenshots, outputSize)) {
return Rx.of(screenshots[0].data);
}
// Turn the screenshot data into actual PNGs
const pngs$ = Rx.from(screenshots).pipe(
mergeMap(
(screenshot: Screenshot): ObservableInput<PNG> => {
const png = new PNG();
const buffer = Buffer.from(screenshot.data, 'base64');
const parseAsObservable = Rx.bindNodeCallback(png.parse.bind(png));
return parseAsObservable(buffer);
},
(screenshot: Screenshot, png: PNG) => {
if (
png.width !== screenshot.rectangle.width ||
png.height !== screenshot.rectangle.height
) {
const error = new CapturePngSizeError(png, screenshot.rectangle);
logger.error(error.message);
throw error;
}
return { screenshot, png };
}
)
);
const output$ = pngs$.pipe(
reduce((output: PNG, input: { screenshot: Screenshot; png: PNG }) => {
const { png, screenshot } = input;
// Spitting out a lot of output to help debug https://github.com/elastic/kibana/issues/19563. Once that is
// fixed, this should probably get pared down.
logger.debug(`Output dimensions is ${JSON.stringify(outputSize)}`);
logger.debug(`Input png w: ${png.width} and h: ${png.height}`);
logger.debug(`Creating output png with ${JSON.stringify(screenshot.rectangle)}`);
const { rectangle } = screenshot;
png.bitblt(output, 0, 0, rectangle.width, rectangle.height, rectangle.x, rectangle.y);
return output;
}, new PNG({ width: outputSize.width, height: outputSize.height }))
);
return output$.pipe(
tap(png => png.pack()),
switchMap<PNG, Buffer>($streamToObservable),
toArray(),
map((chunks: Buffer[]) => Buffer.concat(chunks).toString('base64'))
);
}

View file

@ -1,107 +0,0 @@
/*
* Copyright Elasticsearch B.V. and/or licensed to Elasticsearch B.V. under one
* or more contributor license agreements. Licensed under the Elastic License;
* you may not use this file except in compliance with the Elastic License.
*/
import { toArray } from 'rxjs/operators';
import { $getClips } from './get_clips';
import { Rectangle } from './types';
function getClipsTest(
description: string,
input: { rectangle: Rectangle; max: number },
expectedClips: { clips: Rectangle[] }
) {
test(description, async () => {
const clips = await $getClips(input.rectangle, input.max)
.pipe(toArray())
.toPromise();
expect(clips.length).toBe(expectedClips.clips.length);
for (let i = 0; i < clips.length; ++i) {
expect(clips[i]).toEqual(expectedClips.clips[i]);
}
});
}
getClipsTest(
`creates one rect if 0, 0`,
{
max: 100,
rectangle: { x: 0, y: 0, height: 0, width: 0 },
},
{
clips: [{ x: 0, y: 0, height: 0, width: 0 }],
}
);
getClipsTest(
`creates one rect if smaller than max`,
{
max: 100,
rectangle: { x: 0, y: 0, height: 99, width: 99 },
},
{
clips: [{ x: 0, y: 0, height: 99, width: 99 }],
}
);
getClipsTest(
`create one rect if equal to max`,
{
max: 100,
rectangle: { x: 0, y: 0, height: 100, width: 100 },
},
{
clips: [{ x: 0, y: 0, height: 100, width: 100 }],
}
);
getClipsTest(
`creates two rects if width is 1.5 * max`,
{
max: 100,
rectangle: { x: 0, y: 0, height: 100, width: 150 },
},
{
clips: [{ x: 0, y: 0, height: 100, width: 100 }, { x: 100, y: 0, height: 100, width: 50 }],
}
);
getClipsTest(
`creates two rects if height is 1.5 * max`,
{
max: 100,
rectangle: { x: 0, y: 0, height: 150, width: 100 },
},
{
clips: [{ x: 0, y: 0, height: 100, width: 100 }, { x: 0, y: 100, height: 50, width: 100 }],
}
);
getClipsTest(
`created four rects if height and width is 1.5 * max`,
{
max: 100,
rectangle: { x: 0, y: 0, height: 150, width: 150 },
},
{
clips: [
{ x: 0, y: 0, height: 100, width: 100 },
{ x: 100, y: 0, height: 100, width: 50 },
{ x: 0, y: 100, height: 50, width: 100 },
{ x: 100, y: 100, height: 50, width: 50 },
],
}
);
getClipsTest(
`creates one rect if height and width is equal to max and theres a y equal to the max`,
{
max: 100,
rectangle: { x: 0, y: 100, height: 100, width: 100 },
},
{
clips: [{ x: 0, y: 100, height: 100, width: 100 }],
}
);

View file

@ -1,34 +0,0 @@
/*
* Copyright Elasticsearch B.V. and/or licensed to Elasticsearch B.V. under one
* or more contributor license agreements. Licensed under the Elastic License;
* you may not use this file except in compliance with the Elastic License.
*/
import * as Rx from 'rxjs';
import { Rectangle } from './types';
/**
* Takes one large rectangle and breaks it down into an array of smaller rectangles,
* that if stitched together would create the original rectangle.
* @param largeRectangle - A big rectangle that might be broken down into smaller rectangles
* @param max - Maximum width or height any single clip should have
*/
export function $getClips(largeRectangle: Rectangle, max: number): Rx.Observable<Rectangle> {
const rectanglesGenerator = function*(): IterableIterator<Rectangle> {
const columns = Math.ceil(largeRectangle.width / max) || 1;
const rows = Math.ceil(largeRectangle.height / max) || 1;
for (let row = 0; row < rows; ++row) {
for (let column = 0; column < columns; ++column) {
yield {
height: row === rows - 1 ? largeRectangle.height - row * max : max,
width: column === columns - 1 ? largeRectangle.width - column * max : max,
x: column * max + largeRectangle.x,
y: row * max + largeRectangle.y,
};
}
}
};
return Rx.from(rectanglesGenerator());
}

View file

@ -1,188 +0,0 @@
/*
* Copyright Elasticsearch B.V. and/or licensed to Elasticsearch B.V. under one
* or more contributor license agreements. Licensed under the Elastic License;
* you may not use this file except in compliance with the Elastic License.
*/
import { promisify } from 'bluebird';
import fs from 'fs';
import path from 'path';
import { screenshotStitcher } from './index';
const loggerMock = {
debug: () => {
return;
},
error: () => {
return;
},
warning: () => {
return;
},
};
const fsp = {
readFile: promisify(fs.readFile),
};
const readPngFixture = async (filename: string) => {
const buffer = await fsp.readFile(path.join(__dirname, 'fixtures', filename));
return buffer.toString('base64');
};
const getSingleWhitePixel = () => {
return readPngFixture('single-white-pixel.png');
};
const getSingleBlackPixel = () => {
return readPngFixture('single-black-pixel.png');
};
const get2x1Checkerboard = () => {
return readPngFixture('2x1-checkerboard.png');
};
const get2x2White = () => {
return readPngFixture('2x2-white.png');
};
const get2x2Black = () => {
return readPngFixture('2x2-black.png');
};
const get4x4Checkerboard = () => {
return readPngFixture('4x4-checkerboard.png');
};
test(`single screenshot`, async () => {
const clip = {
height: 1,
width: 1,
x: 0,
y: 0,
};
const fn = jest.fn();
fn.mockReturnValueOnce(getSingleWhitePixel());
const data = await screenshotStitcher(clip, 1, 1, fn, loggerMock);
expect(fn.mock.calls.length).toBe(1);
expect(fn.mock.calls[0][0]).toEqual({ x: 0, y: 0, width: 1, height: 1 });
const expectedData = await getSingleWhitePixel();
expect(data).toEqual(expectedData);
});
test(`single screenshot, when zoom creates partial pixel we round up`, async () => {
const clip = {
height: 1,
width: 1,
x: 0,
y: 0,
};
const fn = jest.fn();
fn.mockReturnValueOnce(get2x2White());
const data = await screenshotStitcher(clip, 2, 1, fn, loggerMock);
expect(fn.mock.calls.length).toBe(1);
expect(fn.mock.calls[0][0]).toEqual({ x: 0, y: 0, width: 1, height: 1 });
const expectedData = await get2x2White();
expect(data).toEqual(expectedData);
});
test(`two screenshots, no zoom`, async () => {
const clip = {
height: 1,
width: 2,
x: 0,
y: 0,
};
const fn = jest.fn();
fn.mockReturnValueOnce(getSingleWhitePixel());
fn.mockReturnValueOnce(getSingleBlackPixel());
const data = await screenshotStitcher(clip, 1, 1, fn, loggerMock);
expect(fn.mock.calls.length).toBe(2);
expect(fn.mock.calls[0][0]).toEqual({ x: 0, y: 0, width: 1, height: 1 });
expect(fn.mock.calls[1][0]).toEqual({ x: 1, y: 0, width: 1, height: 1 });
const expectedData = await get2x1Checkerboard();
expect(data).toEqual(expectedData);
});
test(`two screenshots, no zoom`, async () => {
const clip = {
height: 1,
width: 2,
x: 0,
y: 0,
};
const fn = jest.fn();
fn.mockReturnValueOnce(getSingleWhitePixel());
fn.mockReturnValueOnce(getSingleBlackPixel());
const data = await screenshotStitcher(clip, 1, 1, fn, loggerMock);
expect(fn.mock.calls.length).toBe(2);
expect(fn.mock.calls[0][0]).toEqual({ x: 0, y: 0, width: 1, height: 1 });
expect(fn.mock.calls[1][0]).toEqual({ x: 1, y: 0, width: 1, height: 1 });
const expectedData = await get2x1Checkerboard();
expect(data).toEqual(expectedData);
});
test(`four screenshots, zoom`, async () => {
const clip = {
height: 2,
width: 2,
x: 0,
y: 0,
};
const fn = jest.fn();
fn.mockReturnValueOnce(get2x2White());
fn.mockReturnValueOnce(get2x2Black());
fn.mockReturnValueOnce(get2x2Black());
fn.mockReturnValueOnce(get2x2White());
const data = await screenshotStitcher(clip, 2, 1, fn, loggerMock);
expect(fn.mock.calls.length).toBe(4);
expect(fn.mock.calls[0][0]).toEqual({ x: 0, y: 0, width: 1, height: 1 });
expect(fn.mock.calls[1][0]).toEqual({ x: 1, y: 0, width: 1, height: 1 });
expect(fn.mock.calls[2][0]).toEqual({ x: 0, y: 1, width: 1, height: 1 });
expect(fn.mock.calls[3][0]).toEqual({ x: 1, y: 1, width: 1, height: 1 });
const expectedData = await get4x4Checkerboard();
expect(data).toEqual(expectedData);
});
test(`four screenshots, zoom and offset`, async () => {
const clip = {
height: 2,
width: 2,
x: 1,
y: 1,
};
const fn = jest.fn();
fn.mockReturnValueOnce(get2x2White());
fn.mockReturnValueOnce(get2x2Black());
fn.mockReturnValueOnce(get2x2Black());
fn.mockReturnValueOnce(get2x2White());
const data = await screenshotStitcher(clip, 2, 1, fn, loggerMock);
expect(fn.mock.calls.length).toBe(4);
expect(fn.mock.calls[0][0]).toEqual({ x: 1, y: 1, width: 1, height: 1 });
expect(fn.mock.calls[1][0]).toEqual({ x: 2, y: 1, width: 1, height: 1 });
expect(fn.mock.calls[2][0]).toEqual({ x: 1, y: 2, width: 1, height: 1 });
expect(fn.mock.calls[3][0]).toEqual({ x: 2, y: 2, width: 1, height: 1 });
const expectedData = await get4x4Checkerboard();
expect(data).toEqual(expectedData);
});

View file

@ -1,8 +0,0 @@
/*
* Copyright Elasticsearch B.V. and/or licensed to Elasticsearch B.V. under one
* or more contributor license agreements. Licensed under the Elastic License;
* you may not use this file except in compliance with the Elastic License.
*/
export { screenshotStitcher } from './screenshot_stitcher';
export { CapturePngSizeError } from './combine';

View file

@ -1,91 +0,0 @@
/*
* Copyright Elasticsearch B.V. and/or licensed to Elasticsearch B.V. under one
* or more contributor license agreements. Licensed under the Elastic License;
* you may not use this file except in compliance with the Elastic License.
*/
import { map, mergeMap, switchMap, toArray } from 'rxjs/operators';
import { $combine } from './combine';
import { $getClips } from './get_clips';
import { Logger, Rectangle, Screenshot } from './types';
const scaleRect = (rect: Rectangle, scale: number): Rectangle => {
return {
height: rect.height * scale,
width: rect.width * scale,
x: rect.x * scale,
y: rect.y * scale,
};
};
/**
* Returns a stream of data that should be of the size outputClip.width * zoom x outputClip.height * zoom
* @param outputClip - The dimensions the final image should take.
* @param zoom - Determines the resolution want the final screenshot to take.
* @param maxDimensionPerClip - The maximimum dimension, in any direction (width or height) that we should allow per
* screenshot clip. If zoom is 10 and maxDimensionPerClip is anything less than or
* equal to 10, each clip taken, before being zoomed in, should be no bigger than 1 x 1.
* If zoom is 10 and maxDimensionPerClip is 20, then each clip taken before being zoomed in should be no bigger than 2 x 2.
* @param captureScreenshotFn - a function to take a screenshot from the page using the dimensions given. The data
* returned should have dimensions not of the clip passed in, but of the clip passed in multiplied by zoom.
* @param logger
*/
export async function screenshotStitcher(
outputClip: Rectangle,
zoom: number,
maxDimensionPerClip: number,
captureScreenshotFn: (rect: Rectangle) => Promise<string>,
logger: Logger
): Promise<string> {
// We have to divide the max by the zoom because we will be multiplying each clip's dimensions
// later by zoom, and we don't want those dimensions to end up larger than max.
const maxDimensionBeforeZoom = Math.ceil(maxDimensionPerClip / zoom);
const screenshotClips$ = $getClips(outputClip, maxDimensionBeforeZoom);
const screenshots$ = screenshotClips$.pipe(
mergeMap(clip => captureScreenshotFn(clip), (clip, data) => ({ clip, data }), 1)
);
// when we take the screenshots we don't have to scale the rects
// but the PNGs don't know about the zoom, so we have to scale them
const screenshotPngRects$ = screenshots$.pipe(
map(({ data, clip }) => {
// At this point we don't care about the offset - the screenshots have been taken.
// We need to adjust the x & y values so they all are adjusted for the top-left most
// clip being at 0, 0.
const x = clip.x - outputClip.x;
const y = clip.y - outputClip.y;
const scaledScreenshotRects = scaleRect(
{
height: clip.height,
width: clip.width,
x,
y,
},
zoom
);
return {
data,
rectangle: scaledScreenshotRects,
};
})
);
const scaledOutputRects = scaleRect(outputClip, zoom);
return screenshotPngRects$
.pipe(
toArray(),
switchMap<Screenshot[], string>((screenshots: Screenshot[]) =>
$combine(
screenshots,
{
height: scaledOutputRects.height,
width: scaledOutputRects.width,
},
logger
)
)
)
.toPromise<string>();
}

View file

@ -1,28 +0,0 @@
/*
* Copyright Elasticsearch B.V. and/or licensed to Elasticsearch B.V. under one
* or more contributor license agreements. Licensed under the Elastic License;
* you may not use this file except in compliance with the Elastic License.
*/
export interface Rectangle {
width: number;
height: number;
x: number;
y: number;
}
export interface Size {
width: number;
height: number;
}
export interface Screenshot {
data: string;
rectangle: Rectangle;
}
export interface Logger {
debug: (message: string) => void;
error: (message: string) => void;
warning: (message: string) => void;
}

View file

@ -1,29 +0,0 @@
/*
* Copyright Elasticsearch B.V. and/or licensed to Elasticsearch B.V. under one
* or more contributor license agreements. Licensed under the Elastic License;
* you may not use this file except in compliance with the Elastic License.
*/
import { transform as babelTransform } from 'babel-core';
import { memoize } from 'lodash';
const safeWrap = (obj) => {
const code = obj.toString();
return new Function(`return (${code}).apply(null, arguments);`);
};
const transform = (code) => {
const result = babelTransform(code, {
ast: false,
babelrc: false,
presets: [
[ require.resolve('babel-preset-es2015'), { 'modules': false } ]
]
});
return result.code;
};
export const transformFn = memoize((fn) => {
const code = transform(safeWrap(fn));
return safeWrap(code);
});

View file

@ -4,7 +4,25 @@
* you may not use this file except in compliance with the Elastic License.
*/
export const args = ({ userDataDir, bridgePort, viewport, disableSandbox, proxyConfig, verboseLogging }) => {
interface Opts {
userDataDir: string;
viewport: { width: number; height: number };
disableSandbox: boolean;
proxyConfig: {
enabled: boolean;
server: string;
bypass?: string[];
};
verboseLogging: boolean;
}
export const args = ({
userDataDir,
viewport,
disableSandbox,
proxyConfig,
verboseLogging,
}: Opts) => {
const flags = [
// Disable built-in Google Translate service
'--disable-translate',
@ -30,7 +48,6 @@ export const args = ({ userDataDir, bridgePort, viewport, disableSandbox, proxyC
'--headless',
'--hide-scrollbars',
`--window-size=${Math.floor(viewport.width)},${Math.floor(viewport.height)}`,
`--remote-debugging-port=${bridgePort}`,
];
if (proxyConfig.enabled) {

View file

@ -1,47 +0,0 @@
/*
* Copyright Elasticsearch B.V. and/or licensed to Elasticsearch B.V. under one
* or more contributor license agreements. Licensed under the Elastic License;
* you may not use this file except in compliance with the Elastic License.
*/
import http from 'http';
// See https://github.com/elastic/kibana/issues/19351 for why this is necessary. Long story short, on certain
// linux platforms (fwiw, we have only experienced this on jenkins agents) the first bootup of chromium takes
// a long time doing something with fontconfig packages loading up a cache. The cdp command will timeout
// if we don't wait for this manually. Note that this may still timeout based on the value of
// xpack.reporting.queue.timeout. Subsequent runs should be fast because the cache will already be
// initialized.
/**
*
* @param {string} port
* @param {Object} logger
* @return {Promise}
*/
export async function ensureChromiumIsListening(port, logger) {
const options = {
port,
hostname: '127.0.0.1',
timeout: 120000,
path: '/json',
};
return new Promise((resolve, reject) => {
http.get(
options,
res => {
res.on('data', function (chunk) {
logger.debug(`Response from chromium: ${chunk}`);
});
res.on('end', function () {
logger.debug(`Chromium response complete`);
resolve();
});
})
.on('error', e => {
logger.error(`Ensure chromium is listening failed with error ${e.message}`);
reject(e);
});
});
}

View file

@ -6,16 +6,13 @@
import fs from 'fs';
import os from 'os';
import path from 'path';
import { spawn } from 'child_process';
import puppeteer from 'puppeteer-core';
import rimraf from 'rimraf';
import * as Rx from 'rxjs';
import { map, share, first, tap, mergeMap, filter, partition } from 'rxjs/operators';
import cdp from 'chrome-remote-interface';
import { map, share, mergeMap, filter, partition } from 'rxjs/operators';
import { HeadlessChromiumDriver } from '../driver';
import { args } from './args';
import { safeChildProcess } from '../../safe_child_process';
import { ensureChromiumIsListening } from './ensure_chromium_is_listening';
const compactWhitespace = (str) => {
return str.replace(/\s+/, ' ');
@ -30,57 +27,61 @@ export class HeadlessChromiumDriverFactory {
type = 'chromium';
create({ bridgePort, viewport }) {
create({ viewport }) {
return Rx.Observable.create(async observer => {
const userDataDir = fs.mkdtempSync(path.join(os.tmpdir(), 'chromium-'));
const chromiumArgs = args({
userDataDir,
bridgePort,
viewport,
verboseLogging: this.logger.isVerbose,
disableSandbox: this.browserConfig.disableSandbox,
proxyConfig: this.browserConfig.proxy,
});
this.logger.debug(`spawning chromium process at ${this.binaryPath} with arguments ${chromiumArgs}`);
let chromium;
let page;
try {
chromium = spawn(this.binaryPath, chromiumArgs);
chromium = await puppeteer.launch({
userDataDir,
executablePath: this.binaryPath,
ignoreHTTPSErrors: true,
args: chromiumArgs,
});
page = await chromium.newPage();
} catch (err) {
observer.error(new Error(`Caught error spawning Chromium`));
return;
}
safeChildProcess(chromium, observer);
safeChildProcess({
async kill() {
await chromium.close();
}
}, observer);
const stderr$ = Rx.fromEvent(chromium.stderr, 'data').pipe(
map(line => line.toString()),
const stderr$ = Rx.fromEvent(page, 'console').pipe(
filter(line => line._type === 'error'),
map(line => line._text),
share()
);
const [ consoleMessage$, message$ ] = stderr$.pipe(
const [consoleMessage$, message$] = stderr$.pipe(
partition(msg => msg.match(/\[\d+\/\d+.\d+:\w+:CONSOLE\(\d+\)\]/))
);
const driver$ = message$.pipe(
first(line => line.indexOf(`DevTools listening on ws://127.0.0.1:${bridgePort}`) >= 0),
tap(() => this.logger.debug('Ensure chromium is running and listening')),
mergeMap(() => ensureChromiumIsListening(bridgePort, this.logger)),
tap(() => this.logger.debug('Connecting chrome remote interface')),
mergeMap(() => cdp({ port: bridgePort, local: true })),
tap(() => this.logger.debug('Initializing chromium driver')),
map(client => new HeadlessChromiumDriver(client, {
maxScreenshotDimension: this.browserConfig.maxScreenshotDimension,
logger: this.logger
}))
);
const driver$ = Rx.of(new HeadlessChromiumDriver(page, {
maxScreenshotDimension: this.browserConfig.maxScreenshotDimension,
logger: this.logger
}));
const processError$ = Rx.fromEvent(chromium, 'error').pipe(
const processError$ = Rx.fromEvent(page, 'error').pipe(
map((err) => this.logger.error(err)),
mergeMap(() => Rx.throwError(new Error(`Unable to spawn Chromium`))),
);
const processExit$ = Rx.fromEvent(chromium, 'exit').pipe(
mergeMap(([code]) => Rx.throwError(new Error(`Chromium exited with code: ${code}`)))
const processExit$ = Rx.fromEvent(chromium, 'disconnected').pipe(
mergeMap((err) => Rx.throwError(new Error(`Chromium exited with code: ${err}. ${JSON.stringify(err)}`)))
);
const nssError$ = message$.pipe(

View file

@ -11,18 +11,18 @@ export const paths = {
baseUrl: 'https://s3.amazonaws.com/headless-shell/',
packages: [{
platforms: ['darwin', 'freebsd', 'openbsd'],
archiveFilename: 'chromium-503a3e4-darwin.zip',
archiveChecksum: 'c1b530f99374e122c0bd7ba663867a95',
archiveFilename: 'chromium-4747cc2-darwin.zip',
archiveChecksum: '3f509e2fa994da3a1399d18d03b6eef7',
binaryRelativePath: 'headless_shell-darwin/headless_shell',
}, {
platforms: ['linux'],
archiveFilename: 'chromium-503a3e4-linux.zip',
archiveChecksum: '9486d8eff9fc4f94c899aa72f5e59520',
archiveFilename: 'chromium-4747cc2-linux.zip',
archiveChecksum: '8f361042d0fc8a84d60cd01777ec260f',
binaryRelativePath: 'headless_shell-linux/headless_shell'
}, {
platforms: ['win32'],
archiveFilename: 'chromium-503a3e4-win32.zip',
archiveChecksum: 'a71ce5565791767492f6d0fb4fe5360d',
binaryRelativePath: 'headless_shell-win32\\headless_shell.exe'
archiveFilename: 'chromium-4747cc2-windows.zip',
archiveChecksum: 'fac0967cd54bb2492a5a858fbefdf983',
binaryRelativePath: 'headless_shell-windows\\headless_shell.exe'
}]
};

View file

@ -27,8 +27,9 @@ const fsp = {
export async function installBrowser(logger, browserConfig, browserType, installsPath) {
const browser = BROWSERS_BY_TYPE[browserType];
const pkg = browser.paths.packages.find(p => p.platforms.includes(process.platform));
if (!pkg) {
throw new Error('Unsupported platform: platform');
throw new Error(`Unsupported platform: ${JSON.stringify(browser, null, 2)}`);
}
const binaryPath = path.join(installsPath, pkg.binaryRelativePath);

View file

@ -1,8 +0,0 @@
/*
* Copyright Elasticsearch B.V. and/or licensed to Elasticsearch B.V. under one
* or more contributor license agreements. Licensed under the Elastic License;
* you may not use this file except in compliance with the Elastic License.
*/
require('@kbn/plugin-helpers').babelRegister();
require('../build_chromium/cli');

View file

@ -6,10 +6,8 @@
require('@kbn/plugin-helpers').babelRegister();
require('@kbn/test').runTestsCli([
// https://github.com/elastic/kibana/issues/19563 needs to be fixed and then these can get
// turned back on. A 3x retry loop did not prevent it from becoming flaky.
// require.resolve('../test/reporting/configs/chromium_api.js'),
// require.resolve('../test/reporting/configs/chromium_functional.js'),
require.resolve('../test/reporting/configs/chromium_api.js'),
require.resolve('../test/reporting/configs/chromium_functional.js'),
require.resolve('../test/reporting/configs/phantom_api.js'),
require.resolve('../test/reporting/configs/phantom_functional.js'),
require.resolve('../test/functional/config.js'),

View file

@ -138,6 +138,11 @@ export function ReportingPageProvider({ getService, getPageObjects }) {
await testSubjects.click('downloadCompletedReportButton', timeout);
}
async clearToastNotifications() {
const toasts = await testSubjects.findAll('toastCloseButton');
await Promise.all(toasts.map(t => t.click()));
}
async getUnsavedChangesWarningExists() {
return await testSubjects.exists('unsavedChangesReportingWarning');
}

View file

@ -12,9 +12,9 @@ import mkdirp from 'mkdirp';
import { PNG } from 'pngjs';
import { PDFImage } from 'pdf-image';
import PDFJS from 'pdfjs-dist';
const mkdirAsync = promisify(mkdirp);
const writeFileAsync = promisify(fs.writeFile);
function comparePngs(actualPath, expectedPath, diffPath, log) {
log.debug(`comparePngs: ${actualPath} vs ${expectedPath}`);
@ -62,12 +62,14 @@ export async function checkIfPdfsMatch(actualPdfPath, baselinePdfPath, screensho
// don't want to start causing failures for other devs working on OS's which are lacking snapshots. We have
// mac and linux covered which is better than nothing for now.
try {
await writeFileAsync(baselineCopyPath, fs.readFileSync(baselinePdfPath));
log.debug(`writeFileSync: ${baselineCopyPath}`);
fs.writeFileSync(baselineCopyPath, fs.readFileSync(baselinePdfPath));
} catch (error) {
log.error(`No baseline pdf found at ${baselinePdfPath}`);
return 0;
}
await writeFileAsync(actualCopyPath, fs.readFileSync(actualPdfPath));
log.debug(`writeFileSync: ${actualCopyPath}`);
fs.writeFileSync(actualCopyPath, fs.readFileSync(actualPdfPath));
const convertOptions = {
'-density': '300',
@ -75,39 +77,30 @@ export async function checkIfPdfsMatch(actualPdfPath, baselinePdfPath, screensho
const actualPdfImage = new PDFImage(actualCopyPath, { convertOptions });
const expectedPdfImage = new PDFImage(baselineCopyPath, { convertOptions });
let pageNum = 0;
let diffTotal = 0;
// Ran across an instance where the page conversion failed with `Failed to convert page to image` for no known
// reason. Seeing if a loop will resolve these flaky errors.
let failCount = 0;
while (true) {
let expectedPagePng;
let actualPagePng;
try {
log.debug(`Converting expected pdf page ${pageNum} to png`);
expectedPagePng = await expectedPdfImage.convertPage(pageNum);
log.debug(`Converting actual pdf page ${pageNum} to png`);
actualPagePng = await actualPdfImage.convertPage(pageNum);
} catch (e) {
log.error(`Error caught while converting pdf page ${pageNum} to png: ${e.message}`);
if (JSON.stringify(e).indexOf('Requested FirstPage is greater than the number of pages in the file') >= 0) {
break;
} else {
if (failCount < 3) {
log.error(`${failCount}: Will try conversion again...`);
failCount++;
continue;
} else {
log.error(`Failed ${failCount} times, throwing error`);
throw e;
}
}
}
log.debug(`Calculating numberOfPages`);
const actualDoc = await PDFJS.getDocument(actualCopyPath);
const expectedDoc = await PDFJS.getDocument(baselineCopyPath);
const actualPages = actualDoc.numPages;
const expectedPages = expectedDoc.numPages;
if (actualPages !== expectedPages) {
throw new Error(
`Expected ${expectedPages} pages but got ${actualPages} in PDFs expected: "${baselineCopyPath}" actual: "${actualCopyPath}".`
);
}
let diffTotal = 0;
for (let pageNum = 0; pageNum <= expectedPages; ++pageNum) {
log.debug(`Converting expected pdf page ${pageNum} to png`);
const expectedPagePng = await expectedPdfImage.convertPage(pageNum);
log.debug(`Converting actual pdf page ${pageNum} to png`);
const actualPagePng = await actualPdfImage.convertPage(pageNum);
const diffPngPath = path.resolve(failureDirectoryPath, `${baselinePdfFileName}-${pageNum}.png`);
diffTotal += await comparePngs(actualPagePng, expectedPagePng, diffPngPath, log);
pageNum++;
}
return diffTotal;
}

View file

@ -66,6 +66,8 @@ export default function ({ getService, getPageObjects }) {
};
describe('Dashboard', () => {
beforeEach(() => PageObjects.reporting.clearToastNotifications());
describe('Print PDF button', () => {
it('is not available if new', async () => {
await PageObjects.common.navigateToApp('dashboard');

View file

@ -226,12 +226,22 @@ acorn@^5.0.0, acorn@^5.1.2:
version "5.3.0"
resolved "https://registry.yarnpkg.com/acorn/-/acorn-5.3.0.tgz#7446d39459c54fb49a80e6ee6478149b940ec822"
agent-base@^4.1.0:
version "4.2.1"
resolved "https://registry.yarnpkg.com/agent-base/-/agent-base-4.2.1.tgz#d89e5999f797875674c07d87f260fc41e83e8ca9"
dependencies:
es6-promisify "^5.0.0"
agentkeepalive@^3.4.1:
version "3.4.1"
resolved "https://registry.yarnpkg.com/agentkeepalive/-/agentkeepalive-3.4.1.tgz#aa95aebc3a749bca5ed53e3880a09f5235b48f0c"
dependencies:
humanize-ms "^1.2.1"
ajv-keywords@^3.1.0:
version "3.2.0"
resolved "https://registry.yarnpkg.com/ajv-keywords/-/ajv-keywords-3.2.0.tgz#e86b819c602cf8821ad637413698f1dec021847a"
ajv@^4.9.1:
version "4.11.8"
resolved "https://registry.yarnpkg.com/ajv/-/ajv-4.11.8.tgz#82ffb02b29e662ae53bdc20af15947706739c536"
@ -248,6 +258,15 @@ ajv@^5.1.0:
fast-json-stable-stringify "^2.0.0"
json-schema-traverse "^0.3.0"
ajv@^6.1.0:
version "6.5.3"
resolved "https://registry.yarnpkg.com/ajv/-/ajv-6.5.3.tgz#71a569d189ecf4f4f321224fecb166f071dd90f9"
dependencies:
fast-deep-equal "^2.0.1"
fast-json-stable-stringify "^2.0.0"
json-schema-traverse "^0.4.1"
uri-js "^4.2.2"
align-text@^0.1.1, align-text@^0.1.3:
version "0.1.4"
resolved "https://registry.yarnpkg.com/align-text/-/align-text-0.1.4.tgz#0cd90a561093f35d0a99256c22b7069433fad117"
@ -483,10 +502,6 @@ assert-plus@^0.2.0:
version "0.2.0"
resolved "https://registry.yarnpkg.com/assert-plus/-/assert-plus-0.2.0.tgz#d74e1b87e7affc0db8aadb7021f3fe48101ab234"
assertion-error@1.0.0:
version "1.0.0"
resolved "https://registry.yarnpkg.com/assertion-error/-/assertion-error-1.0.0.tgz#c7f85438fdd466bc7ca16ab90c81513797a5d23b"
assign-symbols@^1.0.0:
version "1.0.0"
resolved "https://registry.yarnpkg.com/assign-symbols/-/assign-symbols-1.0.0.tgz#59667f41fadd4f20ccbc2bb96b8d4f7f78ec0367"
@ -1040,6 +1055,10 @@ beeper@^1.0.0:
version "1.1.1"
resolved "https://registry.yarnpkg.com/beeper/-/beeper-1.1.1.tgz#e6d5ea8c5dad001304a70b22638447f69cb2f809"
big.js@^3.1.3:
version "3.2.0"
resolved "https://registry.yarnpkg.com/big.js/-/big.js-3.2.0.tgz#a5fc298b81b9e0dca2e458824784b65c52ba588e"
bl@^1.0.0:
version "1.2.1"
resolved "https://registry.yarnpkg.com/bl/-/bl-1.2.1.tgz#cac328f7bee45730d404b692203fcb590e172d5e"
@ -1188,6 +1207,10 @@ buffer-equal@^1.0.0:
version "1.0.0"
resolved "https://registry.yarnpkg.com/buffer-equal/-/buffer-equal-1.0.0.tgz#59616b498304d556abd466966b22eeda3eca5fbe"
buffer-from@^1.0.0:
version "1.1.1"
resolved "https://registry.yarnpkg.com/buffer-from/-/buffer-from-1.1.1.tgz#32713bc028f75c02fdb710d7c7bcec1f2c6070ef"
buffer@^3.0.1:
version "3.6.0"
resolved "https://registry.yarnpkg.com/buffer/-/buffer-3.6.0.tgz#a72c936f77b96bf52f5f7e7b467180628551defb"
@ -1296,13 +1319,6 @@ center-align@^0.1.1:
align-text "^0.1.3"
lazy-cache "^1.0.3"
chai@~1.9.2:
version "1.9.2"
resolved "https://registry.yarnpkg.com/chai/-/chai-1.9.2.tgz#3f1a20f82b0b9d7437577d24d6f12b1a69d3b590"
dependencies:
assertion-error "1.0.0"
deep-eql "0.1.3"
chalk@^1.0.0, chalk@^1.1.1, chalk@^1.1.3:
version "1.1.3"
resolved "https://registry.yarnpkg.com/chalk/-/chalk-1.1.3.tgz#a8115c55e4a702fe4d150abd3872822a7e09fc98"
@ -1354,13 +1370,6 @@ chownr@^1.0.1:
version "1.0.1"
resolved "https://registry.yarnpkg.com/chownr/-/chownr-1.0.1.tgz#e2a75042a9551908bebd25b8523d5f9769d79181"
chrome-remote-interface@0.24.2:
version "0.24.2"
resolved "https://registry.yarnpkg.com/chrome-remote-interface/-/chrome-remote-interface-0.24.2.tgz#43a05440a1fa60b73769e72f3e7892ac11d66eba"
dependencies:
commander "2.1.x"
ws "2.0.x"
ci-info@^1.0.0:
version "1.1.2"
resolved "https://registry.yarnpkg.com/ci-info/-/ci-info-1.1.2.tgz#03561259db48d0474c8bdc90f5b47b068b6bbfb4"
@ -1506,10 +1515,6 @@ commander@0.6.1:
version "0.6.1"
resolved "https://registry.yarnpkg.com/commander/-/commander-0.6.1.tgz#fa68a14f6a945d54dbbe50d8cdb3320e9e3b1a06"
commander@2.1.x:
version "2.1.0"
resolved "https://registry.yarnpkg.com/commander/-/commander-2.1.0.tgz#d121bbae860d9992a3d517ba96f56588e47c6781"
commander@2.12.2, commander@^2.9.0:
version "2.12.2"
resolved "https://registry.yarnpkg.com/commander/-/commander-2.12.2.tgz#0f5946c427ed9ec0d91a46bb9def53e54650e555"
@ -1548,6 +1553,15 @@ concat-stream@1.5.1:
readable-stream "~2.0.0"
typedarray "~0.0.5"
concat-stream@1.6.2:
version "1.6.2"
resolved "https://registry.yarnpkg.com/concat-stream/-/concat-stream-1.6.2.tgz#904bdf194cd3122fc675c77fc4ac3d4ff0fd1a34"
dependencies:
buffer-from "^1.0.0"
inherits "^2.0.3"
readable-stream "^2.2.2"
typedarray "^0.0.6"
concat-stream@^1.4.7, concat-stream@~1.6.0:
version "1.6.0"
resolved "https://registry.yarnpkg.com/concat-stream/-/concat-stream-1.6.0.tgz#0aac662fd52be78964d5532f694784e70110acf7"
@ -1824,7 +1838,7 @@ debug@2.6.0:
dependencies:
ms "0.7.2"
debug@^2.2.0, debug@^2.3.3, debug@^2.6.8:
debug@2.6.9, debug@^2.2.0, debug@^2.3.3, debug@^2.6.8:
version "2.6.9"
resolved "https://registry.yarnpkg.com/debug/-/debug-2.6.9.tgz#5d128515df134ff327e90a4c93f4e077a536341f"
dependencies:
@ -1854,12 +1868,6 @@ dedent@^0.7.0:
version "0.7.0"
resolved "https://registry.yarnpkg.com/dedent/-/dedent-0.7.0.tgz#2495ddbaf6eb874abb0e1be9df22d2e5a544326c"
deep-eql@0.1.3:
version "0.1.3"
resolved "https://registry.yarnpkg.com/deep-eql/-/deep-eql-0.1.3.tgz#ef558acab8de25206cd713906d74e56930eb69f2"
dependencies:
type-detect "0.1.1"
deep-equal@^1.0.0, deep-equal@^1.0.1:
version "1.0.1"
resolved "https://registry.yarnpkg.com/deep-equal/-/deep-equal-1.0.1.tgz#f5d260292b660e084eff4cdbc9f08ad3247448b5"
@ -2087,6 +2095,10 @@ elasticsearch@^14.1.0:
lodash.isempty "^4.4.0"
lodash.trimend "^4.5.1"
emojis-list@^2.0.0:
version "2.1.0"
resolved "https://registry.yarnpkg.com/emojis-list/-/emojis-list-2.1.0.tgz#4daa4d9db00f9819880c79fa457ae5b09a1fd389"
encoding@^0.1.11:
version "0.1.12"
resolved "https://registry.yarnpkg.com/encoding/-/encoding-0.1.12.tgz#538b66f3ee62cd1ab51ec323829d1f9480c74beb"
@ -2175,9 +2187,15 @@ es-to-primitive@^1.1.1:
is-date-object "^1.0.1"
is-symbol "^1.0.1"
es6-promise@~2.0.0:
version "2.0.1"
resolved "https://registry.yarnpkg.com/es6-promise/-/es6-promise-2.0.1.tgz#ccc4963e679f0ca9fb187c777b9e583d3c7573c2"
es6-promise@^4.0.3, es6-promise@~4.2.4:
version "4.2.4"
resolved "https://registry.yarnpkg.com/es6-promise/-/es6-promise-4.2.4.tgz#dc4221c2b16518760bd8c39a52d8f356fc00ed29"
es6-promisify@^5.0.0:
version "5.0.0"
resolved "https://registry.yarnpkg.com/es6-promisify/-/es6-promisify-5.0.0.tgz#5109d62f3e56ea967c4b63505aef08291c8a5203"
dependencies:
es6-promise "^4.0.3"
escape-string-regexp@1.0.2:
version "1.0.2"
@ -2409,6 +2427,15 @@ extract-zip@1.5.0:
mkdirp "0.5.0"
yauzl "2.4.1"
extract-zip@^1.6.6:
version "1.6.7"
resolved "https://registry.yarnpkg.com/extract-zip/-/extract-zip-1.6.7.tgz#a840b4b8af6403264c8db57f4f1a74333ef81fe9"
dependencies:
concat-stream "1.6.2"
debug "2.6.9"
mkdirp "0.5.1"
yauzl "2.4.1"
extsprintf@1.3.0:
version "1.3.0"
resolved "https://registry.yarnpkg.com/extsprintf/-/extsprintf-1.3.0.tgz#96918440e3041a7a414f8c52e3c574eb3c3e1e05"
@ -2438,6 +2465,10 @@ fast-deep-equal@^1.0.0:
version "1.0.0"
resolved "https://registry.yarnpkg.com/fast-deep-equal/-/fast-deep-equal-1.0.0.tgz#96256a3bc975595eb36d82e9929d060d893439ff"
fast-deep-equal@^2.0.1:
version "2.0.1"
resolved "https://registry.yarnpkg.com/fast-deep-equal/-/fast-deep-equal-2.0.1.tgz#7b05218ddf9667bf7f370bf7fdb2cb15fdd0aa49"
fast-json-stable-stringify@^2.0.0:
version "2.0.0"
resolved "https://registry.yarnpkg.com/fast-json-stable-stringify/-/fast-json-stable-stringify-2.0.0.tgz#d5142c0caee6b1189f87d3a76111064f86c8bbf2"
@ -3453,6 +3484,13 @@ http-signature@~1.2.0:
jsprim "^1.2.2"
sshpk "^1.7.0"
https-proxy-agent@^2.2.1:
version "2.2.1"
resolved "https://registry.yarnpkg.com/https-proxy-agent/-/https-proxy-agent-2.2.1.tgz#51552970fa04d723e04c56d04178c3f92592bbc0"
dependencies:
agent-base "^4.1.0"
debug "^3.1.0"
humanize-ms@^1.2.1:
version "1.2.1"
resolved "https://registry.yarnpkg.com/humanize-ms/-/humanize-ms-1.2.1.tgz#c46e3159a293f6b896da29316d8b6fe8bb79bbed"
@ -4366,6 +4404,10 @@ json-schema-traverse@^0.3.0:
version "0.3.1"
resolved "https://registry.yarnpkg.com/json-schema-traverse/-/json-schema-traverse-0.3.1.tgz#349a6d44c53a51de89b40805c5d5e59b417d3340"
json-schema-traverse@^0.4.1:
version "0.4.1"
resolved "https://registry.yarnpkg.com/json-schema-traverse/-/json-schema-traverse-0.4.1.tgz#69f6a87d9513ab8bb8fe63bdb0979c448e684660"
json-schema@0.2.3:
version "0.2.3"
resolved "https://registry.yarnpkg.com/json-schema/-/json-schema-0.2.3.tgz#b480c892e59a2f05954ce727bd3f2a4e882f9e13"
@ -4384,7 +4426,7 @@ json3@3.3.2:
version "3.3.2"
resolved "https://registry.yarnpkg.com/json3/-/json3-3.3.2.tgz#3c0434743df93e2f5c42aee7b19bcb483575f4e1"
json5@^0.5.1:
json5@^0.5.0, json5@^0.5.1:
version "0.5.1"
resolved "https://registry.yarnpkg.com/json5/-/json5-0.5.1.tgz#1eade7acc012034ad84e2396767ead9fa5495821"
@ -4533,6 +4575,14 @@ load-json-file@^1.0.0, load-json-file@^1.1.0:
pinkie-promise "^2.0.0"
strip-bom "^2.0.0"
loader-utils@^1.0.0:
version "1.1.0"
resolved "https://registry.yarnpkg.com/loader-utils/-/loader-utils-1.1.0.tgz#c98aef488bcceda2ffb5e2de646d6a754429f5cd"
dependencies:
big.js "^3.1.3"
emojis-list "^2.0.0"
json5 "^0.5.0"
locate-path@^2.0.0:
version "2.0.0"
resolved "https://registry.yarnpkg.com/locate-path/-/locate-path-2.0.0.tgz#2b568b265eec944c6d9c0de9c3dbbbca0354cd8e"
@ -4980,6 +5030,10 @@ mime@^1.4.1:
version "1.6.0"
resolved "https://registry.yarnpkg.com/mime/-/mime-1.6.0.tgz#32cd9e5c64553bd58d19a568af452acff04981b1"
mime@^2.0.3:
version "2.3.1"
resolved "https://registry.yarnpkg.com/mime/-/mime-2.3.1.tgz#b1621c54d63b97c47d3cfe7f7215f7d64517c369"
mimic-fn@^1.0.0:
version "1.1.0"
resolved "https://registry.yarnpkg.com/mimic-fn/-/mimic-fn-1.1.0.tgz#e667783d92e89dbd342818b5230b9d62a672ad18"
@ -5222,6 +5276,10 @@ nise@^1.2.0:
path-to-regexp "^1.7.0"
text-encoding "^0.6.4"
node-ensure@^0.0.0:
version "0.0.0"
resolved "https://registry.yarnpkg.com/node-ensure/-/node-ensure-0.0.0.tgz#ecae764150de99861ec5c810fd5d096b183932a7"
node-fetch@^1.0.1, node-fetch@^1.3.3:
version "1.7.3"
resolved "https://registry.yarnpkg.com/node-fetch/-/node-fetch-1.7.3.tgz#980f6f72d85211a5347c6b2bc18c5b84c3eb47ef"
@ -5743,12 +5801,18 @@ path-type@^1.0.0:
pify "^2.0.0"
pinkie-promise "^2.0.0"
pdf-image@1.1.0:
version "1.1.0"
resolved "https://registry.yarnpkg.com/pdf-image/-/pdf-image-1.1.0.tgz#2ddc0397dcf0f2007e40519cdddd7ba64fa4d367"
pdf-image@2.0.0:
version "2.0.0"
resolved "https://registry.yarnpkg.com/pdf-image/-/pdf-image-2.0.0.tgz#f134296876c3d5aacb6bb5805ad15d0a46775fd5"
dependencies:
chai "~1.9.2"
es6-promise "~2.0.0"
es6-promise "~4.2.4"
pdfjs-dist@^2.0.489:
version "2.0.489"
resolved "https://registry.yarnpkg.com/pdfjs-dist/-/pdfjs-dist-2.0.489.tgz#63e54b292a86790a454697eb44d4347b8fbfad27"
dependencies:
node-ensure "^0.0.0"
worker-loader "^1.1.1"
pdfkit@^0.8.3:
version "0.8.3"
@ -5944,6 +6008,10 @@ process@~0.5.1:
version "0.5.2"
resolved "https://registry.yarnpkg.com/process/-/process-0.5.2.tgz#1638d8a8e34c2f440a91db95ab9aeb677fc185cf"
progress@^2.0.0:
version "2.0.0"
resolved "https://registry.yarnpkg.com/progress/-/progress-2.0.0.tgz#8a1be366bf8fc23db2bd23f10c6fe920b4389d1f"
promise@^7.1.1:
version "7.3.1"
resolved "https://registry.yarnpkg.com/promise/-/promise-7.3.1.tgz#064b72602b18f90f29192b8b1bc418ffd1ebd3bf"
@ -5972,6 +6040,10 @@ prop-types@^15.6.1:
loose-envify "^1.3.1"
object-assign "^4.1.1"
proxy-from-env@^1.0.0:
version "1.0.0"
resolved "https://registry.yarnpkg.com/proxy-from-env/-/proxy-from-env-1.0.0.tgz#33c50398f70ea7eb96d21f7b817630a55791c7ee"
proxyquire@1.7.11:
version "1.7.11"
resolved "https://registry.yarnpkg.com/proxyquire/-/proxyquire-1.7.11.tgz#13b494eb1e71fb21cc3ebe3699e637d3bec1af9e"
@ -6038,6 +6110,19 @@ punycode@^2.1.0:
version "2.1.0"
resolved "https://registry.yarnpkg.com/punycode/-/punycode-2.1.0.tgz#5f863edc89b96db09074bad7947bf09056ca4e7d"
puppeteer-core@^1.7.0:
version "1.7.0"
resolved "https://registry.yarnpkg.com/puppeteer-core/-/puppeteer-core-1.7.0.tgz#c10f660983e9a4faacf6b8e50861c7739871c752"
dependencies:
debug "^3.1.0"
extract-zip "^1.6.6"
https-proxy-agent "^2.2.1"
mime "^2.0.3"
progress "^2.0.0"
proxy-from-env "^1.0.0"
rimraf "^2.6.1"
ws "^5.1.1"
qs@^6.5.1, qs@~6.5.1:
version "6.5.1"
resolved "https://registry.yarnpkg.com/qs/-/qs-6.5.1.tgz#349cdf6eef89ec45c12d7d5eb3fc0c870343a6d8"
@ -6918,6 +7003,13 @@ sax@>=0.6.0, sax@^1.2.1:
version "1.2.4"
resolved "https://registry.yarnpkg.com/sax/-/sax-1.2.4.tgz#2816234e2378bddc4e5354fab5caa895df7100d9"
schema-utils@^0.4.0:
version "0.4.7"
resolved "https://registry.yarnpkg.com/schema-utils/-/schema-utils-0.4.7.tgz#ba74f597d2be2ea880131746ee17d0a093c68187"
dependencies:
ajv "^6.1.0"
ajv-keywords "^3.1.0"
scroll-into-view@^1.3.0:
version "1.9.1"
resolved "https://registry.yarnpkg.com/scroll-into-view/-/scroll-into-view-1.9.1.tgz#90c3b338422f9fddaebad90e6954790940dc9c1e"
@ -7764,10 +7856,6 @@ type-check@~0.3.2:
dependencies:
prelude-ls "~1.1.2"
type-detect@0.1.1:
version "0.1.1"
resolved "https://registry.yarnpkg.com/type-detect/-/type-detect-0.1.1.tgz#0ba5ec2a885640e470ea4e8505971900dac58822"
type-detect@^4.0.5:
version "4.0.8"
resolved "https://registry.yarnpkg.com/type-detect/-/type-detect-4.0.8.tgz#7646fb5f18871cfbb7749e69bd39a6388eb7450c"
@ -7805,10 +7893,6 @@ uid-number@^0.0.6:
version "0.0.6"
resolved "https://registry.yarnpkg.com/uid-number/-/uid-number-0.0.6.tgz#0ea10e8035e8eb5b8e4449f06da1c730663baa81"
ultron@~1.1.0:
version "1.1.1"
resolved "https://registry.yarnpkg.com/ultron/-/ultron-1.1.1.tgz#9fe1536a10a664a65266a1e3ccf85fd36302bc9c"
unbzip2-stream@1.0.9:
version "1.0.9"
resolved "https://registry.yarnpkg.com/unbzip2-stream/-/unbzip2-stream-1.0.9.tgz#9d107697a8d539d7bfdb9378a1cd832836bb7f8f"
@ -7873,6 +7957,12 @@ unset-value@^1.0.0:
has-value "^0.3.1"
isobject "^3.0.0"
uri-js@^4.2.2:
version "4.2.2"
resolved "https://registry.yarnpkg.com/uri-js/-/uri-js-4.2.2.tgz#94c540e1ff772956e2299507c010aea6c8838eb0"
dependencies:
punycode "^2.1.0"
urix@^0.1.0, urix@~0.1.0:
version "0.1.0"
resolved "https://registry.yarnpkg.com/urix/-/urix-0.1.0.tgz#da937f7a62e21fec1fd18d49b35c2935067a6c72"
@ -8149,6 +8239,13 @@ wordwrap@~1.0.0:
version "1.0.0"
resolved "https://registry.yarnpkg.com/wordwrap/-/wordwrap-1.0.0.tgz#27584810891456a4171c8d0226441ade90cbcaeb"
worker-loader@^1.1.1:
version "1.1.1"
resolved "https://registry.yarnpkg.com/worker-loader/-/worker-loader-1.1.1.tgz#920d74ddac6816fc635392653ed8b4af1929fd92"
dependencies:
loader-utils "^1.0.0"
schema-utils "^0.4.0"
wrap-ansi@^2.0.0:
version "2.1.0"
resolved "https://registry.yarnpkg.com/wrap-ansi/-/wrap-ansi-2.1.0.tgz#d8fc3d284dd05794fe84973caecdd1cf824fdd85"
@ -8175,12 +8272,6 @@ write-file-atomic@^2.1.0:
imurmurhash "^0.1.4"
signal-exit "^3.0.2"
ws@2.0.x:
version "2.0.3"
resolved "https://registry.yarnpkg.com/ws/-/ws-2.0.3.tgz#532fd499c3f7d7d720e543f1f807106cfc57d9cb"
dependencies:
ultron "~1.1.0"
ws@^4.1.0:
version "4.1.0"
resolved "https://registry.yarnpkg.com/ws/-/ws-4.1.0.tgz#a979b5d7d4da68bf54efe0408967c324869a7289"
@ -8188,6 +8279,12 @@ ws@^4.1.0:
async-limiter "~1.0.0"
safe-buffer "~5.1.0"
ws@^5.1.1:
version "5.2.2"
resolved "https://registry.yarnpkg.com/ws/-/ws-5.2.2.tgz#dffef14866b8e8dc9133582514d1befaf96e980f"
dependencies:
async-limiter "~1.0.0"
xml-crypto@^0.10.1:
version "0.10.1"
resolved "https://registry.yarnpkg.com/xml-crypto/-/xml-crypto-0.10.1.tgz#f832f74ccf56f24afcae1163a1fcab44d96774a8"

View file

@ -2720,13 +2720,6 @@ chownr@^1.0.1:
version "1.0.1"
resolved "https://registry.yarnpkg.com/chownr/-/chownr-1.0.1.tgz#e2a75042a9551908bebd25b8523d5f9769d79181"
chrome-remote-interface@0.24.2:
version "0.24.2"
resolved "https://registry.yarnpkg.com/chrome-remote-interface/-/chrome-remote-interface-0.24.2.tgz#43a05440a1fa60b73769e72f3e7892ac11d66eba"
dependencies:
commander "2.1.x"
ws "2.0.x"
chromedriver@2.41.0:
version "2.41.0"
resolved "https://registry.yarnpkg.com/chromedriver/-/chromedriver-2.41.0.tgz#2709d3544bc0c288b4738a6925a64c02a98a921f"
@ -2997,10 +2990,6 @@ commander@2, commander@^2.12.1, commander@^2.9.0:
version "2.15.1"
resolved "https://registry.yarnpkg.com/commander/-/commander-2.15.1.tgz#df46e867d0fc2aec66a34662b406a9ccafff5b0f"
commander@2.1.x:
version "2.1.0"
resolved "https://registry.yarnpkg.com/commander/-/commander-2.1.0.tgz#d121bbae860d9992a3d517ba96f56588e47c6781"
commander@2.8.1, commander@~2.8.1:
version "2.8.1"
resolved "https://registry.yarnpkg.com/commander/-/commander-2.8.1.tgz#06be367febfda0c330aa1e2a072d3dc9762425d4"
@ -4968,7 +4957,7 @@ extract-zip@1.5.0:
mkdirp "0.5.0"
yauzl "2.4.1"
extract-zip@^1.6.7:
extract-zip@^1.6.6, extract-zip@^1.6.7:
version "1.6.7"
resolved "https://registry.yarnpkg.com/extract-zip/-/extract-zip-1.6.7.tgz#a840b4b8af6403264c8db57f4f1a74333ef81fe9"
dependencies:
@ -8853,6 +8842,10 @@ mime@^1.2.11, mime@^1.3.4, mime@^1.4.1:
version "1.6.0"
resolved "https://registry.yarnpkg.com/mime/-/mime-1.6.0.tgz#32cd9e5c64553bd58d19a568af452acff04981b1"
mime@^2.0.3:
version "2.3.1"
resolved "https://registry.yarnpkg.com/mime/-/mime-2.3.1.tgz#b1621c54d63b97c47d3cfe7f7215f7d64517c369"
mimic-fn@^1.0.0:
version "1.2.0"
resolved "https://registry.yarnpkg.com/mimic-fn/-/mimic-fn-1.2.0.tgz#820c86a39334640e99516928bd03fca88057d022"
@ -10468,7 +10461,7 @@ proto-list@~1.2.1:
version "1.2.4"
resolved "https://registry.yarnpkg.com/proto-list/-/proto-list-1.2.4.tgz#212d5bfe1318306a420f6402b8e26ff39647a849"
proxy-from-env@1.0.0:
proxy-from-env@1.0.0, proxy-from-env@^1.0.0:
version "1.0.0"
resolved "https://registry.yarnpkg.com/proxy-from-env/-/proxy-from-env-1.0.0.tgz#33c50398f70ea7eb96d21f7b817630a55791c7ee"
@ -10657,6 +10650,19 @@ punycode@^1.2.4, punycode@^1.4.1:
version "1.4.1"
resolved "https://registry.yarnpkg.com/punycode/-/punycode-1.4.1.tgz#c0d5a63b2718800ad8e1eb0fa5269c84dd41845e"
puppeteer-core@^1.7.0:
version "1.7.0"
resolved "https://registry.yarnpkg.com/puppeteer-core/-/puppeteer-core-1.7.0.tgz#c10f660983e9a4faacf6b8e50861c7739871c752"
dependencies:
debug "^3.1.0"
extract-zip "^1.6.6"
https-proxy-agent "^2.2.1"
mime "^2.0.3"
progress "^2.0.0"
proxy-from-env "^1.0.0"
rimraf "^2.6.1"
ws "^5.1.1"
q@^1.0.1, q@^1.1.2:
version "1.5.1"
resolved "https://registry.yarnpkg.com/q/-/q-1.5.1.tgz#7e32f75b41381291d04611f1bf14109ac00651d7"
@ -13383,10 +13389,6 @@ ultron@1.0.x:
version "1.0.2"
resolved "https://registry.yarnpkg.com/ultron/-/ultron-1.0.2.tgz#ace116ab557cd197386a4e88f4685378c8b2e4fa"
ultron@~1.1.0:
version "1.1.1"
resolved "https://registry.yarnpkg.com/ultron/-/ultron-1.1.1.tgz#9fe1536a10a664a65266a1e3ccf85fd36302bc9c"
unbzip2-stream@1.0.9:
version "1.0.9"
resolved "https://registry.yarnpkg.com/unbzip2-stream/-/unbzip2-stream-1.0.9.tgz#9d107697a8d539d7bfdb9378a1cd832836bb7f8f"
@ -14373,12 +14375,6 @@ ws@1.1.2:
options ">=0.0.5"
ultron "1.0.x"
ws@2.0.x:
version "2.0.3"
resolved "https://registry.yarnpkg.com/ws/-/ws-2.0.3.tgz#532fd499c3f7d7d720e543f1f807106cfc57d9cb"
dependencies:
ultron "~1.1.0"
ws@^4.0.0, ws@^4.1.0:
version "4.1.0"
resolved "https://registry.yarnpkg.com/ws/-/ws-4.1.0.tgz#a979b5d7d4da68bf54efe0408967c324869a7289"
@ -14386,6 +14382,12 @@ ws@^4.0.0, ws@^4.1.0:
async-limiter "~1.0.0"
safe-buffer "~5.1.0"
ws@^5.1.1:
version "5.2.2"
resolved "https://registry.yarnpkg.com/ws/-/ws-5.2.2.tgz#dffef14866b8e8dc9133582514d1befaf96e980f"
dependencies:
async-limiter "~1.0.0"
wtf-8@1.0.0:
version "1.0.0"
resolved "https://registry.yarnpkg.com/wtf-8/-/wtf-8-1.0.0.tgz#392d8ba2d0f1c34d1ee2d630f15d0efb68e1048a"