[Reporting] Update Puppeteer to version 8.0.0 and Chromium to r856583 (#98688)

* Update Puppeteer to 8.0.0

Updates Chromium to r856583
Links to new build of Linux headless_shell in the Kibana team GCS bucket
Links to main download site of Chromium for Mac and Windows
Removes Mac and Windows compatibility from the Chromium build scripts

* add functional tests for large dashboard

* ensure png comparison is working

* add test for large dashboard pdf

* update arm64 binary checksum

* update README

* more readme update

* Update x-pack/build_chromium/README.md

Co-authored-by: Jean-Louis Leysens <jloleysens@gmail.com>

Co-authored-by: Kibana Machine <42973632+kibanamachine@users.noreply.github.com>
Co-authored-by: Jean-Louis Leysens <jloleysens@gmail.com>
This commit is contained in:
Tim Sullivan 2021-05-07 09:54:33 -07:00 committed by GitHub
parent 518da4daa1
commit f73da420ff
No known key found for this signature in database
GPG key ID: 4AEE18F83AFDEB23
25 changed files with 519 additions and 434 deletions

View file

@ -314,7 +314,7 @@
"proxy-from-env": "1.0.0",
"proxyquire": "1.8.0",
"puid": "1.0.7",
"puppeteer": "npm:@elastic/puppeteer@5.4.1-patch.1",
"puppeteer": "^8.0.0",
"query-string": "^6.13.2",
"raw-loader": "^3.1.0",
"rbush": "^3.0.1",
@ -586,7 +586,6 @@
"@types/pretty-ms": "^5.0.0",
"@types/prop-types": "^15.7.3",
"@types/proper-lockfile": "^3.0.1",
"@types/puppeteer": "^5.4.1",
"@types/rbush": "^3.0.0",
"@types/reach__router": "^1.2.6",
"@types/react": "^16.9.36",

View file

@ -3,37 +3,24 @@
We ship our own headless build of Chromium which is significantly smaller than
the standard binaries shipped by Google. The scripts in this folder can be used
to accept a commit hash from the Chromium repository, and initialize the build
environments and run the build on Mac, Windows, and Linux.
on Ubuntu Linux.
## Before you begin
## Why do we do this
If you wish to use a remote VM to build, you'll need access to our GCP account,
which is where we have two machines provisioned for the Linux and Windows
builds. Mac builds can be achieved locally, and are a great place to start to
gain familiarity.
By default, Puppeteer will download a zip file containing the Chromium browser for any
OS. This creates problems on Linux, because Chromium has a dependency on X11, which
is often not installed for a server environment. We don't want to make a requirement
for Linux that you need X11 to run Kibana. To work around this, we create our own Chromium
build, using the
[`headless_shell`](https://chromium.googlesource.com/chromium/src/+/5cf4b8b13ed518472038170f8de9db2f6c258fe4/headless)
build target. There are no (trustworthy) sources of these builds available elsewhere.
**NOTE:** Linux builds should be done in Ubuntu on x64 architecture. ARM builds
are created in x64 using cross-compiling. CentOS is not supported for building Chromium.
1. Login to our GCP instance [here using your okta credentials](https://console.cloud.google.com/).
2. Click the "Compute Engine" tab.
3. Find `chromium-build-linux` or `chromium-build-windows-12-beefy` and start the instance.
4. Install [Google Cloud SDK](https://cloud.google.com/sdk) locally to ssh into the GCP instance
5. System dependencies:
- 8 CPU
- 30GB memory
- 80GB free space on disk (Try `ncdu /home` to see where space is used.)
- git
- python2 (`python` must link to `python2`)
- lsb_release
- tmux is recommended in case your ssh session is interrupted
- "Cloud API access scopes": must have **read / write** scope for the Storage API
6. Copy the entire `build_chromium` directory from the `headless_shell_staging` bucket. To do this, use `gsutil rsync`:
```sh
# This shows a preview of what would change by synchronizing the source scripts with the destination GCS bucket.
# Remove the `-n` flag to enact the changes
gsutil -m rsync -n -r x-pack/build_chromium gs://headless_shell_staging/build_chromium
```
Fortunately, creating the custom builds is only necessary for Linux. When you have a build
of Kibana for Linux, or if you use a Linux desktop to develop Kibana, you have a copy of
`headless_shell` bundled inside. When you have a Windows or Mac build of Kibana, or use
either of those for development, you have a copy of the full build of Chromium, which
was downloaded from the main [Chromium download
location](https://commondatastorage.googleapis.com/chromium-browser-snapshots/index.html).
## Build Script Usage
@ -63,161 +50,65 @@ the Chromium repo to be cloned successfully. If checking out the Chromium fails
with "early EOF" errors, the instance could be low on memory or disk space.
## Getting the Commit Hash
If you need to bump the version of Puppeteer, you need to get a new git commit hash for Chromium that corresponds to the Puppeteer version.
```
node x-pack/dev-tools/chromium_version.js [PuppeteerVersion]
```
When bumping the Puppeteer version, make sure you also update the `.chromium-commit` file with the commit hash
for the current Chromium build, so we'll be able to construct a build pipeline for each OS (coming soon!).
When bumping the Puppeteer version, make sure you also update the `ChromiumArchivePaths.revision` variable in
`x-pack/plugins/reporting/server/browsers/chromium/paths.ts`.
## Build args
A good how-to on building Chromium from source is
[here](https://chromium.googlesource.com/chromium/src/+/master/docs/get_the_code.md).
There are documents for each OS that will explain how to customize arguments
for the build using the `gn` tool. Those instructions do not apply for the
Kibana Chromium build. Our `build.py` script ensure the correct `args.gn`
file gets used for build arguments.
We have an `args.gn` file per platform:
- mac: `darwin/args.gn`
- linux 64bit: `linux-x64/args.gn`
- windows: `windows/args.gn`
- ARM 64bit: linux-aarch64/args.gn
We have an `linux/args.gn` file that is automatically copied to the build target directory.
To get a list of the build arguments that are enabled, install `depot_tools` and run
`gn args out/headless --list`. It prints out all of the flags and their
settings, including the defaults.
The various build flags are not well documented. Some are documented
[here](https://www.chromium.org/developers/gn-build-configuration).
As of this writing, there is an officially supported headless Chromium build
args file for Linux: `build/args/headless.gn`. This does not work on Windows or
Mac, so we have taken that as our starting point, and modified it until the
Windows / Mac builds succeeded.
settings, including the defaults. Some build flags are documented
[here](https://www.chromium.org/developers/gn-build-configuration).
**NOTE:** Please, make sure you consult @elastic/kibana-security before you change, remove or add any of the build flags.
## Building locally
## Directions for Elasticians
You can skip the step of running `<os_name>/init.sh` for your OS if you already
have your environment set up, and the chromium source cloned.
If you wish to use a remote VM to build, you'll need access to our GCP account.
To get the Chromium code, refer to the [documentation](https://chromium.googlesource.com/chromium/src/+/master/docs/get_the_code.md).
Install `depot_tools` as suggested, since it comes with useful scripts. Use the
`fetch` command to clone the chromium repository. To set up and run the build,
use the Kibana `build.py` script (in this directory).
**NOTE:** The builds should be done in Ubuntu on x64 architecture. ARM builds
are created in x64 using cross-compiling. CentOS is not supported for building Chromium.
It's recommended that you create a working directory for the chromium source
code and all the build tools, and run the commands from there:
```sh
mkdir ~/chromium && cd ~/chromium
cp -r ~/path/to/kibana/x-pack/build_chromium .
python ./build_chromium/init.sh [arch_name]
python ./build_chromium/build.py <commit_id>
```
## VMs
I ran Linux and Windows VMs in GCP with the following specs:
- 8 core vCPU
- 30GB RAM
- 128GB hard drive
- Ubuntu 18.04 LTS (not minimal)
- Windows Server 2016 (full, with desktop)
The more cores the better, as the build makes effective use of each. For Linux, Ubuntu is the only officially supported build target.
- Linux:
- SSH in using [gcloud](https://cloud.google.com/sdk/)
- Get the ssh command in the [GCP console](https://console.cloud.google.com/) -> VM instances -> your-vm-name -> SSH -> "View gcloud command"
- Their in-browser UI is kinda sluggish, so use the commandline tool (Google Cloud SDK is required)
- Windows:
- Install Microsoft's Remote Desktop tools
- Get the RDP file in the [GCP console](https://console.cloud.google.com/) -> VM instances -> your-vm-name -> RDP -> Download the RDP file
- Edit it in Microsoft Remote Desktop:
- Display -> Resolution (1280 x 960 or something reasonable)
- Local Resources -> Folders, then select the folder(s) you want to share, this is at least `build_chromium` folder
- Save
## Initializing each VM / environment
In a VM, you'll want to use the init scripts to initialize each environment.
On Mac OS you'll need to install XCode and accept the license agreement.
Create the build folder:
- Mac / Linux: `mkdir -p ~/chromium`
- Windows: `mkdir c:\chromium`
Copy the `x-pack/build-chromium` folder to each. Replace `you@your-machine` with the correct username and VM name:
- Mac: `cp -r x-pack/build_chromium ~/chromium/build_chromium`
- Linux: `gcloud compute scp --recurse x-pack/build_chromium you@your-machine:~/chromium/ --zone=us-east1-b --project "XXXXXXXX"`
- Windows: Copy the `build_chromium` folder via the RDP GUI into `c:\chromium\build_chromium`
There is an init script for each platform. This downloads and installs the necessary prerequisites, sets environment variables, etc.
- Mac x64: `~/chromium/build_chromium/darwin/init.sh`
- Linux x64: `~/chromium/build_chromium/linux/init.sh`
- Linux arm64: `~/chromium/build_chromium/linux/init.sh arm64`
- Windows x64: `c:\chromium\build_chromium\windows\init.bat`
In windows, at least, you will need to do a number of extra steps:
- Follow the prompts in the Visual Studio installation process, click "Install" and wait a while
- Once it's installed, open Control Panel and turn on Debugging Tools for Windows:
- Control Panel → Programs → Programs and Features → Select the “Windows Software Development Kit” → Change → Change → Check “Debugging Tools For Windows” → Change
- Press enter in the terminal to continue running the init
## Building
Note: In Linux, you should run the build command in tmux so that if your ssh session disconnects, the build can keep going. To do this, just type `tmux` into your terminal to hop into a tmux session. If you get disconnected, you can hop back in like so:
- SSH into the server
- Run `tmux list-sessions`
- Run `tmux switch -t {session_id}`, replacing {session_id} with the value from the list-sessions output
To run the build, replace the sha in the following commands with the sha that you wish to build:
- Mac x64: `python ~/chromium/build_chromium/build.py 312d84c8ce62810976feda0d3457108a6dfff9e6`
- Linux x64: `python ~/chromium/build_chromium/build.py 312d84c8ce62810976feda0d3457108a6dfff9e6`
- Linux arm64: `python ~/chromium/build_chromium/build.py 312d84c8ce62810976feda0d3457108a6dfff9e6 arm64`
- Windows x64: `python c:\chromium\build_chromium\build.py 312d84c8ce62810976feda0d3457108a6dfff9e6`
1. Login to Google Cloud Console
2. Click the "Compute Engine" tab.
3. Create a Linux VM:
- 8 CPU
- 30GB memory
- 80GB free space on disk (Try `ncdu /home` to see where space is used.)
- git
- python2 (`python` must link to `python2`)
- lsb_release
- tmux is recommended in case your ssh session is interrupted
- "Cloud API access scopes": must have **read / write** scope for the Storage API
4. Install [Google Cloud SDK](https://cloud.google.com/sdk) locally to ssh into the GCP instance
## Artifacts
After the build completes, there will be a .zip file and a .md5 file in `~/chromium/chromium/src/out/headless`. These are named like so: `chromium-{first_7_of_SHA}-{platform}-{arch}`, for example: `chromium-4747cc2-linux-x64`.
The zip files and md5 files are copied to a staging bucket in GCP storage.
The zip files need to be deployed to GCP Storage. For testing, I drop them into `headless-shell-dev`, but for production, they need to be in `headless-shell`. And the `x-pack/plugins/reporting/server/browsers/chromium/paths.ts` file needs to be upated to have the correct `archiveChecksum`, `archiveFilename`, `binaryChecksum` and `baseUrl`. Below is a list of what the archive's are:
## Testing
Search the Puppeteer Github repo for known issues that could affect our use case, and make sure to test anywhere that is affected.
- `archiveChecksum`: The contents of the `.md5` file, which is the `md5` checksum of the zip file.
- `binaryChecksum`: The `md5` checksum of the `headless_shell` binary itself.
Here's the steps on how to test a Puppeteer upgrade, run these tests on Mac, Windows, Linux x64 and Linux arm64:
*If you're building in the cloud, don't forget to turn off your VM after retrieving the build artifacts!*
## Diagnosing runtime failures
After getting the build to pass, the resulting binaries often failed to run or would hang.
You can run the headless browser manually to see what errors it is generating (replace the `c:\dev\data` with the path to a dummy folder you've created on your system):
**Mac**
`headless_shell --disable-translate --disable-extensions --disable-background-networking --safebrowsing-disable-auto-update --disable-sync --metrics-recording-only --disable-default-apps --mute-audio --no-first-run --disable-gpu --no-sandbox --headless --hide-scrollbars --window-size=400,400 --remote-debugging-port=9221 https://example.com/`
**Linux**
`headless_shell --disable-translate --disable-extensions --disable-background-networking --safebrowsing-disable-auto-update --disable-sync --metrics-recording-only --disable-default-apps --mute-audio --no-first-run --disable-gpu --no-sandbox --headless --hide-scrollbars --window-size=400,400 --remote-debugging-port=9221 https://example.com/`
**Windows**
`headless_shell.exe --disable-translate --disable-extensions --disable-background-networking --safebrowsing-disable-auto-update --disable-sync --metrics-recording-only --disable-default-apps --mute-audio --no-first-run --disable-gpu --no-sandbox --headless --hide-scrollbars --window-size=400,400 --remote-debugging-port=9221 https://example.com/`
In the case of Windows, you can use IE to open `http://localhost:9221` and see if the page loads. In mac/linux you can just curl the JSON endpoints: `curl http://localhost:9221/json/list`.
- Make sure the Reporting plugin is fetching the correct version of the browser
at start-up time, and that it can successfully unzip it and copy the files to
`x-pack/plugins/reporting/chromium`
- Make sure there are no errors when using the **Reporting diagnostic tool**
- All functional and API tests that generate PDF and PNG files should pass.
- Use a VM to run Kibana in a low-memory environment and try to generate a PNG of a dashboard that outputs as a 4MB file. Document the minimum requirements in the PR.
## Resources
@ -225,8 +116,6 @@ The following links provide helpful context about how the Chromium build works,
- Tools for Chromium version information: https://omahaproxy.appspot.com/
- https://www.chromium.org/developers/how-tos/get-the-code/working-with-release-branches
- https://chromium.googlesource.com/chromium/src/+/HEAD/docs/windows_build_instructions.md
- https://chromium.googlesource.com/chromium/src/+/HEAD/docs/mac_build_instructions.md
- https://chromium.googlesource.com/chromium/src/+/HEAD/docs/linux/build_instructions.md
- Some build-flag descriptions: https://www.chromium.org/developers/gn-build-configuration
- The serverless Chromium project was indispensable: https://github.com/adieuadieu/serverless-chrome/blob/b29445aa5a96d031be2edd5d1fc8651683bf262c/packages/lambda/builds/chromium/build/build.sh

View file

@ -6,7 +6,7 @@ from build_util import (
md5_file,
)
# This file builds Chromium headless on Windows, Mac, and Linux.
# This file builds Chromium headless on Linux.
# Verify that we have an argument, and if not print instructions
if (len(sys.argv) < 2):
@ -76,7 +76,7 @@ print('Setting up build directory')
runcmd('rm -rf out/headless')
runcmd('mkdir out/headless')
# Copy build args/{Linux | Darwin | Windows}.gn from the root of our directory to out/headless/args.gn,
# Copy args.gn from the root of our directory to out/headless/args.gn,
# add the target_cpu for cross-compilation
print('Adding target_cpu to args')
argsgn_file_out = path.abspath('out/headless/args.gn')
@ -89,7 +89,7 @@ runcmd('gn gen out/headless')
print('Compiling... this will take a while')
runcmd('autoninja -C out/headless headless_shell')
# Optimize the output on Linux x64 and Mac by stripping inessentials from the binary
# Optimize the output on Linux x64 by stripping inessentials from the binary
# ARM must be cross-compiled from Linux and can not read the ARM binary in order to strip
if platform.system() != 'Windows' and arch_name != 'arm64':
print('Optimizing headless_shell')
@ -112,30 +112,10 @@ def archive_file(name):
archive.write(from_path, to_path)
return to_path
# Each platform has slightly different requirements for what dependencies
# must be bundled with the Chromium executable.
if platform.system() == 'Linux':
archive_file('headless_shell')
archive_file(path.join('swiftshader', 'libEGL.so'))
archive_file(path.join('swiftshader', 'libGLESv2.so'))
if arch_name == 'arm64':
archive_file(path.join('swiftshader', 'libEGL.so'))
elif platform.system() == 'Windows':
archive_file('headless_shell.exe')
archive_file('dbghelp.dll')
archive_file('icudtl.dat')
archive_file(path.join('swiftshader', 'libEGL.dll'))
archive_file(path.join('swiftshader', 'libEGL.dll.lib'))
archive_file(path.join('swiftshader', 'libGLESv2.dll'))
archive_file(path.join('swiftshader', 'libGLESv2.dll.lib'))
elif platform.system() == 'Darwin':
archive_file('headless_shell')
archive_file('libswiftshader_libEGL.dylib')
archive_file('libswiftshader_libGLESv2.dylib')
archive_file(path.join('Helpers', 'chrome_crashpad_handler'))
# Add dependencies that must be bundled with the Chromium executable.
archive_file('headless_shell')
archive_file(path.join('swiftshader', 'libEGL.so'))
archive_file(path.join('swiftshader', 'libGLESv2.so'))
archive.close()

View file

@ -1,33 +0,0 @@
# Based on //build/headless.gn
# Embed resource.pak into binary to simplify deployment.
headless_use_embedded_resources = true
# In order to simplify deployment we build ICU data file
# into binary.
icu_use_data_file = false
# Use embedded data instead external files for headless in order
# to simplify deployment.
v8_use_external_startup_data = false
enable_nacl = false
enable_print_preview = false
enable_basic_printing = false
enable_remoting = false
use_alsa = false
use_cups = false
use_dbus = false
use_gio = false
use_libpci = false
use_pulseaudio = false
use_udev = false
is_debug = false
symbol_level = 0
is_component_build = false
# Please, consult @elastic/kibana-security before changing/removing this option.
use_kerberos = false
# target_cpu is appended before build: "x64" or "arm64"

View file

@ -1,5 +0,0 @@
#!/bin/bash
# Launch the cross-platform init script using a relative path
# from this script's location.
python "`dirname "$0"`/../init.py"

View file

@ -3,11 +3,9 @@ from os import path
from build_util import runcmd, mkdir
# This is a cross-platform initialization script which should only be run
# once per environment, and isn't intended to be run directly. You should
# run the appropriate platform init script (e.g. Linux/init.sh) which will
# call this once the platform-specific initialization has completed.
# once per environment.
# Set to "arm" to build for ARM on Linux
# Set to "arm" to build for ARM
arch_name = sys.argv[1] if len(sys.argv) >= 2 else 'undefined'
build_path = path.abspath(os.curdir)
src_path = path.abspath(path.join(build_path, 'chromium', 'src'))
@ -23,7 +21,6 @@ runcmd('git config --global branch.autosetuprebase always')
runcmd('git config --global core.compression 0')
# Grab Chromium's custom build tools, if they aren't already installed
# (On Windows, they are installed before this Python script is run)
# Put depot_tools on the path so we can properly run the fetch command
if not path.isdir('depot_tools'):
print('Installing depot_tools...')

View file

@ -1,13 +0,0 @@
#!/bin/bash
# Initializes a Linux environment. This need only be done once per
# machine. The OS needs to be a flavor that supports apt get, such as Ubuntu.
if ! [ -x "$(command -v python)" ]; then
echo "Installing Python"
sudo apt-get --assume-yes install python
fi
# Launch the cross-platform init script using a relative path
# from this script's location.
python "`dirname "$0"`/../init.py" $1

View file

@ -1,29 +0,0 @@
# Based on //build/headless.gn
# Embed resource.pak into binary to simplify deployment.
headless_use_embedded_resources = true
# Use embedded data instead external files for headless in order
# to simplify deployment.
v8_use_external_startup_data = false
enable_nacl = false
enable_print_preview = false
enable_basic_printing = false
enable_remoting = false
use_alsa = false
use_cups = false
use_dbus = false
use_gio = false
use_libpci = false
use_pulseaudio = false
use_udev = false
is_debug = false
symbol_level = 0
is_component_build = false
# Please, consult @elastic/kibana-security before changing/removing this option.
use_kerberos = false
# target_cpu is appended before build: "x64" or "arm64"

View file

@ -1,32 +0,0 @@
: This only needs to be run once per environment to set it up.
: This requires a GUI, as the VS installation is graphical.
: If initilization fails, you can simply install run the `install_vs.exe`
@echo off
: Install Visual Studio (this requires user interaction, and takes quite a while)
: Most of the subsequent commands can be run in parallel with this (downloading, unzipping,
: grabbing the source, etc). This must be completed before building, though.
@echo "Installing Visual Studio"
powershell -command "& {iwr -outf c:\chromium\install_vs.exe https://download.visualstudio.microsoft.com/download/pr/f9c35424-ffad-4b44-bb8f-d4e3968e90ce/f75403c967456e32e758ef558957f345/vs_community.exe}"
install_vs.exe --add Microsoft.VisualStudio.Workload.NativeDesktop --add Microsoft.VisualStudio.Component.VC.ATLMFC --includeRecommended
: Install Chromium's custom build tools
@echo "Installing Chromium build tools"
powershell -command "& {iwr -outf %~dp0../../depot_tools.zip https://storage.googleapis.com/chrome-infra/depot_tools.zip}"
powershell -command "& {Expand-Archive %~dp0../../depot_tools.zip -DestinationPath %~dp0../../depot_tools}"
: Set the environment variables required by depot_tools
@echo "When Visual Studio is installed, you need to enable the Windows SDK in Control Panel. After that, press <enter> here to continue initialization"
pause
SETX PATH "%~dp0..\..\depot_tools;%path%"
SETX DEPOT_TOOLS_WIN_TOOLCHAIN 0
call gclient
python %~dp0../init.py

View file

@ -302,7 +302,7 @@ export class HeadlessChromiumDriver {
// Even though 3xx redirects go through our request
// handler, we should probably inspect responses just to
// avoid being bamboozled by some malicious request
this.page.on('response', (interceptedResponse: puppeteer.Response) => {
this.page.on('response', (interceptedResponse: puppeteer.HTTPResponse) => {
const interceptedUrl = interceptedResponse.url();
const allowed = !interceptedUrl.startsWith('file://');

View file

@ -46,6 +46,8 @@ export const args = ({ userDataDir, viewport, disableSandbox, proxy: proxyConfig
// The viewport may later need to be resized depending on the position of the clip area.
// These numbers come from the job parameters, so this is a close guess.
`--window-size=${Math.floor(viewport.width)},${Math.floor(viewport.height)}`,
// allow screenshot clip region to go outside of the viewport
`--mainFrameClipsContent=false`,
];
if (proxyConfig.enabled) {

View file

@ -89,7 +89,7 @@ export class HeadlessChromiumDriverFactory {
const versionInfo = await client.send('Browser.getVersion');
logger.debug(`Browser version: ${JSON.stringify(versionInfo)}`);
await page.emulateTimezone(browserTimezone ?? null);
await page.emulateTimezone(browserTimezone);
// Set the default timeout for all navigation methods to the openUrl timeout (30 seconds)
// All waitFor methods have their own timeout config passed in to them
@ -173,7 +173,7 @@ export class HeadlessChromiumDriverFactory {
})
);
const pageRequestFailed$ = Rx.fromEvent<puppeteer.Request>(page, 'requestfailed').pipe(
const pageRequestFailed$ = Rx.fromEvent<puppeteer.HTTPRequest>(page, 'requestfailed').pipe(
map((req) => {
const failure = req.failure && req.failure();
if (failure) {

View file

@ -99,8 +99,9 @@ export const browserStartLogs = (
);
const error$ = fromEvent(browserProcess, 'error').pipe(
map(() => {
map((err) => {
logger.error(`Browser process threw an error on startup`);
logger.error(err as string | Error);
return i18n.translate('xpack.reporting.diagnostic.browserErrored', {
defaultMessage: `Browser process threw an error on startup`,
});

View file

@ -16,44 +16,62 @@ interface PackageInfo {
binaryRelativePath: string;
}
// We download zip files from a Kibana team GCS bucket named `headless_shell`
enum BaseUrl {
// see https://www.chromium.org/getting-involved/download-chromium
common = 'https://commondatastorage.googleapis.com/chromium-browser-snapshots',
// A GCS bucket under the Kibana team
custom = 'https://storage.googleapis.com/headless_shell',
}
interface CustomPackageInfo extends PackageInfo {
location: 'custom';
}
interface CommonPackageInfo extends PackageInfo {
location: 'common';
archivePath: string;
}
export class ChromiumArchivePaths {
public readonly packages: PackageInfo[] = [
public readonly revision = '856583';
public readonly packages: Array<CustomPackageInfo | CommonPackageInfo> = [
{
platform: 'darwin',
architecture: 'x64',
archiveFilename: 'chromium-ef768c9-darwin_x64.zip',
archiveChecksum: 'd87287f6b2159cff7c64babac873cc73',
binaryChecksum: '8d777b3380a654e2730fc36afbfb11e1',
binaryRelativePath: 'headless_shell-darwin_x64/headless_shell',
archiveFilename: 'chrome-mac.zip',
archiveChecksum: '6aad6fa5a26d83e24e2f0d52de5230bf',
binaryChecksum: '2dc7a7250d849df4cab60f3b4a70c1ea',
binaryRelativePath: 'chrome-mac/Chromium.app/Contents/MacOS/Chromium',
location: 'common',
archivePath: 'Mac',
},
{
platform: 'linux',
architecture: 'x64',
archiveFilename: 'chromium-ef768c9-linux_x64.zip',
archiveChecksum: '85575e8fd56849f4de5e3584e05712c0',
binaryChecksum: '38c4d849c17683def1283d7e5aa56fe9',
archiveFilename: 'chromium-d163fd7-linux_x64.zip',
archiveChecksum: 'fba0a240d409228a3494aef415c300fc',
binaryChecksum: '99cfab472d516038b94ef86649e52871',
binaryRelativePath: 'headless_shell-linux_x64/headless_shell',
location: 'custom',
},
{
platform: 'linux',
architecture: 'arm64',
archiveFilename: 'chromium-ef768c9-linux_arm64.zip',
archiveChecksum: '20b09b70476bea76a276c583bf72eac7',
binaryChecksum: 'dcfd277800c1a5c7d566c445cbdc225c',
archiveFilename: 'chromium-d163fd7-linux_arm64.zip',
archiveChecksum: '29834735bc2f0e0d9134c33bc0580fb6',
binaryChecksum: '13baccf2e5c8385cb9d9588db6a9e2c2',
binaryRelativePath: 'headless_shell-linux_arm64/headless_shell',
location: 'custom',
},
{
platform: 'win32',
architecture: 'x64',
archiveFilename: 'chromium-ef768c9-windows_x64.zip',
archiveChecksum: '33301c749b5305b65311742578c52f15',
binaryChecksum: '9f28dd56c7a304a22bf66f0097fa4de9',
binaryRelativePath: 'headless_shell-windows_x64\\headless_shell.exe',
archiveFilename: 'chrome-win.zip',
archiveChecksum: '64999a384bfb6c96c50c4cb6810dbc05',
binaryChecksum: '13b8bbb4a12f9036b8cc3b57b3a71fec',
binaryRelativePath: 'chrome-win\\chrome.exe',
location: 'common',
archivePath: 'Win',
},
];
@ -72,8 +90,11 @@ export class ChromiumArchivePaths {
return this.packages.map((p) => this.resolvePath(p));
}
public getDownloadUrl(p: PackageInfo) {
return BaseUrl.custom + `/${p.archiveFilename}`;
public getDownloadUrl(p: CustomPackageInfo | CommonPackageInfo) {
if (p.location === 'common') {
return `${BaseUrl.common}/${p.archivePath}/${this.revision}/${p.archiveFilename}`;
}
return BaseUrl.custom + '/' + p.archiveFilename;
}
public getBinaryPath(p: PackageInfo) {

View file

@ -40,16 +40,14 @@ export function installBrowser(
if (binaryChecksum !== pkg.binaryChecksum) {
await ensureBrowserDownloaded(logger);
await del(chromiumPath);
const archive = path.join(paths.archivesPath, pkg.archiveFilename);
logger.info(`Extracting [${archive}] to [${binaryPath}]`);
await del(chromiumPath);
logger.info(`Extracting [${archive}] to [${chromiumPath}]`);
await extract(archive, chromiumPath);
}
logger.debug(`Browser executable: ${binaryPath}`);
logger.info(`Browser executable: ${binaryPath}`);
binaryPath$.next(binaryPath); // subscribers wait for download and extract to complete
};

View file

@ -9,7 +9,6 @@ import { APP_WRAPPER_CLASS } from '../../../../../../src/core/server';
export const DEFAULT_PAGELOAD_SELECTOR = `.${APP_WRAPPER_CLASS}`;
export const CONTEXT_GETNUMBEROFITEMS = 'GetNumberOfItems';
export const CONTEXT_GETBROWSERDIMENSIONS = 'GetBrowserDimensions';
export const CONTEXT_INJECTCSS = 'InjectCss';
export const CONTEXT_WAITFORRENDER = 'WaitForRender';
export const CONTEXT_GETTIMERANGE = 'GetTimeRange';

View file

@ -10,54 +10,6 @@ import { LevelLogger, startTrace } from '../';
import { HeadlessChromiumDriver } from '../../browsers';
import { LayoutInstance } from '../layouts';
import { ElementsPositionAndAttribute, Screenshot } from './';
import { CONTEXT_GETBROWSERDIMENSIONS } from './constants';
// In Puppeteer 5.4+, the viewport size limits what the screenshot can take, even if a clip is specified. The clip area must
// be visible in the viewport. This workaround resizes the viewport to the actual content height and width.
// NOTE: this will fire a window resize event
const resizeToClipArea = async (
item: ElementsPositionAndAttribute,
browser: HeadlessChromiumDriver,
zoom: number,
logger: LevelLogger
) => {
// Check current viewport size
const { width, height, left, top } = item.position.boundingClientRect; // the "unscaled" pixel sizes
const [viewWidth, viewHeight] = await browser.evaluate(
{
fn: () => [document.body.clientWidth, document.body.clientHeight],
args: [],
},
{ context: CONTEXT_GETBROWSERDIMENSIONS },
logger
);
logger.debug(`Browser viewport: width=${viewWidth} height=${viewHeight}`);
// Resize the viewport if the clip area is not visible
if (viewWidth < width + left || viewHeight < height + top) {
logger.debug(`Item's position is not within the viewport.`);
// add left and top margin to unscaled measurements
const newWidth = width + left;
const newHeight = height + top;
logger.debug(
`Resizing browser viewport to: width=${newWidth} height=${newHeight} zoom=${zoom}`
);
await browser.setViewport(
{
width: newWidth,
height: newHeight,
zoom,
},
logger
);
}
logger.debug(`Capturing item: width=${width} height=${height} left=${left} top=${top}`);
};
export const getScreenshots = async (
browser: HeadlessChromiumDriver,
@ -77,7 +29,6 @@ export const getScreenshots = async (
const endTrace = startTrace('get_screenshots', 'read');
const item = elementsPositionAndAttributes[i];
await resizeToClipArea(item, browser, layout.getBrowserZoom(), logger);
const base64EncodedData = await browser.screenshot(item.position);
if (!base64EncodedData) {

View file

@ -341,8 +341,6 @@ describe('Screenshot Observable Pipeline', () => {
if (mockCall === contexts.CONTEXT_ELEMENTATTRIBUTES) {
return Promise.resolve(null);
} else if (mockCall === contexts.CONTEXT_GETBROWSERDIMENSIONS) {
return Promise.resolve([800, 600]);
} else {
return Promise.resolve();
}

View file

@ -65,9 +65,6 @@ mockBrowserEvaluate.mockImplementation(() => {
if (mockCall === contexts.CONTEXT_GETNUMBEROFITEMS) {
return Promise.resolve(1);
}
if (mockCall === contexts.CONTEXT_GETBROWSERDIMENSIONS) {
return Promise.resolve([600, 800]);
}
if (mockCall === contexts.CONTEXT_INJECTCSS) {
return Promise.resolve();
}

View file

@ -40,8 +40,7 @@ export async function checkIfPngsMatch(
log.debug(`writeFile: ${baselineCopyPath}`);
await fs.writeFile(baselineCopyPath, await fs.readFile(baselinepngPath));
} catch (error) {
log.error(`No baseline png found at ${baselinepngPath}`);
return 0;
throw new Error(`No baseline png found at ${baselinepngPath}`);
}
log.debug(`writeFile: ${actualCopyPath}`);
await fs.writeFile(actualCopyPath, await fs.readFile(actualpngPath));

Binary file not shown.

After

Width:  |  Height:  |  Size: 6.2 MiB

View file

@ -120,21 +120,21 @@ export default function ({ getPageObjects, getService }: FtrProviderContext) {
});
describe('PNG Layout', () => {
it('downloads a PNG file', async function () {
const writeSessionReport = async (name: string, rawPdf: Buffer, reportExt: string) => {
const sessionDirectory = path.resolve(REPORTS_FOLDER, 'session');
await mkdirAsync(sessionDirectory, { recursive: true });
const sessionReportPath = path.resolve(sessionDirectory, `${name}.${reportExt}`);
await writeFileAsync(sessionReportPath, rawPdf);
return sessionReportPath;
};
const getBaselineReportPath = (fileName: string, reportExt: string) => {
const baselineFolder = path.resolve(REPORTS_FOLDER, 'baseline');
const fullPath = path.resolve(baselineFolder, `${fileName}.${reportExt}`);
log.debug(`getBaselineReportPath (${fullPath})`);
return fullPath;
};
const writeSessionReport = async (name: string, rawPdf: Buffer, reportExt: string) => {
const sessionDirectory = path.resolve(REPORTS_FOLDER, 'session');
await mkdirAsync(sessionDirectory, { recursive: true });
const sessionReportPath = path.resolve(sessionDirectory, `${name}.${reportExt}`);
await writeFileAsync(sessionReportPath, rawPdf);
return sessionReportPath;
};
const getBaselineReportPath = (fileName: string, reportExt: string) => {
const baselineFolder = path.resolve(REPORTS_FOLDER, 'baseline');
const fullPath = path.resolve(baselineFolder, `${fileName}.${reportExt}`);
log.debug(`getBaselineReportPath (${fullPath})`);
return fullPath;
};
it('downloads a PNG file: small dashboard', async function () {
this.timeout(300000);
await PageObjects.common.navigateToApp('dashboard');
@ -146,7 +146,31 @@ export default function ({ getPageObjects, getService }: FtrProviderContext) {
const url = await PageObjects.reporting.getReportURL(60000);
const reportData = await PageObjects.reporting.getRawPdfReportData(url);
const reportFileName = 'dashboard_preserve_layout';
const reportFileName = 'small_dashboard_preserve_layout';
const sessionReportPath = await writeSessionReport(reportFileName, reportData, 'png');
const percentDiff = await checkIfPngsMatch(
sessionReportPath,
getBaselineReportPath(reportFileName, 'png'),
config.get('screenshots.directory'),
log
);
expect(percentDiff).to.be.lessThan(0.09);
});
it('downloads a PNG file: large dashboard', async function () {
this.timeout(300000);
await PageObjects.common.navigateToApp('dashboard');
await PageObjects.dashboard.loadSavedDashboard('Large Dashboard');
await PageObjects.reporting.openPngReportingPanel();
await PageObjects.reporting.forceSharedItemsContainerSize({ width: 1405 });
await PageObjects.reporting.clickGenerateReportButton();
await PageObjects.reporting.removeForceSharedItemsContainerSize();
const url = await PageObjects.reporting.getReportURL(200000);
const reportData = await PageObjects.reporting.getRawPdfReportData(url);
const reportFileName = 'large_dashboard_preserve_layout';
const sessionReportPath = await writeSessionReport(reportFileName, reportData, 'png');
const percentDiff = await checkIfPngsMatch(
sessionReportPath,
@ -160,9 +184,7 @@ export default function ({ getPageObjects, getService }: FtrProviderContext) {
});
describe('Preserve Layout', () => {
it('downloads a PDF file', async function () {
// Generating and then comparing reports can take longer than the default 60s timeout because the comparePngs
// function is taking about 15 seconds per comparison in jenkins.
it('downloads a PDF file: small dashboard', async function () {
this.timeout(300000);
await PageObjects.common.navigateToApp('dashboard');
await PageObjects.dashboard.loadSavedDashboard('Ecom Dashboard');
@ -176,10 +198,22 @@ export default function ({ getPageObjects, getService }: FtrProviderContext) {
expect(res.get('content-type')).to.equal('application/pdf');
});
it('downloads a PDF file: large dashboard', async function () {
this.timeout(300000);
await PageObjects.common.navigateToApp('dashboard');
await PageObjects.dashboard.loadSavedDashboard('Large Dashboard');
await PageObjects.reporting.openPdfReportingPanel();
await PageObjects.reporting.clickGenerateReportButton();
const url = await PageObjects.reporting.getReportURL(60000);
const res = await PageObjects.reporting.getResponse(url);
expect(res.status).to.equal(200);
expect(res.get('content-type')).to.equal('application/pdf');
});
it('downloads a PDF file with saved search given EuiDataGrid enabled', async function () {
await kibanaServer.uiSettings.replace({ 'doc_table:legacy': false });
// Generating and then comparing reports can take longer than the default 60s timeout because the comparePngs
// function is taking about 15 seconds per comparison in jenkins.
this.timeout(300000);
await PageObjects.common.navigateToApp('dashboard');
await PageObjects.dashboard.loadSavedDashboard('Ecom Dashboard');

File diff suppressed because one or more lines are too long

View file

@ -5599,13 +5599,6 @@
resolved "https://registry.yarnpkg.com/@types/proper-lockfile/-/proper-lockfile-3.0.1.tgz#dd770a2abce3adbcce3bd1ed892ce2f5f17fbc86"
integrity sha512-ODOjqxmaNs0Zkij+BJovsNJRSX7BJrr681o8ZnNTNIcTermvVFzLpz/XFtfg3vNrlPVTJY1l4e9h2LvHoxC1lg==
"@types/puppeteer@^5.4.1":
version "5.4.1"
resolved "https://registry.yarnpkg.com/@types/puppeteer/-/puppeteer-5.4.1.tgz#8d0075ad7705e8061b06df6a9a3abc6ca5fb7cd9"
integrity sha512-mEytIRrqvsFgs16rHOa5jcZcoycO/NSjg1oLQkFUegj3HOHeAP1EUfRi+eIsJdGrx2oOtfN39ckibkRXzs+qXA==
dependencies:
"@types/node" "*"
"@types/q@^1.5.1":
version "1.5.4"
resolved "https://registry.yarnpkg.com/@types/q/-/q-1.5.4.tgz#15925414e0ad2cd765bfef58842f7e26a7accb24"
@ -11654,16 +11647,16 @@ detective@^5.0.2, detective@^5.2.0:
defined "^1.0.0"
minimist "^1.1.1"
devtools-protocol@0.0.809251:
version "0.0.809251"
resolved "https://registry.yarnpkg.com/devtools-protocol/-/devtools-protocol-0.0.809251.tgz#300b3366be107d5c46114ecb85274173e3999518"
integrity sha512-pf+2OY6ghMDPjKkzSWxHMq+McD+9Ojmq5XVRYpv/kPd9sTMQxzEt21592a31API8qRjro0iYYOc3ag46qF/1FA==
devtools-protocol@0.0.818844:
version "0.0.818844"
resolved "https://registry.yarnpkg.com/devtools-protocol/-/devtools-protocol-0.0.818844.tgz#d1947278ec85b53e4c8ca598f607a28fa785ba9e"
integrity sha512-AD1hi7iVJ8OD0aMLQU5VK0XH9LDlA1+BcPIgrAxPfaibx2DbWucuyOhc4oyQCbnvDDO68nN6/LcKfqTP343Jjg==
devtools-protocol@0.0.854822:
version "0.0.854822"
resolved "https://registry.yarnpkg.com/devtools-protocol/-/devtools-protocol-0.0.854822.tgz#eac3a5260a6b3b4e729a09fdc0c77b0d322e777b"
integrity sha512-xd4D8kHQtB0KtWW0c9xBZD5LVtm9chkMOfs/3Yn01RhT/sFIsVtzTtypfKoFfWBaL+7xCYLxjOLkhwPXaX/Kcg==
dezalgo@^1.0.0:
version "1.0.3"
resolved "https://registry.yarnpkg.com/dezalgo/-/dezalgo-1.0.3.tgz#7f742de066fc748bc8db820569dddce49bf0d456"
@ -22454,19 +22447,19 @@ puppeteer@^5.3.1:
unbzip2-stream "^1.3.3"
ws "^7.2.3"
"puppeteer@npm:@elastic/puppeteer@5.4.1-patch.1":
version "5.4.1-patch.1"
resolved "https://registry.yarnpkg.com/@elastic/puppeteer/-/puppeteer-5.4.1-patch.1.tgz#61af43ec7df47d1042c8708c386cfa7af76e08f7"
integrity sha512-I4JbNmQHZkE72TPNdipND8GnsEBnqzuksxPSAT25qvudShuuzdY9TwNBQ65IJwPD/pjlpx7fUIUmFyvTHwlxhQ==
puppeteer@^8.0.0:
version "8.0.0"
resolved "https://registry.yarnpkg.com/puppeteer/-/puppeteer-8.0.0.tgz#a236669118aa795331c2d0ca19877159e7664705"
integrity sha512-D0RzSWlepeWkxPPdK3xhTcefj8rjah1791GE82Pdjsri49sy11ci/JQsAO8K2NRukqvwEtcI+ImP5F4ZiMvtIQ==
dependencies:
debug "^4.1.0"
devtools-protocol "0.0.809251"
devtools-protocol "0.0.854822"
extract-zip "^2.0.0"
https-proxy-agent "^4.0.0"
https-proxy-agent "^5.0.0"
node-fetch "^2.6.1"
pkg-dir "^4.2.0"
progress "^2.0.1"
proxy-from-env "^1.0.0"
proxy-from-env "^1.1.0"
rimraf "^3.0.2"
tar-fs "^2.0.0"
unbzip2-stream "^1.3.3"