rename advanced setting ml:fileDataVisualizerMaxFileSize to fileUpload:maxFileSize and increase max geojson upload size to 1GB (#92620) (#93358)

* rename ml:fileDataVisualizerMaxFileSize to fileUppload:maxFileSize

* add saved object migration

* file preview

* importing status

* remove console statement

* import complete view

* fix geojson_importer test

* tslint

* i18n fixes

* cleanup

* update documenation for advanced setting rename

* advanced settings usage_collection

* remove ml:fileDataVisualizerMaxFileSize from schemas and types

* add copy buttons for import response and fix geojson upload functional tests

* tslint

* remove clipboard-read check

* return early if env does not support reading from clipboard

* fix reporting tests

* review feedback

* update GeoJsonFileSource to support showing results trimmed icon and tooltip

* add fileUpload to useMlKibana context and replace dependencyCache with useMlKibana

* tslint

* review feedback

* lower case file name

* default to selecting geo_shape when file contains both points and shapes

* fix wizard onError callback to not advance to next step

Co-authored-by: Kibana Machine <42973632+kibanamachine@users.noreply.github.com>

Co-authored-by: Kibana Machine <42973632+kibanamachine@users.noreply.github.com>
This commit is contained in:
Nathan Reese 2021-03-02 21:23:49 -07:00 committed by GitHub
parent e7195f482a
commit b7bfbeff31
No known key found for this signature in database
GPG key ID: 4AEE18F83AFDEB23
47 changed files with 1446 additions and 958 deletions

View file

@ -83,6 +83,10 @@ specific dashboard, application, or saved object as they enter each space.
[[fields-popularlimit]]`fields:popularLimit`::
The top N most popular fields to show.
[[fileupload-maxfilesize]]`fileUpload:maxFileSize`::
Sets the file size limit when importing files. The default
value is `100MB`. The highest supported value for this setting is `1GB`.
[[filtereditor-suggestvalues]]`filterEditor:suggestValues`::
Set this property to `false` to prevent the filter editor from suggesting values
for fields.
@ -258,7 +262,7 @@ Hides the "Time" column in *Discover* and in all saved searches on dashboards.
Highlights results in *Discover* and saved searches on dashboards. Highlighting
slows requests when working on big documents.
[[doctable-legacy]]`doc_table:legacy`::
[[doctable-legacy]]`doc_table:legacy`::
Controls the way the document table looks and works. Set this property to `true` to revert to the legacy implementation.
[[discover-searchFieldsFromSource]]`discover:searchFieldsFromSource`::
@ -282,10 +286,6 @@ must contain `from` and `to` values (see
{ref}/common-options.html#date-math[accepted formats]). It is ignored unless
`ml:anomalyDetection:results:enableTimeDefaults` is enabled.
[[ml-filedatavisualizermaxfilesize]]`ml:fileDataVisualizerMaxFileSize`::
Sets the file size limit when importing data in the {data-viz}. The default
value is `100MB`. The highest supported value for this setting is `1GB`.
[float]
[[kibana-notification-settings]]

View file

@ -20,10 +20,10 @@ image::user/ml/images/ml-data-visualizer-sample.jpg[{data-viz} for sample flight
experimental[] You can also upload a CSV, NDJSON, or log file. The *{data-viz}*
identifies the file format and field mappings. You can then optionally import
that data into an {es} index. To change the default file size limit, see
<<kibana-ml-settings,Machine learning advanced settings>>.
<<kibana-general-settings, fileUpload:maxFileSize advanced settings>>.
If {stack-security-features} are enabled, users must have the necessary
privileges to use {ml-features}. Refer to
privileges to use {ml-features}. Refer to
{ml-docs}/setup.html#setup-privileges[Set up {ml-features}].
NOTE: There are limitations in {ml-features} that affect {kib}. For more information, refer to {ml-docs}/ml-limitations.html[Machine learning].
@ -40,15 +40,15 @@ false positives. {anomaly-detect-cap} runs in and scales with {es}, and
includes an intuitive UI on the {kib} *Machine Learning* page for creating
{anomaly-jobs} and understanding results.
If you have a license that includes the {ml-features}, you can
If you have a license that includes the {ml-features}, you can
create {anomaly-jobs} and manage jobs and {dfeeds} from the *Job Management*
pane:
pane:
[role="screenshot"]
image::user/ml/images/ml-job-management.png[Job Management]
You can use the *Settings* pane to create and edit
{ml-docs}/ml-calendars.html[calendars] and the filters that are used in
You can use the *Settings* pane to create and edit
{ml-docs}/ml-calendars.html[calendars] and the filters that are used in
{ml-docs}/ml-rules.html[custom rules]:
[role="screenshot"]
@ -69,13 +69,13 @@ occurring in your operational environment at that time:
image::user/ml/images/ml-annotations-list.png[Single Metric Viewer with annotations]
In some circumstances, annotations are also added automatically. For example, if
the {anomaly-job} detects that there is missing data, it annotates the affected
time period. For more information, see
{ml-docs}/ml-delayed-data-detection.html[Handling delayed data]. The
the {anomaly-job} detects that there is missing data, it annotates the affected
time period. For more information, see
{ml-docs}/ml-delayed-data-detection.html[Handling delayed data]. The
*Job Management* pane shows the full list of annotations for each job.
NOTE: The {kib} {ml-features} use pop-ups. You must configure your web
browser so that it does not block pop-up windows or create an exception for your
NOTE: The {kib} {ml-features} use pop-ups. You must configure your web
browser so that it does not block pop-up windows or create an exception for your
{kib} URL.
For more information about the {anomaly-detect} feature, see
@ -89,7 +89,7 @@ experimental[]
The Elastic {ml} {dfanalytics} feature enables you to analyze your data using
{classification}, {oldetection}, and {regression} algorithms and generate new
indices that contain the results alongside your source data.
indices that contain the results alongside your source data.
If you have a license that includes the {ml-features}, you can create
{dfanalytics-jobs} and view their results on the *Data Frame Analytics* page in
@ -98,5 +98,5 @@ If you have a license that includes the {ml-features}, you can create
[role="screenshot"]
image::user/ml/images/outliers.png[{oldetection-cap} results in {kib}]
For more information about the {dfanalytics} feature, see
{ml-docs}/ml-dfanalytics.html[{ml-cap} {dfanalytics}].
For more information about the {dfanalytics} feature, see
{ml-docs}/ml-dfanalytics.html[{ml-cap} {dfanalytics}].

View file

@ -44,3 +44,37 @@ describe('ui_settings 7.9.0 migrations', () => {
});
});
});
describe('ui_settings 7.13.0 migrations', () => {
const migration = migrations['7.13.0'];
test('returns doc on empty object', () => {
expect(migration({} as SavedObjectUnsanitizedDoc)).toEqual({
references: [],
});
});
test('properly renames ml:fileDataVisualizerMaxFileSize to fileUpload:maxFileSize', () => {
const doc = {
type: 'config',
id: '8.0.0',
attributes: {
buildNum: 9007199254740991,
'ml:fileDataVisualizerMaxFileSize': '250MB',
},
references: [],
updated_at: '2020-06-09T20:18:20.349Z',
migrationVersion: {},
};
expect(migration(doc)).toEqual({
type: 'config',
id: '8.0.0',
attributes: {
buildNum: 9007199254740991,
'fileUpload:maxFileSize': '250MB',
},
references: [],
updated_at: '2020-06-09T20:18:20.349Z',
migrationVersion: {},
});
});
});

View file

@ -28,4 +28,23 @@ export const migrations = {
}),
references: doc.references || [],
}),
'7.13.0': (doc: SavedObjectUnsanitizedDoc<any>): SavedObjectSanitizedDoc<any> => ({
...doc,
...(doc.attributes && {
attributes: Object.keys(doc.attributes).reduce(
(acc, key) =>
key === 'ml:fileDataVisualizerMaxFileSize'
? {
...acc,
['fileUpload:maxFileSize']: doc.attributes[key],
}
: {
...acc,
[key]: doc.attributes[key],
},
{}
),
}),
references: doc.references || [],
}),
};

View file

@ -220,7 +220,7 @@ export const stackManagementSchema: MakeSchemaFrom<UsageStats> = {
type: 'boolean',
_meta: { description: 'Non-default value of setting.' },
},
'ml:fileDataVisualizerMaxFileSize': {
'fileUpload:maxFileSize': {
type: 'keyword',
_meta: { description: 'Non-default value of setting.' },
},

View file

@ -74,7 +74,7 @@ export interface UsageStats {
'discover:sort:defaultOrder': string;
'context:step': number;
'accessibility:disableAnimations': boolean;
'ml:fileDataVisualizerMaxFileSize': string;
'fileUpload:maxFileSize': string;
'ml:anomalyDetection:results:enableTimeDefaults': boolean;
'ml:anomalyDetection:results:timeDefaults': string;
'truncate:maxHeight': number;

View file

@ -7742,7 +7742,7 @@
"description": "Non-default value of setting."
}
},
"ml:fileDataVisualizerMaxFileSize": {
"fileUpload:maxFileSize": {
"type": "keyword",
"_meta": {
"description": "Non-default value of setting."

View file

@ -5,6 +5,8 @@
* 2.0.
*/
export const UI_SETTING_MAX_FILE_SIZE = 'fileUpload:maxFileSize';
export const MB = Math.pow(2, 20);
export const MAX_FILE_SIZE = '100MB';
export const MAX_FILE_SIZE_BYTES = 104857600; // 100MB

View file

@ -4,5 +4,6 @@
"kibanaVersion": "kibana",
"server": true,
"ui": true,
"requiredPlugins": ["data", "usageCollection"]
"requiredPlugins": ["data", "usageCollection"],
"requiredBundles": ["kibanaReact"]
}

View file

@ -12,6 +12,8 @@ import type { IImporter, ImportFactoryOptions } from '../importer';
export interface FileUploadStartApi {
getFileUploadComponent(): Promise<React.ComponentType<FileUploadComponentProps>>;
importerFactory(format: string, options: ImportFactoryOptions): Promise<IImporter | undefined>;
getMaxBytes(): number;
getMaxBytesFormatted(): string;
}
export async function getFileUploadComponent(): Promise<

View file

@ -0,0 +1,155 @@
/*
* Copyright Elasticsearch B.V. and/or licensed to Elasticsearch B.V. under one
* or more contributor license agreements. Licensed under the Elastic License
* 2.0; you may not use this file except in compliance with the Elastic License
* 2.0.
*/
import React, { Component } from 'react';
import { EuiFilePicker, EuiFormRow } from '@elastic/eui';
import { i18n } from '@kbn/i18n';
import { MB } from '../../common';
import { getMaxBytesFormatted } from '../get_max_bytes';
import { validateFile } from '../importer';
import { GeoJsonImporter, GeoJsonPreview, GEOJSON_FILE_TYPES } from '../importer/geojson_importer';
interface Props {
onSelect: ({
features,
hasPoints,
hasShapes,
importer,
indexName,
previewCoverage,
}: GeoJsonPreview & {
indexName: string;
importer: GeoJsonImporter;
}) => void;
onClear: () => void;
}
interface State {
error: string | null;
isLoadingPreview: boolean;
previewSummary: string | null;
}
export class GeoJsonFilePicker extends Component<Props, State> {
private _isMounted = false;
state: State = {
error: null,
isLoadingPreview: false,
previewSummary: null,
};
async componentDidMount() {
this._isMounted = true;
}
componentWillUnmount() {
this._isMounted = false;
}
_onFileSelect = (files: FileList | null) => {
this.props.onClear();
this.setState({
error: null,
isLoadingPreview: false,
previewSummary: null,
});
if (files && files.length) {
this._loadFilePreview(files[0]);
}
};
async _loadFilePreview(file: File) {
this.setState({ isLoadingPreview: true });
let importer: GeoJsonImporter | null = null;
let previewError: string | null = null;
let preview: GeoJsonPreview | null = null;
try {
validateFile(file, GEOJSON_FILE_TYPES);
importer = new GeoJsonImporter(file);
preview = await importer.previewFile(10000, MB * 3);
if (preview.features.length === 0) {
previewError = i18n.translate('xpack.fileUpload.geojsonFilePicker.noFeaturesDetected', {
defaultMessage: 'No GeoJson features found in selected file.',
});
}
} catch (error) {
previewError = error.message;
}
if (!this._isMounted) {
return;
}
this.setState({
error: previewError,
isLoadingPreview: false,
previewSummary:
!previewError && preview
? i18n.translate('xpack.fileUpload.geojsonFilePicker.previewSummary', {
defaultMessage: 'Previewing {numFeatures} features, {previewCoverage}% of file.',
values: {
numFeatures: preview.features.length,
previewCoverage: preview.previewCoverage,
},
})
: null,
});
if (importer && preview) {
this.props.onSelect({
...preview,
importer,
indexName: file.name.split('.')[0].toLowerCase(),
});
}
}
_renderHelpText() {
return this.state.previewSummary !== null ? (
this.state.previewSummary
) : (
<span>
{i18n.translate('xpack.fileUpload.geojsonFilePicker.acceptedFormats', {
defaultMessage: 'Formats accepted: {fileTypes}',
values: { fileTypes: GEOJSON_FILE_TYPES.join(', ') },
})}
<br />
{i18n.translate('xpack.fileUpload.geojsonFilePicker.maxSize', {
defaultMessage: 'Max size: {maxFileSize}',
values: { maxFileSize: getMaxBytesFormatted() },
})}
<br />
{i18n.translate('xpack.fileUpload.geojsonFilePicker.acceptedCoordinateSystem', {
defaultMessage: 'Coordinates must be in EPSG:4326 coordinate reference system.',
})}
</span>
);
}
render() {
return (
<EuiFormRow
isInvalid={!!this.state.error}
error={!!this.state.error ? [this.state.error] : []}
helpText={this._renderHelpText()}
>
<EuiFilePicker
initialPromptText={i18n.translate('xpack.fileUpload.geojsonFilePicker.filePicker', {
defaultMessage: 'Select or drag and drop a file',
})}
onChange={this._onFileSelect}
accept={GEOJSON_FILE_TYPES.join(',')}
isLoading={this.state.isLoadingPreview}
/>
</EuiFormRow>
);
}
}

View file

@ -0,0 +1,159 @@
/*
* Copyright Elasticsearch B.V. and/or licensed to Elasticsearch B.V. under one
* or more contributor license agreements. Licensed under the Elastic License
* 2.0; you may not use this file except in compliance with the Elastic License
* 2.0.
*/
import React, { Component, Fragment } from 'react';
import { i18n } from '@kbn/i18n';
import {
EuiButtonIcon,
EuiCallOut,
EuiCopy,
EuiFlexGroup,
EuiFlexItem,
EuiSpacer,
EuiText,
EuiTitle,
} from '@elastic/eui';
import { FormattedMessage } from '@kbn/i18n/react';
import { CodeEditor, KibanaContextProvider } from '../../../../../src/plugins/kibana_react/public';
import { getHttp, getUiSettings } from '../kibana_services';
import { ImportResponse } from '../../common';
const services = {
uiSettings: getUiSettings(),
};
interface Props {
importResp?: ImportResponse;
indexPatternResp?: object;
}
export class ImportCompleteView extends Component<Props, {}> {
_renderCodeEditor(json: object | undefined, title: string, copyButtonDataTestSubj: string) {
if (!json) {
return null;
}
const jsonAsString = JSON.stringify(json, null, 2);
return (
<Fragment>
<EuiFlexGroup justifyContent="spaceBetween" alignItems="flexEnd">
<EuiFlexItem grow={false}>
<EuiTitle size="xxs">
<h4>{title}</h4>
</EuiTitle>
</EuiFlexItem>
<EuiFlexItem grow={false}>
<EuiCopy textToCopy={jsonAsString}>
{(copy) => (
<EuiButtonIcon
size="s"
onClick={copy}
iconType="copy"
color="text"
data-test-subj={copyButtonDataTestSubj}
aria-label={i18n.translate('xpack.fileUpload.copyButtonAriaLabel', {
defaultMessage: 'Copy to clipboard',
})}
/>
)}
</EuiCopy>
</EuiFlexItem>
</EuiFlexGroup>
<div style={{ height: '200px' }}>
<CodeEditor
languageId="json"
value={jsonAsString}
onChange={() => {}}
options={{
readOnly: true,
lineNumbers: 'off',
fontSize: 12,
minimap: {
enabled: false,
},
scrollBeyondLastLine: false,
wordWrap: 'on',
wrappingIndent: 'indent',
automaticLayout: true,
}}
/>
</div>
<EuiSpacer size="m" />
</Fragment>
);
}
_getStatusMsg() {
if (!this.props.importResp || !this.props.importResp.success) {
return i18n.translate('xpack.fileUpload.uploadFailureMsg', {
defaultMessage: 'File upload failed.',
});
}
const successMsg = i18n.translate('xpack.fileUpload.uploadSuccessMsg', {
defaultMessage: 'File upload complete: indexed {numFeatures} features.',
values: {
numFeatures: this.props.importResp.docCount,
},
});
const failedFeaturesMsg = this.props.importResp.failures.length
? i18n.translate('xpack.fileUpload.failedFeaturesMsg', {
defaultMessage: 'Unable to index {numFailures} features.',
values: {
numFailures: this.props.importResp.failures.length,
},
})
: '';
return `${successMsg} ${failedFeaturesMsg}`;
}
render() {
return (
<KibanaContextProvider services={services}>
<EuiText>
<p>{this._getStatusMsg()}</p>
</EuiText>
{this._renderCodeEditor(
this.props.importResp,
i18n.translate('xpack.fileUpload.jsonImport.indexingResponse', {
defaultMessage: 'Import response',
}),
'indexRespCopyButton'
)}
{this._renderCodeEditor(
this.props.indexPatternResp,
i18n.translate('xpack.fileUpload.jsonImport.indexPatternResponse', {
defaultMessage: 'Index pattern response',
}),
'indexPatternRespCopyButton'
)}
<EuiCallOut>
<div>
<FormattedMessage
id="xpack.fileUpload.jsonImport.indexModsMsg"
defaultMessage="Further index modifications can be made using "
/>
<a
data-test-subj="indexManagementNewIndexLink"
target="_blank"
href={getHttp().basePath.prepend('/app/management/kibana/indexPatterns')}
>
<FormattedMessage
id="xpack.fileUpload.jsonImport.indexMgmtLink"
defaultMessage="Index Management"
/>
</a>
</div>
</EuiCallOut>
</KibanaContextProvider>
);
}
}

View file

@ -118,6 +118,7 @@ export class IndexSettings extends Component {
text: indexType,
value: indexType,
}))}
value={this.props.selectedIndexType}
onChange={({ target }) => setSelectedIndexType(target.value)}
/>
</EuiFormRow>

View file

@ -1,135 +0,0 @@
/*
* Copyright Elasticsearch B.V. and/or licensed to Elasticsearch B.V. under one
* or more contributor license agreements. Licensed under the Elastic License
* 2.0; you may not use this file except in compliance with the Elastic License
* 2.0.
*/
import React, { Fragment, Component } from 'react';
import { i18n } from '@kbn/i18n';
import { EuiCodeBlock, EuiSpacer, EuiText, EuiTitle, EuiProgress, EuiCallOut } from '@elastic/eui';
import { FormattedMessage } from '@kbn/i18n/react';
import { getHttp } from '../kibana_services';
export class JsonImportProgress extends Component {
state = {
indexDataJson: null,
indexPatternJson: null,
indexName: '',
importStage: '',
};
componentDidUpdate(prevProps, prevState) {
this._setIndex(this.props);
this._formatIndexDataResponse({ ...this.state, ...this.props });
this._formatIndexPatternResponse({ ...this.state, ...this.props });
if (prevState.importStage !== this.props.importStage) {
this.setState({
importStage: this.props.importStage,
});
}
}
// Retain last index for UI purposes
_setIndex = ({ indexName }) => {
if (indexName && !this.state.indexName) {
this.setState({ indexName });
}
};
// Format json responses
_formatIndexDataResponse = ({ indexDataResp, indexDataJson }) => {
if (indexDataResp && !indexDataJson) {
this.setState({ indexDataJson: JSON.stringify(indexDataResp, null, 2) });
}
};
_formatIndexPatternResponse = ({ indexPatternResp, indexPatternJson }) => {
if (indexPatternResp && !indexPatternJson) {
this.setState({ indexPatternJson: JSON.stringify(indexPatternResp, null, 2) });
}
};
render() {
const { complete } = this.props;
const { indexPatternJson, indexDataJson, indexName, importStage } = this.state;
const importMessage = complete ? importStage : `${importStage}: ${indexName}`;
return (
<Fragment>
{!complete ? <EuiProgress size="xs" color="accent" position="absolute" /> : null}
<EuiTitle size="xs">
<h3>
<FormattedMessage
id="xpack.fileUpload.jsonImport.indexingStatus"
defaultMessage="Indexing status"
/>
</h3>
</EuiTitle>
<EuiText>{importMessage && <p>{importMessage}</p>}</EuiText>
<EuiSpacer size="m" />
{complete ? (
<Fragment>
{indexDataJson ? (
<Fragment>
<EuiTitle size="xxs">
<h4>
<FormattedMessage
id="xpack.fileUpload.jsonImport.indexingResponse"
defaultMessage="Indexing response"
/>
</h4>
</EuiTitle>
<EuiCodeBlock
data-test-subj="indexRespCodeBlock"
paddingSize="s"
overflowHeight={200}
>
{indexDataJson}
</EuiCodeBlock>
<EuiSpacer size="m" />
</Fragment>
) : null}
{indexPatternJson ? (
<Fragment>
<EuiTitle size="xxs">
<h4>
<FormattedMessage
id="xpack.fileUpload.jsonImport.indexPatternResponse"
defaultMessage="Index pattern response"
/>
</h4>
</EuiTitle>
<EuiCodeBlock
data-test-subj="indexPatternRespCodeBlock"
paddingSize="s"
overflowHeight={200}
>
{indexPatternJson}
</EuiCodeBlock>
<EuiSpacer size="m" />
</Fragment>
) : null}
<EuiCallOut>
<div>
{i18n.translate('xpack.fileUpload.jsonImport.indexModsMsg', {
defaultMessage: 'Further index modifications can be made using\n',
})}
<a
data-test-subj="indexManagementNewIndexLink"
target="_blank"
href={getHttp().basePath.prepend('/app/management/kibana/indexPatterns')}
>
{i18n.translate('xpack.fileUpload.jsonImport.indexMgmtLink', {
defaultMessage: 'Index Management',
})}
</a>
.
</div>
</EuiCallOut>
</Fragment>
) : null}
</Fragment>
);
}
}

View file

@ -1,267 +0,0 @@
/*
* Copyright Elasticsearch B.V. and/or licensed to Elasticsearch B.V. under one
* or more contributor license agreements. Licensed under the Elastic License
* 2.0; you may not use this file except in compliance with the Elastic License
* 2.0.
*/
import React, { Fragment, Component } from 'react';
import { EuiFilePicker, EuiFormRow, EuiProgress } from '@elastic/eui';
import { FormattedMessage } from '@kbn/i18n/react';
import { i18n } from '@kbn/i18n';
const MAX_FILE_SIZE = 52428800;
const ACCEPTABLE_FILETYPES = ['json', 'geojson'];
const acceptedFileTypeString = ACCEPTABLE_FILETYPES.map((type) => `.${type}`).join(',');
const acceptedFileTypeStringMessage = ACCEPTABLE_FILETYPES.map((type) => `.${type}`).join(', ');
export class JsonIndexFilePicker extends Component {
state = {
fileUploadError: '',
percentageProcessed: 0,
featuresProcessed: 0,
fileParseActive: false,
currentFileTracker: null,
};
async componentDidMount() {
this._isMounted = true;
}
componentWillUnmount() {
this._isMounted = false;
}
isFileParseActive = () => this._isMounted && this.state.fileParseActive;
_fileHandler = (fileList) => {
const fileArr = Array.from(fileList);
this.props.resetFileAndIndexSettings();
this.setState({
fileUploadError: '',
percentageProcessed: 0,
featuresProcessed: 0,
});
if (fileArr.length === 0) {
// Remove
this.setState({
fileParseActive: false,
});
return;
}
const file = fileArr[0];
this.setState(
{
fileParseActive: true,
currentFileTracker: Symbol(),
},
() => this._parseFile(file)
);
};
_getFileNameAndCheckType({ name }) {
let fileNameOnly;
try {
if (!name) {
throw new Error(
i18n.translate('xpack.fileUpload.jsonIndexFilePicker.noFileNameError', {
defaultMessage: 'No file name provided',
})
);
}
const splitNameArr = name.split('.');
const fileType = splitNameArr.pop();
if (!ACCEPTABLE_FILETYPES.includes(fileType)) {
//should only occur if browser does not accept the <File> accept parameter
throw new Error(
i18n.translate('xpack.fileUpload.jsonIndexFilePicker.acceptableTypesError', {
defaultMessage: 'File is not one of acceptable types: {types}',
values: {
types: ACCEPTABLE_FILETYPES.join(', '),
},
})
);
}
fileNameOnly = splitNameArr[0];
} catch (error) {
this.setState({
fileUploadError: i18n.translate(
'xpack.fileUpload.jsonIndexFilePicker.fileProcessingError',
{
defaultMessage: 'File processing error: {errorMessage}',
values: {
errorMessage: error.message,
},
}
),
});
return;
}
return fileNameOnly.toLowerCase();
}
setFileProgress = ({ featuresProcessed, bytesProcessed, totalBytes }) => {
const percentageProcessed = parseInt((100 * bytesProcessed) / totalBytes);
if (this.isFileParseActive()) {
this.setState({ featuresProcessed, percentageProcessed });
}
};
async _parseFile(file) {
const { currentFileTracker } = this.state;
const { setFileRef, setParsedFile, resetFileAndIndexSettings } = this.props;
if (file.size > MAX_FILE_SIZE) {
this.setState({
fileUploadError: i18n.translate('xpack.fileUpload.jsonIndexFilePicker.acceptableFileSize', {
defaultMessage: 'File size {fileSize} exceeds maximum file size of {maxFileSize}',
values: {
fileSize: bytesToSize(file.size),
maxFileSize: bytesToSize(MAX_FILE_SIZE),
},
}),
});
resetFileAndIndexSettings();
return;
}
const defaultIndexName = this._getFileNameAndCheckType(file);
if (!defaultIndexName) {
resetFileAndIndexSettings();
return;
}
const fileResult = await this.props.geojsonImporter
.readFile(file, this.setFileProgress, this.isFileParseActive)
.catch((err) => {
if (this._isMounted) {
this.setState({
fileParseActive: false,
percentageProcessed: 0,
featuresProcessed: 0,
fileUploadError: (
<FormattedMessage
id="xpack.fileUpload.jsonIndexFilePicker.unableParseFile"
defaultMessage="Unable to parse file: {error}"
values={{
error: err.message,
}}
/>
),
});
resetFileAndIndexSettings();
return;
}
});
if (!this._isMounted) {
return;
}
// If another file is replacing this one, leave file parse active
this.setState({
percentageProcessed: 0,
featuresProcessed: 0,
fileParseActive: currentFileTracker !== this.state.currentFileTracker,
});
if (!fileResult) {
resetFileAndIndexSettings();
return;
}
if (fileResult.errors.length) {
this.setState({
fileUploadError: (
<FormattedMessage
id="xpack.fileUpload.jsonIndexFilePicker.fileParseError"
defaultMessage="File parse error(s) detected: {error}"
values={{ error: fileResult.errors[0] }}
/>
),
});
}
setFileRef(file);
setParsedFile(fileResult, defaultIndexName);
}
render() {
const { fileUploadError, percentageProcessed, featuresProcessed } = this.state;
return (
<Fragment>
{percentageProcessed ? (
<EuiProgress
value={percentageProcessed}
max={100}
size="xs"
color="accent"
position="absolute"
/>
) : null}
<EuiFormRow
label={
<FormattedMessage
id="xpack.fileUpload.jsonIndexFilePicker.filePickerLabel"
defaultMessage="Select a file to upload"
/>
}
isInvalid={fileUploadError !== ''}
error={[fileUploadError]}
helpText={
percentageProcessed ? (
i18n.translate('xpack.fileUpload.jsonIndexFilePicker.parsingFile', {
defaultMessage: '{featuresProcessed} features parsed...',
values: {
featuresProcessed,
},
})
) : (
<span>
{i18n.translate('xpack.fileUpload.jsonIndexFilePicker.formatsAccepted', {
defaultMessage: 'Formats accepted: {acceptedFileTypeStringMessage}',
values: {
acceptedFileTypeStringMessage,
},
})}{' '}
<br />
<FormattedMessage
id="xpack.fileUpload.jsonIndexFilePicker.maxSize"
defaultMessage="Max size: {maxFileSize}"
values={{
maxFileSize: bytesToSize(MAX_FILE_SIZE),
}}
/>
<br />
{i18n.translate('xpack.fileUpload.jsonIndexFilePicker.coordinateSystemAccepted', {
defaultMessage: 'Coordinates must be in EPSG:4326 coordinate reference system.',
})}{' '}
</span>
)
}
>
<EuiFilePicker
initialPromptText={
<FormattedMessage
id="xpack.fileUpload.jsonIndexFilePicker.filePicker"
defaultMessage="Upload file"
/>
}
onChange={this._fileHandler}
accept={acceptedFileTypeString}
/>
</EuiFormRow>
</Fragment>
);
}
}
function bytesToSize(bytes) {
const sizes = ['Bytes', 'KB', 'MB', 'GB', 'TB'];
if (bytes === 0) return 'n/a';
const i = parseInt(Math.floor(Math.log(bytes) / Math.log(1024)), 10);
if (i === 0) return `${bytes} ${sizes[i]})`;
return `${(bytes / 1024 ** i).toFixed(1)} ${sizes[i]}`;
}

View file

@ -7,65 +7,41 @@
import React, { Component, Fragment } from 'react';
import { i18n } from '@kbn/i18n';
import { EuiForm } from '@elastic/eui';
import { EuiForm, EuiProgress, EuiText } from '@elastic/eui';
import PropTypes from 'prop-types';
import { IndexSettings } from './index_settings';
import { JsonIndexFilePicker } from './json_index_file_picker';
import { JsonImportProgress } from './json_import_progress';
import _ from 'lodash';
import { GeoJsonImporter } from '../importer/geojson_importer';
import { ES_FIELD_TYPES } from '../../../../../src/plugins/data/public';
import { getIndexPatternService } from '../kibana_services';
import { GeoJsonFilePicker } from './geojson_file_picker';
import { ImportCompleteView } from './import_complete_view';
import { ES_FIELD_TYPES } from '../../../../../src/plugins/data/public';
const INDEXING_STAGE = {
INDEXING_STARTED: i18n.translate('xpack.fileUpload.jsonUploadAndParse.dataIndexingStarted', {
defaultMessage: 'Data indexing started',
}),
WRITING_TO_INDEX: i18n.translate('xpack.fileUpload.jsonUploadAndParse.writingToIndex', {
defaultMessage: 'Writing to index',
}),
INDEXING_COMPLETE: i18n.translate('xpack.fileUpload.jsonUploadAndParse.indexingComplete', {
defaultMessage: 'Indexing complete',
}),
CREATING_INDEX_PATTERN: i18n.translate(
'xpack.fileUpload.jsonUploadAndParse.creatingIndexPattern',
{ defaultMessage: 'Creating index pattern' }
),
INDEX_PATTERN_COMPLETE: i18n.translate(
'xpack.fileUpload.jsonUploadAndParse.indexPatternComplete',
{ defaultMessage: 'Index pattern complete' }
),
INDEXING_ERROR: i18n.translate('xpack.fileUpload.jsonUploadAndParse.dataIndexingError', {
defaultMessage: 'Data indexing error',
}),
INDEX_PATTERN_ERROR: i18n.translate('xpack.fileUpload.jsonUploadAndParse.indexPatternError', {
defaultMessage: 'Index pattern error',
}),
const PHASE = {
CONFIGURE: 'CONFIGURE',
IMPORT: 'IMPORT',
COMPLETE: 'COMPLETE',
};
function getWritingToIndexMsg(progress) {
return i18n.translate('xpack.fileUpload.jsonUploadAndParse.writingToIndex', {
defaultMessage: 'Writing to index: {progress}% complete',
values: { progress },
});
}
export class JsonUploadAndParse extends Component {
geojsonImporter = new GeoJsonImporter();
state = {
// File state
fileRef: null,
parsedFile: null,
indexedFile: null,
// Index state
indexTypes: [],
selectedIndexType: '',
indexName: '',
indexRequestInFlight: false,
indexPatternRequestInFlight: false,
hasIndexErrors: false,
isIndexReady: false,
// Progress-tracking state
showImportProgress: false,
currentIndexingStage: INDEXING_STAGE.INDEXING_STARTED,
indexDataResp: '',
indexPatternResp: '',
importStatus: '',
phase: PHASE.CONFIGURE,
importResp: undefined,
indexPatternResp: undefined,
};
componentDidMount() {
@ -74,102 +50,36 @@ export class JsonUploadAndParse extends Component {
componentWillUnmount() {
this._isMounted = false;
}
_resetFileAndIndexSettings = () => {
if (this.props.onFileRemove && this.state.fileRef) {
this.props.onFileRemove(this.state.fileRef);
if (this._geojsonImporter) {
this._geojsonImporter.destroy();
this._geojsonImporter = null;
}
this.setState({
indexTypes: [],
selectedIndexType: '',
indexName: '',
indexedFile: null,
parsedFile: null,
fileRef: null,
});
};
}
componentDidUpdate() {
this._updateIndexType();
this._setIndexReady({ ...this.state, ...this.props });
this._indexData({ ...this.state, ...this.props });
if (this.props.isIndexingTriggered && !this.state.showImportProgress && this._isMounted) {
this.setState({ showImportProgress: true });
this._setIndexReady();
if (this.props.isIndexingTriggered && this.state.phase === PHASE.CONFIGURE) {
this._import();
}
}
_updateIndexType() {
let nextIndexTypes = [];
if (this.state.parsedFile) {
nextIndexTypes =
this.state.parsedFile.geometryTypes.includes('Point') ||
this.state.parsedFile.geometryTypes.includes('MultiPoint')
? [ES_FIELD_TYPES.GEO_POINT, ES_FIELD_TYPES.GEO_SHAPE]
: [ES_FIELD_TYPES.GEO_SHAPE];
}
if (!_.isEqual(nextIndexTypes, this.state.indexTypes)) {
this.setState({ indexTypes: nextIndexTypes });
}
if (!this.state.selectedIndexType && nextIndexTypes.length) {
// auto select index type
this.setState({ selectedIndexType: nextIndexTypes[0] });
} else if (
this.state.selectedIndexType &&
!nextIndexTypes.includes(this.state.selectedIndexType)
) {
// unselected indexType if selected type is not longer an option
this.setState({ selectedIndexType: null });
}
}
_setIndexReady = ({
parsedFile,
selectedIndexType,
indexName,
hasIndexErrors,
indexRequestInFlight,
onIndexReady,
}) => {
_setIndexReady = () => {
const isIndexReady =
!!parsedFile &&
!!selectedIndexType &&
!!indexName &&
!hasIndexErrors &&
!indexRequestInFlight;
this._geojsonImporter !== undefined &&
!!this.state.selectedIndexType &&
!!this.state.indexName &&
!this.state.hasIndexErrors &&
this.state.phase === PHASE.CONFIGURE;
if (isIndexReady !== this.state.isIndexReady) {
this.setState({ isIndexReady });
if (onIndexReady) {
onIndexReady(isIndexReady);
}
this.props.onIndexReady(isIndexReady);
}
};
_indexData = async ({
indexedFile,
parsedFile,
indexRequestInFlight,
indexName,
selectedIndexType,
isIndexingTriggered,
isIndexReady,
onIndexingComplete,
onIndexingError,
}) => {
// Check index ready
const filesAreEqual = _.isEqual(indexedFile, parsedFile);
if (!isIndexingTriggered || filesAreEqual || !isIndexReady || indexRequestInFlight) {
return;
}
this.setState({
indexRequestInFlight: true,
currentIndexingStage: INDEXING_STAGE.WRITING_TO_INDEX,
});
this.geojsonImporter.setDocs(parsedFile.parsedGeojson, selectedIndexType);
// initialize import
_import = async () => {
//
// create index
//
const settings = {
number_of_shards: 1,
};
@ -181,8 +91,16 @@ export class JsonUploadAndParse extends Component {
},
};
const ingestPipeline = {};
const initializeImportResp = await this.geojsonImporter.initializeImport(
indexName,
this.setState({
importStatus: i18n.translate('xpack.fileUpload.jsonUploadAndParse.dataIndexingStarted', {
defaultMessage: 'Creating index: {indexName}',
values: { indexName: this.state.indexName },
}),
phase: PHASE.IMPORT,
});
this._geojsonImporter.setGeoFieldType(this.state.selectedIndexType);
const initializeImportResp = await this._geojsonImporter.initializeImport(
this.state.indexName,
settings,
mappings,
ingestPipeline
@ -192,131 +110,180 @@ export class JsonUploadAndParse extends Component {
}
if (initializeImportResp.index === undefined || initializeImportResp.id === undefined) {
this.setState({
indexRequestInFlight: false,
currentIndexingStage: INDEXING_STAGE.INDEXING_ERROR,
phase: PHASE.COMPLETE,
});
this._resetFileAndIndexSettings();
onIndexingError();
this.props.onIndexingError();
return;
}
//
// import file
const importResp = await this.geojsonImporter.import(
//
this.setState({
importStatus: getWritingToIndexMsg(0),
});
const importResp = await this._geojsonImporter.import(
initializeImportResp.id,
indexName,
this.state.indexName,
initializeImportResp.pipelineId,
() => {}
(progress) => {
if (this._isMounted) {
this.setState({
importStatus: getWritingToIndexMsg(progress),
});
}
}
);
if (!this._isMounted) {
return;
}
if (!importResp.success) {
this.setState({
indexDataResp: importResp,
indexRequestInFlight: false,
currentIndexingStage: INDEXING_STAGE.INDEXING_ERROR,
importResp,
importStatus: i18n.translate('xpack.fileUpload.jsonUploadAndParse.dataIndexingError', {
defaultMessage: 'Data indexing error',
}),
phase: PHASE.COMPLETE,
});
this._resetFileAndIndexSettings();
onIndexingError();
this.props.onIndexingError();
return;
}
this.setState({
indexDataResp: importResp,
indexedFile: parsedFile,
currentIndexingStage: INDEXING_STAGE.INDEXING_COMPLETE,
});
//
// create index pattern
//
this.setState({
indexPatternRequestInFlight: true,
currentIndexingStage: INDEXING_STAGE.CREATING_INDEX_PATTERN,
importResp,
importStatus: i18n.translate('xpack.fileUpload.jsonUploadAndParse.creatingIndexPattern', {
defaultMessage: 'Creating index pattern: {indexName}',
values: { indexName: this.state.indexName },
}),
});
let indexPattern;
try {
indexPattern = await getIndexPatternService().createAndSave(
{
title: indexName,
title: this.state.indexName,
},
true
);
} catch (error) {
if (this._isMounted) {
this.setState({
indexPatternRequestInFlight: false,
currentIndexingStage: INDEXING_STAGE.INDEX_PATTERN_ERROR,
importStatus: i18n.translate('xpack.fileUpload.jsonUploadAndParse.indexPatternError', {
defaultMessage: 'Index pattern error',
}),
phase: PHASE.COMPLETE,
});
this._resetFileAndIndexSettings();
onIndexingError();
this.props.onIndexingError();
}
return;
}
if (!this._isMounted) {
return;
}
//
// Successful import
//
this.setState({
indexPatternResp: {
success: true,
id: indexPattern.id,
fields: indexPattern.fields,
},
indexPatternRequestInFlight: false,
phase: PHASE.COMPLETE,
importStatus: '',
});
this.setState({
currentIndexingStage: INDEXING_STAGE.INDEX_PATTERN_COMPLETE,
});
this._resetFileAndIndexSettings();
onIndexingComplete({
this.props.onIndexingComplete({
indexDataResp: importResp,
indexPattern,
});
};
render() {
const {
currentIndexingStage,
indexDataResp,
indexPatternResp,
fileRef,
_onFileSelect = ({ features, hasPoints, hasShapes, importer, indexName, previewCoverage }) => {
this._geojsonImporter = importer;
const geoFieldTypes = hasPoints
? [ES_FIELD_TYPES.GEO_POINT, ES_FIELD_TYPES.GEO_SHAPE]
: [ES_FIELD_TYPES.GEO_SHAPE];
const newState = {
indexTypes: geoFieldTypes,
indexName,
indexTypes,
showImportProgress,
} = this.state;
};
if (!this.state.selectedIndexType) {
// auto select index type
newState.selectedIndexType =
hasPoints && !hasShapes ? ES_FIELD_TYPES.GEO_POINT : ES_FIELD_TYPES.GEO_SHAPE;
} else if (
this.state.selectedIndexType &&
!geoFieldTypes.includes(this.state.selectedIndexType)
) {
// unselected indexType if selected type is not longer an option
newState.selectedIndexType = '';
}
this.setState(newState);
this.props.onFileUpload(
{
type: 'FeatureCollection',
features,
},
indexName,
previewCoverage
);
};
_onFileClear = () => {
if (this._geojsonImporter) {
this._geojsonImporter.destroy();
this._geojsonImporter = undefined;
}
this.props.onFileRemove();
this.setState({
indexTypes: [],
selectedIndexType: '',
indexName: '',
});
};
render() {
if (this.state.phase === PHASE.IMPORT) {
return (
<Fragment>
<EuiProgress size="xs" color="accent" position="absolute" />
<EuiText>
<p>{this.state.importStatus}</p>
</EuiText>
</Fragment>
);
}
if (this.state.phase === PHASE.COMPLETE) {
return (
<ImportCompleteView
importResp={this.state.importResp}
indexPatternResp={this.state.indexPatternResp}
/>
);
}
return (
<EuiForm>
{showImportProgress ? (
<JsonImportProgress
importStage={currentIndexingStage}
indexDataResp={indexDataResp}
indexPatternResp={indexPatternResp}
complete={
currentIndexingStage === INDEXING_STAGE.INDEX_PATTERN_COMPLETE ||
currentIndexingStage === INDEXING_STAGE.INDEXING_ERROR
}
indexName={indexName}
/>
) : (
<Fragment>
<JsonIndexFilePicker
fileRef={fileRef}
setFileRef={(fileRef) => this.setState({ fileRef })}
setParsedFile={(parsedFile, indexName) => {
this.setState({ parsedFile, indexName });
this.props.onFileUpload(parsedFile.parsedGeojson, indexName);
}}
resetFileAndIndexSettings={this._resetFileAndIndexSettings}
geojsonImporter={this.geojsonImporter}
/>
<IndexSettings
disabled={!fileRef}
indexName={indexName}
setIndexName={(indexName) => this.setState({ indexName })}
indexTypes={indexTypes}
setSelectedIndexType={(selectedIndexType) => this.setState({ selectedIndexType })}
setHasIndexErrors={(hasIndexErrors) => this.setState({ hasIndexErrors })}
/>
</Fragment>
)}
<GeoJsonFilePicker onSelect={this._onFileSelect} onClear={this._onFileClear} />
<IndexSettings
disabled={this._geojsonImporter === undefined}
indexName={this.state.indexName}
setIndexName={(indexName) => this.setState({ indexName })}
indexTypes={this.state.indexTypes}
selectedIndexType={this.state.selectedIndexType}
setSelectedIndexType={(selectedIndexType) => this.setState({ selectedIndexType })}
setHasIndexErrors={(hasIndexErrors) => this.setState({ hasIndexErrors })}
/>
</EuiForm>
);
}

View file

@ -0,0 +1,31 @@
/*
* Copyright Elasticsearch B.V. and/or licensed to Elasticsearch B.V. under one
* or more contributor license agreements. Licensed under the Elastic License
* 2.0; you may not use this file except in compliance with the Elastic License
* 2.0.
*/
// @ts-ignore
import numeral from '@elastic/numeral';
import {
MAX_FILE_SIZE,
MAX_FILE_SIZE_BYTES,
ABSOLUTE_MAX_FILE_SIZE_BYTES,
FILE_SIZE_DISPLAY_FORMAT,
UI_SETTING_MAX_FILE_SIZE,
} from '../common';
import { getUiSettings } from './kibana_services';
export function getMaxBytes() {
const maxFileSize = getUiSettings().get(UI_SETTING_MAX_FILE_SIZE, MAX_FILE_SIZE);
// @ts-ignore
const maxBytes = numeral(maxFileSize.toUpperCase()).value();
if (maxBytes < MAX_FILE_SIZE_BYTES) {
return MAX_FILE_SIZE_BYTES;
}
return maxBytes <= ABSOLUTE_MAX_FILE_SIZE_BYTES ? maxBytes : ABSOLUTE_MAX_FILE_SIZE_BYTES;
}
export function getMaxBytesFormatted() {
return numeral(getMaxBytes()).format(FILE_SIZE_DISPLAY_FORMAT);
}

View file

@ -5,7 +5,7 @@
* 2.0.
*/
import { GeoJsonImporter } from './geojson_importer';
import { GeoJsonImporter, toEsDocs } from './geojson_importer';
import { ES_FIELD_TYPES } from '../../../../../../src/plugins/data/public';
import '@loaders.gl/polyfills';
@ -25,9 +25,7 @@ const FEATURE_COLLECTION = {
],
};
describe('readFile', () => {
const setFileProgress = jest.fn((a) => a);
describe('previewFile', () => {
const FILE_WITH_FEATURE_COLLECTION = new File(
[JSON.stringify(FEATURE_COLLECTION)],
'testfile.json',
@ -39,38 +37,26 @@ describe('readFile', () => {
jest.restoreAllMocks();
});
test('should throw error if no file provided', async () => {
const importer = new GeoJsonImporter();
await importer
.readFile(null, setFileProgress, () => {
return true;
})
.catch((e) => {
expect(e.message).toMatch('Error, no file provided');
});
});
test('should abort if file parse is cancelled', async () => {
const importer = new GeoJsonImporter();
const results = await importer.readFile(FILE_WITH_FEATURE_COLLECTION, setFileProgress, () => {
return false;
test('should stop reading when importer is destroyed', async () => {
const importer = new GeoJsonImporter(FILE_WITH_FEATURE_COLLECTION);
importer.destroy();
const results = await importer.previewFile();
expect(results).toEqual({
features: [],
previewCoverage: 0,
hasPoints: false,
hasShapes: false,
});
expect(results).toBeNull();
});
test('should read features from feature collection', async () => {
const importer = new GeoJsonImporter();
const results = await importer.readFile(FILE_WITH_FEATURE_COLLECTION, setFileProgress, () => {
return true;
});
expect(setFileProgress).toHaveBeenCalled();
const importer = new GeoJsonImporter(FILE_WITH_FEATURE_COLLECTION);
const results = await importer.previewFile();
expect(results).toEqual({
errors: [],
geometryTypes: ['Point'],
parsedGeojson: FEATURE_COLLECTION,
previewCoverage: 100,
hasPoints: true,
hasShapes: false,
features: FEATURE_COLLECTION.features,
});
});
@ -99,20 +85,14 @@ describe('readFile', () => {
{ type: 'text/json' }
);
const importer = new GeoJsonImporter();
const results = await importer.readFile(
fileWithFeaturesWithoutGeometry,
setFileProgress,
() => {
return true;
}
);
const importer = new GeoJsonImporter(fileWithFeaturesWithoutGeometry);
const results = await importer.previewFile();
expect(setFileProgress).toHaveBeenCalled();
expect(results).toEqual({
errors: ['2 features without geometry omitted'],
geometryTypes: ['Point'],
parsedGeojson: FEATURE_COLLECTION,
previewCoverage: 100,
hasPoints: true,
hasShapes: false,
features: FEATURE_COLLECTION.features,
});
});
@ -134,20 +114,18 @@ describe('readFile', () => {
{ type: 'text/json' }
);
const importer = new GeoJsonImporter();
const results = await importer.readFile(fileWithUnwrapedFeature, setFileProgress, () => {
return true;
});
const importer = new GeoJsonImporter(fileWithUnwrapedFeature);
const results = await importer.previewFile();
expect(setFileProgress).toHaveBeenCalled();
expect(results).toEqual({
errors: [],
geometryTypes: ['Point'],
parsedGeojson: FEATURE_COLLECTION,
previewCoverage: 100,
hasPoints: true,
hasShapes: false,
features: FEATURE_COLLECTION.features,
});
});
test('should throw if no features', async () => {
test('should return empty feature collection if no features', async () => {
const fileWithNoFeatures = new File(
[
JSON.stringify({
@ -159,17 +137,18 @@ describe('readFile', () => {
{ type: 'text/json' }
);
const importer = new GeoJsonImporter();
await importer
.readFile(fileWithNoFeatures, setFileProgress, () => {
return true;
})
.catch((e) => {
expect(e.message).toMatch('Error, no features detected');
});
const importer = new GeoJsonImporter(fileWithNoFeatures);
const results = await importer.previewFile();
expect(results).toEqual({
previewCoverage: 100,
hasPoints: false,
hasShapes: false,
features: [],
});
});
test('should throw if no features with geometry', async () => {
test('should return empty feature collection if no features with geometry', async () => {
const fileWithFeaturesWithNoGeometry = new File(
[
JSON.stringify({
@ -186,22 +165,22 @@ describe('readFile', () => {
{ type: 'text/json' }
);
const importer = new GeoJsonImporter();
await importer
.readFile(fileWithFeaturesWithNoGeometry, setFileProgress, () => {
return true;
})
.catch((e) => {
expect(e.message).toMatch('Error, no features detected');
});
const importer = new GeoJsonImporter(fileWithFeaturesWithNoGeometry);
const results = await importer.previewFile();
expect(results).toEqual({
previewCoverage: 100,
hasPoints: false,
hasShapes: false,
features: [],
});
});
});
describe('setDocs', () => {
describe('toEsDocs', () => {
test('should convert features to geo_point ES documents', () => {
const importer = new GeoJsonImporter();
importer.setDocs(FEATURE_COLLECTION, ES_FIELD_TYPES.GEO_POINT);
expect(importer.getDocs()).toEqual([
const esDocs = toEsDocs(FEATURE_COLLECTION.features, ES_FIELD_TYPES.GEO_POINT);
expect(esDocs).toEqual([
{
coordinates: [-112.0372, 46.608058],
population: 200,
@ -210,9 +189,8 @@ describe('setDocs', () => {
});
test('should convert features to geo_shape ES documents', () => {
const importer = new GeoJsonImporter();
importer.setDocs(FEATURE_COLLECTION, ES_FIELD_TYPES.GEO_SHAPE);
expect(importer.getDocs()).toEqual([
const esDocs = toEsDocs(FEATURE_COLLECTION.features, ES_FIELD_TYPES.GEO_SHAPE);
expect(esDocs).toEqual([
{
coordinates: {
type: 'point',

View file

@ -7,7 +7,6 @@
import {
Feature,
FeatureCollection,
Point,
MultiPoint,
LineString,
@ -18,15 +17,237 @@ import {
import { i18n } from '@kbn/i18n';
// @ts-expect-error
import { JSONLoader, loadInBatches } from './loaders';
import { CreateDocsResponse } from '../types';
import { Importer } from '../importer';
import { CreateDocsResponse, ImportResults } from '../types';
import { callImportRoute, Importer, IMPORT_RETRIES } from '../importer';
import { ES_FIELD_TYPES } from '../../../../../../src/plugins/data/public';
// @ts-expect-error
import { geoJsonCleanAndValidate } from './geojson_clean_and_validate';
import { ImportFailure, ImportResponse, MB } from '../../../common';
const IMPORT_CHUNK_SIZE_MB = 10 * MB;
export const GEOJSON_FILE_TYPES = ['.json', '.geojson'];
export interface GeoJsonPreview {
features: Feature[];
hasPoints: boolean;
hasShapes: boolean;
previewCoverage: number;
}
export class GeoJsonImporter extends Importer {
constructor() {
private _file: File;
private _isActive = true;
private _iterator?: Iterator<unknown>;
private _hasNext = true;
private _features: Feature[] = [];
private _totalBytesProcessed = 0;
private _unimportedBytesProcessed = 0;
private _totalFeatures = 0;
private _geometryTypesMap = new Map<string, boolean>();
private _invalidFeatures: ImportFailure[] = [];
private _prevBatchLastFeature?: Feature;
private _geoFieldType: ES_FIELD_TYPES.GEO_POINT | ES_FIELD_TYPES.GEO_SHAPE =
ES_FIELD_TYPES.GEO_SHAPE;
constructor(file: File) {
super();
this._file = file;
}
public destroy() {
this._isActive = false;
}
public async previewFile(rowLimit?: number, sizeLimit?: number): Promise<GeoJsonPreview> {
await this._readUntil(rowLimit, sizeLimit);
return {
features: [...this._features],
previewCoverage: this._hasNext
? Math.round((this._unimportedBytesProcessed / this._file.size) * 100)
: 100,
hasPoints: this._geometryTypesMap.has('Point') || this._geometryTypesMap.has('MultiPoint'),
hasShapes:
this._geometryTypesMap.has('LineString') ||
this._geometryTypesMap.has('MultiLineString') ||
this._geometryTypesMap.has('Polygon') ||
this._geometryTypesMap.has('MultiPolygon'),
};
}
public setGeoFieldType(geoFieldType: ES_FIELD_TYPES.GEO_POINT | ES_FIELD_TYPES.GEO_SHAPE) {
this._geoFieldType = geoFieldType;
}
public async import(
id: string,
index: string,
pipelineId: string,
setImportProgress: (progress: number) => void
): Promise<ImportResults> {
if (!id || !index) {
return {
success: false,
error: i18n.translate('xpack.fileUpload.import.noIdOrIndexSuppliedErrorMessage', {
defaultMessage: 'no ID or index supplied',
}),
};
}
let success = true;
const failures: ImportFailure[] = [...this._invalidFeatures];
let error;
while ((this._features.length > 0 || this._hasNext) && this._isActive) {
await this._readUntil(undefined, IMPORT_CHUNK_SIZE_MB);
if (!this._isActive) {
return {
success: false,
failures,
docCount: this._totalFeatures,
};
}
let retries = IMPORT_RETRIES;
let resp: ImportResponse = {
success: false,
failures: [],
docCount: 0,
id: '',
index: '',
pipelineId: '',
};
const data = toEsDocs(this._features, this._geoFieldType);
const progress = Math.round((this._totalBytesProcessed / this._file.size) * 100);
this._features = [];
this._unimportedBytesProcessed = 0;
while (resp.success === false && retries > 0) {
try {
resp = await callImportRoute({
id,
index,
data,
settings: {},
mappings: {},
ingestPipeline: {
id: pipelineId,
},
});
if (retries < IMPORT_RETRIES) {
// eslint-disable-next-line no-console
console.log(`Retrying import ${IMPORT_RETRIES - retries}`);
}
retries--;
} catch (err) {
resp.success = false;
resp.error = err;
retries = 0;
}
}
failures.push(...resp.failures);
if (!resp.success) {
success = false;
error = resp.error;
break;
}
setImportProgress(progress);
}
const result: ImportResults = {
success,
failures,
docCount: this._totalFeatures,
};
if (success) {
setImportProgress(100);
} else {
result.error = error;
}
return result;
}
private async _readUntil(rowLimit?: number, sizeLimit?: number) {
while (
this._isActive &&
this._hasNext &&
(rowLimit === undefined || this._features.length < rowLimit) &&
(sizeLimit === undefined || this._unimportedBytesProcessed < sizeLimit)
) {
await this._next();
}
}
private async _next() {
if (this._iterator === undefined) {
this._iterator = await loadInBatches(this._file, JSONLoader, {
json: {
jsonpaths: ['$.features'],
_rootObjectBatches: true,
},
});
}
if (!this._isActive || !this._iterator) {
return;
}
const { value: batch, done } = await this._iterator.next();
if (!this._isActive || done) {
this._hasNext = false;
return;
}
if ('bytesUsed' in batch) {
const bytesRead = batch.bytesUsed - this._totalBytesProcessed;
this._unimportedBytesProcessed += bytesRead;
this._totalBytesProcessed = batch.bytesUsed;
}
const rawFeatures: unknown[] = this._prevBatchLastFeature ? [this._prevBatchLastFeature] : [];
this._prevBatchLastFeature = undefined;
const isLastBatch = batch.batchType === 'root-object-batch-complete';
if (isLastBatch) {
// Handle single feature geoJson
if (this._totalFeatures === 0) {
rawFeatures.push(batch.container);
}
} else {
rawFeatures.push(...batch.data);
}
for (let i = 0; i < rawFeatures.length; i++) {
const rawFeature = rawFeatures[i] as Feature;
if (!isLastBatch && i === rawFeatures.length - 1) {
// Do not process last feature until next batch is read, features on batch boundary may be incomplete.
this._prevBatchLastFeature = rawFeature;
continue;
}
this._totalFeatures++;
if (!rawFeature.geometry || !rawFeature.geometry.type) {
this._invalidFeatures.push({
item: this._totalFeatures,
reason: i18n.translate('xpack.fileUpload.geojsonImporter.noGeometry', {
defaultMessage: 'Feature does not contain required field "geometry"',
}),
doc: rawFeature,
});
} else {
if (!this._geometryTypesMap.has(rawFeature.geometry.type)) {
this._geometryTypesMap.set(rawFeature.geometry.type, true);
}
this._features.push(geoJsonCleanAndValidate(rawFeature));
}
}
}
public read(data: ArrayBuffer): { success: boolean } {
@ -36,143 +257,34 @@ export class GeoJsonImporter extends Importer {
protected _createDocs(text: string): CreateDocsResponse {
throw new Error('_createDocs not implemented.');
}
}
public getDocs() {
return this._docArray;
}
public setDocs(
featureCollection: FeatureCollection,
geoFieldType: ES_FIELD_TYPES.GEO_POINT | ES_FIELD_TYPES.GEO_SHAPE
) {
this._docArray = [];
for (let i = 0; i < featureCollection.features.length; i++) {
const feature = featureCollection.features[i];
const geometry = feature.geometry as
| Point
| MultiPoint
| LineString
| MultiLineString
| Polygon
| MultiPolygon;
const coordinates =
geoFieldType === ES_FIELD_TYPES.GEO_SHAPE
? {
type: geometry.type.toLowerCase(),
coordinates: geometry.coordinates,
}
: geometry.coordinates;
const properties = feature.properties ? feature.properties : {};
this._docArray.push({
coordinates,
...properties,
});
}
}
public async readFile(
file: File,
setFileProgress: ({
featuresProcessed,
bytesProcessed,
totalBytes,
}: {
featuresProcessed: number;
bytesProcessed: number;
totalBytes: number;
}) => void,
isFileParseActive: () => boolean
): Promise<{
errors: string[];
geometryTypes: string[];
parsedGeojson: FeatureCollection;
} | null> {
if (!file) {
throw new Error(
i18n.translate('xpack.fileUpload.fileParser.noFileProvided', {
defaultMessage: 'Error, no file provided',
})
);
}
return new Promise(async (resolve, reject) => {
const batches = await loadInBatches(file, JSONLoader, {
json: {
jsonpaths: ['$.features'],
_rootObjectBatches: true,
},
});
const rawFeatures: unknown[] = [];
for await (const batch of batches) {
if (!isFileParseActive()) {
break;
}
if (batch.batchType === 'root-object-batch-complete') {
// Handle single feature geoJson
if (rawFeatures.length === 0) {
rawFeatures.push(batch.container);
export function toEsDocs(
features: Feature[],
geoFieldType: ES_FIELD_TYPES.GEO_POINT | ES_FIELD_TYPES.GEO_SHAPE
) {
const esDocs = [];
for (let i = 0; i < features.length; i++) {
const feature = features[i];
const geometry = feature.geometry as
| Point
| MultiPoint
| LineString
| MultiLineString
| Polygon
| MultiPolygon;
const coordinates =
geoFieldType === ES_FIELD_TYPES.GEO_SHAPE
? {
type: geometry.type.toLowerCase(),
coordinates: geometry.coordinates,
}
} else {
rawFeatures.push(...batch.data);
}
setFileProgress({
featuresProcessed: rawFeatures.length,
bytesProcessed: batch.bytesUsed,
totalBytes: file.size,
});
}
if (!isFileParseActive()) {
resolve(null);
return;
}
if (rawFeatures.length === 0) {
reject(
new Error(
i18n.translate('xpack.fileUpload.fileParser.noFeaturesDetected', {
defaultMessage: 'Error, no features detected',
})
)
);
return;
}
const features: Feature[] = [];
const geometryTypesMap = new Map<string, boolean>();
let invalidCount = 0;
for (let i = 0; i < rawFeatures.length; i++) {
const rawFeature = rawFeatures[i] as Feature;
if (!rawFeature.geometry || !rawFeature.geometry.type) {
invalidCount++;
} else {
if (!geometryTypesMap.has(rawFeature.geometry.type)) {
geometryTypesMap.set(rawFeature.geometry.type, true);
}
features.push(geoJsonCleanAndValidate(rawFeature));
}
}
const errors: string[] = [];
if (invalidCount > 0) {
errors.push(
i18n.translate('xpack.fileUpload.fileParser.featuresOmitted', {
defaultMessage: '{invalidCount} features without geometry omitted',
values: { invalidCount },
})
);
}
resolve({
errors,
geometryTypes: Array.from(geometryTypesMap.keys()),
parsedGeojson: {
type: 'FeatureCollection',
features,
},
});
: geometry.coordinates;
const properties = feature.properties ? feature.properties : {};
esDocs.push({
coordinates,
...properties,
});
}
return esDocs;
}

View file

@ -5,4 +5,4 @@
* 2.0.
*/
export { GeoJsonImporter } from './geojson_importer';
export { GeoJsonImporter, GeoJsonPreview, GEOJSON_FILE_TYPES } from './geojson_importer';

View file

@ -22,7 +22,7 @@ import { CreateDocsResponse, IImporter, ImportResults } from './types';
const CHUNK_SIZE = 5000;
const MAX_CHUNK_CHAR_COUNT = 1000000;
const IMPORT_RETRIES = 5;
export const IMPORT_RETRIES = 5;
const STRING_CHUNKS_MB = 100;
export abstract class Importer implements IImporter {
@ -232,7 +232,7 @@ function createDocumentChunks(docArray: ImportDoc[]) {
return chunks;
}
function callImportRoute({
export function callImportRoute({
id,
index,
data,

View file

@ -6,4 +6,5 @@
*/
export { importerFactory } from './importer_factory';
export { validateFile } from './validate_file';
export * from './types';

View file

@ -0,0 +1,52 @@
/*
* Copyright Elasticsearch B.V. and/or licensed to Elasticsearch B.V. under one
* or more contributor license agreements. Licensed under the Elastic License
* 2.0; you may not use this file except in compliance with the Elastic License
* 2.0.
*/
import { i18n } from '@kbn/i18n';
import { getMaxBytes, getMaxBytesFormatted } from '../get_max_bytes';
export function validateFile(file: File, types: string[]) {
if (file.size > getMaxBytes()) {
throw new Error(
i18n.translate('xpack.fileUpload.fileSizeError', {
defaultMessage: 'File size {fileSize} exceeds maximum file size of {maxFileSize}',
values: {
fileSize: bytesToSize(file.size),
maxFileSize: getMaxBytesFormatted(),
},
})
);
}
if (!file.name) {
throw new Error(
i18n.translate('xpack.fileUpload.noFileNameError', {
defaultMessage: 'File name not provided',
})
);
}
const nameSplit = file.name.split('.');
const fileType = nameSplit.pop();
if (!types.includes(`.${fileType}`)) {
throw new Error(
i18n.translate('xpack.fileUpload.fileTypeError', {
defaultMessage: 'File is not one of acceptable types: {types}',
values: {
types: types.join(', '),
},
})
);
}
}
function bytesToSize(bytes: number) {
if (bytes === 0) return 'n/a';
const sizes = ['Bytes', 'KB', 'MB', 'GB', 'TB'];
const i = Math.round(Math.floor(Math.log(bytes) / Math.log(1024)));
if (i === 0) return `${bytes} ${sizes[i]})`;
return `${(bytes / 1024 ** i).toFixed(1)} ${sizes[i]}`;
}

View file

@ -18,3 +18,4 @@ export function setStartServices(core: CoreStart, plugins: FileUploadStartDepend
export const getIndexPatternService = () => pluginsStart.data.indexPatterns;
export const getHttp = () => coreStart.http;
export const getSavedObjectsClient = () => coreStart.savedObjects.client;
export const getUiSettings = () => coreStart.uiSettings;

View file

@ -12,7 +12,7 @@ import { IImporter, ImportFactoryOptions, ImportResults } from '../importer';
export interface FileUploadComponentProps {
isIndexingTriggered: boolean;
onFileUpload: (geojsonFile: FeatureCollection, name: string) => void;
onFileUpload: (geojsonFile: FeatureCollection, name: string, previewCoverage: number) => void;
onFileRemove: () => void;
onIndexReady: (indexReady: boolean) => void;
onIndexingComplete: (results: {

View file

@ -9,6 +9,7 @@ import { CoreStart, Plugin } from '../../../../src/core/public';
import { FileUploadStartApi, getFileUploadComponent, importerFactory } from './api';
import { setStartServices } from './kibana_services';
import { DataPublicPluginStart } from '../../../../src/plugins/data/public';
import { getMaxBytes, getMaxBytesFormatted } from './get_max_bytes';
// eslint-disable-next-line @typescript-eslint/no-empty-interface
export interface FileUploadSetupDependencies {}
@ -34,6 +35,8 @@ export class FileUploadPlugin
return {
getFileUploadComponent,
importerFactory,
getMaxBytes,
getMaxBytesFormatted,
};
}
}

View file

@ -5,10 +5,13 @@
* 2.0.
*/
import { i18n } from '@kbn/i18n';
import { CoreSetup, CoreStart, Plugin } from 'src/core/server';
import { schema } from '@kbn/config-schema';
import { fileUploadRoutes } from './routes';
import { initFileUploadTelemetry } from './telemetry';
import { UsageCollectionSetup } from '../../../../src/plugins/usage_collection/server';
import { UI_SETTING_MAX_FILE_SIZE, MAX_FILE_SIZE } from '../common';
interface SetupDeps {
usageCollection: UsageCollectionSetup;
@ -18,6 +21,26 @@ export class FileUploadPlugin implements Plugin {
async setup(coreSetup: CoreSetup, plugins: SetupDeps) {
fileUploadRoutes(coreSetup.http.createRouter());
coreSetup.uiSettings.register({
[UI_SETTING_MAX_FILE_SIZE]: {
name: i18n.translate('xpack.fileUpload.maxFileSizeUiSetting.name', {
defaultMessage: 'Maximum file upload size',
}),
value: MAX_FILE_SIZE,
description: i18n.translate('xpack.fileUpload.maxFileSizeUiSetting.description', {
defaultMessage:
'Sets the file size limit when importing files. The highest supported value for this setting is 1GB.',
}),
schema: schema.string(),
validation: {
regexString: '\\d+[mMgG][bB]',
message: i18n.translate('xpack.fileUpload.maxFileSizeUiSetting.error', {
defaultMessage: 'Should be a valid data size. e.g. 200MB, 1GB',
}),
},
},
});
initFileUploadTelemetry(coreSetup, plugins.usageCollection);
}

View file

@ -174,6 +174,8 @@ export type InlineFieldDescriptor = {
export type GeojsonFileSourceDescriptor = {
__fields?: InlineFieldDescriptor[];
__featureCollection: FeatureCollection;
areResultsTrimmed: boolean;
tooltipContent: string | null;
name: string;
type: string;
};

View file

@ -5,6 +5,7 @@
* 2.0.
*/
import { i18n } from '@kbn/i18n';
import React, { Component } from 'react';
import { FeatureCollection } from 'geojson';
import { EuiPanel } from '@elastic/eui';
@ -70,7 +71,7 @@ export class ClientFileCreateSourceEditor extends Component<RenderWizardArgument
}
}
_onFileUpload = (geojsonFile: FeatureCollection, name: string) => {
_onFileUpload = (geojsonFile: FeatureCollection, name: string, previewCoverage: number) => {
if (!this._isMounted) {
return;
}
@ -80,8 +81,19 @@ export class ClientFileCreateSourceEditor extends Component<RenderWizardArgument
return;
}
const areResultsTrimmed = previewCoverage < 100;
const sourceDescriptor = GeoJsonFileSource.createDescriptor({
__featureCollection: geojsonFile,
areResultsTrimmed,
tooltipContent: areResultsTrimmed
? i18n.translate('xpack.maps.fileUpload.trimmedResultsMsg', {
defaultMessage: `Results limited to {numFeatures} features, {previewCoverage}% of file.`,
values: {
numFeatures: geojsonFile.features.length,
previewCoverage,
},
})
: null,
name,
});
const layerDescriptor = VectorLayer.createDescriptor(
@ -130,7 +142,8 @@ export class ClientFileCreateSourceEditor extends Component<RenderWizardArgument
return;
}
this.props.advanceToNextStep();
this.props.stopStepLoading();
this.props.disableNextBtn();
this.setState({ indexingStage: INDEXING_STAGE.ERROR });
};

View file

@ -48,6 +48,9 @@ export class GeoJsonFileSource extends AbstractVectorSource {
type: SOURCE_TYPES.GEOJSON_FILE,
__featureCollection: getFeatureCollection(descriptor.__featureCollection),
__fields: descriptor.__fields || [],
areResultsTrimmed:
descriptor.areResultsTrimmed !== undefined ? descriptor.areResultsTrimmed : false,
tooltipContent: descriptor.tooltipContent ? descriptor.tooltipContent : null,
name: descriptor.name || 'Features',
};
}
@ -121,6 +124,13 @@ export class GeoJsonFileSource extends AbstractVectorSource {
canFormatFeatureProperties() {
return true;
}
getSourceTooltipContent() {
return {
tooltipContent: (this._descriptor as GeojsonFileSourceDescriptor).tooltipContent,
areResultsTrimmed: (this._descriptor as GeojsonFileSourceDescriptor).areResultsTrimmed,
};
}
}
registerSource({

View file

@ -5,7 +5,6 @@
* 2.0.
*/
export const FILE_DATA_VISUALIZER_MAX_FILE_SIZE = 'ml:fileDataVisualizerMaxFileSize';
export const ANOMALY_DETECTION_ENABLE_TIME_RANGE = 'ml:anomalyDetection:results:enableTimeDefaults';
export const ANOMALY_DETECTION_DEFAULT_TIME_RANGE = 'ml:anomalyDetection:results:timeDefaults';

View file

@ -82,6 +82,7 @@ const App: FC<AppProps> = ({ coreStart, deps, appMountParams }) => {
embeddable: deps.embeddable,
maps: deps.maps,
triggersActionsUi: deps.triggersActionsUi,
fileUpload: deps.fileUpload,
...coreStart,
};

View file

@ -18,6 +18,7 @@ import { MlServicesContext } from '../../app';
import { IStorageWrapper } from '../../../../../../../src/plugins/kibana_utils/public';
import type { EmbeddableStart } from '../../../../../../../src/plugins/embeddable/public';
import type { MapsStartApi } from '../../../../../maps/public';
import type { FileUploadPluginStart } from '../../../../../file_upload/public';
import type { LensPublicStart } from '../../../../../lens/public';
import { TriggersAndActionsUIPublicPluginStart } from '../../../../../triggers_actions_ui/public';
@ -30,6 +31,7 @@ interface StartPlugins {
maps?: MapsStartApi;
lens?: LensPublicStart;
triggersActionsUi?: TriggersAndActionsUIPublicPluginStart;
fileUpload?: FileUploadPluginStart;
}
export type StartServices = CoreStart &
StartPlugins & {

View file

@ -25,9 +25,7 @@ import { i18n } from '@kbn/i18n';
import { FormattedMessage } from '@kbn/i18n/react';
import { isFullLicense } from '../license';
import { useTimefilter, useMlKibana, useNavigateToPath } from '../contexts/kibana';
import { NavigationMenu } from '../components/navigation_menu';
import { getMaxBytesFormatted } from './file_based/components/utils';
import { HelpMenu } from '../components/help_menu';
function startTrialDescription() {
@ -58,8 +56,10 @@ export const DatavisualizerSelector: FC = () => {
licenseManagement,
http: { basePath },
docLinks,
fileUpload,
},
} = useMlKibana();
const helpLink = docLinks.links.ml.guide;
const navigateToPath = useNavigateToPath();
@ -68,7 +68,12 @@ export const DatavisualizerSelector: FC = () => {
licenseManagement.enabled === true &&
isFullLicense() === false;
const maxFileSize = getMaxBytesFormatted();
if (fileUpload === undefined) {
// eslint-disable-next-line no-console
console.error('File upload plugin not available');
return null;
}
const maxFileSize = fileUpload.getMaxBytesFormatted();
return (
<Fragment>

View file

@ -20,7 +20,8 @@ import {
} from '@elastic/eui';
import { ExperimentalBadge } from '../experimental_badge';
import { getMaxBytesFormatted } from '../utils';
import { useMlKibana } from '../../../../contexts/kibana';
export const WelcomeContent: FC = () => {
const toolTipContent = i18n.translate(
@ -30,7 +31,16 @@ export const WelcomeContent: FC = () => {
}
);
const maxFileSize = getMaxBytesFormatted();
const {
services: { fileUpload },
} = useMlKibana();
if (fileUpload === undefined) {
// eslint-disable-next-line no-console
console.error('File upload plugin not available');
return null;
}
const maxFileSize = fileUpload.getMaxBytesFormatted();
return (
<EuiFlexGroup gutterSize="xl" alignItems="center">

View file

@ -22,12 +22,12 @@ import { ExplanationFlyout } from '../explanation_flyout';
import { ImportView } from '../import_view';
import {
DEFAULT_LINES_TO_SAMPLE,
getMaxBytes,
readFile,
createUrlOverrides,
processResults,
hasImportPermission,
} from '../utils';
import { getFileUpload } from '../../../../util/dependency_cache';
import { MODE } from './constants';
@ -60,7 +60,7 @@ export class FileDataVisualizerView extends Component {
this.originalSettings = {
linesToSample: DEFAULT_LINES_TO_SAMPLE,
};
this.maxFileUploadBytes = getMaxBytes();
this.maxFileUploadBytes = getFileUpload().getMaxBytes();
}
async componentDidMount() {

View file

@ -10,7 +10,5 @@ export {
hasImportPermission,
processResults,
readFile,
getMaxBytes,
getMaxBytesFormatted,
DEFAULT_LINES_TO_SAMPLE,
} from './utils';

View file

@ -6,19 +6,9 @@
*/
import { isEqual } from 'lodash';
// @ts-ignore
import numeral from '@elastic/numeral';
import { ml } from '../../../../services/ml_api_service';
import { AnalysisResult, InputOverrides } from '../../../../../../common/types/file_datavisualizer';
import {
MB,
MAX_FILE_SIZE,
MAX_FILE_SIZE_BYTES,
ABSOLUTE_MAX_FILE_SIZE_BYTES,
FILE_SIZE_DISPLAY_FORMAT,
} from '../../../../../../../file_upload/public';
import { getUiSettings } from '../../../../util/dependency_cache';
import { FILE_DATA_VISUALIZER_MAX_FILE_SIZE } from '../../../../../../common/constants/settings';
import { MB } from '../../../../../../../file_upload/public';
export const DEFAULT_LINES_TO_SAMPLE = 1000;
const UPLOAD_SIZE_MB = 5;
@ -66,20 +56,6 @@ export function readFile(file: File) {
});
}
export function getMaxBytes() {
const maxFileSize = getUiSettings().get(FILE_DATA_VISUALIZER_MAX_FILE_SIZE, MAX_FILE_SIZE);
// @ts-ignore
const maxBytes = numeral(maxFileSize.toUpperCase()).value();
if (maxBytes < MAX_FILE_SIZE_BYTES) {
return MAX_FILE_SIZE_BYTES;
}
return maxBytes <= ABSOLUTE_MAX_FILE_SIZE_BYTES ? maxBytes : ABSOLUTE_MAX_FILE_SIZE_BYTES;
}
export function getMaxBytesFormatted() {
return numeral(getMaxBytes()).format(FILE_SIZE_DISPLAY_FORMAT);
}
export function createUrlOverrides(overrides: InputOverrides, originalSettings: InputOverrides) {
const formattedOverrides: InputOverrides = {};
for (const o in overrideDefaults) {

View file

@ -9,34 +9,14 @@ import { CoreSetup } from 'kibana/server';
import { i18n } from '@kbn/i18n';
import { schema } from '@kbn/config-schema';
import {
FILE_DATA_VISUALIZER_MAX_FILE_SIZE,
ANOMALY_DETECTION_DEFAULT_TIME_RANGE,
ANOMALY_DETECTION_ENABLE_TIME_RANGE,
DEFAULT_AD_RESULTS_TIME_FILTER,
DEFAULT_ENABLE_AD_RESULTS_TIME_FILTER,
} from '../../common/constants/settings';
import { MAX_FILE_SIZE } from '../../../file_upload/common';
export function registerKibanaSettings(coreSetup: CoreSetup) {
coreSetup.uiSettings.register({
[FILE_DATA_VISUALIZER_MAX_FILE_SIZE]: {
name: i18n.translate('xpack.ml.maxFileSizeSettingsName', {
defaultMessage: 'File Data Visualizer maximum file upload size',
}),
value: MAX_FILE_SIZE,
description: i18n.translate('xpack.ml.maxFileSizeSettingsDescription', {
defaultMessage:
'Sets the file size limit when importing data in the File Data Visualizer. The highest supported value for this setting is 1GB.',
}),
category: ['machineLearning'],
schema: schema.string(),
validation: {
regexString: '\\d+[mMgG][bB]',
message: i18n.translate('xpack.ml.maxFileSizeSettingsError', {
defaultMessage: 'Should be a valid data size. e.g. 200MB, 1GB',
}),
},
},
[ANOMALY_DETECTION_ENABLE_TIME_RANGE]: {
name: i18n.translate('xpack.ml.advancedSettings.enableAnomalyDetectionDefaultTimeRangeName', {
defaultMessage: 'Enable time filter defaults for anomaly detection results',

View file

@ -7429,8 +7429,6 @@
"xpack.features.savedObjectsManagementFeatureName": "保存されたオブジェクトの管理",
"xpack.features.visualizeFeatureName": "可視化",
"xpack.fileUpload.enterIndexName": "インデックス名を入力",
"xpack.fileUpload.fileParser.noFeaturesDetected": "エラー、機能が検出されませんでした",
"xpack.fileUpload.fileParser.noFileProvided": "エラー、ファイルが提供されていません",
"xpack.fileUpload.httpService.fetchError": "フェッチ実行エラー:{error}",
"xpack.fileUpload.httpService.noUrl": "URLが指定されていません",
"xpack.fileUpload.indexNameReqField": "インデックス名、必須フィールド",
@ -7446,29 +7444,11 @@
"xpack.fileUpload.indexSettings.indexNameContainsIllegalCharactersErrorMessage": "インデックス名に許可されていない文字が含まれています",
"xpack.fileUpload.indexSettings.indexNameGuidelines": "インデックス名ガイドライン",
"xpack.fileUpload.jsonImport.indexingResponse": "インデックス応答",
"xpack.fileUpload.jsonImport.indexingStatus": "インデックスステータス",
"xpack.fileUpload.jsonImport.indexMgmtLink": "インデックス管理",
"xpack.fileUpload.jsonImport.indexModsMsg": "次を使用すると、その他のインデックス修正を行うことができます。\n",
"xpack.fileUpload.jsonImport.indexPatternResponse": "インデックスパターン応答",
"xpack.fileUpload.jsonIndexFilePicker.acceptableFileSize": "ファイルサイズ {fileSize} は最大ファイルサイズの{maxFileSize} を超えています",
"xpack.fileUpload.jsonIndexFilePicker.acceptableTypesError": "ファイルは使用可能なタイプのいずれかではありません。{types}",
"xpack.fileUpload.jsonIndexFilePicker.coordinateSystemAccepted": "座標は EPSG:4326 座標参照系でなければなりません。",
"xpack.fileUpload.jsonIndexFilePicker.fileParseError": "ファイル解析エラーが検出されました:{error}",
"xpack.fileUpload.jsonIndexFilePicker.filePicker": "ファイルをアップロード",
"xpack.fileUpload.jsonIndexFilePicker.filePickerLabel": "アップロードするファイルを選択",
"xpack.fileUpload.jsonIndexFilePicker.fileProcessingError": "ファイル処理エラー: {errorMessage}",
"xpack.fileUpload.jsonIndexFilePicker.formatsAccepted": "許可されているフォーマット:{acceptedFileTypeStringMessage}",
"xpack.fileUpload.jsonIndexFilePicker.maxSize": "最大サイズ:{maxFileSize}",
"xpack.fileUpload.jsonIndexFilePicker.noFileNameError": "ファイル名が指定されていません",
"xpack.fileUpload.jsonIndexFilePicker.parsingFile": "{featuresProcessed} 機能が解析されました...",
"xpack.fileUpload.jsonIndexFilePicker.unableParseFile": "ファイルをパースできません。{error}",
"xpack.fileUpload.jsonUploadAndParse.creatingIndexPattern": "インデックスパターンを作成中です",
"xpack.fileUpload.jsonUploadAndParse.dataIndexingError": "データインデックスエラー",
"xpack.fileUpload.jsonUploadAndParse.dataIndexingStarted": "データインデックスが開始しました",
"xpack.fileUpload.jsonUploadAndParse.indexingComplete": "インデックス完了",
"xpack.fileUpload.jsonUploadAndParse.indexPatternComplete": "インデックスパターンの完了",
"xpack.fileUpload.jsonUploadAndParse.indexPatternError": "インデックスパターンエラー",
"xpack.fileUpload.jsonUploadAndParse.writingToIndex": "インデックスに書き込み中",
"xpack.fleet.agentBulkActions.agentsSelected": "{count, plural, other {#個のエージェント}}が選択されました",
"xpack.fleet.agentBulkActions.clearSelection": "選択した項目をクリア",
"xpack.fleet.agentBulkActions.reassignPolicy": "新しいポリシーに割り当てる",
@ -13427,9 +13407,6 @@
"xpack.ml.management.syncSavedObjectsFlyout.sync.error": "一部のジョブを同期できません。",
"xpack.ml.management.syncSavedObjectsFlyout.sync.success": "{successCount} {successCount, plural, other {件のジョブ}}が同期されました",
"xpack.ml.management.syncSavedObjectsFlyout.syncButton": "同期",
"xpack.ml.maxFileSizeSettingsDescription": "ファイルデータビジュアライザーでデータをインポートするときのファイルサイズ上限を設定します。この設定でサポートされている最大値は1 GBです。",
"xpack.ml.maxFileSizeSettingsError": "200 MB、1 GBなどの有効なデータサイズにしてください。",
"xpack.ml.maxFileSizeSettingsName": "ファイルデータビジュアライザーの最大ファイルアップロードサイズ",
"xpack.ml.models.jobService.allOtherRequestsCancelledDescription": " 他のすべてのリクエストはキャンセルされました。",
"xpack.ml.models.jobService.categorization.messages.failureToGetTokens": "フィールド値の例のサンプルをトークン化することができませんでした。{message}",
"xpack.ml.models.jobService.categorization.messages.insufficientPrivileges": "権限が不十分なため、フィールド値の例のトークン化を実行できませんでした。そのため、フィールド値を確認し、カテゴリー分けジョブでの使用が適当かを確認することができません。",

View file

@ -7448,8 +7448,6 @@
"xpack.features.savedObjectsManagementFeatureName": "已保存对象管理",
"xpack.features.visualizeFeatureName": "Visualize",
"xpack.fileUpload.enterIndexName": "输入索引名称",
"xpack.fileUpload.fileParser.noFeaturesDetected": "错误,未检测到特征",
"xpack.fileUpload.fileParser.noFileProvided": "错误,未提供任何文件",
"xpack.fileUpload.httpService.fetchError": "执行提取时出错:{error}",
"xpack.fileUpload.httpService.noUrl": "未提供 URL",
"xpack.fileUpload.indexNameReqField": "索引名称,必填字段",
@ -7465,29 +7463,11 @@
"xpack.fileUpload.indexSettings.indexNameContainsIllegalCharactersErrorMessage": "索引名称包含非法字符。",
"xpack.fileUpload.indexSettings.indexNameGuidelines": "索引名称指引",
"xpack.fileUpload.jsonImport.indexingResponse": "索引响应",
"xpack.fileUpload.jsonImport.indexingStatus": "索引状态",
"xpack.fileUpload.jsonImport.indexMgmtLink": "索引管理",
"xpack.fileUpload.jsonImport.indexModsMsg": "要进一步做索引修改,可以使用\n",
"xpack.fileUpload.jsonImport.indexPatternResponse": "索引模式响应",
"xpack.fileUpload.jsonIndexFilePicker.acceptableFileSize": "文件大小 {fileSize} 超过最大文件大小 {maxFileSize}",
"xpack.fileUpload.jsonIndexFilePicker.acceptableTypesError": "文件不是可接受类型之一:{types}",
"xpack.fileUpload.jsonIndexFilePicker.coordinateSystemAccepted": "坐标必须在 EPSG:4326 坐标参考系中。",
"xpack.fileUpload.jsonIndexFilePicker.fileParseError": "检测到文件解析错误:{error}",
"xpack.fileUpload.jsonIndexFilePicker.filePicker": "上传文件",
"xpack.fileUpload.jsonIndexFilePicker.filePickerLabel": "选择文件进行上传",
"xpack.fileUpload.jsonIndexFilePicker.fileProcessingError": "文件处理错误:{errorMessage}",
"xpack.fileUpload.jsonIndexFilePicker.formatsAccepted": "接受的格式:{acceptedFileTypeStringMessage}",
"xpack.fileUpload.jsonIndexFilePicker.maxSize": "最大大小:{maxFileSize}",
"xpack.fileUpload.jsonIndexFilePicker.noFileNameError": "未提供任何文件名称",
"xpack.fileUpload.jsonIndexFilePicker.parsingFile": "已处理 {featuresProcessed} 个特征......",
"xpack.fileUpload.jsonIndexFilePicker.unableParseFile": "无法解析文件:{error}",
"xpack.fileUpload.jsonUploadAndParse.creatingIndexPattern": "正在创建索引模式",
"xpack.fileUpload.jsonUploadAndParse.dataIndexingError": "数据索引错误",
"xpack.fileUpload.jsonUploadAndParse.dataIndexingStarted": "数据索引已启动",
"xpack.fileUpload.jsonUploadAndParse.indexingComplete": "索引完成",
"xpack.fileUpload.jsonUploadAndParse.indexPatternComplete": "索引模式完成",
"xpack.fileUpload.jsonUploadAndParse.indexPatternError": "索引模式错误",
"xpack.fileUpload.jsonUploadAndParse.writingToIndex": "正在写入索引",
"xpack.fleet.agentBulkActions.agentsSelected": "已选择 {count, plural, other {# 个代理}}",
"xpack.fleet.agentBulkActions.clearSelection": "清除所选内容",
"xpack.fleet.agentBulkActions.reassignPolicy": "分配到新策略",
@ -13459,9 +13439,6 @@
"xpack.ml.management.syncSavedObjectsFlyout.sync.error": "一些作业无法同步。",
"xpack.ml.management.syncSavedObjectsFlyout.sync.success": "{successCount} 个{successCount, plural, other {作业}}已同步",
"xpack.ml.management.syncSavedObjectsFlyout.syncButton": "同步",
"xpack.ml.maxFileSizeSettingsDescription": "设置在文件数据可视化工具中导入数据时的文件大小限制。此设置支持的最高值为 1GB。",
"xpack.ml.maxFileSizeSettingsError": "应为有效的数据大小。如 200MB、1GB",
"xpack.ml.maxFileSizeSettingsName": "文件数据可视化工具最大文件上传大小",
"xpack.ml.models.jobService.allOtherRequestsCancelledDescription": " 所有其他请求已取消。",
"xpack.ml.models.jobService.categorization.messages.failureToGetTokens": "无法对示例字段值样本进行分词。{message}",
"xpack.ml.models.jobService.categorization.messages.insufficientPrivileges": "由于权限不足,无法对字段值示例执行分词。因此,无法检查字段值是否适合用于归类作业。",

View file

@ -13,6 +13,7 @@ export default function ({ getService, getPageObjects }) {
const PageObjects = getPageObjects(['maps', 'common']);
const log = getService('log');
const security = getService('security');
const browser = getService('browser');
const IMPORT_FILE_PREVIEW_NAME = 'Import File';
const FILE_LOAD_DIR = 'test_upload_files';
@ -102,6 +103,9 @@ export default function ({ getService, getPageObjects }) {
const pointGeojsonFiles = ['point.json', 'multi_point.json'];
pointGeojsonFiles.forEach(async (pointFile) => {
it(`should index with type geo_point for file: ${pointFile}`, async () => {
if (!(await browser.checkBrowserPermission('clipboard-read'))) {
return;
}
await loadFileAndIndex(pointFile);
const indexPatternResults = await PageObjects.maps.getIndexPatternResults();
const coordinatesField = indexPatternResults.fields.find(
@ -120,6 +124,9 @@ export default function ({ getService, getPageObjects }) {
];
nonPointGeojsonFiles.forEach(async (shapeFile) => {
it(`should index with type geo_shape for file: ${shapeFile}`, async () => {
if (!(await browser.checkBrowserPermission('clipboard-read'))) {
return;
}
await loadFileAndIndex(shapeFile);
const indexPatternResults = await PageObjects.maps.getIndexPatternResults();
const coordinatesField = indexPatternResults.fields.find(

File diff suppressed because one or more lines are too long

File diff suppressed because one or more lines are too long

View file

@ -423,7 +423,7 @@ export function GisPageProvider({ getService, getPageObjects }: FtrProviderConte
async importLayerReadyForAdd() {
log.debug(`Wait until import complete`);
await testSubjects.find('indexRespCodeBlock', 5000);
await testSubjects.find('indexRespCopyButton', 5000);
let layerAddReady = false;
await retry.waitForWithTimeout('Add layer button ready', 2000, async () => {
layerAddReady = await this.importFileButtonEnabled();
@ -454,21 +454,20 @@ export function GisPageProvider({ getService, getPageObjects }: FtrProviderConte
);
}
async getCodeBlockParsedJson(dataTestSubjName: string) {
log.debug(`Get parsed code block for ${dataTestSubjName}`);
const indexRespCodeBlock = await testSubjects.find(`${dataTestSubjName}`);
const indexRespJson = await indexRespCodeBlock.getAttribute('innerText');
return JSON.parse(indexRespJson);
async clickCopyButton(dataTestSubj: string): Promise<string> {
log.debug(`Click ${dataTestSubj} copy button`);
await testSubjects.click(dataTestSubj);
return await browser.getClipboardValue();
}
async getIndexResults() {
log.debug('Get index results');
return await this.getCodeBlockParsedJson('indexRespCodeBlock');
return JSON.parse(await this.clickCopyButton('indexRespCopyButton'));
}
async getIndexPatternResults() {
log.debug('Get index pattern results');
return await this.getCodeBlockParsedJson('indexPatternRespCodeBlock');
return JSON.parse(await this.clickCopyButton('indexPatternRespCopyButton'));
}
async setLayerQuery(layerName: string, query: string) {