Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

[doc] modified refresh statement #1952

Open
wants to merge 1 commit into
base: master
Choose a base branch
from
Open
Show file tree
Hide file tree
Changes from all commits
Commits
File filter

Filter by extension

Filter by extension

Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view

Large diffs are not rendered by default.

Original file line number Diff line number Diff line change
@@ -1,9 +1,8 @@
---
{
"title": "CANCEL LOAD",
"title": "CANCEL-LOAD",
"language": "en"
}

---

<!--
Expand All @@ -25,58 +24,74 @@ specific language governing permissions and limitations
under the License.
-->


## Description

This statement is used to undo an import job for the specified label. Or batch undo import jobs via fuzzy matching
This statement is used to cancel an import job with a specified `label`, or to cancel import jobs in batches through fuzzy matching.

## Syntax

```sql
CANCEL LOAD
[FROM db_name]
WHERE [LABEL = "load_label" | LABEL like "label_pattern" | STATE = "PENDING/ETL/LOADING"]
[FROM <db_name>]
WHERE [LABEL = "<load_label>" | LABEL like "<label_pattern>" | STATE = { "PENDING" | "ETL" | "LOADING" } ]
```

Notice: Cancel by State is supported since 1.2.0.
## Required Parameters

**1. `<db_name>`**

> The name of the database where the import job to be cancelled resides.

## Optional Parameters

**1. `<load_label>`**

> If `LABEL = "<load_label>"` is used, it precisely matches the specified label.

**2. `<label_pattern>`**

> If `LABEL LIKE "<label_pattern>"` is used, it matches import tasks whose labels contain the `label_pattern`.

**3. `<PENDING>`**

> Specifying `PENDING` means cancelling jobs with the `STATE = "PENDING"` status. The same applies to other statuses.

## Access Control Requirements

Users executing this SQL command must have at least the following permissions:

## Example
| Privilege | Object | Notes |
| :---------------- | :------------- | :---------------------------- |
| LOAD_PRIV | Database | Import permissions for the database tables are required. |

1. Cancel the import job whose label is `example_db_test_load_label` on the database example_db
## Usage Notes

- Cancelling jobs based on the `State` is supported starting from version 1.2.0.
- Only incomplete import jobs in the `PENDING`, `ETL`, or `LOADING` states can be cancelled.
- When performing batch cancellation, Doris does not guarantee that all corresponding import jobs will be cancelled atomically. That is, only some import jobs may be cancelled successfully. Users can check the job status using the `SHOW LOAD` statement and try to execute the `CANCEL LOAD` statement again.

## Examples

1. Cancel the import job with the label `example_db_test_load_label` in the database `example_db`.

```sql
CANCEL LOAD
FROM example_db
WHERE LABEL = "example_db_test_load_label";
```

2. Cancel all import jobs containing example* on the database example*db.
2. Cancel all import jobs containing `example_` in the database `example_db`.

```sql
CANCEL LOAD
FROM example_db
WHERE LABEL like "example_";
```



3. Cancel all import jobs which state are "LOADING"
3. Cancel import jobs in the `LOADING` state.

```sql
CANCEL LOAD
FROM example_db
WHERE STATE = "loading";
```

:::tip Tips
This feature is supported since the Apache Doris 1.2 version
:::


## Keywords

CANCEL, LOAD

## Best Practice

1. Only pending import jobs in PENDING, ETL, LOADING state can be canceled.
2. When performing batch undo, Doris does not guarantee the atomic undo of all corresponding import jobs. That is, it is possible that only some of the import jobs were successfully undone. The user can view the job status through the SHOW LOAD statement and try to execute the CANCEL LOAD statement repeatedly.

```
Original file line number Diff line number Diff line change
@@ -1,6 +1,6 @@
---
{
"title": "MYSQL LOAD",
"title": "MYSQL-LOAD",
"language": "en"
}
---
Expand All @@ -24,62 +24,90 @@ specific language governing permissions and limitations
under the License.
-->



## Description

mysql-load: Import local data using the MySql client
Use the MySQL client to import local data files into Doris. MySQL Load is a synchronous import method, which returns the import result immediately after execution. You can determine whether the import is successful based on the return result of the `LOAD DATA` statement. MySQL Load can ensure the atomicity of a batch of import tasks, meaning that either all imports succeed or all fail.

```
## Syntax

```sql
LOAD DATA
[LOCAL]
INFILE 'file_name'
INTO TABLE tbl_name
[PARTITION (partition_name [, partition_name] ...)]
[COLUMNS TERMINATED BY 'string']
[LINES TERMINATED BY 'string']
[IGNORE number {LINES | ROWS}]
[(col_name_or_user_var [, col_name_or_user_var] ...)]
[SET (col_name={expr | DEFAULT} [, col_name={expr | DEFAULT}] ...)]
[PROPERTIES (key1 = value1 [, key2=value2]) ]
[ LOCAL ]
INFILE "<file_name>"
INTO TABLE "<tbl_name>"
[ PARTITION (<partition_name> [, ... ]) ]
[ COLUMNS TERMINATED BY "<column_separator>" ]
[ LINES TERMINATED BY "<line_delimiter>" ]
[ IGNORE <number> {LINES | ROWS} ]
[ (<col_name_or_user_var> [, ... ] ) ]
[ SET (col_name={<expr> | DEFAULT} [, col_name={<expr> | DEFAULT}] ...) ]
[ PROPERTIES ("key1" = "value1" [ , ... ]) ]
```

This statement is used to import data to the specified table. Unlike normal Load, this import method is a synchronous import.
## Required Parameters

**1. `<file_name>`**

> Specify the path of the local file, which can be either a relative or an absolute path. Currently, only a single file is supported, and multiple files are not supported.

**2. `<tbl_name>`**

> The table name can include the database name, as shown in the examples. If the database name is omitted, the current user's database will be used.

## Optional Parameters

**1. `LOCAL`**

This import method can still guarantee the atomicity of a batch of import tasks, either all data imports are successful or all fail.
> Specifying `LOCAL` indicates reading files from the client. Omitting it means reading files from the local storage of the FE server. The function of importing files from the FE server is disabled by default. You need to set `mysql_load_server_secure_path` on the FE node to specify a secure path to enable this function.

1. MySQL Load starts with the syntax `LOAD DATA`, without specifying `LABEL`
2. Specify `LOCAL` to read client side files. Not specified to read FE server side local files. Server side load was disabled by default. It can be enabled by setting a secure path in FE configuration `mysql_load_server_secure_path`
3. The local fill path will be filled after `INFILE`, which can be a relative path or an absolute path. Currently only a single file is supported, and multiple files are not supported
4. The table name after `INTO TABLE` can specify the database name, as shown in the case. It can also be omitted, and the database where the current user is located will be used.
5. `PARTITION` syntax supports specified partition to import
6. `COLUMNS TERMINATED BY` specifies the column separator
7. `LINES TERMINATED BY` specifies the line separator
8. `IGNORE num LINES` The user skips the header of the CSV and can skip any number of lines. This syntax can also be replaced by'IGNORE num ROWS '
9. Column mapping syntax, please refer to the column mapping chapter of [Imported Data Transformation](../../../../data-operate/import/import-way/mysql-load-manual.md)
10. `PROPERTIES` parameter configuration, see below for details
**2. `<partition_name>`**

### PROPERTIES
> Multiple partitions can be specified for import, separated by commas.

1. max_filter_ratio: The maximum tolerable data ratio that can be filtered (for reasons such as data irregularity). Zero tolerance by default. Data irregularities do not include rows filtered out by where conditions.
**3. `<column_separator>`**

2. timeout: Specify the import timeout. in seconds. The default is 600 seconds. The setting range is from 1 second to 259200 seconds.
> Specify the column separator.

3. strict_mode: The user specifies whether to enable strict mode for this import. The default is off.
**4. `<line_delimiter>`**

4. timezone: Specify the time zone used for this import. The default is Dongba District. This parameter affects the results of all time zone-related functions involved in the import.
> Specify the line delimiter.

5. exec_mem_limit: Load memory limit. Default is 2GB. The unit is bytes.
**5. `IGNORE <number> { LINES | ROWS }`**

6. trim_double_quotes: Boolean type, The default value is false. True means that the outermost double quotes of each field in the load file are trimmed.
> Users can skip the header of the CSV file or any number of lines. This syntax can also be replaced with `IGNORE num ROWS`.

7. enclose: When the csv data field contains row delimiters or column delimiters, to prevent accidental truncation, single-byte characters can be specified as brackets for protection. For example, the column separator is ",", the bracket is "'", and the data is "a,'b,c'", then "b,c" will be parsed as a field. Note: when the bracket is `"`, trim\_double\_quotes must be set to true.
**6. `<col_name_or_user_var>`**

8. escape: Used to escape characters that appear in a csv field identical to the enclosing characters. For example, if the data is "a,'b,'c'", enclose is "'", and you want "b,'c to be parsed as a field, you need to specify a single-byte escape character, such as "\", and then modify the data to "a,' b,\'c'".
> Column mapping syntax. For specific parameters, refer to the column mapping section of [Data Transformation during Import](../../../../data-operate/import/import-way/mysql-load-manual.md).

## Example
**7. `<properties>`**

1. Import the data from the client side local file `testData` into the table `testTbl` in the database `testDb`. Specify a timeout of 100 seconds
| Parameter | Parameter Description |
| ---------------------- | ------------------------------------------------------------ |
| max_filter_ratio | The maximum tolerable ratio of filterable data (due to reasons such as data irregularities), with a default of zero tolerance. |
| timeout | Specify the import timeout period in seconds. The default is 600 seconds, and the valid range is from 1 second to 259,200 seconds. |
| strict_mode | Users can specify whether to enable strict mode for this import. The default is disabled. |
| timezone | Specify the time zone for this import. The default is the East Eight Time Zone. This parameter will affect the results of all time - zone - related functions involved in the import. |
| exec_mem_limit | The import memory limit, with a default of 2GB in bytes. |
| trim_double_quotes | A boolean type with a default value of `false`. When set to `true`, it means trimming the outermost double quotes of each field in the imported file. |
| enclose | Enclosure character. When a CSV data field contains a line separator or column separator, to prevent accidental truncation, a single - byte character can be specified as the enclosure character for protection. For example, if the column separator is ",", the enclosure character is "'", and the data is "a,'b,c'", then "b,c" will be parsed as one field. Note: When `enclose` is set to `""`, `trim_double_quotes` must be set to `true`. |
| escape | Escape character. Used to escape characters in the CSV field that are the same as the enclosure character. For example, if the data is "a,'b,'c'", the enclosure character is "'", and you want "b,'c" to be parsed as one field, you need to specify a single - byte escape character, such as "", and then modify the data to "a,'b,'c'". |

## Access Control Requirements

Users executing this SQL command must have at least the following permissions:

| Privilege | Object | Notes |
| :---------------- | :------------- | :---------------------------- |
| LOAD_PRIV | Table | Import permissions for the specified database table. |

## Usage Notes

- The MySQL Load statement starts with the syntax `LOAD DATA` and does not require specifying a LABEL.

## Examples

1. Import data from the client's local file `testData` into the table `testTbl` in the database `testDb`. Specify a timeout of 100 seconds.

```sql
LOAD DATA LOCAL
Expand All @@ -88,7 +116,7 @@ This import method can still guarantee the atomicity of a batch of import tasks,
PROPERTIES ("timeout"="100")
```

2. Import the data from the server side local file `/root/testData` (set FE config `mysql_load_server_secure_path` to be `root` already) into the table `testTbl` in the database `testDb`. Specify a timeout of 100 seconds
2. Import data from the server's local file `/root/testData` (you need to set the FE configuration `mysql_load_server_secure_path` to `/root`) into the table `testTbl` in the database `testDb`. Specify a timeout of 100 seconds.

```sql
LOAD DATA
Expand All @@ -97,7 +125,7 @@ This import method can still guarantee the atomicity of a batch of import tasks,
PROPERTIES ("timeout"="100")
```

3. Import data from client side local file `testData` into table `testTbl` in database `testDb`, allowing 20% error rate
3. Import data from the client's local file `testData` into the table `testTbl` in the database `testDb`, allowing an error rate of 20%.

```sql
LOAD DATA LOCAL
Expand All @@ -106,7 +134,7 @@ This import method can still guarantee the atomicity of a batch of import tasks,
PROPERTIES ("max_filter_ratio"="0.2")
```

4. Import the data from the client side local file `testData` into the table `testTbl` in the database `testDb`, allowing a 20% error rate and specifying the column names of the file
4. Import data from the client's local file `testData` into the table `testTbl` in the database `testDb`, allowing an error rate of 20%, and specify the column names of the file.

```sql
LOAD DATA LOCAL
Expand All @@ -116,7 +144,7 @@ This import method can still guarantee the atomicity of a batch of import tasks,
PROPERTIES ("max_filter_ratio"="0.2")
```

5. Import the data in the local file `testData` into the p1, p2 partitions in the table of `testTbl` in the database `testDb`, allowing a 20% error rate.
5. Import data from the local file `testData` into partitions `p1` and `p2` of the table `testTbl` in the database `testDb`, allowing an error rate of 20%.

```sql
LOAD DATA LOCAL
Expand All @@ -126,7 +154,7 @@ This import method can still guarantee the atomicity of a batch of import tasks,
PROPERTIES ("max_filter_ratio"="0.2")
```

6. Import the data in the CSV file `testData` with a local row delimiter of `0102` and a column delimiter of `0304` into the table `testTbl` in the database `testDb`.
6. Import data from the local CSV file `testData` with a line separator of `0102` and a column separator of `0304` into the table `testTbl` in the database `testDb`.

```sql
LOAD DATA LOCAL
Expand All @@ -136,17 +164,17 @@ This import method can still guarantee the atomicity of a batch of import tasks,
LINES TERMINATED BY '0102'
```

7. Import the data from the local file `testData` into the p1, p2 partitions in the table of `testTbl` in the database `testDb` and skip the first 3 lines.
7. Import data from the local file `testData` into partitions `p1` and `p2` of the table `testTbl` in the database `testDb` and skip the first 3 lines.

```sql
LOAD DATA LOCAL
INFILE 'testData'
INTO TABLE testDb.testTbl
PARTITION (p1, p2)
IGNORE 1 LINES
IGNORE 3 LINES
```

8. Import data for strict schema filtering and set the time zone to Africa/Abidjan
8. Import data with strict mode filtering and set the time zone to `Africa/Abidjan`.

```sql
LOAD DATA LOCAL
Expand All @@ -155,15 +183,11 @@ This import method can still guarantee the atomicity of a batch of import tasks,
PROPERTIES ("strict_mode"="true", "timezone"="Africa/Abidjan")
```

9. Import data is limited to 10GB of load memory and timed out in 10 minutes
9. Limit the import memory to 10GB and set a timeout of 10 minutes for the data import.

```sql
LOAD DATA LOCAL
INFILE 'testData'
INTO TABLE testDb.testTbl
PROPERTIES ("exec_mem_limit"="10737418240", "timeout"="600")
```

## Keywords

MYSQL, LOAD
```
Original file line number Diff line number Diff line change
@@ -1,6 +1,6 @@
---
{
"title": "SHOW CREATE LOAD",
"title": "SHOW-CREATE-LOAD",
"language": "en"
}
---
Expand All @@ -24,34 +24,38 @@ specific language governing permissions and limitations
under the License.
-->




## Description

This statement is used to demonstrate the creation statement of a import job.
This statement is used to display the creation statement of an import job.

grammar:
## Syntax

```sql
SHOW CREATE LOAD for load_name;
SHOW CREATE LOAD FOR <load_name>;
```

illustrate:
## Required Parameters

- `load_name`: import job name
**`<load_name>`**

## Example
> The name of the routine import job.

1. Show the creation statement of the specified import job under the default db
## Access Control Requirements

```sql
SHOW CREATE LOAD for test_load
```
Users executing this SQL command must have at least the following permissions:

## Keywords
| Privilege | Object | Notes |
| :---------------- | :------------- | :---------------------------- |
| ADMIN/OPERATOR | Database | Cluster administrator privileges are required. |

SHOW, CREATE, LOAD
## Return Value

## Best Practice
Returns the creation statement of the specified import job.

## Examples

- Display the creation statement of the specified import job in the default database.

```sql
SHOW CREATE LOAD for test_load
```
Loading
Loading