Skip to content

Commit

Permalink
Merge branch 'master' into minaasham/upgrade-replace-avro-python3
Browse files Browse the repository at this point in the history
  • Loading branch information
mina-asham authored Jan 21, 2025
2 parents 2bddaa7 + 9ba5afc commit 8d2543b
Show file tree
Hide file tree
Showing 414 changed files with 15,049 additions and 1,823 deletions.
1 change: 0 additions & 1 deletion .github/workflows/nightly-trigger.yml
Original file line number Diff line number Diff line change
Expand Up @@ -33,7 +33,6 @@ jobs:
- master
- release-1.20
- release-1.19
- release-1.18
runs-on: ubuntu-latest
steps:
- name: Trigger Workflow
Expand Down
4 changes: 2 additions & 2 deletions docs/config.toml
Original file line number Diff line number Diff line change
Expand Up @@ -34,11 +34,11 @@ pygmentsUseClasses = true
# we change the version for the complete docs when forking of a release branch
# etc.
# The full version string as referenced in Maven (e.g. 1.2.1)
Version = "2.0-SNAPSHOT"
Version = "2.1-SNAPSHOT"

# For stable releases, leave the bugfix version out (e.g. 1.2). For snapshot
# release this should be the same as the regular version
VersionTitle = "2.0-SNAPSHOT"
VersionTitle = "2.1-SNAPSHOT"

# The branch for this version of Apache Flink
Branch = "master"
Expand Down
2 changes: 1 addition & 1 deletion docs/content.zh/docs/deployment/cli.md
Original file line number Diff line number Diff line change
Expand Up @@ -576,7 +576,7 @@ related options. Here's an overview of all the Python related options for the ac
<td>
Specify the path of the python interpreter used to execute the python UDF worker
(e.g.: --pyExecutable /usr/local/bin/python3).
The python UDF worker depends on Python 3.8+, Apache Beam (version == 2.43.0),
The python UDF worker depends on Python 3.8+, Apache Beam (version >= 2.54.0, <= 2.61.0),
Pip (version >= 20.3) and SetupTools (version >= 37.0.0).
Please ensure that the specified environment meets the above requirements.
</td>
Expand Down
2 changes: 1 addition & 1 deletion docs/content.zh/docs/dev/table/sqlClient.md
Original file line number Diff line number Diff line change
Expand Up @@ -322,7 +322,7 @@ Mode "embedded" (default) submits Flink jobs from the local machine.
/usr/local/bin/python3). The
python UDF worker depends on
Python 3.8+, Apache Beam
(version == 2.43.0), Pip
(version >= 2.54.0, <= 2.61.0), Pip
(version >= 20.3) and SetupTools
(version >= 37.0.0). Please
ensure that the specified
Expand Down
56 changes: 48 additions & 8 deletions docs/content.zh/docs/ops/metrics.md
Original file line number Diff line number Diff line change
Expand Up @@ -1545,31 +1545,33 @@ Certain RocksDB native metrics are available but disabled by default, you can fi
<table class="table table-bordered">
<thead>
<tr>
<th class="text-left" style="width: 18%">Scope</th>
<th class="text-left" style="width: 26%">Metrics</th>
<th class="text-left" style="width: 48%">Description</th>
<th class="text-left" style="width: 8%">Type</th>
<th class="text-left" style="width: 15%">Scope</th>
<th class="text-left" style="width: 15%">Infix</th>
<th class="text-left" style="width: 15%">Metrics</th>
<th class="text-left" style="width: 50%">Description</th>
<th class="text-left" style="width: 5%">Type</th>
</tr>
</thead>
<tbody>
<tr>
<th rowspan="4"><strong>Task/Operator</strong></th>
<td>forst.fileCache.hit</td>
<td rowspan="4">forst.fileCache</td>
<td>hit</td>
<td>The hit count of ForSt state backend cache.</td>
<td>Counter</td>
</tr>
<tr>
<td>forst.fileCache.miss</td>
<td>miss</td>
<td>The miss count of ForSt state backend cache.</td>
<td>Counter</td>
</tr>
<tr>
<td>forst.fileCache.usedBytes</td>
<td>usedBytes</td>
<td>The bytes cached in ForSt state backend cache.</td>
<td>Gauge</td>
</tr>
<tr>
<td>forst.fileCache.remainingBytes</td>
<td>remainingBytes</td>
<td>The remaining space in the volume for the configured cache. Only available when 'state.backend.forst.cache.reserve-size' is set above 0. </td>
<td>Gauge</td>
</tr>
Expand Down Expand Up @@ -2280,6 +2282,44 @@ logged by `SystemResourcesMetricsInitializer` during the startup.
</tbody>
</table>

### Async State Processing

<table class="table table-bordered">
<thead>
<tr>
<th class="text-left" style="width: 15%">Scope</th>
<th class="text-left" style="width: 10%">Infix</th>
<th class="text-left" style="width: 20%">Metrics</th>
<th class="text-left" style="width: 50%">Description</th>
<th class="text-left" style="width: 5%">Type</th>
</tr>
</thead>
<tbody>
<tr>
<th rowspan="4"><strong>Operator</strong></th>
<td rowspan="4">asyncStateProcessing</td>
<td>numInFlightRecords</td>
<td>The number of in-flight records in the async execution controller's buffers.</td>
<td>Gauge</td>
</tr>
<tr>
<td>activeBufferSize</td>
<td>The number of records which are pending to be processed.</td>
<td>Gauge</td>
</tr>
<tr>
<td>blockingBufferSize</td>
<td>The number of records which are blocked by the ongoing records.</td>
<td>Gauge</td>
</tr>
<tr>
<td>numBlockingKeys</td>
<td>The number of different keys are blocked in async execution controller.</td>
<td>Gauge</td>
</tr>
</tbody>
</table>

## End-to-End latency tracking

Flink allows to track the latency of records travelling through the system. This feature is disabled by default.
Expand Down
2 changes: 1 addition & 1 deletion docs/content/docs/deployment/cli.md
Original file line number Diff line number Diff line change
Expand Up @@ -574,7 +574,7 @@ related options. Here's an overview of all the Python related options for the ac
<td>
Specify the path of the python interpreter used to execute the python UDF worker
(e.g.: --pyExecutable /usr/local/bin/python3).
The python UDF worker depends on Python 3.8+, Apache Beam (version == 2.43.0),
The python UDF worker depends on Python 3.8+, Apache Beam (version >= 2.54.0,<= 2.61.0),
Pip (version >= 20.3) and SetupTools (version >= 37.0.0).
Please ensure that the specified environment meets the above requirements.
</td>
Expand Down
2 changes: 1 addition & 1 deletion docs/content/docs/dev/table/sqlClient.md
Original file line number Diff line number Diff line change
Expand Up @@ -260,7 +260,7 @@ Mode "embedded" (default) submits Flink jobs from the local machine.
/usr/local/bin/python3). The
python UDF worker depends on
Python 3.8+, Apache Beam
(version == 2.43.0), Pip
(version >= 2.54.0, <= 2.61.0), Pip
(version >= 20.3) and SetupTools
(version >= 37.0.0). Please
ensure that the specified
Expand Down
56 changes: 48 additions & 8 deletions docs/content/docs/ops/metrics.md
Original file line number Diff line number Diff line change
Expand Up @@ -1535,31 +1535,33 @@ Certain RocksDB native metrics are available but disabled by default, you can fi
<table class="table table-bordered">
<thead>
<tr>
<th class="text-left" style="width: 18%">Scope</th>
<th class="text-left" style="width: 26%">Metrics</th>
<th class="text-left" style="width: 48%">Description</th>
<th class="text-left" style="width: 8%">Type</th>
<th class="text-left" style="width: 15%">Scope</th>
<th class="text-left" style="width: 15%">Infix</th>
<th class="text-left" style="width: 15%">Metrics</th>
<th class="text-left" style="width: 50%">Description</th>
<th class="text-left" style="width: 5%">Type</th>
</tr>
</thead>
<tbody>
<tr>
<th rowspan="4"><strong>Task/Operator</strong></th>
<td>forst.fileCache.hit</td>
<td rowspan="4">forst.fileCache</td>
<td>hit</td>
<td>The hit count of ForSt state backend cache.</td>
<td>Counter</td>
</tr>
<tr>
<td>forst.fileCache.miss</td>
<td>miss</td>
<td>The miss count of ForSt state backend cache.</td>
<td>Counter</td>
</tr>
<tr>
<td>forst.fileCache.usedBytes</td>
<td>usedBytes</td>
<td>The bytes cached in ForSt state backend cache.</td>
<td>Gauge</td>
</tr>
<tr>
<td>forst.fileCache.remainingBytes</td>
<td>remainingBytes</td>
<td>The remaining space in the volume for the configured cache. Only available when 'state.backend.forst.cache.reserve-size' is set above 0. </td>
<td>Gauge</td>
</tr>
Expand Down Expand Up @@ -2230,6 +2232,44 @@ Metrics below can be used to measure the effectiveness of speculative execution.
</tbody>
</table>

### Async State Processing

<table class="table table-bordered">
<thead>
<tr>
<th class="text-left" style="width: 15%">Scope</th>
<th class="text-left" style="width: 10%">Infix</th>
<th class="text-left" style="width: 20%">Metrics</th>
<th class="text-left" style="width: 50%">Description</th>
<th class="text-left" style="width: 5%">Type</th>
</tr>
</thead>
<tbody>
<tr>
<th rowspan="4"><strong>Operator</strong></th>
<td rowspan="4">asyncStateProcessing</td>
<td>numInFlightRecords</td>
<td>The number of in-flight records in the async execution controller's buffers.</td>
<td>Gauge</td>
</tr>
<tr>
<td>activeBufferSize</td>
<td>The number of records which are pending to be processed.</td>
<td>Gauge</td>
</tr>
<tr>
<td>blockingBufferSize</td>
<td>The number of records which are blocked by the ongoing records.</td>
<td>Gauge</td>
</tr>
<tr>
<td>numBlockingKeys</td>
<td>The number of different keys are blocked in async execution controller.</td>
<td>Gauge</td>
</tr>
</tbody>
</table>

## End-to-End latency tracking

Flink allows to track the latency of records travelling through the system. This feature is disabled by default.
Expand Down
Original file line number Diff line number Diff line change
Expand Up @@ -24,7 +24,7 @@
<td><h5>python.executable</h5></td>
<td style="word-wrap: break-word;">"python"</td>
<td>String</td>
<td>Specify the path of the python interpreter used to execute the python UDF worker. The python UDF worker depends on Python 3.8+, Apache Beam (version == 2.43.0), Pip (version &gt;= 20.3) and SetupTools (version &gt;= 37.0.0). Please ensure that the specified environment meets the above requirements. The option is equivalent to the command line option "-pyexec".</td>
<td>Specify the path of the python interpreter used to execute the python UDF worker. The python UDF worker depends on Python 3.8+, Apache Beam (version >= 2.54.0, <= 2.61.0), Pip (version &gt;= 20.3) and SetupTools (version &gt;= 37.0.0). Please ensure that the specified environment meets the above requirements. The option is equivalent to the command line option "-pyexec".</td>
</tr>
<tr>
<td><h5>python.execution-mode</h5></td>
Expand Down
15 changes: 13 additions & 2 deletions docs/setup_hugo.sh
100644 → 100755
Original file line number Diff line number Diff line change
Expand Up @@ -18,8 +18,19 @@
################################################################################

# setup hugo
HUGO_REPO=https://github.com/gohugoio/hugo/releases/download/v0.110.0/hugo_extended_0.110.0_Linux-64bit.tar.gz
HUGO_ARTIFACT=hugo_extended_0.110.0_Linux-64bit.tar.gz

# Detect Operating System
OS="Linux"
[[ "$OSTYPE" == "darwin"* ]] && OS="Mac"

# Setup Hugo based on OS
if [ "$OS" = "Mac" ]; then
HUGO_ARTIFACT="hugo_extended_0.110.0_darwin-universal.tar.gz"
else
HUGO_ARTIFACT="hugo_extended_0.110.0_Linux-64bit.tar.gz"
fi

HUGO_REPO="https://github.com/gohugoio/hugo/releases/download/v0.110.0/${HUGO_ARTIFACT}"
if ! curl --fail -OL $HUGO_REPO ; then
echo "Failed to download Hugo binary"
exit 1
Expand Down
2 changes: 1 addition & 1 deletion flink-annotations/pom.xml
Original file line number Diff line number Diff line change
Expand Up @@ -25,7 +25,7 @@ under the License.
<parent>
<groupId>org.apache.flink</groupId>
<artifactId>flink-parent</artifactId>
<version>2.0-SNAPSHOT</version>
<version>2.1-SNAPSHOT</version>
</parent>

<artifactId>flink-annotations</artifactId>
Expand Down
Original file line number Diff line number Diff line change
Expand Up @@ -26,7 +26,7 @@ under the License.
<parent>
<groupId>org.apache.flink</groupId>
<artifactId>flink-architecture-tests</artifactId>
<version>2.0-SNAPSHOT</version>
<version>2.1-SNAPSHOT</version>
</parent>

<artifactId>flink-architecture-tests-base</artifactId>
Expand Down
Original file line number Diff line number Diff line change
Expand Up @@ -26,7 +26,7 @@ under the License.
<parent>
<groupId>org.apache.flink</groupId>
<artifactId>flink-architecture-tests</artifactId>
<version>2.0-SNAPSHOT</version>
<version>2.1-SNAPSHOT</version>
</parent>

<artifactId>flink-architecture-tests-production</artifactId>
Expand Down
Original file line number Diff line number Diff line change
Expand Up @@ -26,7 +26,7 @@ under the License.
<parent>
<groupId>org.apache.flink</groupId>
<artifactId>flink-architecture-tests</artifactId>
<version>2.0-SNAPSHOT</version>
<version>2.1-SNAPSHOT</version>
</parent>

<artifactId>flink-architecture-tests-test</artifactId>
Expand Down
2 changes: 1 addition & 1 deletion flink-architecture-tests/pom.xml
Original file line number Diff line number Diff line change
Expand Up @@ -26,7 +26,7 @@ under the License.
<parent>
<groupId>org.apache.flink</groupId>
<artifactId>flink-parent</artifactId>
<version>2.0-SNAPSHOT</version>
<version>2.1-SNAPSHOT</version>
</parent>

<artifactId>flink-architecture-tests</artifactId>
Expand Down
2 changes: 1 addition & 1 deletion flink-clients/pom.xml
Original file line number Diff line number Diff line change
Expand Up @@ -26,7 +26,7 @@ under the License.
<parent>
<groupId>org.apache.flink</groupId>
<artifactId>flink-parent</artifactId>
<version>2.0-SNAPSHOT</version>
<version>2.1-SNAPSHOT</version>
</parent>

<artifactId>flink-clients</artifactId>
Expand Down
Original file line number Diff line number Diff line change
Expand Up @@ -275,7 +275,7 @@ public class CliFrontendParser {
true,
"Specify the path of the python interpreter used to execute the python UDF worker "
+ "(e.g.: --pyExecutable /usr/local/bin/python3). "
+ "The python UDF worker depends on Python 3.8+, Apache Beam (version == 2.43.0), "
+ "The python UDF worker depends on Python 3.8+, Apache Beam (version >= 2.54.0, <= 2.61.0), "
+ "Pip (version >= 20.3) and SetupTools (version >= 37.0.0). "
+ "Please ensure that the specified environment meets the above requirements.");

Expand Down
Original file line number Diff line number Diff line change
Expand Up @@ -27,4 +27,9 @@ private StandaloneClusterId() {}
public static StandaloneClusterId getInstance() {
return INSTANCE;
}

@Override
public String toString() {
return "StandaloneClusterId";
}
}
2 changes: 1 addition & 1 deletion flink-connectors/flink-connector-base/pom.xml
Original file line number Diff line number Diff line change
Expand Up @@ -25,7 +25,7 @@
<parent>
<groupId>org.apache.flink</groupId>
<artifactId>flink-connectors</artifactId>
<version>2.0-SNAPSHOT</version>
<version>2.1-SNAPSHOT</version>
</parent>

<artifactId>flink-connector-base</artifactId>
Expand Down
2 changes: 1 addition & 1 deletion flink-connectors/flink-connector-datagen-test/pom.xml
Original file line number Diff line number Diff line change
Expand Up @@ -25,7 +25,7 @@
<parent>
<groupId>org.apache.flink</groupId>
<artifactId>flink-connectors</artifactId>
<version>2.0-SNAPSHOT</version>
<version>2.1-SNAPSHOT</version>
</parent>

<artifactId>flink-connector-datagen-tests</artifactId>
Expand Down
2 changes: 1 addition & 1 deletion flink-connectors/flink-connector-datagen/pom.xml
Original file line number Diff line number Diff line change
Expand Up @@ -25,7 +25,7 @@
<parent>
<groupId>org.apache.flink</groupId>
<artifactId>flink-connectors</artifactId>
<version>2.0-SNAPSHOT</version>
<version>2.1-SNAPSHOT</version>
</parent>

<artifactId>flink-connector-datagen</artifactId>
Expand Down
2 changes: 1 addition & 1 deletion flink-connectors/flink-connector-files/pom.xml
Original file line number Diff line number Diff line change
Expand Up @@ -26,7 +26,7 @@ under the License.
<parent>
<groupId>org.apache.flink</groupId>
<artifactId>flink-connectors</artifactId>
<version>2.0-SNAPSHOT</version>
<version>2.1-SNAPSHOT</version>
</parent>

<artifactId>flink-connector-files</artifactId>
Expand Down
Original file line number Diff line number Diff line change
Expand Up @@ -6,9 +6,9 @@ The Apache Software Foundation (http://www.apache.org/).

This project bundles the following dependencies under the Apache Software License 2.0. (http://www.apache.org/licenses/LICENSE-2.0.txt)

- org.apache.parquet:parquet-hadoop:1.13.1
- org.apache.parquet:parquet-column:1.13.1
- org.apache.parquet:parquet-common:1.13.1
- org.apache.parquet:parquet-encoding:1.13.1
- org.apache.parquet:parquet-format-structures:1.13.1
- org.apache.parquet:parquet-jackson:1.13.1
- org.apache.parquet:parquet-hadoop:1.14.4
- org.apache.parquet:parquet-column:1.14.4
- org.apache.parquet:parquet-common:1.14.4
- org.apache.parquet:parquet-encoding:1.14.4
- org.apache.parquet:parquet-format-structures:1.14.4
- org.apache.parquet:parquet-jackson:1.14.4
2 changes: 1 addition & 1 deletion flink-connectors/flink-file-sink-common/pom.xml
Original file line number Diff line number Diff line change
Expand Up @@ -25,7 +25,7 @@ under the License.
<parent>
<groupId>org.apache.flink</groupId>
<artifactId>flink-connectors</artifactId>
<version>2.0-SNAPSHOT</version>
<version>2.1-SNAPSHOT</version>
</parent>

<artifactId>flink-file-sink-common</artifactId>
Expand Down
Loading

0 comments on commit 8d2543b

Please sign in to comment.