Skip to content

Commit d59cc77

Browse files
committed
cloudpickle support is no longer experimental.
1 parent c1d0fa4 commit d59cc77

File tree

1 file changed

+1
-1
lines changed

1 file changed

+1
-1
lines changed

website/www/site/content/en/documentation/sdks/python-pipeline-dependencies.md

Lines changed: 1 addition & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -163,7 +163,7 @@ Dataflow, see [Pre-building the python SDK custom container image with extra dep
163163

164164
When the Python SDK submits the pipeline for execution to a remote runner, the pipeline contents, such as transform user code, is serialized (or pickled) into a bytecode using
165165
libraries that perform the serialization (also called picklers). The default pickler library used by Beam is `dill`.
166-
To use the `cloudpickle` pickler, supply the `--pickle_library=cloudpickle` pipeline option. The `cloudpickle` support is currently [experimental](https://github.com/apache/beam/issues/21298).
166+
To use the `cloudpickle` pickler, supply the `--pickle_library=cloudpickle` pipeline option.
167167

168168
By default, global imports, functions, and variables defined in the main pipeline module are not saved during the serialization of a Beam job.
169169
Thus, one might encounter an unexpected `NameError` when running a `DoFn` on any remote runner. To resolve this, supply the main session content with the pipeline by

0 commit comments

Comments
 (0)