Skip to content
Open
Show file tree
Hide file tree
Changes from all commits
Commits
File filter

Filter by extension

Filter by extension

Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
2 changes: 1 addition & 1 deletion docs/develop/python/converters-and-encryption.mdx
Original file line number Diff line number Diff line change
Expand Up @@ -189,7 +189,7 @@ If you aren't yet able to upgrade from Pydantic v1, see https://github.com/tempo
### Custom Type Data Conversion

When converting from JSON, Workflow and Activity type hints are taken into account to convert to the proper types.
All common Python typings including `Optional`, `Union`, all forms of iterables and mappings, and `NewType` are supported in addition the regular JSON values mentioned before.
All common Python typings including `Optional`, `Union`, all forms of iterables and mappings, and `NewType` are supported in addition to the regular JSON values mentioned before.

In Python, Data Converters contain a reference to a Payload Converter class that is used to convert input and output payloads.
By default, the Payload Converter is a `CompositePayloadConverter` which contains multiple `EncodingPayloadConverter`s to try to serialize/deserialize payloads.
Expand Down
4 changes: 2 additions & 2 deletions docs/develop/python/core-application.mdx
Original file line number Diff line number Diff line change
Expand Up @@ -220,7 +220,7 @@ _deterministic_.

One of the primary things that Workflows do is orchestrate the execution of Activities. An Activity is a normal function
or method execution that's intended to execute a single, well-defined action (either short or long-running), such as
querying a database, calling a third-party API, or transcoding a media file. An Activity can interact with world outside
querying a database, calling a third-party API, or transcoding a media file. An Activity can interact with the world outside
the Temporal Platform or use a Temporal Client to interact with a Temporal Service. For the Workflow to be able to
execute the Activity, we must define the [Activity Definition](/activity-definition).

Expand All @@ -244,7 +244,7 @@ serially, defeating the entire purpose of using `asyncio`. This can also lead to
behavior that causes tasks to be unable to execute. Debugging these issues can be difficult and time consuming, as
locating the source of the blocking call might not always be immediately obvious.

Due to this, consider not make blocking calls from within an asynchronous Activity, or use an async safe library to
Due to this, consider not making blocking calls from within an asynchronous Activity, or use an async safe library to
perform these actions. If you must use a blocking library, consider using a synchronous Activity instead.

:::
Expand Down
2 changes: 1 addition & 1 deletion docs/develop/python/message-passing.mdx
Original file line number Diff line number Diff line change
Expand Up @@ -797,7 +797,7 @@ A dynamic Activity is a stand-in implementation.
It's used when an Activity Task with an unknown Activity type is received by the Worker.

To participate, your Activity must opt into dynamic access.
Adding `dynamic=True` to the `@activity.defn` decorator makes the Workflow Definition eligible to participate in dynamic invocation.
Adding `dynamic=True` to the `@activity.defn` decorator makes the Activity Definition eligible to participate in dynamic invocation.
You must register the Activity with the [Worker](https://python.temporal.io/temporalio.worker.html) before it can be invoked.

The Activity Definition must then accept a single argument of type `Sequence[temporalio.common.RawValue]`.
Expand Down
2 changes: 1 addition & 1 deletion docs/develop/python/observability.mdx
Original file line number Diff line number Diff line change
Expand Up @@ -184,7 +184,7 @@ Use the [list_workflows()](https://python.temporal.io/temporalio.client.Client.h

### How to set custom Search Attributes {#custom-search-attributes}

After you've created custom Search Attributes in your Temporal Service (using `temporal operator search-attribute create`or the Cloud UI), you can set the values of the custom Search Attributes when starting a Workflow.
After you've created custom Search Attributes in your Temporal Service (using `temporal operator search-attribute create` or the Cloud UI), you can set the values of the custom Search Attributes when starting a Workflow.

Use `SearchAttributeKey` to create your Search Attributes. Then, when starting a Workflow execution using `client.start_workflow()`, include the Custom Search Attributes by passing instances of `SearchAttributePair()` containing each of your keys and starting values to a parameter called `search_attributes`.
If you had Custom Search Attributes `CustomerId` of type `Keyword` and `MiscData` of type `Text`, you could provide these starting values:
Expand Down
6 changes: 3 additions & 3 deletions docs/develop/python/python-sdk-sandbox.mdx
Original file line number Diff line number Diff line change
Expand Up @@ -71,7 +71,7 @@ A default set of restrictions that prevents most dangerous standard library call

The following techniques aren't recommended, but they allow you to avoid, skip, or break through the sandbox environment.

Skipping Workflow Sandboxing results in a lack of determinism checks. Using the Workflow Sandboxing environment helps to preventing non-determinism errors but doesn't completely negate the risk.
Skipping Workflow Sandboxing results in a lack of determinism checks. Using the Workflow Sandboxing environment helps prevent non-determinism errors but doesn't completely negate the risk.

### Skip Sandboxing for a block of code

Expand Down Expand Up @@ -202,9 +202,9 @@ The sandbox's import notification policy specifies how the sandbox behaves when
A dynamic import occurs when a module is imported after the Workflow is loaded into the sandbox. These imports are often invisible and, if they don't do anything restricted by the sandbox, cause memory overhead. By default the [`WARN_ON_DYNAMIC_IMPORT`](https://python.temporal.io/temporalio.workflow.SandboxImportNotificationPolicy.html#WARN_ON_DYNAMIC_IMPORT) policy setting is enabled and a warning will be emitted when a module that is not in the [passthrough modules](#passthrough-modules) list is dynamically imported.

The other notable policy settings apply when a module is imported into the sandbox that was not passed through. These settings are disabled by default and must be explicitly turned on.
The [`WARN_ON_UNINTENTIONAL_PASSTHROUGH`](https://python.temporal.io/temporalio.workflow.SandboxImportNotificationPolicy.html#WARN_ON_UNINTENTIONAL_PASSTHROUGH) setting emits a warning when a module not included in the [passthrough modules](#passthrough-modules) list.
The [`WARN_ON_UNINTENTIONAL_PASSTHROUGH`](https://python.temporal.io/temporalio.workflow.SandboxImportNotificationPolicy.html#WARN_ON_UNINTENTIONAL_PASSTHROUGH) setting emits a warning when a module not included in the [passthrough modules](#passthrough-modules) list is imported.

Similarly, the [`RAISE_ON_UNINTENTIONAL_PASSTHROUGH`](https://python.temporal.io/temporalio.workflow.SandboxImportNotificationPolicy.html#RAISE_ON_UNINTENTIONAL_PASSTHROUGH) setting will raise an error when an non-passed-through module is imported.
Similarly, the [`RAISE_ON_UNINTENTIONAL_PASSTHROUGH`](https://python.temporal.io/temporalio.workflow.SandboxImportNotificationPolicy.html#RAISE_ON_UNINTENTIONAL_PASSTHROUGH) setting will raise an error when a non-passed-through module is imported.

The import notification policy can be set for specific imports by using [`sandbox_import_notification_policy`](https://python.temporal.io/temporalio.workflow.unsafe.html#sandbox_import_notification_policy) context manager.

Expand Down
2 changes: 1 addition & 1 deletion docs/develop/python/python-sdk-sync-vs-async.mdx
Original file line number Diff line number Diff line change
Expand Up @@ -212,7 +212,7 @@ asynchronous Activity Definition.
It makes
a call to a microservice, accessed through HTTP, to request this
greeting in Spanish. This Activity uses the `aiohttp` library to make an async
safe HTTP request. Using the `requests` library here would have resulting in
safe HTTP request. Using the `requests` library here would have resulted in
blocking code within the async event loop, which will block the entire async
event loop. For more in-depth information about this issue, refer to the
[Python asyncio documentation](https://docs.python.org/3/library/asyncio-dev.html#running-blocking-code).
Expand Down
4 changes: 2 additions & 2 deletions docs/develop/python/schedules.mdx
Original file line number Diff line number Diff line change
Expand Up @@ -42,9 +42,9 @@ This page shows how to do the following:

**How to Schedule a Workflow Execution**

Scheduling Workflows is a crucial aspect of any automation process, especially when dealing with time-sensitive tasks. By scheduling a Workflow, you can automate repetitive tasks, reduce the need for manual intervention, and ensure timely execution of your business processes
Scheduling Workflows is a crucial aspect of any automation process, especially when dealing with time-sensitive tasks. By scheduling a Workflow, you can automate repetitive tasks, reduce the need for manual intervention, and ensure timely execution of your business processes.

Use any of the following action to help Schedule a Workflow Execution and take control over your automation process.
Use any of the following actions to help Schedule a Workflow Execution and take control over your automation process.

### Create a Scheduled Workflow {#create}

Expand Down
3 changes: 0 additions & 3 deletions docs/develop/python/temporal-client.mdx
Original file line number Diff line number Diff line change
Expand Up @@ -294,9 +294,6 @@ overrides the option set in the configuration file.
For example, the following TOML configuration file defines a `staging` profile with the necessary connection options to
connect to Temporal Cloud via an API key:

For example, the following TOML configuration file defines a `staging` profile with the necessary connection options to
connect to Temporal Cloud via an API key:

```toml
# Cloud profile for Temporal Cloud
[profile.staging]
Expand Down
4 changes: 2 additions & 2 deletions docs/develop/python/versioning.mdx
Original file line number Diff line number Diff line change
Expand Up @@ -117,7 +117,7 @@ Patching is a three-step process:

1. Patch in any new, updated code using the `patched()` function. Run the new patched code alongside old code.
2. Remove old code and use `deprecate_patch()` to mark a particular patch as deprecated.
3. Once there are no longer any open Worklow Executions of the previous version of the code, remove `deprecate_patch()`.
3. Once there are no longer any open Workflow Executions of the previous version of the code, remove `deprecate_patch()`.
Let's walk through this process in sequence.

### Patching in new code {#using-patched-for-workflow-history-markers}
Expand Down Expand Up @@ -260,4 +260,4 @@ This method also does not provide a way to version any still-running Workflows -

### Testing a Workflow for replay safety

To determine whether your Workflow your needs a patch, or that you've patched it successfully, you should incorporate [Replay Testing](/develop/python/testing-suite#replay).
To determine whether your Workflow needs a patch, or that you've patched it successfully, you should incorporate [Replay Testing](/develop/python/testing-suite#replay).
2 changes: 1 addition & 1 deletion docs/develop/python/worker-versioning-legacy.mdx
Original file line number Diff line number Diff line change
Expand Up @@ -23,7 +23,7 @@ See the [Pre-release README](https://github.com/temporalio/temporal/blob/main/do
:::

A Build ID corresponds to a deployment. If you don't already have one, we recommend a hash of the code--such as a Git SHA--combined with a human-readable timestamp.
To use Worker Versioning, you need to pass a Build ID to your Java Worker and opt in to Worker Versioning.
To use Worker Versioning, you need to pass a Build ID to your Python Worker and opt in to Worker Versioning.

### Assign a Build ID to your Worker and opt in to Worker Versioning

Expand Down