Skip to content

Commit 9c31624

Browse files
Zhangxunmtkolchfa-awsnatebower
authored
remove experimental for batch_predict and deprecate the async batch i… (#9768)
* remove experimental for batch_predict and deprecate the async batch ingestion API in ml-commons Signed-off-by: Xun Zhang <[email protected]> * address comments Signed-off-by: Xun Zhang <[email protected]> * add warning tag Signed-off-by: Xun Zhang <[email protected]> * Update _ml-commons-plugin/remote-models/async-batch-ingestion.md Co-authored-by: kolchfa-aws <[email protected]> Signed-off-by: Xun Zhang <[email protected]> * Update _ml-commons-plugin/api/async-batch-ingest.md Co-authored-by: kolchfa-aws <[email protected]> Signed-off-by: Xun Zhang <[email protected]> * Update _ml-commons-plugin/remote-models/async-batch-ingestion.md Co-authored-by: kolchfa-aws <[email protected]> Signed-off-by: Xun Zhang <[email protected]> * Update _ml-commons-plugin/api/async-batch-ingest.md Co-authored-by: kolchfa-aws <[email protected]> Signed-off-by: Xun Zhang <[email protected]> * Update _ml-commons-plugin/remote-models/async-batch-ingestion.md Co-authored-by: Nathan Bower <[email protected]> Signed-off-by: Xun Zhang <[email protected]> * Update _ml-commons-plugin/api/async-batch-ingest.md Co-authored-by: Nathan Bower <[email protected]> Signed-off-by: Xun Zhang <[email protected]> --------- Signed-off-by: Xun Zhang <[email protected]> Co-authored-by: kolchfa-aws <[email protected]> Co-authored-by: Nathan Bower <[email protected]>
1 parent 2082dae commit 9c31624

File tree

3 files changed

+12
-7
lines changed

3 files changed

+12
-7
lines changed

_ml-commons-plugin/api/async-batch-ingest.md

Lines changed: 6 additions & 2 deletions
Original file line numberDiff line numberDiff line change
@@ -8,8 +8,12 @@ nav_order: 35
88
---
99

1010
# Asynchronous batch ingestion
11-
**Introduced 2.17**
12-
{: .label .label-purple }
11+
**Deprecated 3.0**
12+
{: .label .label-red }
13+
14+
This feature is deprecated. For similar functionality, use [OpenSearch Data Prepper]({{site.url}}{{site.baseurl}}/data-prepper/). If you'd like to see this feature reinstated, [create an issue](https://github.com/opensearch-project/ml-commons/issues) in the ML Commons repository.
15+
{: .warning}
16+
1317

1418
Use the Asynchronous Batch Ingestion API to ingest data into your OpenSearch cluster from your files on remote file servers, such as Amazon Simple Storage Service (Amazon S3) or OpenAI. For detailed configuration steps, see [Asynchronous batch ingestion]({{site.url}}{{site.baseurl}}/ml-commons-plugin/remote-models/async-batch-ingestion/).
1519

_ml-commons-plugin/api/model-apis/batch-predict.md

Lines changed: 0 additions & 3 deletions
Original file line numberDiff line numberDiff line change
@@ -8,9 +8,6 @@ nav_order: 65
88

99
# Batch predict
1010

11-
This is an experimental feature and is not recommended for use in a production environment. For updates on the progress of the feature or if you want to leave feedback, see the associated [GitHub issue](https://github.com/opensearch-project/ml-commons/issues/2488).
12-
{: .warning}
13-
1411
ML Commons can perform inference on large datasets in an offline asynchronous mode using a model deployed on external model servers. To use the Batch Predict API, you must provide the `model_id` for an externally hosted model. Amazon SageMaker, Cohere, and OpenAI are currently the only verified external servers that support this API.
1512

1613
For information about user access for this API, see [Model access control considerations]({{site.url}}{{site.baseurl}}/ml-commons-plugin/api/model-apis/index/#model-access-control-considerations).

_ml-commons-plugin/remote-models/async-batch-ingestion.md

Lines changed: 6 additions & 2 deletions
Original file line numberDiff line numberDiff line change
@@ -8,8 +8,12 @@ grand_parent: Integrating ML models
88

99

1010
# Asynchronous batch ingestion
11-
**Introduced 2.17**
12-
{: .label .label-purple }
11+
**Deprecated 3.0**
12+
{: .label .label-red }
13+
14+
This feature is deprecated. For similar functionality, use [OpenSearch Data Prepper]({{site.url}}{{site.baseurl}}/data-prepper/). If you'd like to see this feature reinstated, [create an issue](https://github.com/opensearch-project/ml-commons/issues) in the ML Commons repository.
15+
{: .warning}
16+
1317

1418
[Batch ingestion]({{site.url}}{{site.baseurl}}/ml-commons-plugin/remote-models/batch-ingestion/) configures an ingest pipeline, which processes documents one by one. For each document, batch ingestion calls an externally hosted model to generate text embeddings from the document text and then ingests the document, including text and embeddings, into an OpenSearch index.
1519

0 commit comments

Comments
 (0)