Skip to content

Commit

Permalink
Adds warning to Create inference API page (elastic#118073)
Browse files Browse the repository at this point in the history
  • Loading branch information
kosabogi authored Dec 5, 2024
1 parent 1fecab1 commit 9d35053
Showing 1 changed file with 8 additions and 1 deletion.
9 changes: 8 additions & 1 deletion docs/reference/inference/put-inference.asciidoc
Original file line number Diff line number Diff line change
Expand Up @@ -10,7 +10,6 @@ Creates an {infer} endpoint to perform an {infer} task.
* For built-in models and models uploaded through Eland, the {infer} APIs offer an alternative way to use and manage trained models. However, if you do not plan to use the {infer} APIs to use these models or if you want to use non-NLP models, use the <<ml-df-trained-models-apis>>.
====


[discrete]
[[put-inference-api-request]]
==== {api-request-title}
Expand Down Expand Up @@ -47,6 +46,14 @@ Refer to the service list in the <<put-inference-api-desc,API description sectio

The create {infer} API enables you to create an {infer} endpoint and configure a {ml} model to perform a specific {infer} task.

[IMPORTANT]
====
* When creating an inference endpoint, the associated machine learning model is automatically deployed if it is not already running.
* After creating the endpoint, wait for the model deployment to complete before using it. You can verify the deployment status by using the <<get-trained-models-stats, Get trained model statistics>> API. In the response, look for `"state": "fully_allocated"` and ensure the `"allocation_count"` matches the `"target_allocation_count"`.
* Avoid creating multiple endpoints for the same model unless required, as each endpoint consumes significant resources.
====


The following services are available through the {infer} API.
You can find the available task types next to the service name.
Click the links to review the configuration details of the services:
Expand Down

0 comments on commit 9d35053

Please sign in to comment.