Streaming
When streaming is enabled, the endpoint will emit events “answer” (answer of the model) and “endpoint_response” (full response of the endpoint)
Authorizations
Bearer authentication header of the form Bearer <token>
, where <token>
is your auth token.
Path Parameters
ID of the agent
Body
application/json
This is the query you want to ask your agent.
ID of the conversation (If not provided a new conversation is created)
ID of the participant that's sending the query (If not provided a new ID is created)
Temperature of the model (min 0.0, max 1.0)
Enable streaming
Set the prompt type for this query
Available options:
raw
, customer_support
Set the prompt template for this query