POST
/
agents
/
query
/
{id}
This let you query your agent for a specific query.
curl --request POST \
  --url https://api.chaindesk.ai/agents/query/{id} \
  --header 'Authorization: Bearer <token>' \
  --header 'Content-Type: application/json' \
  --data '{
  "query": "<string>",
  "conversationId": "<string>",
  "visitorId": "<string>",
  "temperature": 123,
  "streaming": true,
  "promptType": "raw",
  "promptTemplate": "<string>",
  "filters": {
    "custom_ids": [
      "<string>"
    ],
    "datasource_ids": [
      "<string>"
    ]
  }
}'
{
  "answer": "<string>",
  "conversationId": "<string>",
  "visitorId": "<string>",
  "sources": [
    {}
  ]
}

Streaming

When streaming is enabled, the endpoint will emit events “answer” (answer of the model) and “endpoint_response” (full response of the endpoint)
import {
  EventStreamContentType,
  fetchEventSource,
} from '@microsoft/fetch-event-source';

let buffer = '';
let bufferEndpointResponse = '';
const ctrl = new AbortController();

await fetchEventSource(queryAgentURL, {
        method: 'POST',
        headers: {
          'Content-Type': 'application/json',
        },
        signal: ctrl.signal,
        body: JSON.stringify({
          streaming: true,
          query,
          conversationId,
          visitorId,
        }),

        async onopen(response) {
          if (response.status === 402) {
            throw new ApiError(ApiErrorType.USAGE_LIMIT);
          }
        },
        onmessage: (event) => {
          if (event.data === '[DONE]') {
            // End of stream
            ctrl.abort();

            try {
              const { sources, conversationId, visitorId } = JSON.parse(
                bufferEndpointResponse
              ) as ChatResponse;
            } catch {}
          } else if (event.data?.startsWith('[ERROR]')) {
            // ...
          } else if (event.event === "endpoint_response") {
            bufferEndpointResponse += event.data;
          } else if (event.event === "answer") {
            buffer += event.data;
            // ...
          }
       },
  });

Authorizations

Authorization
string
header
required

Bearer authentication header of the form Bearer <token>, where <token> is your auth token.

Path Parameters

id
string
required

ID of the agent

Body

application/json
query
string
required

This is the query you want to ask your agent.

conversationId
string

ID of the conversation (If not provided a new conversation is created)

visitorId
string

ID of the participant that's sending the query (If not provided a new ID is created)

temperature
number

Temperature of the model (min 0.0, max 1.0)

streaming
boolean

Enable streaming

promptType
enum<string>

Set the prompt type for this query

Available options:
raw,
customer_support
promptTemplate
string

Set the prompt template for this query

filters
object

Response

Success

answer
string

The answer of the agent.

conversationId
string

ID of the conversation

visitorId
string

ID of the participant that's sending the query

sources
object[]

Datasource chunks that were used to generate the answer