Chats
The /chats route allows you to create conversational search experiences using LLM technology
The /chats
route enables AI-powered conversational search by integrating Large Language Models (LLMs) with your Meilisearch data. This feature allows users to ask questions in natural language and receive contextual answers based on your indexed content.
This is an experimental feature. Use the Meilisearch Cloud UI or the experimental features endpoint to activate it:
Chat completions workspace object
Name | Type | Description |
---|---|---|
uid | String | Unique identifier for the chat completions workspace |
Update the chat workspace settings
Configure the LLM provider and settings for a chat workspace.
Path parameters
Name | Type | Description |
---|---|---|
workspace | String | The workspace identifier |
Settings parameters
Name | Type | Description |
---|---|---|
source | String | LLM source: "openAi" , "azureOpenAi" , "mistral" , "gemini" , or "vLlm" |
orgId | String | Organization ID for the LLM provider (required for azureOpenAi) |
projectId | String | Project ID for the LLM provider |
apiVersion | String | API version for the LLM provider (required for azureOpenAi) |
deploymentId | String | Deployment ID for the LLM provider (required for azureOpenAi) |
baseUrl | String | Base URL for the provider (required for azureOpenAi and vLlm) |
apiKey | String | API key for the LLM provider (optional for vLlm) |
prompts | Object | Prompts object containing system prompts and other configuration |
The prompts object
Name | Type | Description |
---|---|---|
system | String | A prompt added to the start of the conversation to guide the LLM |
searchDescription | String | A prompt to explain what the internal search function does |
searchQParam | String | A prompt to explain what the q parameter of the search function does and how to use it |
searchIndexUidParam | String | A prompt to explain what the indexUid parameter of the search function does and how to use it |
Request body
All fields are optional. Only provided fields will be updated.
Response: 200 OK
Returns the updated settings object. Note that apiKey
is write-only and will not be returned in the response.
Examples
Chat completions
Create a chat completion using the OpenAI-compatible interface. The endpoint searches relevant indexes and generates responses based on the retrieved content.
Path parameters
Name | Type | Description |
---|---|---|
workspace | String | The chat completion workspace unique identifier (uid) |
Request body
Name | Type | Required | Description |
---|---|---|---|
model | String | Yes | Model to use and will be related to the source LLM in the workspace settings |
messages | Array | Yes | Array of message objects with role and content |
stream | Boolean | No | Enable streaming responses (default: true ) |
Currently, only streaming responses (stream: true
) are supported. Non-streaming responses will be available in a future release.
Message object
Name | Type | Description |
---|---|---|
role | String | Message role: "system" , "user" , or "assistant" |
content | String | Message content |
Response
The response follows the OpenAI chat completions format. For streaming responses, the endpoint returns Server-Sent Events (SSE).
Streaming response example
Example
Get chat settings
Retrieve the current settings for a chat workspace.
Path parameters
Name | Type | Description |
---|---|---|
workspace | String | The workspace identifier |
Response: 200 OK
Returns the settings object without the apiKey
field.
Example
List chat workspaces
List all available chat workspaces. Results can be paginated using query parameters.
Query parameters
Query parameter | Description | Default value |
---|---|---|
offset | Number of workspaces to skip | 0 |
limit | Number of workspaces to return | 20 |
Response
Name | Type | Description |
---|---|---|
results | Array | An array of chat workspace objects |
offset | Integer | Number of workspaces skipped |
limit | Integer | Number of workspaces returned |
total | Integer | Total number of workspaces |
Example
Response: 200 OK
Authentication
The chat feature integrates with Meilisearch’s authentication system:
- Default Chat API Key: A new default key is created when chat is enabled, with permissions to access chat endpoints
- Tenant tokens: Fully supported for multi-tenant applications
- Index visibility: Chat searches only indexes accessible with the provided API key
Tool calling
The chat feature uses internal tool calling to search your indexes and provide enhanced user experience. For optimal performance and user experience, you should declare three special tools in your chat completion requests. These tools are handled internally by Meilisearch and provide real-time feedback about search operations, conversation context, and source documents.
Overview of Special Tools
_meiliSearchProgress
: Reports real-time search progress and operations_meiliAppendConversationMessage
: Maintains conversation context for better responses_meiliSearchSources
: Provides source documents used in generating responses
Tool Declaration
Include these tools in your request’s tools
array to enable enhanced functionality:
Tool Functions Explained
_meiliSearchProgress
This tool reports real-time progress of internal search operations. When declared, Meilisearch will call this function whenever search operations are performed in the background.
Purpose: Provides transparency about search operations and reduces perceived latency by showing users what’s happening behind the scenes.
Arguments:
call_id
: Unique identifier to track the search operationfunction_name
: Name of the internal function being executed (e.g., “_meiliSearchInIndex”)function_parameters
: JSON-encoded string containing search parameters likeq
(query) andindex_uid
Example Response:
_meiliAppendConversationMessage
Since the /chats/{workspace}/chat/completions
endpoint is stateless, this tool helps maintain conversation context by requesting the client to append internal messages to the conversation history.
Purpose: Maintains conversation context for better response quality in subsequent requests by preserving tool calls and results.
Arguments:
role
: Message author role (“user” or “assistant”)content
: Message content (for tool results)tool_calls
: Array of tool calls made by the assistanttool_call_id
: ID of the tool call this message responds to
Example Response:
_meiliSearchSources
This tool provides the source documents that were used by the LLM to generate responses, enabling transparency and allowing users to verify information sources.
Purpose: Shows users which documents were used to generate responses, improving trust and enabling source verification.
Arguments:
call_id
: Matches thecall_id
from_meiliSearchProgress
to associate queries with resultsdocuments
: JSON object containing the source documents with only displayed attributes
Example Response:
Implementation Best Practices
- Always declare all three tools for the best user experience
- Handle progress updates by displaying search status to users during streaming
- Append conversation messages as requested to maintain context for future requests
- Display source documents to users for transparency and verification
- Use the
call_id
to associate progress updates with their corresponding source results
These special tools are handled internally by Meilisearch and are not forwarded to the LLM provider. They serve as a communication mechanism between Meilisearch and your application to provide enhanced user experience features.