Skip to main content

Documentation Index

Fetch the complete documentation index at: https://www.meilisearch.com/docs/llms.txt

Use this file to discover all available pages before exploring further.

Streaming delivers chat responses incrementally, giving users immediate feedback instead of waiting for the full response to generate. Meilisearch uses Server-Sent Events (SSE) to stream responses from the chat completions endpoint.
In code examples, replace WORKSPACE_NAME with the name of your workspace. On Meilisearch Cloud, the default workspace name is cloud.

Send a streaming request

Send a POST request to the chat completions endpoint with "stream": true. Non-streaming is not yet supported and returns a 501 Not Implemented error.
curl -N \
  -X POST 'MEILISEARCH_URL/chats/WORKSPACE_NAME/chat/completions' \
  -H 'Authorization: Bearer MEILISEARCH_KEY' \
  -H 'Content-Type: application/json' \
  --data-binary '{
    "model": "PROVIDER_MODEL_UID",
    "stream": true,
    "messages": [
      {
        "role": "user",
        "content": "What is Meilisearch?"
      }
    ],
    "tools": [
      {
        "type": "function",
        "function": {
          "name": "_meiliSearchProgress",
          "description": "Provides information about the current Meilisearch search operation",
          "parameters": {
            "type": "object",
            "properties": {
              "call_id": { "type": "string" },
              "function_name": { "type": "string" },
              "function_parameters": { "type": "string" }
            },
            "required": ["call_id", "function_name", "function_parameters"],
            "additionalProperties": false
          },
          "strict": true
        }
      },
      {
        "type": "function",
        "function": {
          "name": "_meiliSearchSources",
          "description": "Provides sources of the search",
          "parameters": {
            "type": "object",
            "properties": {
              "call_id": { "type": "string" },
              "documents": { "type": "array", "items": { "type": "object" } }
            },
            "required": ["call_id", "documents"],
            "additionalProperties": false
          },
          "strict": true
        }
      },
      {
        "type": "function",
        "function": {
          "name": "_meiliAppendConversationMessage",
          "description": "Append a new message to the conversation based on what happened internally",
          "parameters": {
            "type": "object",
            "properties": {
              "role": { "type": "string" },
              "content": { "type": "string" },
              "tool_calls": { "type": ["array", "null"] },
              "tool_call_id": { "type": ["string", "null"] }
            },
            "required": ["role", "content", "tool_calls", "tool_call_id"],
            "additionalProperties": false
          },
          "strict": true
        }
      }
    ]
  }'
The -N flag in the cURL example disables output buffering, so you see each chunk as it arrives.

Understand the SSE response format

Meilisearch streams responses as Server-Sent Events (SSE) over a persistent HTTP connection. The wire format is deliberately OpenAI-compatible so you can point the official OpenAI SDKs, the Vercel AI SDK, or any other SSE-aware client at the endpoint without custom parsing. Concretely, each event on the wire follows three rules:
  • Every event is a line that begins with the literal prefix data: .
  • The payload after data: is a single JSON object shaped like an OpenAI chat.completion.chunk (same id, object, choices[].delta structure as /v1/chat/completions).
  • The stream terminates with the sentinel line data: [DONE]. The [DONE] marker is a literal string, not JSON, so parsers must check for it before calling JSON.parse.
Events are separated by blank lines. After consuming [DONE], close the reader and treat the connection as complete.

Content chunks

Regular content chunks contain the AI-generated text. Each chunk includes a small piece of the response in choices[0].delta.content:
data: {"id":"chatcmpl-123","object":"chat.completion.chunk","created":1677652288,"model":"gpt-4o","choices":[{"index":0,"delta":{"role":"assistant","content":""},"finish_reason":null}]}

data: {"id":"chatcmpl-123","object":"chat.completion.chunk","created":1677652288,"model":"gpt-4o","choices":[{"index":0,"delta":{"content":"Meilisearch"},"finish_reason":null}]}

data: {"id":"chatcmpl-123","object":"chat.completion.chunk","created":1677652288,"model":"gpt-4o","choices":[{"index":0,"delta":{"content":" is"},"finish_reason":null}]}

data: {"id":"chatcmpl-123","object":"chat.completion.chunk","created":1677652288,"model":"gpt-4o","choices":[{"index":0,"delta":{"content":" a"},"finish_reason":null}]}

Tool call chunks

When you include Meilisearch tools in your request, the stream also contains tool call chunks. These appear in choices[0].delta.tool_calls and carry search progress and source information:
data: {"id":"chatcmpl-123","object":"chat.completion.chunk","created":1677652288,"model":"gpt-4o","choices":[{"index":0,"delta":{"tool_calls":[{"index":0,"id":"call_abc123","type":"function","function":{"name":"_meiliSearchProgress","arguments":""}}]},"finish_reason":null}]}

data: {"id":"chatcmpl-123","object":"chat.completion.chunk","created":1677652288,"model":"gpt-4o","choices":[{"index":0,"delta":{"tool_calls":[{"index":0,"function":{"arguments":"{\"call_id\":\"abc\",\"function_name\":\"_meiliSearchInIndex\",\"function_parameters\":\"{\\\"index_uid\\\":\\\"movies\\\",\\\"q\\\":\\\"search engine\\\"}\"}"}}]},"finish_reason":null}]}

End of stream

The stream ends with a finish_reason of "stop" followed by the [DONE] marker:
data: {"id":"chatcmpl-123","object":"chat.completion.chunk","created":1677652288,"model":"gpt-4o","choices":[{"index":0,"delta":{},"finish_reason":"stop"}]}

data: [DONE]

Handle streaming in JavaScript

Use the Fetch API to process the SSE stream in a browser or Node.js application:
async function streamChat(query) {
  const response = await fetch(
    'MEILISEARCH_URL/chats/WORKSPACE_NAME/chat/completions',
    {
      method: 'POST',
      headers: {
        'Authorization': 'Bearer MEILISEARCH_KEY',
        'Content-Type': 'application/json',
      },
      body: JSON.stringify({
        model: 'gpt-4o',
        stream: true,
        messages: [{ role: 'user', content: query }],
        tools: [
          {
            type: 'function',
            function: {
              name: '_meiliSearchProgress',
              description: 'Provides information about the current Meilisearch search operation',
              parameters: {
                type: 'object',
                properties: {
                  call_id: { type: 'string' },
                  function_name: { type: 'string' },
                  function_parameters: { type: 'string' },
                },
                required: ['call_id', 'function_name', 'function_parameters'],
                additionalProperties: false,
              },
              strict: true,
            },
          },
          {
            type: 'function',
            function: {
              name: '_meiliSearchSources',
              description: 'Provides sources of the search',
              parameters: {
                type: 'object',
                properties: {
                  call_id: { type: 'string' },
                  documents: { type: 'array', items: { type: 'object' } },
                },
                required: ['call_id', 'documents'],
                additionalProperties: false,
              },
              strict: true,
            },
          },
          {
            type: 'function',
            function: {
              name: '_meiliAppendConversationMessage',
              description: 'Append a new message to the conversation based on what happened internally',
              parameters: {
                type: 'object',
                properties: {
                  role: { type: 'string' },
                  content: { type: 'string' },
                  tool_calls: { type: ['array', 'null'] },
                  tool_call_id: { type: ['string', 'null'] },
                },
                required: ['role', 'content', 'tool_calls', 'tool_call_id'],
                additionalProperties: false,
              },
              strict: true,
            },
          },
        ],
      }),
    }
  );

  const reader = response.body.getReader();
  const decoder = new TextDecoder();
  let buffer = '';

  while (true) {
    const { done, value } = await reader.read();
    if (done) break;

    buffer += decoder.decode(value, { stream: true });
    const lines = buffer.split('\n');
    buffer = lines.pop(); // Keep incomplete line in buffer

    for (const line of lines) {
      if (!line.startsWith('data: ')) continue;

      const data = line.slice(6);
      if (data === '[DONE]') return;

      const chunk = JSON.parse(data);
      const delta = chunk.choices[0]?.delta;

      if (delta?.content) {
        // Append text content to your UI
        process.stdout.write(delta.content);
      }

      if (delta?.tool_calls) {
        // Handle tool calls (search progress, sources)
        for (const toolCall of delta.tool_calls) {
          handleToolCall(toolCall);
        }
      }
    }
  }
}

Maintain conversation context

The chat completions endpoint is stateless. To maintain conversation history across multiple exchanges, accumulate messages and send the full history with each request.
const messages = [];

async function sendMessage(userMessage) {
  messages.push({ role: 'user', content: userMessage });

  const response = await fetch(
    'MEILISEARCH_URL/chats/WORKSPACE_NAME/chat/completions',
    {
      method: 'POST',
      headers: {
        Authorization: 'Bearer MEILISEARCH_KEY',
        'Content-Type': 'application/json',
      },
      body: JSON.stringify({
        model: 'PROVIDER_MODEL_UID',
        stream: true,
        messages,
        tools: MEILISEARCH_TOOLS, // See the complete example in the chat interface guide
      }),
    }
  );

  const reader = response.body.getReader();
  const decoder = new TextDecoder();
  let assistantMessage = '';
  const pendingToolCalls = {};

  while (true) {
    const { done, value } = await reader.read();
    if (done) break;

    for (const line of decoder.decode(value).split('\n')) {
      if (!line.startsWith('data: ') || line === 'data: [DONE]') continue;
      const delta = JSON.parse(line.slice(6)).choices[0]?.delta;
      if (delta?.content) assistantMessage += delta.content;
      for (const toolCall of delta?.tool_calls ?? []) {
        if (toolCall.id) pendingToolCalls[toolCall.id] = { name: toolCall.function.name, args: '' };
        const pending = toolCall.id ? pendingToolCalls[toolCall.id] : Object.values(pendingToolCalls).at(-1);
        if (pending && toolCall.function?.arguments) pending.args += toolCall.function.arguments;
      }
    }
  }

  for (const call of Object.values(pendingToolCalls)) {
    if (call.name === '_meiliAppendConversationMessage') {
      messages.push(JSON.parse(call.args)); // Preserve search context for follow-ups
    }
  }

  messages.push({ role: 'assistant', content: assistantMessage });
}
For a complete example combining all tools with progress, sources, and history, see the chat interface guide.

Next steps