Getting started with conversational search
Learn how to implement AI-powered conversational search in your application
This guide walks you through implementing Meilisearch’s chat completions feature to create conversational search experiences in your application.
The chat completions feature is experimental and must be enabled before use. See experimental features for activation instructions.
Prerequisites
Before starting, ensure you have:
- Meilisearch instance running (v1.15.1 or later)
- An API key from an LLM provider (OpenAI, Azure OpenAI, Mistral, Gemini, or access to a vLLM server)
- At least one index with searchable content
- The chat completions experimental feature enabled
Quick start
Enable the chat completions feature
First, enable the chat completions experimental feature:
Configure a chat completions workspace
Create a workspace with your LLM provider settings. Here are examples for different providers:
Send your first chat completions request
Now you can start a conversation:
Understanding workspaces
Workspaces allow you to create isolated chat configurations for different use cases:
- Customer support: Configure with support-focused prompts
- Product search: Optimize for e-commerce queries
- Documentation: Tune for technical Q&A
Each workspace maintains its own:
- LLM provider configuration
- System prompt
Building a chat interface with OpenAI SDK
Since Meilisearch’s chat endpoint is OpenAI-compatible, you can use the official OpenAI SDK:
Error handling
When using the OpenAI SDK with Meilisearch’s chat completions endpoint, errors from the streamed responses are natively handled by the official OpenAI SDK. This means you can use the SDK’s built-in error handling mechanisms without additional configuration:
Next steps
- Explore advanced chat API features
- Learn about conversational search concepts
- Review security best practices