This is currently only possible in the self hosted version of Rowboat
Using custom LLM providers
By default, Rowboat uses OpenAI LLMs (gpt-4o, gpt-4.1, etc.) for both agents and copilot, when you export your OPENAI_API_KEY. However, you can also configure custom LLM providers (e.g. LiteLLM, OpenRouter) to use any of the hundreds of available LLMs beyond OpenAI, such as Claude, DeepSeek, Ollama LLMs and so on.1
Set up your LLM provider
Configure your environment variables to point to your preferred LLM backend. Example using LiteLLM:Rowboat uses Notes:
gpt-4.1
as the default model for agents and copilot. You can override these:- Copilot is optimized for
gpt-4o
/gpt-4.1
. We strongly recommend using these models for best results. - You can use different models for the copilot and each agent, but all must be from the same provider (e.g., LiteLLM).
- Rowboat is provider-agnostic — any backend implementing the OpenAI messages format should work.
- OpenAI-specific tools (like
web_search
) will not function with non-OpenAI providers. Remove such tools to avoid errors.
2
Clone the repository and start Rowboat Docker
Clone the Rowboat repo and spin it up locally:
3
Access the app
Once Docker is running, navigate to:http://localhost:3000