This is currently only possible in the self hosted version of Rowboat

Using custom LLM providers

By default, Rowboat uses OpenAI LLMs (gpt-4o, gpt-4.1, etc.) for both agents and copilot, when you export your OPENAI_API_KEY. However, you can also configure custom LLM providers (e.g. LiteLLM, OpenRouter) to use any of the hundreds of available LLMs beyond OpenAI, such as Claude, DeepSeek, Ollama LLMs and so on.
1

Set up your LLM provider

Configure your environment variables to point to your preferred LLM backend. Example using LiteLLM:
export PROVIDER_BASE_URL=http://host.docker.internal:4000/
export PROVIDER_API_KEY=sk-1234
Rowboat uses gpt-4.1 as the default model for agents and copilot. You can override these:
export PROVIDER_DEFAULT_MODEL=claude-3-7-sonnet-latest
export PROVIDER_COPILOT_MODEL=gpt-4o
Notes:
  • Copilot is optimized for gpt-4o/gpt-4.1. We strongly recommend using these models for best results.
  • You can use different models for the copilot and each agent, but all must be from the same provider (e.g., LiteLLM).
  • Rowboat is provider-agnostic — any backend implementing the OpenAI messages format should work.
  • OpenAI-specific tools (like web_search) will not function with non-OpenAI providers. Remove such tools to avoid errors.
2

Clone the repository and start Rowboat Docker

Clone the Rowboat repo and spin it up locally:
git clone git@github.com:rowboatlabs/rowboat.git
cd rowboat
docker-compose up --build
3

Access the app

Once Docker is running, navigate to:http://localhost:3000