
Connect to OpenAI, OpenRouter, and Ollama; models are automatically synced across APIs. You can also use Hugging Face GGUFs.

Use this section to connect to OpenAI-compatible providers. Common options include:
Configure the provider, add your API key, and select your preferred default model.

Connect to a local Ollama instance. Enter the base URL if different from the default, then click Test connection to validate and save the configuration.

Browse available models and download them locally so they are ready to use.

Remove models you no longer need to free up disk space.
Download GGUF models from Hugging Face — see the GGUF models catalog. Use them with Ollama by creating a model from a GGUF via a Modelfile. Models are large; ensure enough disk space.