Skip to main content

CodeBuddy Configuration (models.json)

This page summarizes how to configure models.json in CodeBuddy CLI, so you can control the model list and the model dropdown.

Overview

models.json lets you customize the model list and control what appears in the dropdown. It supports two levels:

  • User-level: ~/.codebuddy/models.json (global, applies to all projects)
  • Project-level: <workspace>/.codebuddy/models.json (project-specific, higher priority)

File locations

User-level

~/.codebuddy/models.json

Project-level

<project-root>/.codebuddy/models.json

Precedence

Merge order from high to low:

  1. Project-level models.json
  2. User-level models.json
  3. Built-in defaults

Project-level config overrides models with the same id. availableModels at the project level fully overrides the user-level list (no merge).

Configuration structure

{
"models": [
{
"id": "claude-sonnet-4.5",
"name": "Claude Sonnet 4.5",
"vendor": "OPE.AI",
"apiKey": "sk-xxxx",
"maxInputTokens": 200000,
"maxOutputTokens": 8192,
"url": "https://api.platform.ope.ai/v1/chat/completions",
"temperature": 0.7,
"supportsToolCall": true,
"supportsImages": true,
"supportsReasoning": true
}
],
"availableModels": ["claude-sonnet-4.5", "claude-haiku-4.5", "claude-opus-4.5"]
}

Field reference

models

Type: Array<LanguageModel>

Used to add custom models or override built-in ones.

LanguageModel fields

  • id (string, required): unique model identifier
  • name (string): display name
  • vendor (string): provider name (e.g., OpenAI, Google)
  • apiKey (string): API key value
  • maxInputTokens (number): max input tokens
  • maxOutputTokens (number): max output tokens
  • url (string): API endpoint URL (must be a full path ending with /chat/completions)
  • temperature (number): sampling temperature, range 0-2
  • supportsToolCall (boolean): tool-calling support
  • supportsImages (boolean): image input support
  • supportsReasoning (boolean): reasoning mode support

Notes:

  • Only OpenAI-compatible API format is supported
  • url must be a full endpoint path, typically ending with /chat/completions
  • Example: https://api.openai.com/v1/chat/completions, http://localhost:11434/v1/chat/completions

availableModels

Type: Array<string>

Controls which models appear in the dropdown:

  • Not set or empty: show all models
  • Set: only show IDs listed
  • Can include built-in and custom model IDs

Labeling

Models added via models.json are tagged as custom for UI filtering.

Merge strategy (SmartMerge)

  • Models with the same ID are overwritten
  • New IDs are appended
  • Project-level config takes precedence
  • availableModels filtering happens after merges

Hot reload

models.json supports hot reload:

  • file changes are detected automatically
  • 1-second debounce to prevent frequent reloads
  • updates are synchronized to the app

Watched files:

  • ~/.codebuddy/models.json
  • <workspace>/.codebuddy/models.json

References