OpenCode Integration Guide
OpenCode is an open-source terminal AI coding tool and an open-source alternative to Claude Code. It can use any model through OPEAI Platform.
Installing OpenCode
macOS / Linux
curl -fsSL https://opencode.ai/install | bash
Install with Go
go install github.com/opencode-ai/opencode@latest
Verify Installation
opencode --version
Configuring OPEAI Platform
Method 1: Environment Variables (Recommended)
Add the following environment variables to your shell configuration file:
~/.zshrc or ~/.bashrc
export OPENAI_API_KEY=<Your OPEAI API Key>
export OPENAI_BASE_URL=https://api-platform.ope.ai/v1
Reload the configuration file after editing:
source ~/.zshrc # or source ~/.bashrc
Method 2: Configuration File
Create or edit the OpenCode configuration file:
~/.config/opencode/config.toml
[providers.opeai]
api_key = "<Your OPEAI API Key>"
base_url = "https://api-platform.ope.ai/v1"
[models.default]
provider = "opeai"
model = "Claude-4.6-Sonnet"
Verify Configuration
Run the following command to test your configuration:
opencode "Hello, please introduce yourself"
If configured correctly, OpenCode will respond using the configured model.
Recommended Models
| Use Case | Recommended Model | Features |
|---|---|---|
| Daily Coding | Claude-4.6-Sonnet | Strong overall capability, cost-effective |
| Complex Tasks | Claude-4.6-Opus | Top-tier reasoning, ideal for architecture design |
| Quick Response | Claude-4.5-Haiku | Ultra-fast response, low cost |
| Ultra-long Context | GPT-5.4-Pro | 1M context, suitable for large projects |
| Cost Optimization | DeepSeek-V3.2 | Extremely low cost, preferred for high-frequency use |
For the complete model list, please check Model Pricing.
Common Commands
Code Generation
opencode "Write a quicksort algorithm in Python"
Code Explanation
opencode "Explain this code" < script.py
Code Refactoring
opencode "Refactor this function to TypeScript" < function.js
Bug Fixing
opencode "Fix the bug in this file" < buggy_code.py
Interactive Mode
opencode chat
Advanced Configuration
Switching Models
Modify the default model in the configuration file:
~/.config/opencode/config.toml
[models.default]
provider = "opeai"
model = "Claude-4.6-Opus" # Switch to a more powerful model
Or specify via command line parameter:
opencode --model Claude-4.5-Haiku "Quick question"
Custom Temperature Parameters
~/.config/opencode/config.toml
[models.default]
provider = "opeai"
model = "Claude-4.6-Sonnet"
temperature = 0.7
max_tokens = 4096
Troubleshooting
API Key Invalid Error
- Confirm API Key format is correct (starts with
sk-) - Check if environment variable is set correctly:
echo $OPENAI_API_KEY - Verify configuration file path is correct:
~/.config/opencode/config.toml - Visit OPEAI Platform Console to confirm Key validity
Model Does Not Exist Error
- Confirm model name spelling is correct (note case sensitivity, e.g.,
Claude-4.6-Sonnet) - Check if
base_urlis configured correctly:https://api-platform.ope.ai/v1 - Refer to the recommended models table above for correct model IDs
Slow Response Speed
- Try switching to faster models (e.g.,
Claude-4.5-Haiku) - Check network connection, confirm access to
https://api-platform.ope.ai - Reduce
max_tokensparameter to limit output length
command not found
- Confirm OpenCode is installed successfully:
which opencode - If installed with Go, ensure
$GOPATH/binis in$PATH - Try reinstalling:
go install github.com/opencode-ai/opencode@latest
info
OpenCode's specific configuration methods may change with version updates. For detailed information, please refer to the OpenCode official documentation.