Model Configuration
Configure and customize AI models in Alex Sidebar
Alex Sidebar supports multiple AI models to suit different development needs and preferences. This guide explains the available models and how to configure them.
Model Selection
You can switch between models in two ways:
Model Selector Menu
- Click on the default model on the bottom left corner of the chat input view
- Select the model you want to use from the dropdown menu
Keyboard Shortcut
Press Command
+ /
to quickly cycle through your enabled models during a chat session.
Note that o1 model is limited to 50 credits - to purchase additional credits, join our Discord community and message @DanielEdrisian on our Discord server.
API Key Configuration
You can add your API keys directly in the Model Settings screen. Simply click the settings icon on the top right corner of the sidebar and look for the API key input fields for each provider under the section “Model Settings”.
Your API keys are stored securely and only used to authenticate with the respective AI providers. You can update or remove them at any time from the settings screen.
Custom Model Setup
You can add custom models that comply with the OpenAI API scheme. Follow these steps to configure a custom model:
Add Custom Model
- Navigate to “Settings” by selecting the gear icon on the top right corner of the sidebar
- Select “Models” and you will find the section on “Custom Models” section
- Click the “Add New Model” button to create a new custom model configuration
Configure Model Details
- Enter the Model ID (e.g.,
qwen2.5-coder-32b-instruct
,deepseek-chat
) - Provide the Base URL for your model’s API endpoint
- Add your API Key for authentication
- (Optional) Specify if the model supports image inputs
Example: DeepSeek V3 Model
To run the DeepSeek V3 model:
- Model ID:
deepseek-chat
- Base URL:
https://api.deepseek.com/v1
- Enter your DeepSeek API Key in the provided field
Finalize Setup
Go back to the chat screen by clicking on the close icon on the top right corner of the sidebar and you will see the custom model in the model selection options.
Running Local Models
Alex Sidebar supports running local AI models through Ollama, providing a free and privacy-focused alternative to cloud-based models. Here is an example of how to set up a local powerful model like Qwen2.5-Coder:
Install Prerequisites
- Install Ollama to manage and serve the local model
Set Up the Model
Configure in Alex Sidebar
Add a custom model with these settings:
- Model ID:
qwen2.5-coder:32b
- Base URL: Your ollama URL +
/v1
(e.g.,http://localhost:1234/v1
)
Local models may run slower than cloud-based alternatives, especially on less powerful hardware. Consider your performance requirements when choosing between local and cloud models.
Best Practices
Model Selection Tips
• Use Claude 3.5 Sonnet or GPT-4 for complex architectural decisions
• Claude 3.5 Haiku or GPT-4 Mini for quick code completions
Performance Optimization
• Start new chats for long conversations to maintain accuracy
• Match model capabilities to task complexity
Troubleshooting
If you encounter issues with model responses:
- Check your API key configuration
- Verify your internet connection
- Ensure you’re within the model’s context limit
- Try switching to a different model
- Restart Alex Sidebar if issues persist
Need help? Join our Discord community for support and tips from other developers.
Code Apply View Position
Bottom Position
Keep the code apply interface fixed at the bottom for easy access to changes.
Improved Workflow
Review and apply code changes without scrolling through long conversations.