Supports Local Deployment, Suitable for Small to Medium Scale Applications
Provides Distilled Small Models (1.5B-70B), Adapts to Resource-Limited Scenarios
API Features
Cost-Effective, Suitable for General Needs
Supports “DeepThink” Mode for Chain of Thought Output, Ideal for Professional Development
Roo Code Configuration Tutorial
Step 1: Plugin Installation
Install through VSCode Extension Marketplace: Search for “Roo Code” and install
Step 2: Obtain API Key
Due to recent inaccessibility of Deepseek Api Development Platform, we’ll use API services provided by Silicon Flow. New users receive 20 million tokens (14 yuan platform quota) upon registration
Open VSCode and add plugin API call configuration:
# 1. Select API Provider
OpenAI Compatible
# 2. Configure API call address
https://api.siliconflow.cn
# 3. Configure the generated API key
Chat test, using Deepseek-R1 model to generate a Snake game
The basic functional logic was implemented, though response time was relatively long
Except for command line commands, feature implementation and bug fixes were mostly automatic
Other Plugin Configurations
Prompt Configuration
Enhanced Prompt Configuration
Initial Experience and Comparison with Cursor
Question answering and prompt configuration allow for more detailed custom prompt configuration, with similar response quality, mainly dependent on the model
In terms of response speed, using Deepseek is relatively slower. Previously using Cursor with claude-3.5-sonnet model showed significantly faster code modification speed. Using Deepseek requires waiting for deep thinking before code modifications
Using Roo Code + Deepseek is FREE!!!
When using @ prompt for file selection in Roo Code plugin, file search filtering is not supported, making file addition for questions more cumbersome
The plugin itself has some bugs, making it less comfortable to use compared to Cursor
Will compare usage differences after experiencing Roo Code + claude-3.5-sonnet later