feat: add app trae-proxy

This commit is contained in:
arch3rPro
2025-09-17 22:30:27 +08:00
parent 616a8f1259
commit 5c4fa38f74
7 changed files with 538 additions and 0 deletions

225
apps/trae-proxy/README.md Normal file
View File

@@ -0,0 +1,225 @@
一个智能的API代理工具专门用于拦截和重定向OpenAI API请求到自定义后端服务。支持多后端配置、动态模型映射和流式响应处理。
## 📢 引言
1. Trae IDE 目前支持自定义模型服务商,但是仅支持列表内固定的模型服务商,不支持自定义 base_url所以无法使用自己的API服务。
2. Github上有不少提出相关的Issue但是官方基本处于不处理状态[增加自定义模型服务商 base_url 的能力](https://github.com/Trae-AI/Trae/issues/1206)、[自定义 AI 路由 Custom AI API Endpoint](https://github.com/Trae-AI/Trae/issues/963)
3. 基于上述情况开发了Trae-Proxy以代理方式将OpenAI的API请求转发到自定义后端同时支持自定义模型ID映射和动态后端切换。
4. 希望官方早日上线自定义base_url的能力使Trae成为真正可自定义的IDE。
https://github.com/arch3rPro/Trae-Proxy/blob/main/asset/IDE-Chat.png?raw=true
## 📸 截图
<div align="center">
<table>
<tr>
<td align="center">
<h3>Custom-Model</h3>
<img src="https://raw.githubusercontent.com/arch3rPro/Trae-Proxy/refs/heads/main/asset/Custom-Model.png" alt="Custom-Model" width="330">
<br>
<em>支持自定义Openai兼容API</em>
</td>
<td align="center">
<h3>IDE-Builder</h3>
<img src="https://raw.githubusercontent.com/arch3rPro/Trae-Proxy/refs/heads/main/asset/IDE-Chat.png" alt="IDE-Chat" width="290">
<br>
<em>接入的Qwen3-Coder-Plus模型</em>
</td>
</tr>
</table>
</div>
## ✨ 主要功能
- **智能代理**: 拦截OpenAI API请求并转发到自定义后端
- **多后端支持**: 配置多个API后端支持动态切换
- **模型映射**: 自定义模型ID映射无缝替换目标模型
- **流式响应**: 支持流式和非流式响应模式切换
- **SSL证书**: 自动生成和管理自签名证书
- **Docker部署**: 一键容器化部署,支持生产环境
## ⚠️ 声明
1. **Trae-Proxy** 是一个拦截和重定向OpenAI API请求到自定义后端服务的工具不涉及任何修改和逆向官方软件的行为。
2. 本工具仅用于学习和研究目的,使用者应遵守相关法律法规和服务条款。
3. 理论上不仅是TraeIDE其他支持接入OpenAI SDK或API的服务的IDE或者客户端都可以无缝接入。
## 🚀 快速开始
Trae-Proxy安装和使用分为以下几个步骤
1. Trae-Proxy服务端安装、配置、启动
2. 客户端安装自签名证书修改Hosts映射将openai域名转发到代理服务上
3. IDE中添加模型服务商选择OpenAI自定义模型ID输入API密钥
### 使用Docker Compose推荐
```bash
# 克隆仓库
git clone https://github.com/arch3rpro/trae-proxy.git
cd trae-proxy
# 启动服务
docker-compose up -d
# 查看日志
docker-compose logs -f
```
### 手动部署
```bash
# 安装依赖
pip install -r requirements.txt
# 生成证书
python generate_certs.py
# 启动代理服务器
python trae_proxy.py
```
### 配置文件结构
Trae-Proxy 使用YAML格式的配置文件 `config.yaml`
```yaml
# Trae-Proxy 配置文件
# 代理域名配置
domain: api.openai.com
# 后端API配置列表
apis:
- name: "deepseek-r1"
endpoint: "https://api.deepseek.com"
custom_model_id: "deepseek-reasoner"
target_model_id: "deepseek-reasoner"
stream_mode: null
active: true
- name: "kimi-k2"
endpoint: "https://api.moonshot.cn"
custom_model_id: "kimi-k2-0711-preview"
target_model_id: "kimi-k2-0711-preview"
stream_mode: null
active: true
- name: "qwen3-coder-plus"
endpoint: "https://dashscope.aliyuncs.com/compatible-mode"
custom_model_id: "qwen3-coder-plus"
target_model_id: "qwen3-coder-plus"
stream_mode: null
active: true
# 代理服务器配置
server:
port: 443
debug: true
```
## 🖥️ 客户端配置
### 1. 获取服务器自签证书
从服务器复制CA证书到本地
```bash
# 从服务器复制CA证书
scp user@your-server-ip:/path/to/trae-proxy/ca/api.openai.com.crt .
```
### 2. 安装CA证书
#### Windows 系统
1. 双击 `api.openai.com.crt` 文件
2. 选择"安装证书"
3. 选择"本地计算机"
4. 选择"将所有证书放入下列存储" → "浏览" → "受信任的根证书颁发机构"
5. 完成安装
#### macOS 系统
1. 双击 `api.openai.com.crt` 文件,系统会打开"钥匙串访问"
2. 将证书添加到"系统"钥匙串
3. 双击导入的证书,展开"信任"部分
4. 将"使用此证书时"设置为"始终信任"
5. 关闭窗口并输入管理员密码确认
### 3. 修改hosts文件
#### Windows 系统
1. 以管理员身份编辑 `C:\Windows\System32\drivers\etc\hosts`
2. 添加以下行替换为您的服务器IP
```
your-server-ip api.openai.com
```
#### macOS 系统
1. 打开终端
2. 执行 `sudo vim /etc/hosts`
3. 添加以下行替换为您的服务器IP
```
your-server-ip api.openai.com
```
### 4. 测试连接
```bash
curl https://api.openai.com/v1/models
```
如果配置正确,您应该能看到代理服务器返回的模型列表。
## 🔧 系统要求
- **服务器端**: Python 3.9+, OpenSSL, Docker
- **客户端**: 管理员权限修改hosts文件、安装证书
## 📁 项目结构
```
trae-proxy/
├── trae_proxy.py # 主代理服务器
├── trae_proxy_cli.py # 命令行管理工具
├── generate_certs.py # 证书生成工具
├── config.yaml # 配置文件
├── docker-compose.yml # Docker部署配置
├── requirements.txt # Python依赖
└── ca/ # 证书和密钥目录
```
## 🔍 实现原理
```
+------------------+ +--------------+ +------------------+
| | | | | |
| | | | | |
| DeepSeek API +--->+ +--->+ Trae IDE |
| | | | | |
| Moonshot API +--->+ +--->+ VSCode |
| | | | | |
| Aliyun API +--->+ Trae-Proxy +--->+ JetBrains |
| | | | | |
| Self-hosted LLM +--->+ +--->+ OpenAI Clients |
| | | | | |
| Other API Svcs +--->+ | | |
| | | | | |
| | | | | |
+------------------+ +--------------+ +------------------+
Backend Services Proxy Server Client Apps
```
## 💡 使用场景
- **API代理**: 将OpenAI API请求转发到私有部署的模型服务
- **模型替换**: 使用自定义模型替换OpenAI官方模型
- **负载均衡**: 在多个后端服务之间分发请求
- **开发测试**: 本地开发环境中的API模拟和测试
## 📝 许可证
本项目采用MIT许可证 - 查看 [LICENSE](LICENSE) 文件了解详情。

View File

@@ -0,0 +1,230 @@
A high-performance, low-latency API proxy middleware designed for large language model applications, capable of seamlessly intercepting and redirecting OpenAI API requests to any custom backend service. Supports multi-backend load balancing, intelligent routing, dynamic model mapping, and streaming response handling, allowing you to break through official API limitations and freely choose and integrate various LLM services while maintaining complete compatibility with existing applications.
## 📢 Introduction
1. Trae IDE currently supports custom model providers, but only those fixed in the list, and does not support custom base_url, making it impossible to use your own API service.
2. There are many related issues on Github, but the official response is minimal, such as: [Add custom model provider base_url capability](https://github.com/Trae-AI/Trae/issues/1206), [Custom AI API Endpoint](https://github.com/Trae-AI/Trae/issues/963)
3. Based on this situation, Trae-Proxy was developed to proxy OpenAI API requests to custom backends, while supporting custom model ID mapping and dynamic backend switching.
4. We hope the official team will soon implement custom base_url capability, making Trae a truly customizable IDE.
## 📸 Screenshots
<div align="center">
<table>
<tr>
<td align="center">
<h3>Custom-Model</h3>
<img src="https://raw.githubusercontent.com/arch3rPro/Trae-Proxy/refs/heads/main/asset/Custom-Model.png" alt="Custom-Model" width="330">
<br>
<em>Support for custom OpenAI-compatible APIs</em>
</td>
<td align="center">
<h3>IDE-Builder</h3>
<img src="https://raw.githubusercontent.com/arch3rPro/Trae-Proxy/refs/heads/main/asset/IDE-Chat.png" alt="IDE-Chat" width="290">
<br>
<em>Integration with Qwen3-Coder-Plus model</em>
</td>
</tr>
<tr>
<td align="center" colspan="2">
<h3>SSL-Certificate</h3>
<img src="./asset/SSL-Certificate.png" alt="SSL-Certificate" width="600">
<br>
<em>Setting up trust for self-signed certificates</em>
</td>
</tr>
</table>
</div>
## ✨ Key Features
- **Intelligent Proxy**: Intercept OpenAI API requests and forward them to custom backends
- **Multi-Backend Support**: Configure multiple API backends with dynamic switching
- **Model Mapping**: Custom model ID mapping for seamless model replacement
- **Streaming Response**: Support for both streaming and non-streaming response modes
- **SSL Certificates**: Automatic generation and management of self-signed certificates
- **Docker Deployment**: One-click containerized deployment for production environments
## ⚠️ Disclaimer
1. **Trae-Proxy** is a tool for intercepting and redirecting OpenAI API requests to custom backend services, without modifying or reverse engineering official software.
2. This tool is for learning and research purposes only. Users should comply with relevant laws, regulations, and service terms.
3. Theoretically, not only TraeIDE but also other IDEs or clients that support OpenAI SDK or API can seamlessly integrate with this tool.
## 🚀 Quick Start
Trae-Proxy installation and usage consists of the following steps:
1. Install, configure, and start the Trae-Proxy server
2. Install self-signed certificates on the client and modify hosts mapping (to forward OpenAI domain to the proxy service)
3. Add models in the IDE, select OpenAI as the provider, customize model ID, and enter API key
### Using Docker Compose (Recommended)
```bash
# Clone the repository
git clone https://github.com/arch3rpro/trae-proxy.git
cd trae-proxy
# Start the service
docker-compose up -d
# View logs
docker-compose logs -f
```
### Manual Deployment
```bash
# Install dependencies
pip install -r requirements.txt
# Generate certificates
python generate_certs.py
# Start the proxy server
python trae_proxy.py
```
### Configuration File Structure
Trae-Proxy uses a YAML format configuration file `config.yaml`:
```yaml
# Trae-Proxy configuration file
# Proxy domain configuration
domain: api.openai.com
# Backend API configuration list
apis:
- name: "deepseek-r1"
endpoint: "https://api.deepseek.com"
custom_model_id: "deepseek-reasoner"
target_model_id: "deepseek-reasoner"
stream_mode: null
active: true
- name: "kimi-k2"
endpoint: "https://api.moonshot.cn"
custom_model_id: "kimi-k2-0711-preview"
target_model_id: "kimi-k2-0711-preview"
stream_mode: null
active: true
- name: "qwen3-coder-plus"
endpoint: "https://dashscope.aliyuncs.com/compatible-mode"
custom_model_id: "qwen3-coder-plus"
target_model_id: "qwen3-coder-plus"
stream_mode: null
active: true
# Proxy server configuration
server:
port: 443
debug: true
```
## 🖥️ Client Configuration
### 1. Get Server Self-Signed Certificate
Copy the CA certificate from the server to your local machine:
```bash
# Copy CA certificate from server
scp user@your-server-ip:/path/to/trae-proxy/ca/api.openai.com.crt .
```
### 2. Install CA Certificate
#### Windows
1. Double-click the `api.openai.com.crt` file
2. Select "Install Certificate"
3. Select "Local Machine"
4. Select "Place all certificates in the following store" → "Browse" → "Trusted Root Certification Authorities"
5. Complete the installation
#### macOS
1. Double-click the `api.openai.com.crt` file, which will open "Keychain Access"
2. Add the certificate to the "System" keychain
3. Double-click the imported certificate, expand the "Trust" section
4. Set "When using this certificate" to "Always Trust"
5. Close the window and enter your administrator password to confirm
### 3. Modify Hosts File
#### Windows
1. Edit `C:\Windows\System32\drivers\etc\hosts` as administrator
2. Add the following line (replace with your server IP):
```
your-server-ip api.openai.com
```
#### macOS
1. Open Terminal
2. Execute `sudo vim /etc/hosts`
3. Add the following line (replace with your server IP):
```
your-server-ip api.openai.com
```
### 4. Test Connection
```bash
curl https://api.openai.com/v1/models
```
If configured correctly, you should see the model list returned by the proxy server.
## 🔧 System Requirements
- **Server**: Python 3.9+, OpenSSL, Docker
- **Client**: Administrator privileges (for modifying hosts file and installing certificates)
## 📁 Project Structure
```
trae-proxy/
├── trae_proxy.py # Main proxy server
├── trae_proxy_cli.py # Command-line management tool
├── generate_certs.py # Certificate generation tool
├── config.yaml # Configuration file
├── docker-compose.yml # Docker deployment configuration
├── requirements.txt # Python dependencies
└── ca/ # Certificates and keys directory
```
## 🔍 How It Works
```
+------------------+ +--------------+ +------------------+
| | | | | |
| | | | | |
| DeepSeek API +--->+ +--->+ Trae IDE |
| | | | | |
| Moonshot API +--->+ +--->+ VSCode |
| | | | | |
| Aliyun API +--->+ Trae-Proxy +--->+ JetBrains |
| | | | | |
| Self-hosted LLM +--->+ +--->+ OpenAI Clients |
| | | | | |
| Other API Svcs +--->+ | | |
| | | | | |
| | | | | |
+------------------+ +--------------+ +------------------+
Backend Services Proxy Server Client Apps
```
## 💡 Use Cases
- **API Proxy**: Forward OpenAI API requests to privately deployed model services
- **Model Replacement**: Replace official OpenAI models with custom models
- **Load Balancing**: Distribute requests among multiple backend services
- **Development Testing**: API simulation and testing in local development environments
## 📝 License
This project is licensed under the MIT License - see the [LICENSE](LICENSE) file for details.

26
apps/trae-proxy/data.yml Normal file
View File

@@ -0,0 +1,26 @@
name: Trae-Proxy
tags:
- 实用工具
- AI
- 开发工具
title: 一个智能的API代理工具专门用于拦截和重定向OpenAI API请求到自定义后端服务
description:
en: An intelligent API proxy tool designed to intercept and redirect OpenAI API
requests.
zh: 一个智能的API代理工具专门用于拦截和重定向OpenAI API请求到自定义后端服务
additionalProperties:
key: trae-proxy
name: Trae-Proxy
tags:
- Tool
- AI
- DevTool
shortDescZh: 一个智能的API代理工具专门用于拦截和重定向OpenAI API请求到自定义后端服务
shortDescEn: An intelligent API proxy tool designed to intercept and redirect
OpenAI API requests.
type: tool
crossVersionUpdate: true
limit: 0
website: https://github.com/arch3rPro/Trae-Proxy
github: https://github.com/arch3rPro/Trae-Proxy
document: https://github.com/arch3rPro/Trae-Proxy

View File

@@ -0,0 +1,28 @@
# Trae-Proxy 配置文件
# 代理域名配置
domain: api.openai.com
# 后端API配置列表
apis:
- name: "deepseek-r1"
endpoint: "https://api.deepseek.com"
custom_model_id: "deepseek-reasoner"
target_model_id: "deepseek-reasoner"
stream_mode: null
active: true
- name: "kimi-k2"
endpoint: "https://api.moonshot.cn"
custom_model_id: "kimi-k2-0711-preview"
target_model_id: "kimi-k2-0711-preview"
stream_mode: null
active: true
- name: "qwen3-coder-plus"
endpoint: "https://dashscope.aliyuncs.com/compatible-mode"
custom_model_id: "qwen3-coder-plus"
target_model_id: "qwen3-coder-plus"
stream_mode: null
active: true
# 代理服务器配置
server:
port: 443
debug: true

View File

@@ -0,0 +1,11 @@
additionalProperties:
formFields:
- default: "443"
envKey: PANEL_APP_PORT_HTTPS
label:
en: Proxy-Port
zh: 代理端口
required: true
type: number
edit: true
rule: paramPort

View File

@@ -0,0 +1,18 @@
services:
trae-proxy:
image: vuldocker/trae-proxy:latest
container_name: ${CONTAINER_NAME}
ports:
- "${PANEL_APP_PORT_HTTPS}:443"
volumes:
- ./ca:/app/ca
- ./config.yaml:/app/config.yaml
restart: unless-stopped
networks:
- 1panel-network
labels:
createdBy: "Apps"
networks:
1panel-network:
external: true

BIN
apps/trae-proxy/logo.png Normal file

Binary file not shown.

After

Width:  |  Height:  |  Size: 9.3 KiB