mirror of
https://github.com/arch3rPro/1Panel-Appstore.git
synced 2026-04-15 00:17:12 +08:00
feat: add app Raycast AI OpenRouter Proxy
This commit is contained in:
17
README.md
17
README.md
@@ -162,7 +162,7 @@
|
||||
|
||||
<table>
|
||||
<tr>
|
||||
<td width="100%" align="center">
|
||||
<td width="33%" align="center">
|
||||
|
||||
<a href="./apps/gpt4free/README.md">
|
||||
<img src="./apps/gpt4free/logo.png" width="60" height="60" alt="GPT4Free">
|
||||
@@ -173,6 +173,21 @@
|
||||
|
||||
<kbd>0.5.7.0</kbd> • [官网链接](https://github.com/xtekky/gpt4free)
|
||||
|
||||
</td>
|
||||
<td width="33%" align="center">
|
||||
|
||||
<a href="./apps/raycast-ai-openrouter-proxy/README.md">
|
||||
<img src="./apps/raycast-ai-openrouter-proxy/logo.png" width="60" height="60" alt="Raycast AI OpenRouter Proxy">
|
||||
<br><b>Raycast AI OpenRouter Proxy</b>
|
||||
</a>
|
||||
|
||||
🚀 Raycast AI的OpenAI兼容API代理,无需Pro订阅
|
||||
|
||||
<kbd>latest</kbd> • [官网链接](https://github.com/miikkaylisiurunen/raycast-ai-openrouter-proxy)
|
||||
|
||||
</td>
|
||||
<td width="33%" align="center">
|
||||
|
||||
</td>
|
||||
</tr>
|
||||
</table>
|
||||
|
||||
102
apps/raycast-ai-openrouter-proxy/README.md
Normal file
102
apps/raycast-ai-openrouter-proxy/README.md
Normal file
@@ -0,0 +1,102 @@
|
||||
# Raycast AI OpenRouter 代理
|
||||
|
||||
这个项目提供了一个代理服务器,允许Raycast AI使用任何OpenAI兼容的API(如OpenAI、Gemini、OpenRouter等)。这为Raycast AI带来了"自带密钥"(BYOK)功能,意味着你可以使用自己的API密钥和所选提供商的模型。默认情况下,代理配置为使用OpenRouter。
|
||||
|
||||

|
||||
|
||||

|
||||
|
||||
**无需Raycast Pro订阅!** 🎉
|
||||
|
||||
这个代理允许在Raycast内使用自定义模型,包括**AI聊天**、**AI命令**、**快速AI**和**AI预设**,让你在Raycast的原生AI体验中拥有自定义模型和自己API密钥的灵活性。
|
||||
|
||||
## 功能
|
||||
|
||||
该代理旨在提供在Raycast中使用自定义模型的无缝体验。以下是支持和不支持的功能:
|
||||
|
||||
### 支持:
|
||||
|
||||
- 🧠 **任何模型**:访问OpenAI兼容提供商提供的各种模型。默认使用OpenRouter。
|
||||
- 👀 **视觉支持**:使用能够处理图像的模型。
|
||||
- 🛠️ **AI扩展和MCP**:使用你喜欢的AI扩展和MCP服务器。
|
||||
- 📝 **系统指令**:提供系统级提示以指导模型行为。
|
||||
- 📎 **附件**:附加与官方模型相同的所有内容。
|
||||
- 🔨 **并行工具使用**:同时进行多个工具调用。
|
||||
- ⚡ **流式传输**:实时获取模型响应。
|
||||
- 🔤 **聊天标题生成**:自动生成聊天标题。
|
||||
- 🛑 **流取消**:停止模型的持续响应。
|
||||
|
||||
### 部分支持:
|
||||
|
||||
- 💭 **显示思考过程**:查看模型的思考过程。
|
||||
- 并非所有提供商都支持此功能,因为OpenAI API规范没有为其定义标准。例如,使用OpenRouter时,支持的模型默认始终显示思考过程。其他提供商可能默认不发送,需要通过模型配置中的`extra`字段进行额外设置。
|
||||
|
||||
### 不支持:
|
||||
|
||||
- 🌐 **远程工具**:某些AI扩展被归类为"远程工具",不受支持。这些包括网络搜索和图像生成,以及其他一些工具。如果你想要类似的工具,可以使用MCP服务器替代。
|
||||
|
||||
## 要求
|
||||
|
||||
- Docker
|
||||
- 你选择的提供商的API密钥(例如OpenRouter)
|
||||
|
||||
## 模型配置
|
||||
|
||||
代理的行为主要通过项目根目录中的`models.json`文件配置。此文件定义了Raycast可用的模型及其特定设置。每个JSON数组条目代表一个模型,可以包括以下属性:
|
||||
|
||||
- `name`:模型名称,将在Raycast中显示。
|
||||
- `id`:提供商期望格式的模型ID。
|
||||
- `contextLength`:模型支持的最大上下文长度(以token为单位)。仅影响Raycast的UI,不影响模型本身。
|
||||
- `capabilities`:表示模型功能的字符串数组。
|
||||
- `"vision"`:模型可以处理图像。
|
||||
- `"tools"`:模型支持AI扩展和MCP(工具调用)。
|
||||
- `temperature`:(可选)控制模型的创造性。值在0到2之间。
|
||||
- `topP`:(可选)另一个控制输出随机性的参数,值在0到1之间。
|
||||
- `max_tokens`:(可选)模型在单个响应中允许生成的最大token数。
|
||||
- `extra`:(可选)用于高级、特定于提供商的配置的对象。这些选项直接传递给提供商的API。
|
||||
|
||||
### 默认配置
|
||||
|
||||
默认models.json中已配置OpenRouter6个模型,其中前四个为免费模型,后两个为收费模型,可根据需求自行修改models.json配置。Openrouter官方模型查询:
|
||||
|
||||

|
||||
|
||||
- Moonshotai Kimi-K2
|
||||
- DeepSeek V3
|
||||
- Qwen3 Coder
|
||||
- DeepSeek-R1-0528
|
||||
- Gemini 2.5 Flash Thinking
|
||||
- Claude Sonnet 4
|
||||
|
||||
修改models.json后,请重新启动应用或容器。
|
||||
|
||||
|
||||
## 使用方法
|
||||
|
||||
1. 在1Panel中安装应用后,配置以下参数:
|
||||
- 代理服务器端口:默认为11435,可根据需要更改
|
||||
- API密钥:你的API提供商密钥
|
||||
- API基础URL:默认为OpenRouter的API端点,可更改为其他OpenAI兼容的API端点
|
||||
|
||||
2. 在Raycast设置中,进入AI > Ollama Host并将主机设置为`localhost:11435`(或你配置的端口)。
|
||||
|
||||
3. 现在你可以在Raycast中使用自定义模型,享受无需Pro订阅的AI功能!
|
||||
|
||||
## 常见问题
|
||||
|
||||
### 这与官方Raycast BYOK功能相比如何?
|
||||
|
||||
Raycast在v1.100.0中发布了内置的BYOK功能。官方实现与此代理有几点不同:
|
||||
|
||||
- 它只支持Anthropic、Google和OpenAI。此代理支持任何OpenAI兼容的提供商。
|
||||
- 所有消息都通过Raycast的服务器传输。
|
||||
- 你的API密钥会发送到Raycast的服务器。
|
||||
- 你对模型及其配置的控制较少。
|
||||
|
||||
### 是否需要Raycast Pro订阅才能使用?
|
||||
|
||||
不需要,此代理的主要优势之一是无需Raycast Pro订阅即可在Raycast中使用自定义模型。
|
||||
|
||||
### 我是否需要安装Ollama?
|
||||
|
||||
不需要,你不需要安装Ollama。
|
||||
101
apps/raycast-ai-openrouter-proxy/README_en.md
Normal file
101
apps/raycast-ai-openrouter-proxy/README_en.md
Normal file
@@ -0,0 +1,101 @@
|
||||
# Raycast AI OpenRouter Proxy
|
||||
|
||||
This project provides a proxy server that allows Raycast AI to utilize models from any OpenAI-compatible API (OpenAI, Gemini, OpenRouter, etc.). This brings "Bring Your Own Key" (BYOK) functionality to Raycast AI, meaning you can use your own API key and models from your chosen provider. By default, the proxy is configured to use OpenRouter.
|
||||
|
||||

|
||||
|
||||

|
||||
|
||||
**No Raycast Pro subscription required!** 🎉
|
||||
|
||||
This proxy allows using custom models inside Raycast, including **AI Chat**, **AI Commands**, **Quick AI**, and **AI Presets**, giving you Raycast's native AI experience with the flexibility of custom models and your own API key.
|
||||
|
||||
## Features
|
||||
|
||||
This proxy aims to provide a seamless experience for using custom models within Raycast. Here's what is supported and what is not:
|
||||
|
||||
### Supported:
|
||||
|
||||
- 🧠 **Any model**: Access the wide range of models offered by OpenAI-compatible providers. OpenRouter is used by default.
|
||||
- 👀 **Vision support**: Use models capable of processing images.
|
||||
- 🛠️ **AI Extensions & MCP**: Use your favorite AI Extensions and MCP servers.
|
||||
- 📝 **System instructions**: Provide system-level prompts to guide model behavior.
|
||||
- 📎 **Attachments**: Attach all the same things as with the official models.
|
||||
- 🔨 **Parallel tool use**: Make multiple tool calls simultaneously.
|
||||
- ⚡ **Streaming**: Get real-time responses from models.
|
||||
- 🔤 **Chat title generation**: Automatically generate chat titles.
|
||||
- 🛑 **Stream cancellation**: Stop ongoing responses from models.
|
||||
|
||||
### Partial Support:
|
||||
|
||||
- 💭 **Displaying thinking process**: See the model's thinking process.
|
||||
- This feature isn't supported by all providers because the OpenAI API specification does not define a standard for it. For example, when using OpenRouter, the thinking process is always shown by default for supported models. Other providers may not send it by default and require extra setup via the `extra` field in the model's configuration as described in the provider's documentation.
|
||||
|
||||
### Not Supported:
|
||||
|
||||
- 🌐 **Remote tools**: Some AI Extensions are classified as "remote tools" and are not supported. These include web search and image generation, as well as some others. You can replace these with MCP servers if you would like similar tools.
|
||||
|
||||
## Requirements
|
||||
|
||||
- Docker
|
||||
- API key for your chosen provider (e.g., OpenRouter)
|
||||
|
||||
## Model Configuration
|
||||
|
||||
The proxy's behavior is primarily configured through a `models.json` file in the root directory of the project. This file defines the models available to Raycast and their specific settings. Each entry in the JSON array represents a model and can include the following properties:
|
||||
|
||||
- `name`: The name of the model as it will appear in Raycast.
|
||||
- `id`: The model ID in the format expected by your provider.
|
||||
- `contextLength`: The maximum context length (in tokens) the model supports. Only affects Raycast's UI and not the model itself.
|
||||
- `capabilities`: An array of strings indicating the model's capabilities.
|
||||
- `"vision"`: The model can process images.
|
||||
- `"tools"`: The model supports AI Extensions and MCP (tool calling).
|
||||
- `temperature`: (Optional) Controls the creativity of the model. A value between 0 and 2.
|
||||
- `topP`: (Optional) Another parameter to control the randomness of the output, a value between 0 and 1.
|
||||
- `max_tokens`: (Optional) The maximum number of tokens the model is allowed to generate in a single response.
|
||||
- `extra`: (Optional) An object for advanced, provider-specific configurations. These options are passed directly to the provider's API.
|
||||
|
||||
### Default Configuration
|
||||
|
||||
The default models.json has 6 OpenRouter models configured, with the first four being free models and the last two being paid models. You can modify the models.json configuration according to your needs. Official OpenRouter model query:
|
||||
|
||||

|
||||
|
||||
- Moonshotai Kimi-K2
|
||||
- DeepSeek V3
|
||||
- Qwen3 Coder
|
||||
- DeepSeek-R1-0528
|
||||
- Gemini 2.5 Flash Thinking
|
||||
- Claude Sonnet 4
|
||||
|
||||
After modifying models.json, please restart the application or container.
|
||||
|
||||
## Usage
|
||||
|
||||
1. After installing the application in 1Panel, configure the following parameters:
|
||||
- Proxy Server Port: Default is 11435, can be changed as needed
|
||||
- API Key: Your API provider key
|
||||
- API Base URL: Default is OpenRouter's API endpoint, can be changed to other OpenAI-compatible API endpoints
|
||||
|
||||
2. In Raycast settings, go to AI > Ollama Host and set the host to `localhost:11435` (or the port you configured).
|
||||
|
||||
3. Now you can use custom models in Raycast and enjoy AI features without a Pro subscription!
|
||||
|
||||
## FAQ
|
||||
|
||||
### How does this compare to the official Raycast BYOK feature?
|
||||
|
||||
Raycast released a built-in BYOK feature in v1.100.0. The official implementation has a few differences compared to this proxy:
|
||||
|
||||
- It only supports Anthropic, Google and OpenAI. This proxy supports any OpenAI-compatible provider.
|
||||
- All your messages go through Raycast's servers.
|
||||
- Your API keys are sent to Raycast's servers.
|
||||
- You have less control over the models and their configurations.
|
||||
|
||||
### Is a Raycast Pro subscription required to use this?
|
||||
|
||||
No, one of the main benefits of this proxy is to enable the use of custom models within Raycast without needing a Raycast Pro subscription.
|
||||
|
||||
### Do I need to install Ollama?
|
||||
|
||||
No, you do not need to install Ollama.
|
||||
28
apps/raycast-ai-openrouter-proxy/data.yml
Normal file
28
apps/raycast-ai-openrouter-proxy/data.yml
Normal file
@@ -0,0 +1,28 @@
|
||||
name: Raycast AI OpenRouter Proxy
|
||||
tags:
|
||||
- AI / 大模型
|
||||
- 开发工具
|
||||
title: Raycast AI OpenRouter代理
|
||||
description: 允许Raycast AI使用任何OpenAI兼容API的代理服务器,支持OpenAI、Gemini、OpenRouter等模型,无需Raycast Pro订阅。
|
||||
additionalProperties:
|
||||
key: raycast-ai-openrouter-proxy
|
||||
name: Raycast AI OpenRouter Proxy
|
||||
tags:
|
||||
- AI
|
||||
- 开发工具
|
||||
shortDescZh: Raycast AI的OpenAI兼容API代理
|
||||
shortDescEn: OpenAI-compatible API proxy for Raycast AI
|
||||
type: website
|
||||
crossVersionUpdate: true
|
||||
limit: 0
|
||||
recommend: 0
|
||||
website: https://github.com/miikkaylisiurunen/raycast-ai-openrouter-proxy
|
||||
github: https://github.com/miikkaylisiurunen/raycast-ai-openrouter-proxy
|
||||
description:
|
||||
en: A proxy server that allows Raycast AI to utilize models from any OpenAI-compatible API without requiring a Raycast Pro subscription
|
||||
zh: 允许Raycast AI使用任何OpenAI兼容API的代理服务器,无需Raycast Pro订阅
|
||||
zh-Hant: 允許Raycast AI使用任何OpenAI兼容API的代理服務器,無需Raycast Pro訂閱
|
||||
memoryRequired: 512
|
||||
architectures:
|
||||
- amd64
|
||||
- arm64
|
||||
35
apps/raycast-ai-openrouter-proxy/latest/data.yml
Normal file
35
apps/raycast-ai-openrouter-proxy/latest/data.yml
Normal file
@@ -0,0 +1,35 @@
|
||||
additionalProperties:
|
||||
formFields:
|
||||
- default: 11435
|
||||
envKey: PANEL_APP_PORT_HTTP
|
||||
labelEn: Proxy Server Port
|
||||
labelZh: 代理服务器端口
|
||||
label:
|
||||
en: Proxy Server Port
|
||||
zh: 代理服务器端口
|
||||
zh-Hant: 代理服務器埠
|
||||
required: true
|
||||
rule: paramPort
|
||||
type: number
|
||||
- default: ""
|
||||
envKey: PANEL_APP_API_KEY
|
||||
labelEn: API Key
|
||||
labelZh: API密钥
|
||||
label:
|
||||
en: API Key
|
||||
zh: API密钥
|
||||
zh-Hant: API密鑰
|
||||
required: true
|
||||
rule: ""
|
||||
type: password
|
||||
- default: "https://openrouter.ai/api/v1"
|
||||
envKey: PANEL_APP_BASE_URL
|
||||
labelEn: API Base URL
|
||||
labelZh: API基础URL
|
||||
label:
|
||||
en: API Base URL
|
||||
zh: API基础URL
|
||||
zh-Hant: API基礎URL
|
||||
required: true
|
||||
rule: ""
|
||||
type: text
|
||||
19
apps/raycast-ai-openrouter-proxy/latest/docker-compose.yml
Normal file
19
apps/raycast-ai-openrouter-proxy/latest/docker-compose.yml
Normal file
@@ -0,0 +1,19 @@
|
||||
services:
|
||||
raycast-ai-proxy:
|
||||
image: vuldocker/raycast-ai-openrouter-proxy:latest
|
||||
container_name: ${CONTAINER_NAME}
|
||||
restart: always
|
||||
networks:
|
||||
- 1panel-network
|
||||
ports:
|
||||
- ${PANEL_APP_PORT_HTTP}:3000
|
||||
volumes:
|
||||
- ./models.json:/app/models.json:ro
|
||||
environment:
|
||||
- API_KEY=${PANEL_APP_API_KEY}
|
||||
- BASE_URL=${PANEL_APP_BASE_URL}
|
||||
labels:
|
||||
createdBy: "Apps"
|
||||
networks:
|
||||
1panel-network:
|
||||
external: true
|
||||
41
apps/raycast-ai-openrouter-proxy/latest/models.json
Normal file
41
apps/raycast-ai-openrouter-proxy/latest/models.json
Normal file
@@ -0,0 +1,41 @@
|
||||
[
|
||||
{
|
||||
"name": "Moonshotai Kimi-K2",
|
||||
"id": "moonshotai/kimi-k2:free",
|
||||
"contextLength": 33000,
|
||||
"capabilities": ["vision", "tools"],
|
||||
"temperature": 0
|
||||
},
|
||||
{
|
||||
"name": "DeepSeek V3",
|
||||
"id": "deepseek/deepseek-chat-v3-0324:free",
|
||||
"contextLength": 33000,
|
||||
"capabilities": ["tools"]
|
||||
},
|
||||
{
|
||||
"name": "Qwen3 Coder",
|
||||
"id": "qwen/qwen3-coder:free",
|
||||
"contextLength": 262000,
|
||||
"capabilities": ["tools"]
|
||||
},
|
||||
{
|
||||
"name": "DeepSeek-R1-0528",
|
||||
"id": "deepseek/deepseek-r1-0528:free",
|
||||
"contextLength": 164000,
|
||||
"capabilities": ["vision", "tools"]
|
||||
},
|
||||
{
|
||||
"name": "Gemini 2.5 Flash Thinking",
|
||||
"id": "google/gemini-2.5-flash-preview-05-20:thinking",
|
||||
"contextLength": 1000000,
|
||||
"capabilities": ["vision", "tools"],
|
||||
"temperature": 0
|
||||
},
|
||||
{
|
||||
"name": "Claude Sonnet 4",
|
||||
"id": "anthropic/claude-sonnet-4",
|
||||
"contextLength": 200000,
|
||||
"capabilities": ["vision", "tools"],
|
||||
"temperature": 0.7
|
||||
}
|
||||
]
|
||||
BIN
apps/raycast-ai-openrouter-proxy/logo.png
Normal file
BIN
apps/raycast-ai-openrouter-proxy/logo.png
Normal file
Binary file not shown.
|
After Width: | Height: | Size: 18 KiB |
Reference in New Issue
Block a user