Skip to content

Commit

Permalink
Merge pull request #18 from 006lp/master
Browse files Browse the repository at this point in the history
feat: upgrade Llama to 3.3 and add proxy support for ddg-chat
  • Loading branch information
leafmoes authored Feb 11, 2025
2 parents 1be1235 + 69939b4 commit 9c8dd2f
Show file tree
Hide file tree
Showing 3 changed files with 22 additions and 11 deletions.
12 changes: 6 additions & 6 deletions api/index.js
Original file line number Diff line number Diff line change
Expand Up @@ -75,7 +75,7 @@ router.get(config.API_PREFIX + '/v1/models', () =>
data: [
{ id: 'gpt-4o-mini', object: 'model', owned_by: 'ddg' },
{ id: 'claude-3-haiku', object: 'model', owned_by: 'ddg' },
{ id: 'llama-3.1-70b', object: 'model', owned_by: 'ddg' },
{ id: 'llama-3.3-70b', object: 'model', owned_by: 'ddg' },
{ id: 'mixtral-8x7b', object: 'model', owned_by: 'ddg' },
{ id: 'o3-mini', object: 'model', owned_by: 'ddg' },
],
Expand Down Expand Up @@ -215,9 +215,9 @@ function messagesPrepare(messages) {
if (['user', 'assistant'].includes(role)) {
const contentStr = Array.isArray(message.content)
? message.content
.filter((item) => item.text)
.map((item) => item.text)
.join('') || ''
.filter((item) => item.text)
.map((item) => item.text)
.join('') || ''
: message.content
content += `${role}:${contentStr};\r\n`
}
Expand Down Expand Up @@ -247,8 +247,8 @@ function convertModel(inputModel) {
case 'claude-3-haiku':
model = 'claude-3-haiku-20240307'
break
case 'llama-3.1-70b':
model = 'meta-llama/Meta-Llama-3.1-70B-Instruct-Turbo'
case 'llama-3.3-70b':
model = 'meta-llama/Llama-3.3-70B-Instruct-Turbo'
break
case 'mixtral-8x7b':
model = 'mistralai/Mixtral-8x7B-Instruct-v0.1'
Expand Down
10 changes: 10 additions & 0 deletions docker-compose.yml
Original file line number Diff line number Diff line change
Expand Up @@ -7,3 +7,13 @@ services:
restart: unless-stopped
ports:
- "8787:8787"
#environment:
# - HTTP_PROXY=http://<your_proxy_address>:<proxy_port> #http代理
# - HTTPS_PROXY=http://<your_proxy_address>:<proxy_port> #https代理
# - NO_PROXY="localhost,127.0.0.1" # 用于忽略本地请求
#networks:
# - bridge-network

#networks:
# bridge-network:
# driver: bridge
11 changes: 6 additions & 5 deletions readme.md
Original file line number Diff line number Diff line change
Expand Up @@ -5,7 +5,7 @@

支持 Vercel, Cloudflare Workers, Docker, Render 等

支持 GPT4o mini, Claude 3 Haiku, Llama 3.1 70B, Mixtral 8x7B 模型
支持 o3 mini, GPT 4o mini, Claude 3 Haiku, Llama 3.3 70B, Mixtral 8x7B 模型

所有模型均由 DuckDuckGo 匿名提供

Expand Down Expand Up @@ -72,15 +72,16 @@ curl --request POST 'https://chatcfapi.r12.top/v1/chat/completions' \

- gpt-4o-mini
- claude-3-haiku
- llama-3.1-70b
- llama-3.3-70b
- mixtral-8x7b
- o3-mini

## 手动部署

由于 DDG API 限制单 IP 并发数,推荐使用 Vercel 进行部署,如果使用 Docker 之类的本地部署,请确保项目运行在代理池中。
为了避免触发 DDG API 的并发限制,在使用 Docker 等本地部署方案时,请确保项目运行在代理池中。
同时,由于 Vercel 和 Cloudflare 的 IP 已被 DDG 屏蔽(可能由于过多用户使用或触发临时风控),不再建议通过这些方式部署。

### Vercel
### Vercel(不推荐)

方法一:云端 Fork 仓库部署

Expand Down Expand Up @@ -110,7 +111,7 @@ npm run publish

[<img src="https://render.com/images/deploy-to-render-button.svg" alt="Deploy on Render" height="30">](https://render.com/deploy)

### Cloudflare Workers
### Cloudflare Workers(不推荐)

方法一:

Expand Down

0 comments on commit 9c8dd2f

Please sign in to comment.