logo
1
0
WeChat Login
feat: 支持CNB的向量模型

CNB Edge Gateway

EdgeOne Pages Node Functions Proxy for CNB LLM API, fully compatible with OpenAI standard interface, solving CORS issues.

English | 简体中文

Mirror Repositories: CNB, GitHub

Introduction

This project proxies the CNB AI Chat Completions API, converting it to OpenAI standard interface format for easy use with various OpenAI-compatible clients.

Original CNB API

curl --request POST \ --url https://api.cnb.cool/{repo}/-/ai/chat/completions \ --header 'Accept: application/json' \ --header 'Authorization: 123' \ --header 'Content-Type: application/json' \ --data '{ "messages": [ { "content": "string", "role": "string" } ], "model": "string", "stream": true }'

Authentication: Requires CNB access token with repo-code:r permission, passed via Authorization: Bearer <token> header.

Proxied Interface

curl --request POST \ --url https://your-edge-pages-domain/v1/chat/completions \ --header 'Authorization: Bearer YOUR_CNB_TOKEN' \ --header 'Content-Type: application/json' \ --data '{ "messages": [ { "content": "string", "role": "string" } ], "model": "string", "stream": true }'
ComparisonOriginal CNB APIProxied (OpenAI Standard)
EndpointPOST /{repo}/-/ai/chat/completionsPOST /v1/chat/completions
AuthenticationAuthorization: Bearer <token>Authorization: Bearer <token> (pass-through)
Repository ConfigSpecified in URL pathConfigured via CNB_REPO environment variable

Features

  • ✅ Fully compatible with OpenAI API standard interface
  • ✅ Supports streaming responses (SSE)
  • ✅ Supports all domains CORS
  • ✅ Request/response logging
  • ✅ Standardized error response format
  • ✅ Auto-removes sk- prefix (compatibility mechanism)
  • ✅ Customizable AI endpoint path

API Reference

EndpointMethodDescription
/v1/chat/completionsPOSTChat completions endpoint
/v1/embeddingsPOSTEmbeddings endpoint
/v1/modelsGETModels list endpoint

Deployment

1. Fork this repository

2. Create project in EdgeOne Pages

Connect your Git repository, EdgeOne Pages will automatically detect Node Functions.

3. Configure environment variables

In EdgeOne Pages console: Project Settings → Environment Variables, add:

VariableDescriptionExampleRequired
CNB_REPOCNB repository path (owner/project/repo)Mintimate/code-nest/cnb-edge-gateway
CNB_AI_PATHCustom AI endpoint path, leave empty for default/-/ai/chat/completions
CNB_EMBEDDINGS_PATHEmbeddings endpoint path, required for embeddings featureNo default
CUSTOM_MODELSCustom model list, comma-separatedmodel-a,model-b,model-c

Note:

  • CNB_AI_PATH defaults vary by endpoint:
    • /v1/chat/completions/-/ai/chat/completions
    • /v1/models/-/ai/models
  • CNB_EMBEDDINGS_PATH is used for the embeddings endpoint. It is required if you want to use the embeddings feature (e.g., /-/ai/embeddings).
  • CUSTOM_MODELS is used for the model list returned by /v1/models endpoint. If set, owned_by will be custom; if not set, the default model hunyuan-2.0-instruct is used with owned_by as cnb-default.

4. Deploy

Push code or manually trigger deployment.

Usage

Configure OpenAI-compatible client

ConfigurationValue
Base URLhttps://your-edge-pages-domain/v1
API KeyYour CNB Token (with or without sk- prefix)

Python (OpenAI SDK)

from openai import OpenAI client = OpenAI( base_url="https://your-edge-pages-domain/v1", api_key="YOUR_CNB_TOKEN" ) # Streaming call stream = client.chat.completions.create( model="any", messages=[{"role": "user", "content": "Hello!"}], stream=True ) for chunk in stream: if chunk.choices[0].delta.content: print(chunk.choices[0].delta.content, end="")

JavaScript (fetch)

const response = await fetch('https://your-edge-pages-domain/v1/chat/completions', { method: 'POST', headers: { 'Authorization': 'Bearer YOUR_CNB_TOKEN', 'Content-Type': 'application/json', }, body: JSON.stringify({ stream: true, messages: [{ role: 'user', content: 'Hello!' }], }), }); // Handle SSE stream const reader = response.body.getReader(); const decoder = new TextDecoder(); while (true) { const { done, value } = await reader.read(); if (done) break; console.log(decoder.decode(value)); }

Error Responses

All error responses follow OpenAI standard format:

{ "error": { "message": "Error description", "type": "error_type", "param": null, "code": null } }
Status CodeTypeDescription
401authentication_errorMissing Authorization header
500server_errorServer configuration error or internal error

Project Structure

cnb-edge-gateway/ ├── node-functions/ │ └── v1/ │ ├── chat/ │ │ └── completions/ │ │ └── index.js # POST /v1/chat/completions │ ├── embeddings/ │ │ └── index.js # POST /v1/embeddings │ └── models/ │ └── index.js # GET /v1/models └── README.md

License

MIT