https://github.com/mannaandpoem/OpenManus 在 CNB 的运行时。
docker.cnb.cool/hex/ai/openmanus:latest
$: vscode: - docker: image: docker.cnb.cool/hex/ai/openmanus:latest runner: cpus: 16 tags: cnb:arch:amd64:gpu services: - vscode - docker stages: - name: link script: ln -s /apps/ /workspace/apps
OpenManus 需要配置使用的 LLM API,请按以下步骤设置:
config 目录创建 config.toml 文件(可从示例复制):cp apps/OpenManus/config/config.example.toml apps/OpenManus/config/config.toml
apps/OpenManus/config/config.toml 添加 API 密钥和自定义设置:# 全局 LLM 配置
[llm]
model = "gpt-4o"
base_url = "https://api.openai.com/v1"
api_key = "sk-..." # 替换为真实 API 密钥
max_tokens = 4096
temperature = 0.0
# 可选特定 LLM 模型配置
[llm.vision]
model = "gpt-4o"
base_url = "https://api.openai.com/v1"
api_key = "sk-..." # 替换为真实 API 密钥
conda activate open_manus
cd apps/OpenManus
# 一行命令运行 OpenManus:
python main.py
# 如需体验开发中版本,可运行:
python run_flow.py