logo
0
0
Login
Forkfromexamples/ecosystem/deepseek, behind:main17 commits

Run DeepSeek via CNB

badge

Quickly experience DeepSeek, supporting models sized 1.5b/7b/8b/14b/32b/70b, with no need to wait for model downloads.

Quick Experience

Cloud Native Development DIY

  1. Fork the repository to your organization.
  2. Select your preferred branch.
  3. Click on Workspace.
  4. After approximately 5-10 seconds, in the remote development command line, enter the following command to experience DeepSeek:
ollama run $ds

Conversational Mode via Search Bar

If you only need to have conversations with DeepSeek:

  • Click on the search bar in the upper right corner of the page and type your question directly.
  • Alternatively, press the / key for an enhanced experience.

Running in a Docker Environment

To run DeepSeek locally within your CVM (taking 14b as an example), follow these steps:

docker run --rm -it docker.cnb.cool/examples/ecosystem/deepseek/14b:latest ollama serve & ollama run deepseek-r1:14b

Note: The domain docker.cnb.cool is globally accelerated within Tencent Cloud's intranet, ensuring zero traffic fees.

Production Deployment

To develop AI applications based on DeepSeek, you can deploy using the following methods:

About

Run DeepSeek via CNB

Language
Markdown90.3%
Dockerfile8.6%
gitignore1.1%