Quickly experience DeepSeek, supporting models sized 1.5b/7b/8b/14b/32b/70b, with no need to wait for model downloads.
ollama run $ds
If you only need to have conversations with DeepSeek:
/ key for an enhanced experience.To run DeepSeek locally within your CVM (taking 14b as an example), follow these steps:
docker run --rm -it docker.cnb.cool/examples/ecosystem/deepseek/14b:latest ollama serve & ollama run deepseek-r1:14b
Note: The domain docker.cnb.cool is globally accelerated within Tencent Cloud's intranet, ensuring zero traffic fees.
To develop AI applications based on DeepSeek, you can deploy using the following methods: