Your new coding bestie, now available in your favourite terminal.
Your tools, your code, and your workflows, wired into your LLM of choice.
终端里的编程新搭档,
无缝接入你的工具、代码与工作流,全面兼容主流 LLM 模型。
http, stdio, and sse)Use a package manager:
# Homebrew
brew install charmbracelet/tap/crush
# NPM
npm install -g @charmland/crush
# Arch Linux (btw)
yay -S crush-bin
# Nix
nix run github:numtide/nix-ai-tools#crush
# FreeBSD
pkg install crush
Windows users:
# Winget
winget install charmbracelet.crush
# Scoop
scoop bucket add charm https://github.com/charmbracelet/scoop-bucket.git
scoop install crush
Crush is available via the official Charm NUR in nur.repos.charmbracelet.crush, which is the most up-to-date way to get Crush in Nix.
You can also try out Crush via the NUR with nix-shell:
# Add the NUR channel.
nix-channel --add https://github.com/nix-community/NUR/archive/main.tar.gz nur
nix-channel --update
# Get Crush in a Nix shell.
nix-shell -p '(import <nur> { pkgs = import <nixpkgs> {}; }).repos.charmbracelet.crush'
Crush provides NixOS and Home Manager modules via NUR. You can use these modules directly in your flake by importing them from NUR. Since it auto detects whether its a home manager or nixos context you can use the import the exact same way :)
{ inputs = { nixpkgs.url = "github:NixOS/nixpkgs/nixos-unstable"; nur.url = "github:nix-community/NUR"; }; outputs = { self, nixpkgs, nur, ... }: { nixosConfigurations.your-hostname = nixpkgs.lib.nixosSystem { system = "x86_64-linux"; modules = [ nur.modules.nixos.default nur.repos.charmbracelet.modules.crush { programs.crush = { enable = true; settings = { providers = { openai = { id = "openai"; name = "OpenAI"; base_url = "https://api.openai.com/v1"; type = "openai"; api_key = "sk-fake123456789abcdef..."; models = [ { id = "gpt-4"; name = "GPT-4"; } ]; }; }; lsp = { go = { command = "gopls"; enabled = true; }; nix = { command = "nil"; enabled = true; }; }; options = { context_paths = [ "/etc/nixos/configuration.nix" ]; tui = { compact_mode = true; }; debug = false; }; }; }; } ]; }; }; }
sudo mkdir -p /etc/apt/keyrings
curl -fsSL https://repo.charm.sh/apt/gpg.key | sudo gpg --dearmor -o /etc/apt/keyrings/charm.gpg
echo "deb [signed-by=/etc/apt/keyrings/charm.gpg] https://repo.charm.sh/apt/ * *" | sudo tee /etc/apt/sources.list.d/charm.list
sudo apt update && sudo apt install crush
echo '[charm]
name=Charm
baseurl=https://repo.charm.sh/yum/
enabled=1
gpgcheck=1
gpgkey=https://repo.charm.sh/yum/gpg.key' | sudo tee /etc/yum.repos.d/charm.repo
sudo yum install crush
Or, download it:
Or just install it with Go:
go install github.com/charmbracelet/crush@latest
WARNING
Productivity may increase when using Crush and you may find yourself nerd sniped when first using the application. If the symptoms persist, join the Discord and nerd snipe the rest of us.
The quickest way to get started is to grab an API key for your preferred provider such as Anthropic, OpenAI, Groq, OpenRouter, or Vercel AI Gateway and just start Crush. You'll be prompted to enter your API key.
That said, you can also set environment variables for preferred providers.
| Environment Variable | Provider |
|---|---|
ANTHROPIC_API_KEY | Anthropic |
OPENAI_API_KEY | OpenAI |
VERCEL_API_KEY | Vercel AI Gateway |
GEMINI_API_KEY | Google Gemini |
SYNTHETIC_API_KEY | Synthetic |
ZAI_API_KEY | Z.ai |
HF_TOKEN | Hugging Face Inference |
CEREBRAS_API_KEY | Cerebras |
OPENROUTER_API_KEY | OpenRouter |
GROQ_API_KEY | Groq |
VERTEXAI_PROJECT | Google Cloud VertexAI (Gemini) |
VERTEXAI_LOCATION | Google Cloud VertexAI (Gemini) |
AWS_ACCESS_KEY_ID | Amazon Bedrock (Claude) |
AWS_SECRET_ACCESS_KEY | Amazon Bedrock (Claude) |
AWS_REGION | Amazon Bedrock (Claude) |
AWS_PROFILE | Amazon Bedrock (Custom Profile) |
AWS_BEARER_TOKEN_BEDROCK | Amazon Bedrock |
AZURE_OPENAI_API_ENDPOINT | Azure OpenAI models |
AZURE_OPENAI_API_KEY | Azure OpenAI models (optional when using Entra ID) |
AZURE_OPENAI_API_VERSION | Azure OpenAI models |
Is there a provider you’d like to see in Crush? Is there an existing model that needs an update?
Crush’s default model listing is managed in Catwalk, a community-supported, open source repository of Crush-compatible models, and you’re welcome to contribute.
Crush runs great with no configuration. That said, if you do need or want to customize Crush, configuration can be added either local to the project itself, or globally, with the following priority:
.crush.jsoncrush.json$HOME/.config/crush/crush.jsonConfiguration itself is stored as a JSON object:
{
"this-setting": { "this": "that" },
"that-setting": ["ceci", "cela"]
}
As an additional note, Crush also stores ephemeral data, such as application state, in one additional location:
# Unix
$HOME/.local/share/crush/crush.json
# Windows
%LOCALAPPDATA%\crush\crush.json
TIP
You can override the user and data config locations by setting:
CRUSH_GLOBAL_CONFIGCRUSH_GLOBAL_DATACrush can use LSPs for additional context to help inform its decisions, just like you would. LSPs can be added manually like so:
{
"$schema": "https://charm.land/crush.json",
"lsp": {
"go": {
"command": "gopls",
"env": {
"GOTOOLCHAIN": "go1.24.5"
}
},
"typescript": {
"command": "typescript-language-server",
"args": ["--stdio"]
},
"nix": {
"command": "nil"
}
}
}
Crush also supports Model Context Protocol (MCP) servers through three
transport types: stdio for command-line servers, http for HTTP endpoints,
and sse for Server-Sent Events. Environment variable expansion is supported
using $(echo $VAR) syntax.
{
"$schema": "https://charm.land/crush.json",
"mcp": {
"filesystem": {
"type": "stdio",
"command": "node",
"args": ["/path/to/mcp-server.js"],
"timeout": 120,
"disabled": false,
"disabled_tools": ["some-tool-name"],
"env": {
"NODE_ENV": "production"
}
},
"github": {
"type": "http",
"url": "https://api.githubcopilot.com/mcp/",
"timeout": 120,
"disabled": false,
"disabled_tools": ["create_issue", "create_pull_request"],
"headers": {
"Authorization": "Bearer $GH_PAT"
}
},
"streaming-service": {
"type": "sse",
"url": "https://example.com/mcp/sse",
"timeout": 120,
"disabled": false,
"headers": {
"API-Key": "$(echo $API_KEY)"
}
}
}
}
Crush respects .gitignore files by default, but you can also create a
.crushignore file to specify additional files and directories that Crush
should ignore. This is useful for excluding files that you want in version
control but don't want Crush to consider when providing context.
The .crushignore file uses the same syntax as .gitignore and can be placed
in the root of your project or in subdirectories.
By default, Crush will ask you for permission before running tool calls. If you'd like, you can allow tools to be executed without prompting you for permissions. Use this with care.
{
"$schema": "https://charm.land/crush.json",
"permissions": {
"allowed_tools": [
"view",
"ls",
"grep",
"edit",
"mcp_context7_get-library-doc"
]
}
}
You can also skip all permission prompts entirely by running Crush with the
--yolo flag. Be very, very careful with this feature.
If you'd like to prevent Crush from using certain built-in tools entirely, you
can disable them via the options.disabled_tools list. Disabled tools are
completely hidden from the agent.
{
"$schema": "https://charm.land/crush.json",
"options": {
"disabled_tools": [
"bash",
"sourcegraph"
]
}
}
To disable tools from MCP servers, see the MCP config section.
Crush supports the Agent Skills open standard for
extending agent capabilities with reusable skill packages. Skills are folders
containing a SKILL.md file with instructions that Crush can discover and
activate on demand.
Skills are discovered from:
~/.config/crush/skills/ on Unix (default, can be overridden with CRUSH_SKILLS_DIR)%LOCALAPPDATA%\crush\skills\ on Windows (default, can be overridden with CRUSH_SKILLS_DIR)options.skills_paths{ "$schema": "https://charm.land/crush.json", "options": { "skills_paths": [ "~/.config/crush/skills", // Windows: "%LOCALAPPDATA%\\crush\\skills", "./project-skills" ] } }
You can get started with example skills from anthropics/skills:
# Unix
mkdir -p ~/.config/crush/skills
cd ~/.config/crush/skills
git clone https://github.com/anthropics/skills.git _temp
mv _temp/skills/* . && rm -rf _temp
# Windows (PowerShell) mkdir -Force "$env:LOCALAPPDATA\crush\skills" cd "$env:LOCALAPPDATA\crush\skills" git clone https://github.com/anthropics/skills.git _temp mv _temp/skills/* . ; rm -r -force _temp
When you initialize a project, Crush analyzes your codebase and creates
a context file that helps it work more effectively in future sessions.
By default, this file is named AGENTS.md, but you can customize the
name and location with the initialize_as option:
{
"$schema": "https://charm.land/crush.json",
"options": {
"initialize_as": "AGENTS.md"
}
}
This is useful if you prefer a different naming convention or want to
place the file in a specific directory (e.g., CRUSH.md or
docs/LLMs.md). Crush will fill the file with project-specific context
like build commands, code patterns, and conventions it discovered during
initialization.
By default, Crush adds attribution information to Git commits and pull requests
it creates. You can customize this behavior with the attribution option:
{
"$schema": "https://charm.land/crush.json",
"options": {
"attribution": {
"trailer_style": "co-authored-by",
"generated_with": true
}
}
}
trailer_style: Controls the attribution trailer added to commit messages
(default: assisted-by)
assisted-by: Adds Assisted-by: [Model Name] via Crush <crush@charm.land>
(includes the model name)co-authored-by: Adds Co-Authored-By: Crush <crush@charm.land>none: No attribution trailergenerated_with: When true (default), adds 💘 Generated with Crush line to
commit messages and PR descriptionsCrush supports custom provider configurations for both OpenAI-compatible and Anthropic-compatible APIs.
NOTE
Note that we support two "types" for OpenAI. Make sure to choose the right one to ensure the best experience!
openai should be used when proxying or routing requests through OpenAI.openai-compat should be used when using non-OpenAI providers that have OpenAI-compatible APIs.Here’s an example configuration for Deepseek, which uses an OpenAI-compatible
API. Don't forget to set DEEPSEEK_API_KEY in your environment.
{
"$schema": "https://charm.land/crush.json",
"providers": {
"deepseek": {
"type": "openai-compat",
"base_url": "https://api.deepseek.com/v1",
"api_key": "$DEEPSEEK_API_KEY",
"models": [
{
"id": "deepseek-chat",
"name": "Deepseek V3",
"cost_per_1m_in": 0.27,
"cost_per_1m_out": 1.1,
"cost_per_1m_in_cached": 0.07,
"cost_per_1m_out_cached": 1.1,
"context_window": 64000,
"default_max_tokens": 5000
}
]
}
}
}
Custom Anthropic-compatible providers follow this format:
{
"$schema": "https://charm.land/crush.json",
"providers": {
"custom-anthropic": {
"type": "anthropic",
"base_url": "https://api.anthropic.com/v1",
"api_key": "$ANTHROPIC_API_KEY",
"extra_headers": {
"anthropic-version": "2023-06-01"
},
"models": [
{
"id": "claude-sonnet-4-20250514",
"name": "Claude Sonnet 4",
"cost_per_1m_in": 3,
"cost_per_1m_out": 15,
"cost_per_1m_in_cached": 3.75,
"cost_per_1m_out_cached": 0.3,
"context_window": 200000,
"default_max_tokens": 50000,
"can_reason": true,
"supports_attachments": true
}
]
}
}
}
Crush currently supports running Anthropic models through Bedrock, with caching disabled.
aws configureAWS_REGION or AWS_DEFAULT_REGION to be setAWS_PROFILE in your environment, i.e. AWS_PROFILE=myprofile crushaws configure, you can also just set AWS_BEARER_TOKEN_BEDROCKVertex AI will appear in the list of available providers when VERTEXAI_PROJECT and VERTEXAI_LOCATION are set. You will also need to be authenticated:
gcloud auth application-default login
To add specific models to the configuration, configure as such:
{
"$schema": "https://charm.land/crush.json",
"providers": {
"vertexai": {
"models": [
{
"id": "claude-sonnet-4@20250514",
"name": "VertexAI Sonnet 4",
"cost_per_1m_in": 3,
"cost_per_1m_out": 15,
"cost_per_1m_in_cached": 3.75,
"cost_per_1m_out_cached": 0.3,
"context_window": 200000,
"default_max_tokens": 50000,
"can_reason": true,
"supports_attachments": true
}
]
}
}
}
Local models can also be configured via OpenAI-compatible API. Here are two common examples:
{
"providers": {
"ollama": {
"name": "Ollama",
"base_url": "http://localhost:11434/v1/",
"type": "openai-compat",
"models": [
{
"name": "Qwen 3 30B",
"id": "qwen3:30b",
"context_window": 256000,
"default_max_tokens": 20000
}
]
}
}
}
{
"providers": {
"lmstudio": {
"name": "LM Studio",
"base_url": "http://localhost:1234/v1/",
"type": "openai-compat",
"models": [
{
"name": "Qwen 3 30B",
"id": "qwen/qwen3-30b-a3b-2507",
"context_window": 256000,
"default_max_tokens": 20000
}
]
}
}
}
Sometimes you need to look at logs. Luckily, Crush logs all sorts of
stuff. Logs are stored in ./.crush/logs/crush.log relative to the project.
The CLI also contains some helper commands to make perusing recent logs easier:
# Print the last 1000 lines
crush logs
# Print the last 500 lines
crush logs --tail 500
# Follow logs in real time
crush logs --follow
Want more logging? Run crush with the --debug flag, or enable it in the
config:
{
"$schema": "https://charm.land/crush.json",
"options": {
"debug": true,
"debug_lsp": true
}
}
By default, Crush automatically checks for the latest and greatest list of providers and models from Catwalk, the open source Crush provider database. This means that when new providers and models are available, or when model metadata changes, Crush automatically updates your local configuration.
For those with restricted internet access, or those who prefer to work in air-gapped environments, this might not be want you want, and this feature can be disabled.
To disable automatic provider updates, set disable_provider_auto_update into
your crush.json config:
{
"$schema": "https://charm.land/crush.json",
"options": {
"disable_provider_auto_update": true
}
}
Or set the CRUSH_DISABLE_PROVIDER_AUTO_UPDATE environment variable:
export CRUSH_DISABLE_PROVIDER_AUTO_UPDATE=1
Manually updating providers is possible with the crush update-providers
command:
# Update providers remotely from Catwalk.
crush update-providers
# Update providers from a custom Catwalk base URL.
crush update-providers https://example.com/
# Update providers from a local file.
crush update-providers /path/to/local-providers.json
# Reset providers to the embedded version, embedded at crush at build time.
crush update-providers embedded
# For more info:
crush update-providers --help
Crush records pseudonymous usage metrics (tied to a device-specific hash), which maintainers rely on to inform development and support priorities. The metrics include solely usage metadata; prompts and responses are NEVER collected.
Details on exactly what’s collected are in the source code (here and here).
You can opt out of metrics collection at any time by setting the environment variable by setting the following in your environment:
export CRUSH_DISABLE_METRICS=1
Or by setting the following in your config:
{
"options": {
"disable_metrics": true
}
}
Crush also respects the DO_NOT_TRACK convention which can be enabled via
export DO_NOT_TRACK=1.
See the contributing guide.
We’d love to hear your thoughts on this project. Need help? We gotchu. You can find us on:
Part of Charm.
Charm热爱开源 • Charm loves open source