Global Rank · of 601 Skills
statusline-setup AI Agent Skill
View Source: b-open-io/prompts
MediumInstallation
npx skills add b-open-io/prompts --skill statusline-setup 89
Installs
Status Line Setup
Create and customize Claude Code status lines to display contextual information like model name, git branch, token usage, project colors, and more.
Overview
Claude Code supports custom status lines displayed at the bottom of the interface. Status lines update when conversation messages change, running at most every 300ms.
Interactive Setup Flow
When setting up a status line, first check for existing configuration and use AskUserQuestion to gather preferences.
Pre-Check: Existing Status Line
# Check for existing configuration
if [[ -f ~/.claude/settings.json ]]; then
EXISTING=$(jq -r '.statusLine // empty' ~/.claude/settings.json)
if [[ -n "$EXISTING" ]]; then
# User has existing status line - ask about backup
fi
fiSetup Questions
- Approach: Custom script (full control), ccstatusline (widget-based TUI), or Simple inline
- Features: Git branch, project colors, token usage, session cost
- Style: Powerline, Minimal, or Match terminal
- Editor: Cursor, VS Code, Sublime, or None (for clickable links)
Two Approaches
1. Manual Script (Full Control)
Create a shell script that receives JSON data via stdin and outputs a single line with ANSI colors.
Quick setup:
cat > ~/.claude/statusline.sh << 'EOF'
#!/bin/bash
input=$(cat)
MODEL=$(echo "$input" | jq -r '.model.display_name')
DIR=$(basename "$(echo "$input" | jq -r '.workspace.current_dir')")
echo "[$MODEL] $DIR"
EOF
chmod +x ~/.claude/statusline.shConfigure in settings:
{
"statusLine": {
"type": "command",
"command": "~/.claude/statusline.sh",
"padding": 0
}
}2. ccstatusline (Widget-Based)
Use the third-party ccstatusline for a widget-based approach with TUI configuration.
Quick setup:
bunx ccstatusline@latestConfigure in settings:
{
"statusLine": "bunx ccstatusline@latest"
}JSON Input Structure
The status line command receives structured JSON via stdin:
| Field | Description |
|---|---|
model.id |
Model identifier (e.g., "claude-opus-4-6") |
model.display_name |
Human-readable name (e.g., "Opus") |
workspace.current_dir |
Current working directory |
workspace.project_dir |
Original project directory |
cost.total_cost_usd |
Session cost in USD |
cost.total_duration_ms |
Total session duration |
context_window.context_window_size |
Max context size |
context_window.current_usage |
Current token usage object |
transcript_path |
Path to session transcript JSON |
session_id |
Unique session identifier |
Common Patterns
Git Branch Display
if git rev-parse --git-dir > /dev/null 2>&1; then
BRANCH=$(git branch --show-current 2>/dev/null)
DIRTY=""
git diff --quiet HEAD 2>/dev/null || DIRTY="*"
echo "[$MODEL] $BRANCH$DIRTY"
fiContext Usage Percentage
USAGE=$(echo "$input" | jq '.context_window.current_usage')
if [ "$USAGE" != "null" ]; then
TOKENS=$(echo "$USAGE" | jq '.input_tokens + .cache_creation_input_tokens + .cache_read_input_tokens')
SIZE=$(echo "$input" | jq -r '.context_window.context_window_size')
PERCENT=$((TOKENS * 100 / SIZE))
echo "Context: ${PERCENT}%"
fiPeacock Project Colors
SETTINGS=".vscode/settings.json"
if [[ -f "$SETTINGS" ]]; then
COLOR=$(jq -r '.["peacock.color"] // empty' "$SETTINGS")
fiClickable File Links (OSC 8)
FILE_URL="vscode://file${LAST_FILE}"
echo -e "\033]8;;${FILE_URL}\a${FILENAME}\033]8;;\a"Reference Files
For detailed implementation guidance, consult:
references/json-input-schema.md— Complete JSON input documentation with all fields, extraction examples in Bash/Python/Node.js, and null-value handlingreferences/scripting-patterns.md— ANSI color codes (256-color and true color), Powerline separators, Git integration patterns, project detection, clickable links (OSC 8), terminal integration, and formatting helpersreferences/ccstatusline-guide.md— Complete widget documentation, installation, configuration options, available widgets, multi-line setup, and troubleshooting
Installs
Security Audit
View Source
b-open-io/prompts
More from this source
Power your AI Agents with
the best open-source models.
Drop-in OpenAI-compatible API. No data leaves Europe.
Explore Inference APIGLM
GLM 5
$1.00 / $3.20
per M tokens
Kimi
Kimi K2.5
$0.60 / $2.80
per M tokens
MiniMax
MiniMax M2.5
$0.30 / $1.20
per M tokens
Qwen
Qwen3.5 122B
$0.40 / $3.00
per M tokens
How to use this skill
Install statusline-setup by running npx skills add b-open-io/prompts --skill statusline-setup in your project directory. Run the install command above in your project directory. The skill file will be downloaded from GitHub and placed in your project.
No configuration needed. Your AI agent (Claude Code, Cursor, Windsurf, etc.) automatically detects installed skills and uses them as context when generating code.
The skill enhances your agent's understanding of statusline-setup, helping it follow established patterns, avoid common mistakes, and produce production-ready output.
What you get
Skills are plain-text instruction files — not executable code. They encode expert knowledge about frameworks, languages, or tools that your AI agent reads to improve its output. This means zero runtime overhead, no dependency conflicts, and full transparency: you can read and review every instruction before installing.
Compatibility
This skill works with any AI coding agent that supports the skills.sh format, including Claude Code (Anthropic), Cursor, Windsurf, Cline, Aider, and other tools that read project-level context files. Skills are framework-agnostic at the transport level — the content inside determines which language or framework it applies to.
Chat with 100+ AI Models in one App.
Use Claude, ChatGPT, Gemini alongside with EU-Hosted Models like Deepseek, GLM-5, Kimi K2.5 and many more.