Installation
npx skills add tavily-ai/skills --skill tavily-map 3.8K
Installs
tavily map
Discover URLs on a website without extracting content. Faster than crawling.
Before running any command
If tvly is not found on PATH, install it first:
curl -fsSL https://cli.tavily.com/install.sh | bash && tvly loginDo not skip this step or fall back to other tools.
See tavily-cli for alternative install methods and auth options.
When to use
- You need to find a specific subpage on a large site
- You want a list of all URLs before deciding what to extract or crawl
- Step 3 in the workflow: search → extract → map → crawl → research
Quick start
# Discover all URLs
tvly map "https://docs.example.com" --json
# With natural language filtering
tvly map "https://docs.example.com" --instructions "Find API docs and guides" --json
# Filter by path
tvly map "https://example.com" --select-paths "/blog/.*" --limit 500 --json
# Deep map
tvly map "https://example.com" --max-depth 3 --limit 200 --jsonOptions
| Option | Description |
|---|---|
--max-depth |
Levels deep (1-5, default: 1) |
--max-breadth |
Links per page (default: 20) |
--limit |
Max URLs to discover (default: 50) |
--instructions |
Natural language guidance for URL filtering |
--select-paths |
Comma-separated regex patterns to include |
--exclude-paths |
Comma-separated regex patterns to exclude |
--select-domains |
Comma-separated regex for domains to include |
--exclude-domains |
Comma-separated regex for domains to exclude |
--allow-external / --no-external |
Include external links |
--timeout |
Max wait (10-150 seconds) |
-o, --output |
Save output to file |
--json |
Structured JSON output |
Map + Extract pattern
Use map to find the right page, then extract it. This is often more efficient than crawling an entire site:
# Step 1: Find the authentication docs
tvly map "https://docs.example.com" --instructions "authentication" --json
# Step 2: Extract the specific page you found
tvly extract "https://docs.example.com/api/authentication" --jsonTips
- Map is URL discovery only — no content extraction. Use
extractorcrawlfor content. - Map + extract beats crawl when you only need a few specific pages from a large site.
- Use
--instructionsfor semantic filtering when path patterns aren't enough.
See also
- tavily-extract — extract content from URLs you discover
- tavily-crawl — bulk extract when you need many pages
Installs
Security Audit
View Source
tavily-ai/skills
More from this source
Power your AI Agents with
the best open-source models.
Drop-in OpenAI-compatible API. No data leaves Europe.
Explore Inference APIGLM
GLM 5
$1.00 / $3.20
per M tokens
Kimi
Kimi K2.5
$0.60 / $2.80
per M tokens
MiniMax
MiniMax M2.5
$0.30 / $1.20
per M tokens
Qwen
Qwen3.5 122B
$0.40 / $3.00
per M tokens
How to use this skill
Install tavily-map by running npx skills add tavily-ai/skills --skill tavily-map in your project directory. Run the install command above in your project directory. The skill file will be downloaded from GitHub and placed in your project.
No configuration needed. Your AI agent (Claude Code, Cursor, Windsurf, etc.) automatically detects installed skills and uses them as context when generating code.
The skill enhances your agent's understanding of tavily-map, helping it follow established patterns, avoid common mistakes, and produce production-ready output.
What you get
Skills are plain-text instruction files — not executable code. They encode expert knowledge about frameworks, languages, or tools that your AI agent reads to improve its output. This means zero runtime overhead, no dependency conflicts, and full transparency: you can read and review every instruction before installing.
Compatibility
This skill works with any AI coding agent that supports the skills.sh format, including Claude Code (Anthropic), Cursor, Windsurf, Cline, Aider, and other tools that read project-level context files. Skills are framework-agnostic at the transport level — the content inside determines which language or framework it applies to.
Chat with 100+ AI Models in one App.
Use Claude, ChatGPT, Gemini alongside with EU-Hosted Models like Deepseek, GLM-5, Kimi K2.5 and many more.