Karpathy Compile OpenClaw Skill

Compile raw wiki entries from Phase 1 into structured, distilled knowledge points using LLM, grouping by topic and saving refined outputs.

v1.0.0 Recently Updated Updated 5 days ago

Installation

clawhub install karpathy-compile

Requires npm i -g clawhub

58

Downloads

0

Stars

0

current installs

0 all-time

1

Versions

EU EU-Hosted Inference API

Power your OpenClaw skills with the best open-source models.

Drop-in OpenAI-compatible API. No data leaves Europe.

Explore Inference API

GLM

GLM 5

$1.00 / $3.20

per M tokens

Kimi

Kimi K2.5

$0.60 / $2.80

per M tokens

MiniMax

MiniMax M2.5

$0.30 / $1.20

per M tokens

Qwen

Qwen3.5 122B

$0.40 / $3.00

per M tokens

Karpathy Compile Skill - Phase 2

描述

实现 Karpathy LLM Knowledge Base 的第二阶段:Wiki → Knowledge Points 编译。

将 Phase 1 生成的 wiki 条目通过 LLM distillation 编译为结构化的知识精华(knowledge points)。

工作流程

Phase 1: 用户查询 → Wiki 条目 (raw, 多条)
Phase 2: Wiki 条目 → Knowledge Points (精炼, 结构性)
Phase 3: Lint → 去重/合并/更新

Knowledge Point 格式

## Knowledge Point: [主题]

**核心概念**: [一句话概括]
**来源**: [wiki条目来源]
**详细说明**: [LLM生成的详细解释]
**标签**: [tag1, tag2]
**创建时间**: YYYY-MM-DD
**可信度**: high/medium/low

Compile Pipeline

  1. 读取 wiki 文件
  2. 按主题/标签分组
  3. 对每组使用 LLM distillation 生成 knowledge point
  4. 保存到 knowledge-points/ 目录

文件结构

karpathy-compile/
├── SKILL.md
├── scripts/
│   ├── __init__.py      # CompilePipeline
│   ├── distiller.py     # LLM distillation
│   ├── parser.py        # wiki 文件解析
│   └── test_compile.py  # 测试
└── knowledge-points/     # 输出目录

依赖

  • Phase 1 的 wiki 文件
  • M-Flow (用于存储编译后的 knowledge points)
  • Ollama LLM (qwen2.5:14b)

Statistics

Downloads 58
Stars 0
Current installs 0
All-time installs 0
Versions 1
Comments 0
Created Apr 5, 2026
Updated Apr 5, 2026

Latest Changes

v1.0.0 · Apr 5, 2026

Initial release: Wiki to Knowledge Points compilation pipeline

Quick Install

clawhub install karpathy-compile
EU Made in Europe

Chat with 100+ AI Models in one App.

Use Claude, ChatGPT, Gemini alongside with EU-Hosted Models like Deepseek, GLM-5, Kimi K2.5 and many more.

Customer Support