Source Code
Skill Exporter
Transform Clawdbot skills into standalone, deployable microservices.
Workflow
Clawdbot Skill (tested & working)
โ
skill-exporter
โ
Standalone Microservice
โ
Railway / Fly.io / Docker
Usage
Export a skill
python3 {baseDir}/scripts/export.py \
--skill ~/.clawdbot/skills/instagram \
--target railway \
--llm anthropic \
--output ~/projects/instagram-service
Options
| Flag | Description | Default |
|---|---|---|
--skill |
Path to skill directory | required |
--target |
Deployment target: railway, fly, docker |
docker |
--llm |
LLM provider: anthropic, openai, none |
none |
--output |
Output directory | ./<skill-name>-service |
--port |
API port | 8000 |
Targets
railway โ Generates railway.json, optimized Dockerfile, health checks
fly โ Generates fly.toml, multi-region ready
docker โ Generic Dockerfile, docker-compose.yml
LLM Integration
When --llm is set, generates llm_client.py with:
- Caption/prompt generation
- Decision making helpers
- Rate limiting and error handling
What Gets Generated
<skill>-service/
โโโ Dockerfile
โโโ docker-compose.yml
โโโ api.py # FastAPI wrapper
โโโ llm_client.py # If --llm specified
โโโ requirements.txt
โโโ .env.example
โโโ railway.json # If --target railway
โโโ fly.toml # If --target fly
โโโ scripts/ # Copied from original skill
โโโ *.py
Requirements
The source skill must have:
SKILL.mdwith valid frontmatter- At least one script in
scripts/ - Scripts should be callable (functions, not just inline code)
Post-Export
- Copy
.env.exampleto.envand fill in secrets - Test locally:
docker-compose up - Deploy:
railway uporfly deploy