โ† Back to DevOps & Cloud
DevOps & Cloud by @macstenk

skill-exporter

Export Clawdbot skills as standalone, deployable microservices

0
Source Code

Skill Exporter

Transform Clawdbot skills into standalone, deployable microservices.

Workflow

Clawdbot Skill (tested & working)
         โ†“
    skill-exporter
         โ†“
Standalone Microservice
         โ†“
Railway / Fly.io / Docker

Usage

Export a skill

python3 {baseDir}/scripts/export.py \
  --skill ~/.clawdbot/skills/instagram \
  --target railway \
  --llm anthropic \
  --output ~/projects/instagram-service

Options

Flag Description Default
--skill Path to skill directory required
--target Deployment target: railway, fly, docker docker
--llm LLM provider: anthropic, openai, none none
--output Output directory ./<skill-name>-service
--port API port 8000

Targets

railway โ€” Generates railway.json, optimized Dockerfile, health checks fly โ€” Generates fly.toml, multi-region ready docker โ€” Generic Dockerfile, docker-compose.yml

LLM Integration

When --llm is set, generates llm_client.py with:

  • Caption/prompt generation
  • Decision making helpers
  • Rate limiting and error handling

What Gets Generated

<skill>-service/
โ”œโ”€โ”€ Dockerfile
โ”œโ”€โ”€ docker-compose.yml
โ”œโ”€โ”€ api.py              # FastAPI wrapper
โ”œโ”€โ”€ llm_client.py       # If --llm specified
โ”œโ”€โ”€ requirements.txt
โ”œโ”€โ”€ .env.example
โ”œโ”€โ”€ railway.json        # If --target railway
โ”œโ”€โ”€ fly.toml            # If --target fly
โ””โ”€โ”€ scripts/            # Copied from original skill
    โ””โ”€โ”€ *.py

Requirements

The source skill must have:

  • SKILL.md with valid frontmatter
  • At least one script in scripts/
  • Scripts should be callable (functions, not just inline code)

Post-Export

  1. Copy .env.example to .env and fill in secrets
  2. Test locally: docker-compose up
  3. Deploy: railway up or fly deploy