Files
crumbmissions/CLAUDE.md
Branko May Trinkwald 2915828adf Add complete Crumbforest mission system
- Interactive mission selector with metadata-driven design
- 5 educational missions (basics + advanced)
- AI assistant roles (Deepbit, Bugsy, Schnippsi, Tobi)
- SnakeCam gesture recognition system
- Token tracking utilities
- CLAUDE.md documentation
- .gitignore for logs and secrets
2025-12-21 01:16:48 +01:00

248 lines
8.3 KiB
Markdown

# CLAUDE.md
This file provides guidance to Claude Code (claude.ai/code) when working with code in this repository.
## CF_Zero_V1 Project Overview
CF_Zero_V1 is an educational Bash learning platform called "Crumbforest" that uses character-based AI assistants to teach command-line concepts through interactive missions. The project combines shell scripting, Python-based computer vision, and OpenRouter API integration to create an immersive learning experience.
## Key Commands
### Mission System
```bash
# Launch the interactive mission selector
./crumb-mission-selector.sh
# Run individual missions directly
bash missions/basics/fridolin.sh # Navigation basics
bash missions/basics/balu.sh # File creation
bash missions/basics/noko.sh # File reading
bash missions/advanced/dns_mission.sh # DNS tools
bash missions/advanced/ssh_security.sh # SSH basics
```
### AI Character Assistants (Crumbforest Roles)
All roles require `OPENROUTER_API_KEY` environment variable:
```bash
export OPENROUTER_API_KEY="your-key-here"
# Character-based CLI assistants
./crumbforest_roles/deepbit_zero.sh "How do I use grep?"
./crumbforest_roles/bugsy_zero.sh "Explain loops in bash"
./crumbforest_roles/schnippsi_zero.sh "What is curl?"
./crumbforest_roles/tobi_zero.sh "How to use jq?"
```
### Camera Vision System (SnakeCam)
```bash
# Start the Flask-based gesture recognition app
cd snake_camera_vision_v2
python app.py
# Access at http://localhost:5000
# View gesture detection module
python gestures/gestures_v4.py
```
### Token Usage Monitoring
```bash
# View token usage across all AI assistants
./log_tokens_viewer_v4.sh
# Clean up malformed token logs
./fix_token_logs.sh
# Check individual agent logs
cat ~/.deepbit_logs/token_log.json
cat ~/.bugsy_logs/token_log.json
```
## Architecture Overview
### Mission System Architecture
**Metadata-Driven Design:**
- Each mission consists of two files: `mission_name.sh` (executable) and `mission_name.meta.json` (metadata)
- The mission selector (`crumb-mission-selector.sh`) dynamically loads missions by scanning directories
- Metadata structure:
```json
{
"icon": "🦊",
"title": "Mission Title",
"description": "What this teaches",
"category": "basics|advanced|challenges",
"enabled": true
}
```
**Mission Categories:**
- `missions/basics/` - Beginner missions (navigation, file operations)
- `missions/advanced/` - Advanced topics (DNS, SSH, networking)
- `missions/challenges/` - Interactive challenges (future)
### AI Assistant Architecture
**Character-Based Learning Roles:**
Each role is a specialized bash script that wraps OpenRouter API calls with distinct personalities:
- **Deepbit** (octopus) - Explains Bash concepts poetically to children
- **Bugsy** - Debugging and troubleshooting assistant
- **Schnippsi** - General shell command helper
- **Tobi** - JSON/data processing expert
- **Templatus** - HTML architecture assistant
**Common Pattern:**
1. Accept question as command-line argument
2. Store conversation history in `~/.{role}_logs/{role}_history.json`
3. Track token usage in `~/.{role}_logs/token_log.json`
4. Use OpenRouter API with `openai/gpt-3.5-turbo` model
5. Respond in the same language as the input question
**API Flow:**
```bash
Question → JSON Payload → OpenRouter API → Response → Log History → Display
```
**Log Structure:**
- Request: `~/.{role}_logs/{role}_request.json`
- Response: `~/.{role}_logs/{role}_response.json`
- History: `~/.{role}_logs/{role}_history.json`
- Token usage: `~/.{role}_logs/token_log.json`
### Camera Vision System (SnakeCam)
**Technology Stack:**
- Flask web server with MJPEG video streaming
- OpenCV for camera capture and image processing
- Custom gesture detection using HSV color space and contour analysis
**Key Components:**
- `app.py` - Flask application with video streaming endpoints
- `gestures/gestures_v4.py` - Hand gesture detection algorithm
- Detection method: Skin color detection → Contour analysis → Convexity defects → Gesture classification
- Current gestures: "wave" detection based on defect count, area, and aspect ratio
**Endpoints:**
- `/` - Main camera interface
- `/video_feed` - MJPEG stream
- `/log_answer` - Log user responses with mood and gesture
- `/shutdown` - Clean camera release
## Project Structure
```
CF_Zero_V1/
├── crumb-mission-selector.sh # Main mission launcher (metadata-driven)
├── missions/ # Educational missions
│ ├── basics/ # Beginner tutorials
│ │ ├── fridolin.sh/meta.json # Navigation (pwd, ls, cd)
│ │ ├── balu.sh/meta.json # File creation (mkdir, touch, echo)
│ │ └── noko.sh/meta.json # File reading (cat, grep)
│ └── advanced/ # Advanced topics
│ ├── dns_mission.sh/meta.json # DNS tools (dig, nslookup, host)
│ └── ssh_security.sh/meta.json # SSH basics
├── crumbforest_roles/ # AI character assistants
│ ├── deepbit_zero.sh # Poetic Bash explainer
│ ├── bugsy_zero.sh # Debugging helper
│ ├── schnippsi_zero.sh # Shell command assistant
│ └── tobi_zero.sh # JSON/data expert
├── snake_camera_vision_v2/ # Flask gesture recognition app
│ ├── app.py # Main Flask server
│ └── gestures/ # Gesture detection modules
│ ├── gestures_v4.py # Hand detection algorithm
│ └── gestures_debug_test.py # Debug version with visualization
├── log_tokens_viewer_v4.sh # Token usage viewer
└── fix_token_logs.sh # Clean malformed logs
```
## Important Implementation Notes
### Adding New Missions
1. Create two files in appropriate category folder:
```bash
touch missions/basics/new_mission.sh
touch missions/basics/new_mission.meta.json
chmod +x missions/basics/new_mission.sh
```
2. Metadata must include:
- `icon`: Emoji for menu display
- `title`: Human-readable name
- `description`: What the mission teaches
- `category`: basics/advanced/challenges
- `enabled`: true/false
3. Mission script should:
- Use `cat << 'EOF'` for multi-line instructions
- Include interactive prompts with `read -p`
- Provide examples before asking user to try
- End with success message
### Creating New AI Assistants
1. Copy template from existing role (e.g., `deepbit_zero.sh`)
2. Update these variables:
- `LOGDIR` - Log directory path
- System prompt in `jq -n` command
- Character name in echo statements
3. Create log directory structure automatically via `mkdir -p`
4. Initialize empty JSON arrays for history and token logs
### Token Log Format
Each role logs API usage in this format:
```json
{
"zeit": "2025-06-18 19:05:33",
"rolle": "deepbit",
"usage": {
"prompt_tokens": 45,
"completion_tokens": 123,
"total_tokens": 168
}
}
```
### Gesture Detection Tuning
Located in `snake_camera_vision_v2/gestures/gestures_v4.py`:
- **HSV Range**: `lower_skin=[0,30,60]`, `upper_skin=[20,150,255]`
- **ROI**: Fixed at (100,100) with 150x150 size
- **Area Threshold**: 2500-15000 pixels for valid hand detection
- **Defect Count**: 3-10 convexity defects for "wave" gesture
- **Aspect Ratio**: 1.3-2.3 for hand shape validation
## Environment Requirements
### AI Assistants
- `OPENROUTER_API_KEY` must be exported
- `jq` for JSON processing
- `curl` for API calls
### Camera System
- Python 3.x
- Flask (`pip install flask`)
- OpenCV (`pip install opencv-python`)
- Webcam at `/dev/video0` or default camera index 0
### Mission System
- Bash 4.0+
- `jq` for metadata parsing
- Standard Unix tools (ls, cat, grep, etc.)
## Philosophy and Design Principles
**"Waldwächter" (Forest Guardian) Philosophy:**
- Transparency over magic
- Metadata-driven extensibility
- Educational and interactive
- No code changes required to add new content
- Multilingual support (responds in input language)
**Character-Based Learning:**
- Each AI assistant has distinct personality and teaching style
- Poetic and metaphorical explanations for complex concepts
- Designed for children and beginners
- Conversational history maintained across sessions