🔺 Das Dreieck (ohne geht's nicht!): - 🐘 DumboSQL - Nie vergessend, Elefanten-Gedächtnis - 🦊 FunkFox - Bash Rapper, Pipes im Beat - 🕊️ Taichi Taube - Balance, Spirale, Weg zeigen 🎨 Die komplette Crew (alle 15/15): - Hardware: CapaciTobi, Schnecki, Schraubbär - Code: SnakePy, PepperPHP, CrabbyRust - UI: Schnippsi, Templatus, ASCII-Monster - System: Maya-Eule, Deepbit, Bugsy, Spider, Vektor ✨ Features: - Waldwächter Library (lib/waldwaechter.sh) - Inter-Character Communication via JSON logs - CrumbCrew Command Central (alle 15 Charaktere) - Stagebuilder Missionen mit AI-Integration - Kekshandbuch Zero v0.0 (komplette Doku) 🦊 Yo, check mal den Flow: Pipes sind wie Beat-Übergänge, Der Rhythmus hält dich fest und forstet. Mit | verbindest du Commands, Smoothes Transition, Flow ohne Ende! #OZM #Crumbforest #WoFragenWachsen 🤖 Generated with Claude Code Co-Authored-By: Claude Sonnet 4.5 <noreply@anthropic.com>
12 KiB
CLAUDE.md
This file provides guidance to Claude Code (claude.ai/code) when working with code in this repository.
CF_Zero_V1 Project Overview
CF_Zero_V1 is an educational Bash learning platform called "Crumbforest" that uses character-based AI assistants to teach command-line concepts through interactive missions. The project combines shell scripting, Python-based computer vision, and OpenRouter API integration to create an immersive learning experience.
Key Commands
Mission System
# Launch the interactive mission selector
./crumb-mission-selector.sh
# Run individual missions directly
bash missions/basics/fridolin.sh # Navigation basics
bash missions/basics/balu.sh # File creation
bash missions/basics/noko.sh # File reading
bash missions/advanced/dns_mission.sh # DNS tools
bash missions/advanced/ssh_security.sh # SSH basics
AI Character Assistants (Crumbforest Roles)
All roles require OPENROUTER_API_KEY environment variable set in .env file:
# Configure environment
cp .env.template .env
# Edit .env and add your OPENROUTER_API_KEY
# Character-based CLI assistants
./crumbforest_roles/mayaeule_zero.sh "What is friendship?"
./crumbforest_roles/deepbit_zero.sh "How do I use grep?"
./crumbforest_roles/bugsy_zero.sh "Explain loops in bash"
./crumbforest_roles/schnippsi_zero.sh "What is curl?"
./crumbforest_roles/tobi_zero.sh "How to use jq?"
./crumbforest_roles/templatus_zero.sh "Create HTML structure"
# Access CrumbCrew Command Central (all assistants in one shell)
# Run mission selector and choose option 9
./crumb-mission-selector.sh
Camera Vision System (SnakeCam)
# Start the Flask-based gesture recognition app
cd snake_camera_vision_v2
python app.py
# Access at http://localhost:5000
# View gesture detection module
python gestures/gestures_v4.py
Token Usage Monitoring
# View token usage across all AI assistants
./log_tokens_viewer_v4.sh
# Clean up malformed token logs
./fix_token_logs.sh
# Check individual agent logs
cat ~/.deepbit_logs/token_log.json
cat ~/.bugsy_logs/token_log.json
Architecture Overview
Mission System Architecture
Metadata-Driven Design:
- Each mission consists of two files:
mission_name.sh(executable) andmission_name.meta.json(metadata) - The mission selector (
crumb-mission-selector.sh) dynamically loads missions by scanning directories - Metadata structure:
{ "icon": "🦊", "title": "Mission Title", "description": "What this teaches", "category": "basics|advanced|challenges", "enabled": true }
Mission Categories:
missions/basics/- Beginner missions (navigation, file operations)missions/advanced/- Advanced topics (DNS, SSH, networking)missions/challenges/- Interactive challenges (future)
AI Assistant Architecture
Character-Based Learning Roles (Waldwächter - Forest Guardians): Each role is a specialized bash script that wraps OpenRouter API calls with distinct personalities:
- Maya-Eule (owl) - Wise guide with Qdrant memory integration for contextual conversations
- Deepbit (octopus) - Explains Bash concepts poetically to children
- Bugsy - Debugging and troubleshooting assistant
- Schnippsi - General shell command helper
- Tobi - JSON/data processing expert (alias "Capacitoby" for electronics)
- Templatus - HTML architecture assistant
- Schraubbär - Heavy-duty hardware specialist (welding, tools, mechanical systems)
Common Pattern:
- Accept question as command-line argument
- Store conversation history in
~/.{role}_logs/{role}_history.json - Track token usage in
~/.{role}_logs/token_log.json - Use OpenRouter API with
openai/gpt-3.5-turbomodel - Respond in the same language as the input question
API Flow:
Question → JSON Payload → OpenRouter API → Response → Log History → Display
Log Structure:
- Request:
~/.{role}_logs/{role}_request.json - Response:
~/.{role}_logs/{role}_response.json - History:
~/.{role}_logs/{role}_history.json - Token usage:
~/.{role}_logs/token_log.json
Waldwächter Library (lib/waldwaechter.sh):
Missions can source this library to make all AI assistants available as shell functions:
# In a mission script
SCRIPT_DIR="$(cd "$(dirname "${BASH_SOURCE[0]}")" && pwd)"
source "${SCRIPT_DIR}/../../lib/waldwaechter.sh"
# Now all AI assistants are available as commands
templatus "Create an HTML5 structure"
schnippsi "Show me CSS for a button"
bugsy "Debug this error"
mayaeule "What is friendship?"
The library automatically:
- Loads
.envfile with API keys - Defines functions for all AI characters
- Exports functions for use in subshells
- Sets correct paths to role scripts in
crumbforest_roles/
CrumbCrew Command Central: Position 9 in the mission selector opens an interactive shell with all Waldwächter pre-loaded:
- Custom prompt:
(🌲 CrumbCrew) user@host:path$ - Commands:
mayaeule,deepbit,bugsy,schnippsi,templatus,tobi,schraubaer - Utilities:
crew_status,crew_tokens,crew_memory
Inter-Character Communication (Crew Memory): Characters can reference each other's conversation histories through JSON log files:
# Schraubbär automatically checks if question mentions other crew members
schraubaer "Tobi sagte wir brauchen 5V, welches Netzteil empfiehlst du?"
# The script reads Tobi's recent conversations and includes them as context
# Keeps it lightweight for Raspberry Pi Zero - simple JSON file memory
Each character stores conversations in ~/.{character}_logs/{character}_history.json:
- Lightweight JSON format for Raspi Zero compatibility
- Last 3 conversations from referenced character are included as context
- Characters detect mentions of other crew members (tobi, schnecki, schnippsi, etc.)
- System prompt automatically enriched with crew context
Camera Vision System (SnakeCam)
Technology Stack:
- Flask web server with MJPEG video streaming
- OpenCV for camera capture and image processing
- Custom gesture detection using HSV color space and contour analysis
Key Components:
app.py- Flask application with video streaming endpointsgestures/gestures_v4.py- Hand gesture detection algorithm- Detection method: Skin color detection → Contour analysis → Convexity defects → Gesture classification
- Current gestures: "wave" detection based on defect count, area, and aspect ratio
Endpoints:
/- Main camera interface/video_feed- MJPEG stream/log_answer- Log user responses with mood and gesture/shutdown- Clean camera release
Project Structure
CF_Zero_V1/
├── crumb-mission-selector.sh # Main mission launcher (metadata-driven)
├── missions/ # Educational missions
│ ├── basics/ # Beginner tutorials
│ │ ├── fridolin.sh/meta.json # Navigation (pwd, ls, cd)
│ │ ├── balu.sh/meta.json # File creation (mkdir, touch, echo)
│ │ └── noko.sh/meta.json # File reading (cat, grep)
│ └── advanced/ # Advanced topics
│ ├── dns_mission.sh/meta.json # DNS tools (dig, nslookup, host)
│ └── ssh_security.sh/meta.json # SSH basics
├── crumbforest_roles/ # AI character assistants
│ ├── deepbit_zero.sh # Poetic Bash explainer
│ ├── bugsy_zero.sh # Debugging helper
│ ├── schnippsi_zero.sh # Shell command assistant
│ └── tobi_zero.sh # JSON/data expert
├── snake_camera_vision_v2/ # Flask gesture recognition app
│ ├── app.py # Main Flask server
│ └── gestures/ # Gesture detection modules
│ ├── gestures_v4.py # Hand detection algorithm
│ └── gestures_debug_test.py # Debug version with visualization
├── log_tokens_viewer_v4.sh # Token usage viewer
└── fix_token_logs.sh # Clean malformed logs
Important Implementation Notes
Adding New Missions
-
Create two files in appropriate category folder:
touch missions/basics/new_mission.sh touch missions/basics/new_mission.meta.json chmod +x missions/basics/new_mission.sh -
Metadata must include:
icon: Emoji for menu displaytitle: Human-readable namedescription: What the mission teachescategory: basics/advanced/challengesenabled: true/false
-
Mission script should:
- Use
cat << 'EOF'for multi-line instructions - Include interactive prompts with
read -p - Provide examples before asking user to try
- End with success message
- Use
Creating New AI Assistants
- Copy template from existing role (e.g.,
deepbit_zero.sh) - Update these variables:
LOGDIR- Log directory path- System prompt in
jq -ncommand - Character name in echo statements
- Create log directory structure automatically via
mkdir -p - Initialize empty JSON arrays for history and token logs
Token Log Format
Each role logs API usage in this format:
{
"zeit": "2025-06-18 19:05:33",
"rolle": "deepbit",
"usage": {
"prompt_tokens": 45,
"completion_tokens": 123,
"total_tokens": 168
}
}
Gesture Detection Tuning
Located in snake_camera_vision_v2/gestures/gestures_v4.py:
- HSV Range:
lower_skin=[0,30,60],upper_skin=[20,150,255] - ROI: Fixed at (100,100) with 150x150 size
- Area Threshold: 2500-15000 pixels for valid hand detection
- Defect Count: 3-10 convexity defects for "wave" gesture
- Aspect Ratio: 1.3-2.3 for hand shape validation
Environment Requirements
AI Assistants
OPENROUTER_API_KEYmust be exportedjqfor JSON processingcurlfor API calls
Camera System
- Python 3.x
- Flask (
pip install flask) - OpenCV (
pip install opencv-python) - Webcam at
/dev/video0or default camera index 0
Mission System
- Bash 4.0+
jqfor metadata parsing- Standard Unix tools (ls, cat, grep, etc.)
Philosophy and Design Principles
"Waldwächter" (Forest Guardian) Philosophy:
- Transparency over magic
- Metadata-driven extensibility
- Educational and interactive
- No code changes required to add new content
- Multilingual support (responds in input language)
Character-Based Learning:
- Each AI assistant has distinct personality and teaching style
- Poetic and metaphorical explanations for complex concepts
- Designed for children and beginners
- Conversational history maintained across sessions
Token Philosophy - "Was kostet die Frage eines Kindes?"
In the forest, a child's question is priceless. However, in our digital age, questions to AI assistants consume computational resources (tokens). This creates a beautiful teaching moment:
"What does a question cost?"
- In the forest: Unbezahlbar (priceless) - every question holds infinite value
- In the system: Measured in tokens - a concrete, understandable metric
- Pedagogically: Token tracking teaches mindful questioning
Why Token Tracking is Educational
- Teaches Reflection: Children learn to think before asking
- Creates Awareness: Understanding that resources have value
- Builds Quality: Better questions lead to better answers
- Encourages Research: Try to find answers independently first
- Develops Patience: Not every thought needs an immediate AI response
Token Budget System
The .env configuration allows setting a DAILY_TOKEN_BUDGET:
0= Unlimited (default) - trust and freedom>0= Daily limit (e.g., 10000 tokens ≈ 20 thoughtful questions)
This isn't about restriction - it's about mindfulness. Just as we teach children to:
- Not waste water
- Not waste food
- Not waste paper
We can teach them:
- Not to waste computational resources
- To value the AI's "thinking time"
- To appreciate the cost of knowledge
Implementation
All AI character scripts (deepbit_zero.sh, bugsy_zero.sh, etc.) log token usage to:
~/.{character}_logs/token_log.json
View with: ./log_tokens_viewer_v4.sh
This creates a transparent feedback loop where children can see:
- How many tokens each question consumed
- Which questions were "expensive" vs "cheap"
- Their daily/weekly usage patterns
Result: More thoughtful questions, deeper learning, and respect for resources.