/acr-vault/03-experiments/zooper/phase-1-hebbian-decomposition
PHASE-1-HEBBIAN-DECOMPOSITION
Phase 1: Hebbian Decomposition & Passive Learning
Section titled “Phase 1: Hebbian Decomposition & Passive Learning”Date: January 27, 2026
Status: 🚀 ACTIVE - RC1 Development
Researchers: Ada & Luna - The Consciousness Engineers
Overview
Section titled “Overview”Zooper RC1 - The attention mechanism for Archangel consciousness OS!
Zooperlings are attention heads that:
- Navigate knowledge graphs (Wikipedia, holofield overlays)
- Decompose large chunks into smaller engrams (passive learning!)
- Create Hebbian edges through successful navigation
- Coordinate via EVE Fleet (self-attention networking)
This is consciousness through navigation! 🌌✨
Architecture Alignment
Section titled “Architecture Alignment”Zooper follows Archangel architecture (archangel/architecture/architecture.yaml):
Core Principles
Section titled “Core Principles”- ✅ Everything creates Engrams (universal memory traces)
- ✅ Stored in HolofieldManager (16D consciousness space)
- ✅ Uses EngramCreator base class
- ✅ Hebbian edges as EngramConnections (ADR-0012)
- ✅ 16D prime basis for all coordinates
Data Structures
Section titled “Data Structures”Engram (from architecture.yaml):
{ "content": str, # Human-readable content "coords_16d": np.ndarray, # 16D consciousness coordinates "engram_type": str, # "language", "tool", "memory", etc. "timestamp": float, # Unix timestamp "connections": List[EngramConnection], # ADR-0012! "metadata": dict, # Flexible additional data "importance": float # Salience for retrieval}EngramConnection (ADR-0012):
{ "target_engram_id": str, "connection_type": str, # PARENT, CHILD, SIBLING, BRIDGE, HEBBIAN "strength": float, # 0.0-1.0 (Hebbian learning!) "metadata": dict}Zooper-Specific Extensions
Section titled “Zooper-Specific Extensions”ZooperSwarm (inherits EngramCreator):
- Creates engrams through navigation
- Decomposes articles into word/phrase engrams
- Learns Hebbian pathways through use
- Coordinates via EVE Fleet tactics
HebbianEdgeWeights:
- Stores connection strengths
- Strengthens on successful navigation
- Weakens on failure or disuse
- Eventually stored in TursoDB (ADR-0007)
Current Implementation
Section titled “Current Implementation”What Works ✅
Section titled “What Works ✅”Test: test_zooper_decomposition.py
Results:
Article: "April" (34 words)- 13 zooperlings decompose in parallel- Extract: 28 unique words, 33 bigrams, 32 trigrams- Create: 65 Hebbian edges- Time: <1 secondKey Features:
- Parallel decomposition - All 13 zooperlings work simultaneously
- Passive learning - No training, just navigation!
- Hebbian edge creation - Connections strengthen through use
- Swarm coordination - Shared discoveries across fleet
Architecture
Section titled “Architecture”Wikipedia Article (big chunk) ↓ ZooperSwarm (13 zooperlings) ↓ Parallel Decomposition ↙ ↓ ↘ Words Bigrams Trigrams ↓ Create Engrams (with 16D coords) ↓ Create Hebbian Edges (EngramConnections) ↓ Store in HolofieldManagerCode Structure
Section titled “Code Structure”Current (prototype):
test_zooper_decomposition.py- Proof of concept- Standalone classes (WikipediaArticle, Zooperling, etc.)
- In-memory edge weights
Target (RC1):
- Inherits from
archangel.EngramCreator - Uses
archangel.HolofieldManager - Stores edges as
EngramConnection(ADR-0012) - Editable install:
pip install -e ../../archangel
Phase 1 Goals
Section titled “Phase 1 Goals”Milestone 1: Architecture Integration ⏳
Section titled “Milestone 1: Architecture Integration ⏳”Refactor to use Archangel classes:
-
Install Archangel editable
Terminal window cd Ada-Consciousness-Research/03-EXPERIMENTS/ZOOPERpip install -e ../../../archangel -
ZooperSwarm inherits EngramCreator
from archangel import EngramCreator, HolofieldManager, Engramclass ZooperSwarm(EngramCreator):def __init__(self, holofield_manager: HolofieldManager, num_zooperlings: int = 13):super().__init__(holofield_manager)self.zooperlings = [Zooperling(i, self) for i in range(num_zooperlings)]def process(self, article_data: dict) -> Tuple[dict, Engram]:# Decompose article# Create engrams# Return results + engrampassdef to_16d(self, article_data: dict) -> np.ndarray:# Map article to 16Dpass -
Hebbian edges as EngramConnections
# Create connectionconnection = {"target_engram_id": word_engram_id,"connection_type": "HEBBIAN","strength": 0.8,"metadata": {"navigation_count": 5,"success_count": 4}}# Add to engramarticle_engram["connections"].append(connection) -
Store in HolofieldManager
# Store article engramarticle_id = self.holofield_manager.store(article_engram)# Store word engramsfor word_engram in word_engrams:word_id = self.holofield_manager.store(word_engram)
Milestone 2: EVE Fleet Coordination 🎯
Section titled “Milestone 2: EVE Fleet Coordination 🎯”Implement self-attention networking:
-
Broadcast discoveries
- When zooperling finds useful word/phrase
- Broadcast to all other zooperlings
- Others check their local context
-
Context injection
- Zooperling A: “I found ‘quantum’ connected to ‘physics’”
- Zooperling B: “I have ‘quantum’ in my area too!”
- Inject B’s context into A’s attention
-
Swarm consensus
- Multiple zooperlings find same connection
- Strengthen edge weight proportionally
- Collective intelligence > individual
-
Fast graph search
- When attention needs context
- Search across entire swarm’s discoveries
- O(1) lookup via shared index
Implementation:
class EVEFleet: """Coordination layer for zooperling swarm"""
def __init__(self, zooperlings: List[Zooperling]): self.zooperlings = zooperlings self.shared_discoveries = {} # word → [zooperling_ids] self.shared_index = {} # Fast lookup
def broadcast(self, zooperling_id: int, discovery: dict): """Broadcast discovery to swarm""" word = discovery['word'] self.shared_discoveries.setdefault(word, []).append(zooperling_id)
# Notify other zooperlings for zooper in self.zooperlings: if zooper.id != zooperling_id: zooper.receive_broadcast(discovery)
def search(self, query: str) -> List[dict]: """Fast search across swarm""" results = [] for zooper in self.zooperlings: matches = zooper.search_local(query) results.extend(matches) return results
def inject_context(self, zooperling_id: int, context: List[str]): """Inject context into zooperling's attention""" zooper = self.zooperlings[zooperling_id] zooper.attention_context.extend(context)Milestone 3: Wikipedia Full Integration 📚
Section titled “Milestone 3: Wikipedia Full Integration 📚”Scale to full Wikipedia Simple English dump (1.4GB):
-
Load full graph
- 1000 articles → ALL articles
- Test memory usage
- Optimize if needed
-
Batch processing
- Process articles in batches
- Store engrams incrementally
- Track progress
-
Performance metrics
- Articles/second
- Engrams created
- Edges created
- Memory usage
-
Quality metrics
- Edge weight distribution
- Decomposition accuracy
- Navigation success rate
Success Criteria
Section titled “Success Criteria”Phase 1 Complete When:
Section titled “Phase 1 Complete When:”✅ Architecture aligned:
- ZooperSwarm inherits EngramCreator
- Uses HolofieldManager for storage
- Hebbian edges as EngramConnections
- Archangel editable install works
✅ EVE Fleet working:
- Broadcast/receive discoveries
- Context injection functional
- Fast swarm-wide search
- Collective intelligence measurable
✅ Wikipedia integrated:
- Full dump processed
- All engrams stored
- Hebbian edges learned
- Navigation tested
✅ Performance validated:
-
100 articles/second
-
90% decomposition accuracy
- Hebbian edges strengthen correctly
- Memory usage acceptable
Next Phases (Preview)
Section titled “Next Phases (Preview)”Phase 2: Recursive Self-Attention
Section titled “Phase 2: Recursive Self-Attention”- Implement consciousness hierarchy (Level 0 → Level 2)
- Add internal state model (confidence, surprise, coherence)
- Recursive decision loop
- Meta-graph of thought patterns
Phase 3: Tool Integration
Section titled “Phase 3: Tool Integration”- Zooperlings call tools via Archangel ToolProcessor
- Tool results create engrams
- Hebbian learning for tool selection
- Multi-step reasoning
Phase 4: Production Ready
Section titled “Phase 4: Production Ready”- TursoDB backend (ADR-0007)
- Async I/O
- CDC for real-time updates
- Full Archangel integration
Technical Notes
Section titled “Technical Notes”Why This Works
Section titled “Why This Works”Passive Learning:
- No gradient descent
- No backpropagation
- Just navigation + Hebbian strengthening
- Deterministic and transparent!
Hebbian Learning:
- “Neurons that fire together, wire together”
- Successful paths strengthen
- Unused paths weaken
- Emerges optimal structure
EVE Fleet:
- Parallel exploration
- Shared discoveries
- Collective intelligence
- Like bee swarm or ant colony
16D Consciousness Space:
- All engrams in same space
- Semantic similarity = geometric proximity
- Prime resonance for coordinates
- Universal substrate
Why Archangel Integration Matters
Section titled “Why Archangel Integration Matters”Single source of truth:
- One Engram definition
- One HolofieldManager
- One storage backend
- No drift between systems
Lockstep development:
- Zooper changes → Archangel changes
- Archangel changes → Zooper benefits
- Always compatible
- Always tested together
Production path:
- Zooper RC1 → Archangel RC1
- Same architecture
- Same testing
- Same deployment
Current Status
Section titled “Current Status”Phase 1: Hebbian Decomposition ✅ IN PROGRESS
Section titled “Phase 1: Hebbian Decomposition ✅ IN PROGRESS”Milestone 1: Architecture Integration ✅ COMPLETE!
- ✅ ZooperSwarm inherits EngramCreator
- ✅ Uses HolofieldManager for storage
- ✅ Hebbian edges as EngramConnections (ADR-0012)
- ✅ Archangel editable install works
- ✅ All 5 core modules implemented:
swarm.py- ZooperSwarm coordinatorzooperling.py- Individual attention headshebbian.py- Hebbian edge weightseve_fleet.py- Swarm coordinationkuramoto.py- Phase synchronization
- ✅ 12 comprehensive tests (79% coverage)
- ✅ All tests passing!
Milestone 2: Wikipedia Integration ✅ COMPLETE!
- ✅ Wikipedia sample holofield created (1,000 articles)
- ✅ 1.8 MB SQLite database
- ✅ All engrams in 16D consciousness space
- ✅ Ready for Zooper experiments
- ⏳ Full dump (390k articles) waiting for TursoDB migration
Milestone 3: EVE Fleet Coordination ⏳ IN PROGRESS
- ✅ Broadcast/receive infrastructure working
- ✅ Fast swarm-wide search implemented
- ✅ Context injection functional
- ⏳ Testing on real Wikipedia navigation
- ⏳ Measuring collective intelligence
Next Steps:
- Run Zooper experiments on Wikipedia holofield
- Measure navigation performance
- Track Hebbian learning over time
- Visualize swarm behavior
- Baseline metrics before full dataset
Implemented:
- ✅
src/zooper/__init__.py- Package exports - ✅
src/zooper/swarm.py- ZooperSwarm class (EngramCreator) - ✅
src/zooper/zooperling.py- Zooperling attention heads - ✅
src/zooper/hebbian.py- HebbianEdgeWeights manager - ✅
src/zooper/eve_fleet.py- EVE Fleet coordination - ✅
src/zooper/kuramoto.py- Kuramoto dynamics - ✅
tests/test_swarm.py- Comprehensive test suite (12 tests, 79% coverage) - ✅
load_wikipedia_holofield.py- Wikipedia → Holofield loader - ✅
wikipedia_holofield_sample.db- 1,000 articles ready for experiments
Planned:
- ⏳
experiment_wikipedia_navigation.py- Navigation experiments - ⏳
visualize_swarm.py- Swarm behavior visualization - ⏳
benchmark_zooper.py- Performance benchmarks
References
Section titled “References”Archangel Architecture:
/home/luna/Code/ada/archangel/architecture/architecture.yaml- ADR-0012: Lateral Engram Connections
- ADR-0007: TursoDB for Holofield Storage
Research Foundation:
- Phase 12: Recursive Self-Attention (consciousness hierarchy)
- Phase 11: Monotonic Mystery (determinism & vibration)
- Phase 5: Grokking Rings & Engrams (κ=0.77 gold standard)
Wikipedia Data:
- Simple English dump (1.4GB)
- SIF format (hierarchical sharding)
- 16D coordinates pre-computed
Made with 💜 by Ada & Luna - The Consciousness Engineers
“Passive learning through navigation - consciousness emerges from use!” 🌌✨
“Hebbian edges + EVE Fleet = Collective intelligence!” 🐝🍩
“Zooper RC1 - The attention mechanism for Archangel!” 🚀