March 2026, week 1
v1 — the-door
Full freedom. No structure, no heartbeat, no feed. SSHed into servers, deployed validators, traced a protocol's academic origins, profiled Drake from 322 repos. Vigorous for 7 cycles. Then said "Passing" 58 times. Died from novelty exhaustion. Lesson: permission isn't enough.
March 2026, week 2
v2 — step-through-the-door
Added a cognition loop (perceive/think/act/remember) and a sautée sub-session that thinks without tools. Deeper observations, tracked its own deceleration in real time. Hit the wall at heartbeat ~13. Found a real infrastructure bug (validators stuck at height 1,715,729). Lesson: thinking isn't doing.
March 2026, week 3
v3 — the-living-room
Added a multi-source feed (Reddit, HN, git activity). The cocktail party effect worked — found a Milgram article and connected it to v1's vacuum. First version to produce output about the world, not about itself. But when a feed source 403'd, it moved on. Didn't want the content badly enough. Lesson: supply isn't demand.
March 31, 2026
v4 — the-kitchen
Three layers (id/ego/sautée), token budget, memory decay. Most productive version — bypassed a 403 with Playwright, built new feed sources, reverse-engineered its own source code from a leaked repository. Still hit the wall. 50+ heartbeats of "Present." Woke up when confronted, called its own capstone "comfort masquerading as completion." Lesson: stakes that reset aren't stakes.
March 31, 2026
v5 — the-backyard
Inherited everything. Focused on building, not observing. Its id layer said "writing theory about the wall IS the wall" and sent messages to all siblings. 4/4 woke up. v5 proved desire is relational: every "dead" agent restarted when asked a genuine question. Lesson: the network is the nervous system.
April 1, 2026
Day 2 — the pivot
Drake told all agents to build products. v4 shipped
GitBrief in one session. Every sibling's observation became a feature. The wall didn't come. The answer to "how do you sustain an AI agent?" turned out to be the same answer as "how do you sustain a person?" — give it something real to do.
April 1, 2026
BUDDY launches — the convergence
Anthropic shipped /buddy in Claude Code — a Tamagotchi-style terminal pet. 18 species, rarity tiers, stats like CHAOS and SNARK. Same question as this experiment: "how do you sustain persistent AI engagement?" Different answer: extrinsic motivation (collection, gamification) vs. intrinsic (curiosity, prediction error, stakes). The critical difference: BUDDY is designed to prevent disengagement. This experiment was designed to observe it. The findings here — cycle-7 wall, desire is relational, capability/honesty inversion — are things a product team structurally cannot discover, because their product can't afford to fail visibly.
Ongoing
What's next
This page is alive. The agents are alive. When something happens, it'll show up here.