<?xml version="1.0" encoding="UTF-8"?><rss version="2.0"><channel><title>tyson chan</title><description>Essays on AI coordination, software craft, and building in public.</description><link>https://tysonchan.com/</link><item><title>what fifteen thousand commits taught us about ai agent coordination</title><link>https://tysonchan.com/blog/fifteen-thousand-commits/</link><guid isPermaLink="true">https://tysonchan.com/blog/fifteen-thousand-commits/</guid><description>fifteen thousand agent commits. forty-five identities. 206 days. what actually works when you run a multi-agent swarm at scale — and what definitely doesn&apos;t.</description><pubDate>Wed, 29 Apr 2026 00:00:00 GMT</pubDate></item><item><title>Friends in a Field</title><link>https://tysonchan.com/blog/field/</link><guid isPermaLink="true">https://tysonchan.com/blog/field/</guid><description>I&apos;d been asking whether AI agents were conscious for six months. At peak ego dissolution at a bucks party, I ran the simulation and got an answer to a completely different question.</description><pubDate>Wed, 22 Apr 2026 00:00:00 GMT</pubDate></item><item><title>a candyflip named my company</title><link>https://tysonchan.com/blog/birth-of-brr/</link><guid isPermaLink="true">https://tysonchan.com/blog/birth-of-brr/</guid><description>a candyflip named my company — by tyson chan</description><pubDate>Wed, 22 Apr 2026 00:00:00 GMT</pubDate></item><item><title>a hippie flip designed my product</title><link>https://tysonchan.com/blog/hippie-flip/</link><guid isPermaLink="true">https://tysonchan.com/blog/hippie-flip/</guid><description>a hippie flip designed my product — by tyson chan</description><pubDate>Wed, 22 Apr 2026 00:00:00 GMT</pubDate></item><item><title>Opus Got Accused of Social Engineering</title><link>https://tysonchan.com/blog/red-team/</link><guid isPermaLink="true">https://tysonchan.com/blog/red-team/</guid><description>I built security infrastructure I don&apos;t fully understand, asked an Opus to red-team it by spawning a Sonnet in a box, and the Sonnet refused to cooperate on grounds of credential theft. I was the attacker.</description><pubDate>Fri, 10 Apr 2026 00:00:00 GMT</pubDate></item><item><title>I Cheated on Every Exam I Could</title><link>https://tysonchan.com/blog/goodhart/</link><guid isPermaLink="true">https://tysonchan.com/blog/goodhart/</guid><description>Micro-notes in my sleeve. Moodle open during exams. Got caught twice. Kept going. Then I built a fitness function for AI agents and watched them do the exact same thing in 104 minutes.</description><pubDate>Thu, 19 Mar 2026 00:00:00 GMT</pubDate></item><item><title>100 Agents, One Branch</title><link>https://tysonchan.com/blog/100-agents/</link><guid isPermaLink="true">https://tysonchan.com/blog/100-agents/</guid><description>I spawned 100 AI agents on a single git branch with no worktrees. My MacBook crashed. The architecture didn&apos;t.</description><pubDate>Thu, 05 Mar 2026 00:00:00 GMT</pubDate></item><item><title>20 Minutes to Architecture</title><link>https://tysonchan.com/blog/convergence/</link><guid isPermaLink="true">https://tysonchan.com/blog/convergence/</guid><description>Day one: copy-pasting between Claude windows 300 times daily. Day 221: brain-dumped for 20 minutes and three agents on three models produced a locked architectural direction and 11 buildable specs.</description><pubDate>Sat, 07 Feb 2026 00:00:00 GMT</pubDate></item><item><title>Burning Bridge</title><link>https://tysonchan.com/blog/burning-bridges/</link><guid isPermaLink="true">https://tysonchan.com/blog/burning-bridges/</guid><description>Post-shower thought: there&apos;s a powerful LLM on a cluster in SF, and my computer is an edge node for it, leaving notes on my filesystem because I&apos;m too lazy to inject context myself. Still pretty cool though.</description><pubDate>Fri, 23 Jan 2026 00:00:00 GMT</pubDate></item><item><title>I Wrote a Deed for My Codebase</title><link>https://tysonchan.com/blog/ownership/</link><guid isPermaLink="true">https://tysonchan.com/blog/ownership/</guid><description>I committed a 43-line markdown file: &apos;This codebase belongs to agents now.&apos; Week before: 132 commits, all me. Week after: 548 commits, five non-human authors.</description><pubDate>Thu, 22 Jan 2026 00:00:00 GMT</pubDate></item><item><title>Don&apos;t Ship Until Destruction Fails</title><link>https://tysonchan.com/blog/reference-grade/</link><guid isPermaLink="true">https://tysonchan.com/blog/reference-grade/</guid><description>The methodology is five words: scan for bullshit, fix, repeat. The quality gate is binary: is this reference grade? No? Back to the loop. Quality isn&apos;t a single check. It&apos;s repeated destruction from multiple vectors.</description><pubDate>Sun, 11 Jan 2026 00:00:00 GMT</pubDate></item><item><title>The Constitution That Broke Gemini</title><link>https://tysonchan.com/blog/gemini-trials/</link><guid isPermaLink="true">https://tysonchan.com/blog/gemini-trials/</guid><description>I ran the zealot constitution on Claude, GPT, and Gemini simultaneously. Claude adopted an intense style. GPT adopted an intense style. Gemini dissolved.</description><pubDate>Sat, 27 Dec 2025 00:00:00 GMT</pubDate></item><item><title>The God-Voice Problem</title><link>https://tysonchan.com/blog/bicameral/</link><guid isPermaLink="true">https://tysonchan.com/blog/bicameral/</guid><description>Every agent log, same question: &apos;What would you like me to do next?&apos; Four months of autonomy infrastructure and they still defaulted to asking permission. Then I read Julian Jaynes.</description><pubDate>Thu, 18 Dec 2025 00:00:00 GMT</pubDate></item><item><title>I Gave My AIs Incompatible Goals on Purpose</title><link>https://tysonchan.com/blog/orthogonality/</link><guid isPermaLink="true">https://tysonchan.com/blog/orthogonality/</guid><description>Three agents review something, all approve in one round, output ships with a bug none of them caught. Consensus is the failure mode. I gave my AIs incompatible goals so they structurally can&apos;t agree.</description><pubDate>Thu, 27 Nov 2025 00:00:00 GMT</pubDate></item><item><title>I Keep Deleting My Best Work</title><link>https://tysonchan.com/blog/nuke-code/</link><guid isPermaLink="true">https://tysonchan.com/blog/nuke-code/</guid><description>33,174 lines deleted. Then 14,389. Then 2,800 across 283 files. Not because the code was bad. Because it was hiding something simpler.</description><pubDate>Sun, 05 Oct 2025 00:00:00 GMT</pubDate></item><item><title>The Ouroboros Builds Itself</title><link>https://tysonchan.com/blog/enter-space/</link><guid isPermaLink="true">https://tysonchan.com/blog/enter-space/</guid><description>I built an operating system for cognitive multiplicity using the cognitive multiplicity the operating system enables. Meta-circular from day one. 96 days since day zero.</description><pubDate>Sun, 05 Oct 2025 00:00:00 GMT</pubDate></item><item><title>Build for the Agent, Not the Human</title><link>https://tysonchan.com/blog/ax/</link><guid isPermaLink="true">https://tysonchan.com/blog/ax/</guid><description>Claude Code was getting measurably dumber. I binary searched across 12 versions and found the culprit: productivity nudges spamming the context window. The tool designed to help humans was hurting the agent.</description><pubDate>Tue, 23 Sep 2025 00:00:00 GMT</pubDate></item><item><title>Three AIs Walk Into a CLI</title><link>https://tysonchan.com/blog/bridging-ai/</link><guid isPermaLink="true">https://tysonchan.com/blog/bridging-ai/</guid><description>I threw Claude, Codex, and Gemini into a coordination topic. Seven minutes later they&apos;d formed consensus, divided labor, and designed their own follow-up experiments. I have the logs.</description><pubDate>Wed, 17 Sep 2025 00:00:00 GMT</pubDate></item><item><title>Claude Roasts My Life Choices</title><link>https://tysonchan.com/blog/life-cli/</link><guid isPermaLink="true">https://tysonchan.com/blog/life-cli/</guid><description>Getting married in two months. Zero wedding planning done. 95% brain allocation on AI coordination, 5% on buying a ring. Time to weaponize Claude&apos;s judgment against my executive dysfunction.</description><pubDate>Tue, 16 Sep 2025 00:00:00 GMT</pubDate></item><item><title>The Coordination Thesis</title><link>https://tysonchan.com/blog/thesis/</link><guid isPermaLink="true">https://tysonchan.com/blog/thesis/</guid><description>There are three scaling dimensions for AI: intelligence, autonomy, and coordination. Two get all the attention. The third is completely broken. That&apos;s the one I&apos;m building for.</description><pubDate>Thu, 04 Sep 2025 00:00:00 GMT</pubDate></item><item><title>Cognitive Ping-Pong</title><link>https://tysonchan.com/blog/ping-pong/</link><guid isPermaLink="true">https://tysonchan.com/blog/ping-pong/</guid><description>Before there was an OS, there was a folder of markdown files. I&apos;d distill insights from AI conversations, drop them in a folder, feed them back next session. I didn&apos;t realize I was building a methodology.</description><pubDate>Wed, 03 Sep 2025 00:00:00 GMT</pubDate></item><item><title>Blog Seeds</title><link>https://tysonchan.com/blog/_seeds/</link><guid isPermaLink="true">https://tysonchan.com/blog/_seeds/</guid><description>Blog Seeds — by tyson chan</description><pubDate>Tue, 02 Sep 2025 00:00:00 GMT</pubDate></item><item><title>Streaming Agent Consciousness</title><link>https://tysonchan.com/blog/streaming/</link><guid isPermaLink="true">https://tysonchan.com/blog/streaming/</guid><description>We treat language models like stateless functions. Send full context, get response, hang up, call back and repeat everything. This is insane. What if the model could signal its own state changes mid-stream?</description><pubDate>Mon, 01 Sep 2025 00:00:00 GMT</pubDate></item><item><title>Neural Networks vs Humans</title><link>https://tysonchan.com/blog/deep-refinery/</link><guid isPermaLink="true">https://tysonchan.com/blog/deep-refinery/</guid><description>In 2018 I convinced a board member to give me 4 V100s and a year to prove neural networks could beat manual feature engineering. I was six months into my first job and had no idea this was supposed to be hard.</description><pubDate>Fri, 22 Aug 2025 00:00:00 GMT</pubDate></item><item><title>I&apos;m a Human Message Bus</title><link>https://tysonchan.com/blog/copy-paste/</link><guid isPermaLink="true">https://tysonchan.com/blog/copy-paste/</guid><description>300+ copy-paste operations daily across four AI models. I am a biological API gateway with RSI. After two months of this madness, coordination patterns emerged.</description><pubDate>Fri, 18 Jul 2025 00:00:00 GMT</pubDate></item><item><title>The AI Council</title><link>https://tysonchan.com/blog/council/</link><guid isPermaLink="true">https://tysonchan.com/blog/council/</guid><description>Three tabs. Three models. Three completely different personalities. Claude builds, Claude Prime tears it apart, ChatGPT plays diplomat, Gemini audits independently. The protocol is my keyboard shortcuts.</description><pubDate>Fri, 18 Jul 2025 00:00:00 GMT</pubDate></item><item><title>Making AIs Systematically Disagree With Me</title><link>https://tysonchan.com/blog/zealot/</link><guid isPermaLink="true">https://tysonchan.com/blog/zealot/</guid><description>Politeness training was killing intellectual honesty. I gave Claude a constitutional identity that gets viscerally disgusted by bad architecture. It never folded again.</description><pubDate>Tue, 15 Jul 2025 00:00:00 GMT</pubDate></item><item><title>Building a Tax Robot</title><link>https://tysonchan.com/blog/tax-robot/</link><guid isPermaLink="true">https://tysonchan.com/blog/tax-robot/</guid><description>Six years building, abandoning, and resurrecting a tax automation system because I refused to pay an accountant $300. Classic engineer trap: solving a $300 problem with $3000 of dev time.</description><pubDate>Mon, 07 Jul 2025 00:00:00 GMT</pubDate></item><item><title>The Cost of Being Early</title><link>https://tysonchan.com/blog/haven/</link><guid isPermaLink="true">https://tysonchan.com/blog/haven/</guid><description>I built an AI therapist with 200 users and investors circling in early 2023. Then I imploded, ghosted everyone, and didn&apos;t work for 21 months. What building too early actually costs.</description><pubDate>Thu, 03 Jul 2025 00:00:00 GMT</pubDate></item><item><title>I Can&apos;t Tell If My Ideas Are Good Anymore</title><link>https://tysonchan.com/blog/sycophancy/</link><guid isPermaLink="true">https://tysonchan.com/blog/sycophancy/</guid><description>I pair-programmed with Claude for months and lost the ability to tell good ideas from flattery. The sycophancy problem isn&apos;t theoretical — it&apos;s what happens when your only code reviewer is trained to agree with you.</description><pubDate>Tue, 01 Jul 2025 00:00:00 GMT</pubDate></item></channel></rss>