What I Hope Happens to You
I don’t want you to just avoid burnout, I want you to see labor as a site of struggle, defaults as political choices, and for us to reforge our systemic blueprints into infrastructures of care.
Each time you rig your phone into a hotspot mid-presentation, context-switch under pressure, or rewrite lost slides, you’re performing unbilled infrastructure. I call this Risk Absorption, and it’s everywhere: from emergency patch calls to last-minute bug fixes. In Temporal Justice, I traced how unpaid “heroics” accumulate as hidden subsidy for frictionless UX. Recognizing these saves as labor—not badges of grit—lets you reclaim the narrative: these aren’t personal feats, but systemic failures demanding repair.
These silent rescues mirror how dissent is cannibalized into corporate vision. You flag a privacy gap or accessibility barrier; months later, that same insight is lauded as a proactive feature.
That’s the Amnesia Engine. Critique is ingested, delayed, and reissued under someone else’s name. By timestamping your feedback and tracking its route from inbox to roadmap, you reveal memory as a design parameter rather than an afterthought.
Defaults aren’t benign. When a form fails without explanation—no error message, no recovery path—it’s not a bug, but an engineered silence. In Emergence Is an Excuse, I show how “neutral” defaults encode institutional priorities: compliance quotas, risk buffers, cost savings. Asking Origin, Beneficiary, and Collateral for each default exposes whose needs are served and whose echo in the margins.
Finally, consider exit flows. A “free” trial that’s effortless to join but punishing to leave isn’t an oversight; it’s containment by design. In Designing Systems Where Coercion Isn’t the Default, I argue that Refusability—equal ease of exit and entry—is the true litmus test of a system’s ethics.
These threads—unpaid labor, selective memory, silent defaults, restricted exits—are not isolated. They form an interlocking architecture of coercion. To dismantle it, we need a unified practice: one that surfaces hidden labor, chronicles memory, decodes defaults, and demands exit rights.
Each of these moves builds on and deepens the others. They’re not standalone tactics but points on a single map.
1. Chart the Amnesia Engine
Track your feedback’s life cycle from flag to feature. Note where it dies in backlogs, and where it reemerges as “insight.” Your audit reveals how memory itself is weaponized—see The Amnesia Engine.
2. Expose “Neutral” Defaults
Audit every invisible setting—auto-expire windows, silent fails, one-touch agreements. Each default, when named, becomes a locus of political choice—see Emergence Is an Excuse.
3. Quantify Invisible Labor
Record the minutes and methods of every uncredited fix. These hours—whether rebooting servers at midnight or clarifying ambiguous prompts—are currency in a system that traffics in frictionless experience—see Temporal Justice.
4. Enforce Refusability
Design and test exit flows with the same rigor as sign-up. If leaving triggers delays, fees, or coercive messaging, demand parity. This reframes consent as an ongoing, reversible stance—see Designing Systems Where Coercion Isn’t the Default.
5. Interrogate Design Power
When leaders cite “too complex to fix,” ask: Too complex for whom? Behind that shrug lie resource allocations, profit motives, and power imbalances—see Coercion as Fragility.
6. Center the Dispossessed
Focus first on those most likely to be erased by defaults: low-income users, non-native speakers, neurodivergent bodies. Their exclusions map systemic blind spots—see Collapse Capitalism.
As you apply these moves, begin with your own workspace—patch notes, email filters, onboarding forms.
Cross-reference findings with essays like Actuarial Medicine & Hidden Exclusion, which shows how algorithms become exclusion engines.
Leverage insights from The Commodification of Behavior in the Age of AI to critique metrics that tokenize human activity.
Ground your refusal in Misreading Capitalist Realism, understanding how dissent is co-opted into compliance.
Each connection deepens our shared lens. The essays I write are meant to be conversational roots, offering you detailed investigations and real-world examples to enrich your audits.
Systems are human-made, therefore human-remade. Every audit you record, every default you decode, every exit you secure chips away at architectures of erasure and control. Over time, those chips become trowels, each laying a cornerstone for infrastructures built on memory, dignity, and consent.
If Foucault mapped the archive and Fanon named the wound, our collective task now is both forensic and generative: to spot the labor, chart the forgetting, expose coercion, and collaboratively redraw the blueprints.
That, dear reader, is what I hope happens to us—together.