Culture x Capital: The end of forgetting
Comprehensive life logging, federated recall, and the promise (and peril) of remembering everything.
“In ten years, I doubt you’ll forget anything.” So argues our own Sophie Bakalar in a recent post about memory becoming the next frontier for consumer technology:
If that sounds frightening or dystopian, like something out of a Black Mirror episode, that’s because it is. But it’s also such an obvious application of AI to a major consumer pain point that it’s probably inevitable.
AI is unquestionably coming for our memories. So the next question becomes: What’s worth remembering, and who decides?
For an extremely small subset of the population — as few as 60 alive in the world — these questions were present long before AI. Those who “suffer” from HSAM (Highly Superior Autobiographical Memory) can remember nearly every moment of their lives like it happened yesterday—including the bad parts. Perfect memory may be useful, but it isn’t painless.
It’s also not, strictly speaking, possible. Memory “is not a camera, it is not a tape recorder, it is not a vault of untouched documents,” Philippa Perry reminds us.
Memory is a storyteller, and like all storytellers it edits, exaggerates, softens, forgets, and adds new meaning over time. Every time you recall something, you are not playing back an old tape, you are reconstructing it in the moment... Which means that the past is never just the past, it is always being rewritten by the present.
That’s where AI comes in. There’s no shortage of companies that will sell you a bauble that can record your daily life in increasingly high fidelity. Such recall would “enhance our memory and thereby promote its value,” says Yannic Kappes in Aeon. That value is unmistakable:
You are literally made, in part, of your memories. Our memories are valuable because they help make us who we are as individuals... A richer and deeper memory can quite literally turn you into a richer and deeper person.
There are of course tradeoffs and caveats. Privacy is paramount — who’s recorded, how consent is given, how leaks are prevented. And there’s the worry our biological memory might atrophy from lack of use.
A more actionable concern is data autonomy. “Despite memory being important, it’s currently trapped in a silo, living with the application where it was accumulated,” the folks at Asimov’s Addendum write. ChatGPT and Claude’s walled gardens accrue benefit to the platforms, not the people.
They propose an open, federated “personal memory infrastructure” — data that belongs to the user and moves freely between tools.
Emily Manges (an AIR EIR!) puts a finer point on it in Byline:
When Google shutters a product, whole archives vanish. When a platform tweaks its algorithm, entire categories of history are buried. What survives isn’t just about storage capacity; it’s about platform incentives. Some lives will be remembered in high resolution; others will fade into digital static.
We desire control of our lives, and to the degree our memories define us, it’s fair to wonder who should control those. Should it be the corporate servers we rent space on, or could a distributed ecosystem of blockchain archives, personal data trusts, or micro-clouds allow “you, not a platform, [to] hold the canonical copy.”
However it unfolds, we’re excited about the possibilities infinite memory might unlock, and eager to meet the people who will help build it. Especially, as Sophie concludes, if they “truly understand how sensitive this trove of personal data will be and treats that responsibility with proper respect.”
And if it all feels like too much, just remember: unlike machines and a select 60 people on earth, we can always choose to forget.
Got an open tab we should see? Reply with the link and why it matters. Applications for AI Residency Cohort 2 are open
At AI Residency (AIR), we back small teams building AI products that feel human, useful, and new — from the earliest ideas to companies scaling toward Series B. We’re looking for founders who see opportunity in broken systems, platform shifts, overlooked niches, or emerging secular trends — and who are bold enough to reimagine them. Applications for AI Residency Cohort 2 are open. Apply by November 16th.
“Everything in life is memory, save for the thin edge of the present.” — Michael GazzanigaHeader image courtesy of the National Gallery of Art, Washington.









To me, it seems impossible to ever trust any 3rd party 'managers' of our memories. Even an AI application installed on any so-called personal device may not be sufficiently secure. Individual or 'bloc' groups of companies are unlikely to offer enough security. How many of them seem to get hacked regularly? However, a public blockchain with extremely high levels of encryption might hold some promise -- until general quantum computing for the real world becomes a thing (maybe 20+ years).
I like the idea on paper, but I can’t find a strong “why” for most people. Selective capture/recall can be helpful, but there are simpler ways to get it. For most, “remember everything” is unnecessary. I bucket memories into: (1) things we should remember and recall quickly, (2) things too painful and better forgotten, (3) things not worth remembering, and (4) general noise. On a typical day, truly long‑term‑worthy items are often fewer than 10. If the set is that small, pen-and-paper or a private notes vault beats any platform on control, lock‑in, and risk.
The platform model adds real hazards: training on your data, product shutdowns, shifting incentives that bury history, breaches, and company mortality. And history suggests “do no evil” slogans rarely survive when billions or trillions are at stake. “User‑owned” infra sounds nice, but nothing is 100% secure—now or in a post‑quantum world—so encryption/blockchains are, at best, short‑term mitigations. Even at the capture layer alone, as your post states, the risks are non‑trivial:
- Non‑consensual recording of bystanders
- Incidental capture of highly sensitive context (health, finances, location, intimacy)
- Inference creep from “harmless” metadata revealing routines and relationships
- Environment identifiers (faces, plates, screens) exposing others’ data
I’m not even addressing architecture or access risks here—just capturing puts you and everyone around you at risk.
I am sure companies will offer end-to-end encryption for security, but how many people are good at maintaining these keys and operational security? Most people unquestioningly trust that the vendor will be able to provide the data back if the key is lost or stolen, and many people write their passwords on paper or store them in a file, so this is a real-world possibility, even though password vaults have been available for years.
There’s also the curation tax: in a selective‑deletion world, you can easily spend hours deciding what to record, keep, or delete—pushing people toward “record all, keep all” by default.
Where tech might help is narrow: cross‑silo search, low‑effort capture for people with extreme info churn, and optional pattern‑finding. But the sane default is simple, local, human‑controlled documentation—and the right to forget, by design. Marta Kagan said, “The ability to forget is a gift.” Those ~60 HSAM individuals mentioned in the post might agree.