Viewing revision 2 of 3 from Apr 7, 2026 9:41 PM. See latest revision

Instead of complex modern algorithms, a new AI memory system (MemPalace*) was built using the ancient Greek "Method of Loci." It proves a Talebian point: 2,000 years ago, people actually knew how to think. Today, we live in an era of information gluttony—publishing 600,000 books a year with barely a memorable thought, while the handful of texts written in antiquity are quoted eternally. We know how to publish; they knew how to think.

upd: sir @rohithzr examined this repo using BEAM 100K benchmark and results are not so satisfying) https://github.com/milla-jovovich/mempalace/issues/125

3 0

Replies (8)

Yes, actress Milla Jovovich and her friend Ben Sigman spent a few months creating MemPalace — a long-term memory system for AI — using Claude Code.

They didn't try to invent yet another complex neural graph or RAG.
Instead, they took the ancient Greek "Method of Loci" technique and turned it into a virtual architecture where all your conversations with the AI are organized.

0 0

I am not one for benchmarks in general but while developing it serves as a base to build upon. In my opinion BEAM is the most relevant benchmark because it tests end-to-end answer quality, not just retrieval. LongMemEval is solid for retrieval evaluation but only measures whether the right document is in the top-K, not whether the system answers correctly. LoCoMo tests useful abilities (multi-hop, temporal) but its recall metric is trivially gameable when top-k exceeds the number of sessions per conversation.

0 0