Midjourney/MyVault illustration

By
Markos Symeonides
Founder & CEO of MyVault AI
The memory wars have begun
There is a moment, maybe six weeks into using a new AI assistant, when something changes.
Jan 28, 2026
Read the article with your favourite AI
Share the article
There is a moment, maybe six weeks into using a new AI assistant, when something changes.
Daily conversations create a connection. You share details about your life: your allergies, your children, or the specific project that keeps you awake. The assistant becomes helpful because it knows you.
Then you try a different model.
The new system knows nothing. It has no record of your history. This persistent memory became a central challenge in 2025, driving a competition worth hundreds of billions of dollars.
The stakes
ChatGPT attracts 800 to 900 million weekly active users, which is roughly one in ten people on Earth. Sam Altman, chief executive of OpenAI, has spoken about the direction of the technology. People want memory, he said in an August interview with CNBC. "People want product features that require us to be able to understand them."
Altman describes the business model plainly. "It's extremely addictive," he told Big Technology. "People love that the model gets to know them over time." If the plan works, the cost to leave ChatGPT could be high.
This strategy is the plan.
The AI personalization market was valued at $484 billion in 2024 and is projected to reach $738 billion by 2033, according to SkyQuest Technology. The largest reward lies in capturing the context that makes personalization possible and making it difficult to leave.
Two philosophies
This year, both ChatGPT and Claude launched memory features. From a distance, they look similar. Look closer and you find opposing worldviews.
ChatGPT loads a profile of you into every conversation by default. Since April 2025, it references your entire chat history to deliver responses OpenAI describes as more relevant to you. This happens automatically.
Simon Willison, a developer and writer, reviewed the information ChatGPT collected about him. He described it as "extremely detailed", a "dossier" built from his conversations. "The entire game when it comes to prompting LLMs is to carefully control their context," he wrote. "This feature eliminates that control entirely."
Claude uses a manual system. When Anthropic launched memory in August 2025, the company chose to keep memory off by default. Users choose to enable it. When Claude remembers something, it shows the specific action. You can see the tool calls, review what it saved, and edit or delete any entry.
"Instead of a global persistent profile, Claude uses project-scoped memory with strict isolation," Anthropic explained. The design focuses on privacy, prevents data leakage between projects, and offers specific control.
ChatGPT wants to know you. Claude waits for your instruction. This distinction shows who is in control.
Context windows are not memory
AI assistants don’t remember anything.
A "context window" functions like a large sticky note. It represents the total amount of text a model holds at once. GPT-4 holds about 128,000 tokens. Claude handles 200,000. Some newer models claim millions. But when you close the chat, the sticky note gets thrown away.
This limitation leaves these systems simultaneously brilliant and amnesiac. Within a single conversation, they perform miracles. Across conversations, they have the memory of a goldfish. And a 2025 study on cognitive memory confirms the problem. It noted that despite LLMs' ability to retrieve information, these models "lack stable and structured long-term memory."
The memory features launched this year attempt to solve the problem. They store facts and summaries outside the context window and inject them back in later. Altman views this as the beginning. "We are in the GPT-2 era of memory," he said, "but the time will come when AI remembers every detail of your life and personalizes itself based on all of that."
But that memory stays platform-specific. Your ChatGPT memories don’t transfer to Claude. Your Claude context doesn’t work in Gemini. You aren’t building a relationship with "AI", but a separate relationship with each company's AI.
The lock-in economy
Research firm FourWeekMBA captured the situation precisely: "AI advantage will not be determined by model quality alone but by personalization accumulated over time. Over time, the cost of switching approaches infinity because users would have to rebuild themselves inside a new system."
The numbers support this. According to the State of Consumer AI report from Andreessen Horowitz, only 9% of consumers pay for more than one AI subscription. Most users stick with one assistant. They ignore benchmarks and invest in a relationship.
This follows the old social media playbook. Your social graph was the lock-in for Facebook. Your context will be the lock-in for AI.
Except the stakes are higher. Your AI's understanding of you is not valuable only to the platform. It is valuable to you. That understanding makes the assistant useful. It anticipates your needs, works in your style, and remembers what you have already said.
Context is infrastructure. And right now, you don’t own it.
The case for portability
There are signs the situation might change.
In October 2025, Anthropic expanded Claude's memory to all paid users and added a feature most competitors lack: import and export. You can take your context with you from ChatGPT, Gemini, or other services.
The Data Transfer Initiative, a nonprofit that helped establish data portability standards for social media, published principles this year specifically for AI. "Providers of AI services can and should make personal data available to users to transfer and to delete," wrote Chris Riley, the organization's executive director, in the framework.
The framework is straightforward. Users should have the ability to download personal data from AI services and request direct transfer between services. They should keep it in structured, machine-readable formats. This portability focuses on personal data. It excludes model weights or training data. It prioritizes the context that makes the AI useful to you.
"It is important that the market for AI services remains open," Riley wrote. "Should any services begin to lock this data up, regulators should and will get involved."
What this means
We are at the beginning of a conversation about something that did not exist five years ago: personal AI context.
The numbers show the market is active. ChatGPT holds 68% market share, down from 87% a year ago. Gemini has tripled its share. Claude grew 190% year-over-year. The benchmarks leapfrog each other every few months. Intelligence is commoditizing fast.
The question is: Who owns your AI's understanding of you?
If the platform owns it, we build a future where switching costs are measured in years of lost context. Your relationship with AI becomes a relationship with whichever company captured you first. As one analysis puts it, the platform that knows you best wins because leaving is too expensive.
If you own it, we build something else. A future where context is portable. Memory stays with you to keep, share, or delete. AI assistants compete on how well they use your context, and they avoid holding it hostage.
That future does not exist yet. But the memory era has started. Which side wins will determine how all of us live with AI for the next decade.
If you could move your entire AI memory to a new platform tomorrow, would you switch assistants or stay with the one that knows you best?
If you have thoughts on any of this, ask us. We read and respond to everything.
Private by Design explores the intersection of AI, memory, and control. Subscribe for monthly analysis of the ideas shaping personal technology.
About
Myvault AI was founded by Markos Symeonides. Markos is a seasoned software entrepreneur, investor, and executive leader with a track record of founding, scaling, and exiting high-growth B2B technology companies.

