
DeepSeek, the Chinese company behind some of the world’s most powerful open-weight AI models, has introduced a new technology that significantly reduces hardware requirements for large language models.
The research, initiated by company founder Liang Wenfeng, proposes a new approach to memory usage in AI systems. The key idea, called conditional memory, separates a model’s logic from its knowledge, allowing most data to be stored on cheaper and more accessible hardware instead of GPU memory.
DeepSeek has released the official code for the technology under the name Engram. According to the paper, Engram enables efficient scaling of knowledge while maintaining high performance in training and inference.
This development suggests a future where advanced AI models become faster, cheaper, and more capable of long-term contextual memory.
Keywords