Humans process enormous amounts of information from our surroundings, yet we store only a fraction of it in our tiny little heads. So how are we so good at solving real-world problems?
It’s because a lot of what we learn gets folded into the ways we shape the physical world around us. Over time, knowledge accumulates in the structures, machines, and configurations of the objects around us.
You’ve heard of “world models”? The world is the model.
For businesses, this means the valuable knowledge isn’t in documents that can be scraped and trained on. It lives in infrastructure, supply chains, factories, and networks. It’s in the layout of aisles, the size of doors, the shape of containers. Knowledge lives in the design of everyday objects.
Building effective AI for this reality isn’t just “robots plus AI.” It requires agents that become part of this embodied intelligence, integrating with processes that exist beyond documents. When knowledge is stored in material structures, unlocking value requires AI that can continuously perceive, experiment, and update those structures.
This is a very different vision of AI than one that simply acts as a talking database. In physical AI, agents are active participants in the ecosystems where knowledge lives. They can change things now, to make things easier for someone else later.
The world is both the interface and the storage medium.
None of this is science fiction. Humans have been doing this since the dawn of time. The world has always been the model. Now, it’s also the machine.