What is Ontology?

When people in AI talk about “ontology,” they usually mean something technical, such as an organized chart of entities and categories, like a knowledge graph that specifies how “doctor” relates to “hospital” or “patient.” In computer science, an ontology is a taxonomy: a way to structure data so machines can reason across it. Useful, yes, but this is a very thin sense of the term. It treats meaning as if it could be fully captured in tidy hierarchies of objects and properties.

Philosophy asks something deeper. Ontology, at its core, is the study of being. It does not just catalog what exists but asks the more fundamental question, which is what does it mean for something to be at all? Why do things show up as meaningful in the first place? Ontology is not a map of categories, but it is an inquiry into the conditions that make categories, and our understanding of them, possible.

Martin Heidegger revolutionized this inquiry in the 20th century. He argued that ontology cannot begin with abstract definitions or classifications, because we are never detached observers standing over against a pile of objects. Instead, we are always already in a world, living through projects, tools, moods, and possibilities. A hammer is not first grasped as an object with a wooden handle and a metal head, it is rather encountered through building, carpentry, and its place within a workshop. This background of significance, what Heidegger called worldhood, is what allows anything to appear as what it is.

This is the sense of ontology we need if we want to approach artificial general intelligence. Current AI systems, especially transformers, achieve fluency because they capture traces of relationality in language. But they lack the existential structures like temporality, care, and mood that make those relations meaningful for us. They operate in a flat semantic space where words connect to other words, but never to beings in world. To simulate human intelligence, we need more than bigger models or clever scaffolding. We need an ontology that begins with being human, because it is only within that grounding that understanding becomes possible.

Ontology, then, is not a distraction from the business of building AGI. It is rather the foundation we need to get it started. Without it, AI will continue to generate words without world. Conversely, with the proper foundation, we may finally find the path toward machines that do more than echo us, they may begin, however minimally, to understand.

Leave a Reply