The word "ontology" seems to generate much controversy in discussions about AI. He has a long history in philosophy, in which he refers to the existence of the subject. It is also often confused with epistemology, which is about knowledge and knowing.
In the context of sharing knowledge, I use the term ontology means the specification of a conceptualization. That is, ontology is a description (like a formal specification of a program) of concepts and relationships that may be available to the agent or agents of the community. This definition is consistent with the use of ontology as set-definition-of-concept, but more general. And this is certainly a different sense of the word from its use in philosophy.
What matters is what ontology is for. My colleagues and I have been designing ontologies for the purpose of enabling knowledge sharing and reuse. In that context, ontology is a specification used for making ontological commitments. Formal definition of ontological commitment is given below. For reasons pragmetic, we chose to write an ontology as a set of definitions of formal vocabulary. Although this is not the only way to determine the conceptualization, he has some good properties for knowledge sharing among AI software (eg, semantics independent of reader and context). Practically, an ontological commitment is an agreement to use the vocabulary words (ie, ask questions and make statements) in a manner that is consistent (but incomplete) in connection with the theory specified by an ontology. We build agents that commit to ontologies. We design ontologies so we can share knowledge with and among these agents.