Syllogism

Aristotle invented logic as a discipline, and in Prior Analytics developed a detailed theory of so-called syllogisms to codify deductive reasoning, which also marks the beginning of formalization in logic. Although there actually were interesting developments in the European middle ages with the theory of so-called supposition as a kind of semi-formal semantics, Kant famously said Aristotle had said all there was to say about logic, and this went undisputed until the time of Boole and De Morgan in the mid-19th century. Boole himself said he was only extending Aristotle’s theory.

The fundamental principle of syllogistic reasoning is best understood as a kind of function composition. Aristotle himself did not have the concept of a mathematical function, which we owe mainly to Leibniz, but he clearly used a concept of composition of things we can recognize as function-like. In the late 19th century, Frege pointed out that the logical meaning of grammatical predication in ordinary language can be considered as a kind of function application.

Aristotle’s syllogisms were expressed in natural language, but in order to focus attention on their form, he often substituted letters for concrete terms. The fundamental pattern is

(quantifier A) op B
(quantifier B) op C
Therefore, A op C

where each instance of “quantifier” is either “some” or “all”; each instance of “op” is either what Aristotle called “combination” or “separation”, conventionally represented in natural language by “is” or “is not”; and each letter is a type aka “universal” aka higher-order term. (In the middle ages and later, individuals were treated as a kind of singleton types with implicit universal quantification, so it is common to see examples like “Socrates is a human”, but Aristotle’s own concrete examples never included references to individuals.) Not all combinations of substitutions correspond to valid inferences, but Prior Analytics systematically described all the valid ones.

In traditional interpretations, Aristotle’s use of conventionalized natural language representations sometimes led to analyses of the “op” emphasizing grammatical relations between subjects and predicates. However, Aristotle did not concern himself with grammar, but with the more substantive meaning of (possibly negated) “said of” relations, which actually codify normative material inferences. His logic is thus a fascinating hybrid, in which each canonical proposition represents a normative judgment of a material-inferential relation between types, and then the representations are formally composed together.

The conclusion B of the first material inference, which is also the premise of the second, was traditionally called the “middle term”, the role of which in reasoning through its licensing of composition lies behind all of Hegel’s talk about mediation. The 20th century saw the development of category theory, which explains all mathematical reasoning and formal logic in terms of the composition of “morphisms” or “arrows” corresponding to primitive function- or inference-like things. Aside from many applications in computer science and physics, category theory has also been used to analyze grammar. The historical relation of Aristotle to the Greek grammarians goes in the same direction — Aristotle influenced the grammarians, not the other way around. (See also Searching for a Middle Term; Aristotelian Demonstration; Demonstrative “Science”?)