Experience

Aristotle, Kant, Hegel, and Brandom all work with thick, nonprimitive, structured notions of human experience that do not involve treating consciousness as a transparent medium in which ready-made contents are immediately presented. Aristotle emphasized experience as a product of accumulation over time, as when we say someone is “experienced”. Kant emphasized that all experience is a product of preconscious synthesis that involves complex applications of concepts. Hegel developed a radical critique of the supposed positive role of immediacy. Whereas many previous readings tended to water down the impact of Kant and Hegel by explicitly or implicitly assimilating their work to empiricist or existential-phenomenological views that treat experience as something primitive, Brandom has emphasized how Kant and Hegel anticipated Wilfrid Sellars’ critique of the “Myth of the Given”, and developed an innovative “negative” account of the role of immediacy within experience (see Error; Negativity in Experience.)

The bottom line of all of this is that experience cannot be used as an unproblematic beginning point, as if all the difficult issues were separate from it, out there in the world somewhere. There is no such separation; we find ourselves only in and through a process of understanding life and the world. It is the forms brought to light through this process that matter.

Experience can still be a beginning point of sorts, but in the Aristotelian pragmatic sense that gives no privilege to beginnings. (See also Empirical-Transcendental Doublet.)

Kant and Foundationalism

According to Kant, all human experience minimally involves the use of empirical concepts. We don’t have access to anything like the raw sense data posited by many early 20th century logical empiricists, and it would not be of much use if we did. In Kantian terms, this would be a form of intuition without concepts, which he famously characterized as necessarily blind, and unable to function on its own.

Foundationalism is the notion that there is certain knowledge that does not depend on any inference. This implies that it somehow comes to us ready-made. But for Kant, all use of empirical concepts involves a kind of synthesis that could not work without low-level inference, so this is impossible.

The idea that any knowledge could come to us ready-made involves what Kant called dogmatism. According to Kant, this should have no place in philosophy. Actual knowledge necessarily is a product of actual work, though some of that work is normally implicit or preconscious. (See also Kantian Discipline; Interpretation; Inferentialism vs Mentalism.)

It also seems to me that foundationalism is incompatible with the Kantian autonomy of reason.

Aristotle, Empiricist?

In contrast with Plato, Aristotle made major contributions to early natural science, and was concerned mainly to interpret human experience of the world. I previously noted with some sympathy John Herman Randall Jr.’s argument that Italian Renaissance Aristotelianism played a much greater role in the development of early modern science than is commonly recognized. I cannot, however, follow Leibniz and Kant’s superficial association of British Empiricist philosophy with Aristotle.

Locke, Berkeley, and Hume were all much closer to Descartes than to Aristotle on key questions related to subjectivity. For all of them, immediate presence to the mind played a foundational role. (See also Empiricism; Aristotelian Subjectivity; Mind Without Mentalism.)

(Locke, Berkeley, and Hume all argued with rather more subtlety and sophistication than Descartes. Unlike Descartes, Locke and Hume did not treat the human soul as a substance, and all three of the great British Empiricists produced detailed accounts of aspects of human cognition that are of lasting value, potentially somewhat independent of the mentalist framework in which they were originally developed.)

Locke and Hume did extensively and systematically develop the notion — commonly attributed to Aristotle in the middle ages — that everything in intellect originates in sense perception. As far as Aristotle is concerned, this seems an overstatement.

Aristotle characteristically looked for multiple “causes”, or reasons why, for a given state of affairs. What I think he really meant to assert, in the brief passage in his treatise on the soul that is taken to support this typically empiricist position, was the more modest thesis that broadly speaking, sense perception provides the event-based occasions that drive occasions of thought. That does not mean that all the content (or form, as Aristotle would call it) of thought has its most direct source in sensation, although significant parts of it clearly do.

Consider something like language. Most concrete instances of language clearly have a sensible component, and those that don’t (such as when we silently talk to ourselves) arguably could not occur if they were not preceded by other instances that did have a sensible component. Without sensation, there could be no language. But that hardly means that linguistic meaning has its primary source in sensation. One could argue that sensation is always depended upon somehow even in considerations of meaning, but it does not seem to be the primary concern. Sensation by itself is a necessary — but not sufficient — basis for an adequate account of thought.

Historiography, Inferentialism

Having laid out some preliminaries, I’ve begun to circle back to more questions of historical detail related to the development here, and it seems fitting to summarize the motivations driving these more historical notes. History is all about the details, but in any inquiry, what are actually higher-order questions about methodology ought to inform primary investigations. We never just have data; it always has to be interpreted, and this involves questions about methodology. With history, this often involves critical examination of the applicability of categories that may tend to be taken for granted. Thus, I am adding notes about the application of various categories or concepts in particular historical settings, and about historical details that seem to have larger methodological significance.

I’m looking back at the history of philosophy (and, to some extent, broader cultural developments) from a point of view inspired by the “inferentialism” of Brandom (taking this as a general name for his point of view), as well as by my own ideas for a revitalized Aristotelianism. In Tales of the Mighty Dead and elsewhere, Brandom himself has effectively placed the historical roots of his development in the broad tradition of early modern philosophical rationalism, including the work of Descartes, Spinoza, and Leibniz. I find standard connotations of the term “rationalism” rather problematic, and want to separate Descartes — of whom I am much more sharply critical than Brandom seems to be — from Spinoza and Leibniz, for whom I find additional reasons to be sympathetic. Brandom has contributed to a new understanding of Kant, and has developed a landmark reading of Hegel. I want to help support the broad thrust of these with historical considerations, while reconnecting them with fresh readings of Aristotle, Plato, and other historical philosophers. With some caveats and in spite of Brandom’s own brief comments, I also want to suggest a possible rapprochement with key insights of 20th century French “structuralism”.

A key point common to most of the tendencies mentioned above is an emphasis on the role of difference in making things intelligible. In the context of philosophical arguments, this means that critical distinctions are as important as positive assertions. Contrasts not only greatly facilitate but largely shape understanding. Brandom himself has developed the contrast between inferentialism and the representationalism of Descartes and Locke. He has made large use of Wilfrid Sellars’ critique of a “Myth of the Given” associated with most varieties of empiricism, and has also referenced the critique of psychologism developed by Frege and others in a logical context.

I have been using the term “mentalism” for a privileging of contents that are supposed to be immediately present to a personal “mind” that is itself conceived mainly in terms of immediate awareness. It seems to me that Descartes and Locke’s version of this was a historically specific combination of all the above notions from which an inferentialism would seek to distinguish itself — representationalism, the Myth of the Given, and psychologism. I have been concerned to point out not only that Cartesian-Lockean mentalism has historically specific antecedents that long predate modernity (going back to Augustine, with some foreshadowing in Plotinus), but also that a proto-inferentialist countertrend is actually even older, going back to Plato and Aristotle’s emphasis on the primacy of reason and reasoned development.

In A Spirit of Trust, Brandom has among many other things expanded on Hegel’s critique of Mastery. I find this to be of tremendous importance for ethics, and consonant with my structuralist sympathies. I have been concerned to point out how extreme claims of mastery are implicit in the various historical kinds of voluntarism, which all want to put some notion of arbitrary will — or authority attributed one-sidedly to such a will — ahead of consideration of what is reasonable and good.

Usual generalization caveats apply to statements about “isms”. In any particular case where the terms seem to apply, we need to look at relevant details, and be alert to the possibility that all aspects of a generalized argument may not apply straightforwardly. (See also Historiography; History of Philosophy.)

Empiricism

Already in the 1950s, analytic philosophers began to seriously question empiricism. Quine’s “Two Dogmas of Empiricism” (1951), Wittgenstein’s Philosophical Investigations (1954), and Sellars’ “Empiricism and the Philosophy of Mind” (1956) all contributed to this.

Brandom explicates Sellars’ pivotal critique of the empiricist “Myth of the Given” as belief in a kind of awareness that counts as a kind of knowledge but does not involve any concepts. (If knowledge is distinguished by the ability to explain, as Aristotle suggested, then any claim to knowledge without concepts is incoherent out of the starting gate.) Building on Sellars’ work, Brandom’s Making It Explicit (1994) finally offered a full-fledged inferentialist alternative. He has rounded this out with a magisterial new reading of Hegel.

The terms “empiricism” and “rationalism” originally referred to schools of Greek medicine, not philosophy. The original empirical school denied the relevance of theory altogether, arguing that medical practice should be based exclusively on observation and experience.

Locke famously began his Essay Concerning Human Understanding (1689) with an argument that there are no innate ideas. I take him to have successfully established this. Unfortunately, he goes on to argue that what are in effect already contentful “ideas” become immediately present to us in sensible intuition. This founding move of British empiricism seems naive compared to what I take Aristotle to have meant. At any rate, I take it to have been decisively refuted by Kant in the Critique of Pure Reason (1781; 2nd ed. 1787). Experience in Kant is highly mediated. “Intuitions without concepts are blind.” (See also Ricoeur on Locke on Personal Identity; Psyche, Subjectivity.)

In the early 20th century, however, there was a great flourishing of phenomenalism, or the view that all knowledge is strictly reducible to sensation understood as immediate awareness. Kant himself was often read as an inconsistent phenomenalist who should be corrected in the direction of consistent phenomenalism. Logical empiricism was a diverse movement with many interesting developments, but sense data theories were widely accepted. Broadly speaking, sense data were supposed to be mind-dependent things of which we are directly aware in perception, and that have the properties they appear to have in perception. They were a recognizable descendent of Cartesian incorrigible appearances and Lockean sensible intuition. (Brandom points out that sense data theory is only one of many varieties of the Myth of the Given; it seems to me that Husserlian phenomenology and its derivatives form another family of examples.)

Quine, Wittgenstein, and Sellars each pointed out serious issues with this sort of empiricism or phenomenalism. Brandom’s colleague John McDowell in Mind and World (1994) defended a very different sort of empiricism that seems to be a kind of conceptually articulated realism. In fact, there is nothing about the practice of empirical science that demands a thin, phenomenalist theory of knowledge. A thicker, more genuinely Kantian notion of experience as always-already conceptual and thus inseparable from thought actually works better anyway.

Thought and intuition are as hylomorphically inseparable in real instances of Kantian experience as form and matter are in Aristotle. A positive role for Kantian intuition as providing neither knowledge nor understanding, but crucial instances for the recognition of error leading to the improvement of understanding, is preserved in Brandom’s A Spirit of Trust. (See also Radical Empiricism?; Primacy of Perception?; Aristotle, Empiricist?)