Independent Things

Having just posted notes on Aristotle’s Metaphysics book Zeta (VII), I wanted to pause for some personal reflections. The hands-on engagement of putting together a textual commentary like that with extensive quotes always gives me a quality of insight into the material that I don’t get from just reading or re-reading a text.

One of the ways Aristotle stands out as a philosopher — to speak a bit figuratively — is his philosophically generous attitude toward not only living, “independent” beings, but ordinary “things” of all sorts. This carries over into his ethics.

Engagement in the world, approached the right way, need be no distraction from our essential concerns. Rather, for Aristotle it is a fulfillment of the “purpose” of the kind of beings that we are. He encourages us to cultivate a feeling of being fundamentally at home in life in the world, a feeling strong enough to remain ultimately unshaken by our emotional responses to events and circumstances. By contrast, Plotinus, for instance, though appreciative of beauty in all its forms, ultimately directs our attention both spiritually and philosophically away from the world and toward the One. Modern philosophers tend to view the world as inert matter for us to manipulate, not something with which we would feel kinship and a sense of belonging.

Hegel criticizes Kant for being “too tender” toward objects, but I feel that this and some other remarks are a bit lacking in interpretive charity, even though Hegel is deeply Kantian in many ways. In particular, I have a lot more sympathy for Kant’s notion of “things in themselves” than Hegel did.

Kantian things in themselves don’t exactly align with either Aristotle’s notion of independent things or with the what-it-is of those things, but they have relations to both, which may suggest an alternate way out of the Kantian “impasse” that troubles Hegel. What Hegel regards as an unresolved impasse in Kant in this area is the irreducible gap Kant sets up between knowledge and things in themselves. But Aristotle also says we do not have knowledge of independent things or their what-it-is.

We may have knowledge of their articulations, but articulations are only expressible in terms of universals (words with posited meanings that are applicable to multiple things), while independent things and their what-it-is are particulars. Therefore, for Aristotle too there will be a sort of Kantian gap between knowledge and independent things. I have praised this as a kind of “epistemic modesty”.

We have only experience and acquaintance with independent things, not knowledge. We may also dialectically inquire, interpret, and make judgments about them, thus reaching relatively well-founded belief, but we cannot know them, because they are particulars independent of us, while all knowledge (episteme) is discursive.

When it comes to the what-it-is of things as distinct from the independent things themselves, we have no experience or acquaintance either, but only the “long detour” of dialectic, interpretation, and judgment. This, it seems to me, is what Hegel’s logic of essence addresses. In the logic of essence, Hegel speaks to Aristotelian considerations, and I would now say more specifically that Hegel’s logic of essence explores more or less the same dialectical level as Metaphysics book Zeta.

Kant’s things in themselves seem utterly remote and mysterious to nearly everyone — I dare say much more so than the Aristotelian what-it-is. A historical reason for this is not far to seek. Kant’s intellectual formation was in the milieu of the Wolffian school, within which the small fraction of the works of Leibniz published in his lifetime played a leading role.

Leibniz developed the highly original notion of the “complete essence” of a thing, corresponding to the way God would know it — as including every true statement about a thing, including all the empirical facts applicable to its past, present, and future. Leibniz’ God is concerned with the totality of logical truth about a thing.

From the point of view of Aristotle or Hegel, this turn to the totality of logical and factual truth abolishes the distinction between essence and what is not essence. It thus effectively abolishes the more specific concept of essence and a “deeper truth”. An emphasis on complete essence also foregrounds something we could not possibly experience over the sensible independent things with articulable properties that we do experience.

For Leibniz, naturally enough, only God knows complete essences. Humans could not possibly know them. What I want to suggest here is that the reason the Kantian thing-in-itself is inherently unknowable by us is that it basically is a Leibnizian complete essence.

Because a complete essence is no longer a proper what-it-is that can potentially be distinguished from the many incidental facts about a thing, it is far less tractable to Aristotelian or Hegelian dialectic than a what-it-is that at least potentially can be so distinguished. A complete essence poses head-on what Hegel calls the “problem of indifference”, which plagued early modern philosophy. Among all the true statements about a thing, there is no clear way to pick out which would be more relevant to what Aristotle would call the articulation of what-it-is.

While Aristotelian independent things and their what-it-is are unknowable because they are particulars, they remain relatively tractable to dialectical inquiry, and are therefore not radically unknowable to humans in the way a complete essence or thing in itself would be. Certainly Aristotle seems to say more about them that is meaningful than Kant is able to say about things in themselves.

Hegel wants to abolish things “in themselves” — not at all because he wants to abolish Aristotelian independent things or their what-it-is, but because he objects both to the Hermetic isolation of complete essences from one another and to the problem of indifference that complete essences pose. He in effect goes back to Aristotle on this.

It is important to emphasize that the independence of an Aristotelian independent thing means it cannot be just an object of consciousness. It is supposed to be a reality in its own right. While this is not the only point of view we may adopt, the kind of deeper truth that Aristotle, Kant, and Hegel all seek is not to be found by fleeing the world and leaving such realities behind.

If we accept an Aristotelian revision of the Kantian gap between knowledge and what is, the gap no longer brings inquiry to a halt. Then the broadly Kantian view that there is a gap and the broadly Hegelian view that we can go a long way toward overcoming it can both be sustained. (See also Practical Wisdom.)

Next in this series: Toward Potentiality and Actuality

Debate on Prehistory

This is a bit of a tangent from the usual topics here, but recently I’ve been dwelling on the distinctions between knowledge, well-founded belief, and not-so-well-founded belief, and I’m taking that as the point of departure. It should be no insult to science (and I certainly mean none) to suggest that empirical science aims only at what I’ve been calling well-founded belief, though received views are commonly taken for simple knowledge. The difference is that well-founded beliefs can still potentially be invalidated by new arguments or information, whereas real knowledge ought to be unconditionally valid.

I’ve been fascinated with prehistory since childhood, and in recent decades especially with the emergence of rich human cultures. Much has changed in this field during my lifetime, as relatively well-founded beliefs were replaced by better-founded ones. For example, it is now generally accepted that modern birds are surviving members of the theropod group of the dinosaur family that included raptors and T. rex, and that the extinction of the (other) dinosaurs was caused by a massive asteroid impact in the Gulf of Mexico ca. 65 million years ago.

Similarly, it is now widely accepted that biologically modern humans emerged in Africa two to three hundred thousand years ago, rather than in Europe only 40,000 years ago. Humans crossed the open sea from Southeast Asia to Australia over 50,000 years ago. If the previously known cave paintings from the late-glacial Magdalenian culture in southwest Europe were not already amazing enough testaments to the human spirit, the Chauvet cave (subject of the wonderful documentary film Cave of Forgotten Dreams by Werner Herzog) was discovered in 1994 to have equally magnificent paintings that turned out to be twice as old (from around 36,000 BP). Gobekli Tepe in Turkey has multi-ton megaliths dating from 9500 BCE, a little before the earliest evidence of agriculture in that region.

Agriculture is now believed to have independently originated in at least 11 different parts of the Old and New Worlds. Wikipedia now mentions small-scale cultivation of edible grasses by the Sea of Galilee from 21,000 BCE. Sickles apparently used for intensive harvesting of wild grains have been found in the Nile valley from at least 18,000 BCE. The Middle Eastern Natufian culture (ca. 15,000-11,500 BP) was previously thought to have had the world’s oldest agriculture, and still boasts the earliest evidence of baked bread (14,400 BP). Some Natufian portable art bears a striking stylistic resemblance to similar artifacts from Magdalenian Europe at roughly the same time. Numerous archaeologists and anthropologists have suggested that agriculture may have had a very long prehistory, beginning with deliberate efforts to promote the growth of particular wild plants that humans valued.

Currently, there is a big ongoing controversy over the cause of dramatic climate changes that occurred around 12,850 BP, at the beginning of the 1200-year period known as the Younger Dryas. The most recent ice age had begun to recede by around 20,000 BP, and the world had been getting gradually warmer. But then, suddenly, in perhaps only a single year’s time, temperatures fell by an astonishing 9 to 14 degrees centigrade. Then, in somewhere between a few years and a few decades, temperatures apparently rose again by 5 to 10 degrees centigrade. Several massive glacial lakes seem to have suddenly been emptied into the ocean, cooling it down, and there is evidence of gargantuan flooding. On a larger time scale of several thousand years including the Younger Dryas, worldwide sea levels are generally accepted to have risen around 400 feet. Many submerged archaeological sites have already been found, but this could be the tip of the proverbial iceberg.

Due to human-induced climate change, we are currently facing a sea level rise of around 50 feet from melting of the remaining ice caps, which is expected to be catastrophic. Four hundred feet dwarfs that. Today the great majority of the world’s population lives in or near coastal areas, and this may well have been true during the ice age too. Around the time of the Younger Dryas, there is evidence of intensive fishing by cultures like the Jomon of Japan — who also produced pottery older than any known from the Middle East — and the Magdalenians in Europe (not to mention many fresh-water fishing villages spread across what is now the Sahara desert).

By this time, humans would have been biologically modern for over 200,000 years, and had been at least occasionally producing magnificent art for at least 20,000 years. Stone and bone tools with amazing elegance and sophistication had been in use equally long. All hunter-gatherer cultures known to modern anthropology have complex culture, language, and spiritual beliefs. But somehow, we still have the prejudice that hunter-gatherers and “cave people” must have been extremely primitive.

The controversy I mentioned concerns evidence that like the dinosaur extinction, the Younger Dryas was caused by a cataclysm from space. Since 2007 the “Younger Dryas impact theory” has been hotly debated, but it now appears to be gaining ground. I have no particular stake in what really caused the Younger Dryas; I’m really more interested in its effects on humans. But the controversy potentially provides an interesting case study in how highly intelligent, educated people can effectively confuse apparently well-founded belief with “knowledge” that would supposedly be beyond doubt.

It also happens to be the case that Plato in the Critias gives a date for the sinking of the mythical Atlantis at around the time of the Younger Dryas. I don’t assume there is any accuracy in the details of the story — the island with the circular city and so forth — but I think archaeology already provides the basis for an extremely well-founded belief that late-glacial stone age cultures had already reached very high levels of sophistication, and that much more evidence may be hidden at as yet undiscovered underwater sites. This doesn’t mean people back then were flying around in spaceships or anything, or had magical powers, or even that they produced metal. Our standards for what represents “advanced” culture are highly distorted by our own obsessions with technology and money.

Incidentally, Plato in the Laws also casually suggests that animal and plant species come into being and pass away, as well as something like the succession of human material culture from stone to soft metals to iron. The Critias story is attributed to the Athenian lawgiver Solon, who supposedly heard it from an Egyptian priest during his travels there, but no source is given for the apparently accurate speculations about prehistory in the Laws.

All the modern fringe speculation around the Atlantis myth — and around “historical” readings of mythology in general — has given this stuff a bad name. We ought to suspend belief in things for which the evidence is shaky. But a suspension of belief need not — and should not — necessarily imply active disbelief. Our active disbeliefs ought to be well-founded up to the same standard as our active beliefs, and ought not to fall to the level of prejudice.