Debate on Prehistory

This is a bit of a tangent from the usual topics here, but recently I’ve been dwelling on the distinctions between knowledge, well-founded belief, and not-so-well-founded belief, and I’m taking that as the point of departure. It should be no insult to science (and I certainly mean none) to suggest that empirical science aims only at what I’ve been calling well-founded belief, though received views are commonly taken for simple knowledge. The difference is that well-founded beliefs can still potentially be invalidated by new arguments or information, whereas real knowledge ought to be unconditionally valid.

I’ve been fascinated with prehistory since childhood, and in recent decades especially with the emergence of rich human cultures. Much has changed in this field during my lifetime, as relatively well-founded beliefs were replaced by better-founded ones. For example, it is now generally accepted that modern birds are surviving members of the theropod group of the dinosaur family that included raptors and T. rex, and that the extinction of the (other) dinosaurs was caused by a massive asteroid impact in the Gulf of Mexico ca. 65 million years ago.

Similarly, it is now widely accepted that biologically modern humans emerged in Africa two to three hundred thousand years ago, rather than in Europe only 40,000 years ago. Humans crossed the open sea from Southeast Asia to Australia over 50,000 years ago. If the previously known cave paintings from the late-glacial Magdalenian culture in southwest Europe were not already amazing enough testaments to the human spirit, the Chauvet cave (subject of the wonderful documentary film Cave of Forgotten Dreams by Werner Herzog) was discovered in 1994 to have equally magnificent paintings that turned out to be twice as old (from around 36,000 BP). Gobekli Tepe in Turkey has multi-ton megaliths dating from 9500 BCE, a little before the earliest evidence of agriculture in that region.

Agriculture is now believed to have independently originated in at least 11 different parts of the Old and New Worlds. Wikipedia now mentions small-scale cultivation of edible grasses by the Sea of Galilee from 21,000 BCE. Sickles apparently used for intensive harvesting of wild grains have been found in the Nile valley from at least 18,000 BCE. The Middle Eastern Natufian culture (ca. 15,000-11,500 BP) was previously thought to have had the world’s oldest agriculture, and still boasts the earliest evidence of baked bread (14,400 BP). Some Natufian portable art bears a striking stylistic resemblance to similar artifacts from Magdalenian Europe at roughly the same time. Numerous archaeologists and anthropologists have suggested that agriculture may have had a very long prehistory, beginning with deliberate efforts to promote the growth of particular wild plants that humans valued.

Currently, there is a big ongoing controversy over the cause of dramatic climate changes that occurred around 12,850 BP, at the beginning of the 1200-year period known as the Younger Dryas. The most recent ice age had begun to recede by around 20,000 BP, and the world had been getting gradually warmer. But then, suddenly, in perhaps only a single year’s time, temperatures fell by an astonishing 9 to 14 degrees centigrade. Then, in somewhere between a few years and a few decades, temperatures apparently rose again by 5 to 10 degrees centigrade. Several massive glacial lakes seem to have suddenly been emptied into the ocean, cooling it down, and there is evidence of gargantuan flooding. On a larger time scale of several thousand years including the Younger Dryas, worldwide sea levels are generally accepted to have risen around 400 feet. Many submerged archaeological sites have already been found, but this could be the tip of the proverbial iceberg.

Due to human-induced climate change, we are currently facing a sea level rise of around 50 feet from melting of the remaining ice caps, which is expected to be catastrophic. Four hundred feet dwarfs that. Today the great majority of the world’s population lives in or near coastal areas, and this may well have been true during the ice age too. Around the time of the Younger Dryas, there is evidence of intensive fishing by cultures like the Jomon of Japan — who also produced pottery older than any known from the Middle East — and the Magdalenians in Europe (not to mention many fresh-water fishing villages spread across what is now the Sahara desert).

By this time, humans would have been biologically modern for over 200,000 years, and had been at least occasionally producing magnificent art for at least 20,000 years. Stone and bone tools with amazing elegance and sophistication had been in use equally long. All hunter-gatherer cultures known to modern anthropology have complex culture, language, and spiritual beliefs. But somehow, we still have the prejudice that hunter-gatherers and “cave people” must have been extremely primitive.

The controversy I mentioned concerns evidence that like the dinosaur extinction, the Younger Dryas was caused by a cataclysm from space. Since 2007 the “Younger Dryas impact theory” has been hotly debated, but it now appears to be gaining ground. I have no particular stake in what really caused the Younger Dryas; I’m really more interested in its effects on humans. But the controversy potentially provides an interesting case study in how highly intelligent, educated people can effectively confuse apparently well-founded belief with “knowledge” that would supposedly be beyond doubt.

It also happens to be the case that Plato in the Critias gives a date for the sinking of the mythical Atlantis at around the time of the Younger Dryas. I don’t assume there is any accuracy in the details of the story — the island with the circular city and so forth — but I think archaeology already provides the basis for an extremely well-founded belief that late-glacial stone age cultures had already reached very high levels of sophistication, and that much more evidence may be hidden at as yet undiscovered underwater sites. This doesn’t mean people back then were flying around in spaceships or anything, or had magical powers, or even that they produced metal. Our standards for what represents “advanced” culture are highly distorted by our own obsessions with technology and money.

Incidentally, Plato in the Laws also casually suggests that animal and plant species come into being and pass away, as well as something like the succession of human material culture from stone to soft metals to iron. The Critias story is attributed to the Athenian lawgiver Solon, who supposedly heard it from an Egyptian priest during his travels there, but no source is given for the apparently accurate speculations about prehistory in the Laws.

All the modern fringe speculation around the Atlantis myth — and around “historical” readings of mythology in general — has given this stuff a bad name. We ought to suspend belief in things for which the evidence is shaky. But a suspension of belief need not — and should not — necessarily imply active disbelief. Our active disbeliefs ought to be well-founded up to the same standard as our active beliefs, and ought not to fall to the level of prejudice.

Beauty and Discursivity

Plotinus was a huge inspiration for me in my youth. Revisiting a piece of his Enneads just now, I am again struck by the majesty of his thought and writing. These days I have a much more positive view of discursive reasoning, but I first wanted to let him speak for himself.

I still agree that there is far more to knowledge and understanding than an accumulation of propositions. But as a teenager, I definitely considered step-by-step reasoning to be something inferior to the kind of holistic intellectual intuition Plotinus emphasizes when he talks about Intellect. The latter I considered to be the true source of insight — “silent mind before talking mind”.

Nowadays, I think that kind of unitary vision is achievable only as the crowning result of much patient work. I no longer take it to be the original source that discursive reasoning imitates in an inferior way. Intellect or Reason does form a relational whole, and the whole is more important than the parts. But today I would emphasize that the relational whole is an articulated whole, and it is the articulation — the making of connections — that is the real essence.

From many connections, we get larger unities. Larger unities are still the goal, but the work of making connections is what makes such fused views possible. The contemplation of well-formed wholes by the silent mind of an embodied human depends on prior work that must include open discursive questioning and reasoning, if the result is to be genuine.

Aristotle made a vitally important distinction between what is first in itself and what is first for us. To directly aim for the highest truth in itself while being dismissive of what is “first for us” is to disregard our nature as rational animals. Put another way, to directly aim for the highest truth is simply to miss it. This is the kind of illegitimate shortcut that Plotinus himself criticized the gnostics for.

We rational animals need the “long detour” of dialectic to properly grasp any kind of real truth. Otherwise, our visionary experiences will just be fever dreams of the sort that incite fanatics. The goal is not just immediacy but mediated immediacy, as Hegel would say. I think Plotinus at least partially recognized this.

To no longer regard things in the manner of “a spectator outside gazing on an outside spectacle” is to overcome naive dichotomies of subject and object. To really do this, we have to clear our minds of prejudice, not just do meditative exercises to silence internal dialogue. Clearing our minds of prejudice is what requires the long detour.

Identification as Valuation

It might seem as though the sort of categorial interpretation of experience and general application of concepts as practiced by Kant in the Critique of Pure Reason were a purely cognitive affair. Many older readings took it that way, and the passages I quoted from Longuenesse’s commentary don’t explicitly dispel such a notion. My very compressed comparison with Aristotle suggests a reconciliation of Aristotelian practical judgment or phronesis with Kantian judgment, but this relies on an implicit view of the unity of Kant’s thought, partially developed elsewhere. The thrust of it is to overlay the cognitive judgment of the first Critique with the teleological and aesthetic judgment of the Critique of Judgment, and then to read the ethical works in terms of that combined notion.

As a concrete example of how the kind of identification of objects dealt with in the first Critique takes on a valuational angle, Brandom cites the identification of a German by a French person as a “boche” or thick-head, a derogatory term from World War I. This immediately suggests many similar examples of prejudice about various alleged “kinds” of people. Brandom argues that even just by the criteria of logical analysis in Kant’s first Critique, the ethically objectionable “boche” and similar terms are not valid concepts at all. They are a kind of false conceptual “universals” that do not reflect any valid generalization, but are only possible with a sort of poor logical hygiene. This shows that such practices of identification are far from neutral. Identification is after all kind of recognition, and Fichte and especially Hegel developed the ethical consequences of this.

Even claims and classifications that are valuationally neutral in themselves can be made in bad faith for some ulterior motive, but the validity of logical operations applied to the real world implicitly presupposes the ethical criterion that we make our judgments in good faith.