by Gene Callahan
Abstraction can be an entertaining and useful activity. But every abstraction falsifies reality simply because it is an abstraction – it is a one-sided emphasis on certain aspects of the real at the expense of neglecting or even denying others. That is not necessarily harmful as long as we remember what we have done. But the abstraction, being simpler and more manageable than the real world, is a seductive fantasy, and the temptation to ignore messy reality and attempt to replace it with a clean and neat dreamworld.
Let me offer a few examples to illustrate what I am on about. For instance, Jared Diamond, in Guns, Germs, and Steel, wants to replace the history of the individual with what he seems to think he has founded, namely, “scientific” history. The end result is that he often winds up botching his history. (I have contributed a chapter discussing the errors in Diamond’s understanding of history to a forthcoming book entitled The Meanings of Michael Oakeshott’s Conservatism.) But, he is corrrect in thinking that history is unscientific, at least if, by “scientific,” one means “concerned with abstract universals” — but that is because, as it deals more directly with concrete reality, it represents an advance over the world of science with its general laws that, as Nancy Cartwright points out, lie. (Sudha Shenoy once expressed this idea to me by saying, “The real world is not theoretical; it is historical.”)
I was recently reading The Last Three Minutes by Paul Davies and found him contending that we are not sure whether or not the universe is a digital or an analog computer. Well, we can, in fact, be quite sure that the universe is not any sort of computer – nor are our minds. A computer is a device built by abstracting a certain aspect of human thought and building a machine to implement that abstraction. To then declare that the mind that created the computer, or, even more absurdly, the entire world, “really is” a computer is as if an artist drew a self-portrait and then began to imagine that the drawing was him.
Mary Midgley, in discussing the work of Barrow and Tipler, notes the motive behind this sort of confusion: “They write, in successive sentences, that ‘an intelligent being… is fundamentally a type of computer’, then that ‘a human being is a program designed to run on particular hardware called a human body’, and—still on the same half-page – that ‘a living human being is a representation of a definite program rather than the program itself’. All they want is some formula by which to bypass any large, awkward questions about what a human being really is, and to justify treating it simply as a memory-store, transferable at will to clouds of stellar dust which will outlast the heat death of the universe.” (From her essay “Artificial Intelligence and Creativity.”) That three-year olds may deceive themselves into thinking that Bugs Bunny is a real rabbit is understandable, but it boggles the mind that educated adults of some intelligence can convince themselves that running a ‘simulation’ of a person on a computer – whatever in the world that is supposed to mean – makes that person go on living.
I could continue to cite examples for many, many paragraphs –for instance, in economics, perfect competition used as a norm rather than a foil springs immediately to mind, and, in biology, the fantastical notion that human behavior is “really” just genes trying to survive — but I hope I have offered enough examples to make myself clear already. To try to view reality as “really” being an instance of some abstraction is to try to live in a fantasy. It can’t, of course, actually be done, but, as Collingwood wrote, “A person may think he is a poached egg; that will not make him one: but it will affect his conduct, and for the worse.”