Part I | Part II
Why the West rose to become the most powerful civilization, the progenitor of modernity, the culture with the most prodigious creators? The answers are plenty. But it may be that a child psychologist, Jean Piaget, has offered the best theoretical framework to explain the difference between the West and the Rest. Part II of this article continues the examination of George Oesterdiefkhoff’s application and elaboration of Piagetian theory in his ranking of the cognitive development of the peoples of the world. It praises the fundamental insights of this elaboration while arguing that the psychogenetic superiority of European children should be traced back to the appearance of new humans in ancient Greek times who started to realize that their consciousness is the highest point on which all else depends.
Oesterdiefkhoff on the Origins of Western Operational Thinking
Why did Europeans reach the fourth stage of formal operations long before any other peoples in the world? When pressed (in an exchange) about the causes of the emergence of stage four, George Oesterdiefkhoff responded that
schooling and other cultural factors must have been more elaborated in early modern Europe than in Asia, antiquity, and medieval times. The trigger to arouse the evolution of formal operations would have been especially the systems of education (2014b, 376).
He then added:
Admittedly, this begs the question about the causes of this alleged fact and necessitates yet another level of causal explanation…I rather prefer cultural explanations and think about the possible relevance of the advantages of the Greek/Roman alphabet or Aristotelian logic, phenomena fostering the use of abstraction and logic (2014b, 376).
But this is as far as Oesterdiefkhoff goes in explaining why the ancient Greeks reached the fourth stage first. He prefers, rather, to jump right into the modern era, the 17th century, as the century in which formal operational thinking really emerged, from which point he then identifies “the rise of formal operations, the cognitive maturation of people” (in itself) as the “cause” of the rise of modern Europe. He insists that his Piagetian theory “is crucially a causal theory of modernity” (2014b, 375). But no explanation is provided as to the original causes of the rise of formal operational thinking.
If Oesterdiefkhoff’s point is that, without a population in which the children have ontogenetically developed a capacity for formal operations, you can’t have adults engage in formal operational thinking, then I agree that this ontogenetic development is a precondition for a modern society. But we still need an explanation of the rise of “new humans” (to use his own words) capable of formal operational thinking. Does he mean that the Greek/Roman alphabet and Aristotelian logic already contained the seeds of formal reasoning? The alphabet is indeed the most abstract symbolic system of writing in which both consonants and vowels are represented. Can it be denied that Aristotle’s theory of the syllogism is at the level of stage four considering that this theory teaches that we can abstract altogether from the concrete content of an argument and judge its merits solely in terms of how the terms are formally or logically connected to each other?
Oesterdiefkhoff says that the Ionian philosophers (in the sixth century BC) were the first to establish the concrete operational stage and, in the same vein, implies that Aristotle’s philosophy did not rise above this concrete stage. “Aristotle’s physics strongly resembles the animistic physics of children aged 10 before they establish the mechanistic world view”. “The formal operational stage comes into being predominantly with Descartes in the 17th century” (2016, 304). We can agree that it “comes into being predominantly” in this century, but if we also agree that this stage has “many sub-stages” (as Oesterdiefkhoff points out) why can’t we identify Aristotle’s extensive writings on logic, induction and deduction, affirmations and contradictions, syllogisms and modalities, definitions and essences, species, genus, differentia, the categories, as the beginnings of stage four?
Oesterdiefkhoff knows he needs some origins, and admits he is caught in a chicken-egg dilemma. He writes about “a positive feedback loop” in the interrelationship between “the knowledge taught in schools and universities” in modern Europe and the rise of formal reasoning. But instead of “finding the causes for the emergence of formal reasoning in Europe some centuries ago,” he prefers to say that the “highest stage, the stage of formal operations, directly accounts for the rise of modern sciences” (2014a, 269). “The rise of formal operations in the Western world after 1700 is the single cause of the rise of the sciences, industrialism, enlightenment, humanism, and democracy” (2014a, 287).
This may be understandable since Oesterdiefkhoff is not a historian. He has, in my view, made a fundamental contribution to the “rise of the West” debate explaining the direct relevance of Piaget’s theory of cognitive development. None of the participants in this debate care to talk about “cognitive development” but assume (along with the academic establishment) that all humans across all cultures and throughout history (since we became homo sapiens in Africa) are equally rational.
Oesterdiefkhoff wants to fit Western history within a stage theory of developmental psychology in which ancient/medieval times are clearly demarcated from modern operational stages. He writes of the “child-like stages” of peoples living in the pre-modern world, including Europeans, and says that the cognitive age of pre-modern adults “typically corresponds to that of children” before the age of 10. “Medieval philosophy, be it Platonic or Aristotelic, regarded nature and reality as living things, ruled by God, and other spiritual forces. It had no concept of physical laws.” “[T]he rise of formal operations became a phenomenon of major importance as late as the 17th century”. “The kernel of Enlightenment philosophy is the surpassing of childlike mental states, of the world of fairy tales, magic, and superstition, as it prevailed in the pre-modern world” (2014a, 292-295).
He qualifies this estimation a bit when he writes that “formal operations…evolved in the intellectual elite of early modern Europe and slowly spread to other milieus”. But his pervading message is that it was only during the 1700s, or even “after the 1700s,” that Europeans came to reach the operational stage. There is no reason to disagree if he means that only the 1700s and after saw sufficient numbers of Europeans maturing into the last stage, making possible a full-scale industrial revolution. But we still need an explanation of the origins of “new humans”, the first humans who matured into the fourth stage.
I understand that many will be tempted to point to social and educational forces as the causes of this initial cognitive transition to operational thinking. They will argue that as literacy was mastered, and as institutes of learning were established, and arithmetic, reading, and other subjects were taught, a major shift occurred in human mental activity. This emphasis on the educational environment is a view often attributed to the Soviet psychologist A.R. Luria (1902-1977). From this claim, it takes only one step to the identification of the “social and economic” mode of production as the “underlying” factor of this cognitive revolution, thus combining Piaget and Marx’s historical materialism. Ancient Greeks developed operational thinking in the new milieu of urban life, growing trade in the Mediterranean, and money exchanges. The flaw in this explanation is that all these new economic ways were present in greater abundance in the older and larger civilizations of Mesopotamia and elsewhere. These commercial and urban activities could be performed effectively with concrete operational habits of thinking.
The view I will propose in a later section, albeit in a suggestive manner, presupposing for its understanding what I wrote in Uniqueness about the aristocratic culture of Indo-Europeans and in a number of articles here about the masculine preconditions of individualism, the higher fluidity of the Western mind, the multiple intelligences of Europeans, and the bicameral mind, is that Oesterdiefkhoff underplays the importance of self-consciousness, the awareness of humans of their own identity as knowers, in contradistinction to everything that is not-I, in the development of cognition. Europeans were the first to reach the fourth stage, a long time before any other people, because they were a new breed of humans who evolved a uniquely high level of self-awareness, an ability to differentiate clearly between their conscious “I” and the physical world, that is, an awareness of their own minds, as distinguished from their appetitive drives, the conventions of the time, and the world of invisible spirits. This introspective awareness of the role of the human mind as the active agent of cognition is what allowed Europeans to reach the fourth stage so early in their history.
It is no accident that the main precursor of the modern concept of mind is the ancient Greek notion of nous. Plato’s identification of three distinct parts of the soul — rational (nous), appetitive (epithumia) and the spirited (thymos) — can be classified as the first psychological contribution in the Western tradition. Both the appetitive and the spirited parts of the soul are about desires but the appetitive part is about the biologically-determined desires humans have for food, sex, and comfort, whereas the spirited part is about “passion”, the emotions associated with the pursuit of honor and glory, feelings of anger and fear. Plato anticipated the Cartesian dualist separation of mind and body when he argued that the mind was immaterial and immortal whereas the body was material and mortal. He also understood that the Indo-Europeans were the most “high spirited” peoples in the world, once observing that “the Thracians and Scythians and northerners generally” were peoples “with a reputation for a high-spirited character” (Francis M. Cornford, trans., The Republic of Plato, 132). Aristotle added to this observation a distinction between the “high-spirited” but barbaric passions of “those who live in Europe” and the “high-spirited” but “intelligent” virtues of the Hellenic peoples. Aristotle further observed that while the peoples of Asia were intelligent, they were “wanting in spirit and therefore they are always in a state of subjection and slavery“.
I trace this high spirited character to the uniquely aristocratic culture of Indo-Europeans. While one may be tempted to think that the intelligent-rational virtues of the Greeks were able to manifest themselves only when the rational part of their soul was brought to bear on their strong thymotic drives, Plato was correct in observing that the rational part would always be in unending combat with the demands of the appetites were it not for the intervention of the spirited part, the strong sense of aristocratic pride and honor of the Greeks, in helping reason subdue the appetitive part, and, in the same vein, helping reason to channel the high strung energies of the spirited part away from barbaric and chaotic actions into a will-to-knowledge, a courage to break through the unknown, and thus bring forth the first sub-stages of the formal operational stage.
Before I say more about this explanation, I want to outline why I think Europeans, under their own initiative, not just in ancient Greek and Roman times, but again in the High Middle Ages, after the decline of the Dark Ages (500 AD to 1100 AD), were the developers of formal operational habits of thinking long before any other people were compelled to adopt these habits under Western pressure.
Ancient Greeks were the First “New Humans”
How about scientific accomplishments during the Hellenistic era (323-31 BC)? Oesterdiefkhoff seems aware of Hellenistic science when he writes that “Roman intellectuals no longer understood the superior contributions of the Hellenistic scholars” (2014a, 281). Can one say the cognitive processes of the Hellenistic elite were under the age of 10 after reading Lucio Russo’s The Forgotten Revolution: How Science Was Born in 300 BC and Why It Had to Be Reborn? Can one really say that the institutionalization of scientific research in the Museum and Library at Alexandria, which contained 500,000+ papyrus rolls and funded 100 scientists and literary scholars, was not an educational establishment promoting formal operational thinking? We learn from Russo’s book about the conics of Apollonius and the invention of trigonometry by Hipparchus, about Archimedes’s work on hydrostatics and the mechanics of pulleys and levers, the first formal science of weight, about Aristarchus’s heliocentric proposal, about Eratosthenes and his calculations to determine the circumference of the earth. The hypothetico-deductive form of Euclid’s Elements is undeniable; the way in which circles, right angles, parallel lines are explicitly defined in terms of a few fundamental abstract entities, such as points, lines, and planes, on the basis of which many other propositions (theorems) are deduced. (Newton, by the way, was still using Euclidean proofs in Principia).
While the Romans did not make major contributions in mathematics and theoretical science, it should be noted that Claudius Ptolemy in the second century AD, while living under Roman rule in Alexandria, wrote highly technical manuals on astronomy and cartography. The Almagest, which postulates a geocentric model, employs pure geometric concepts combined with very accurate observations of the orbits of planets. It postulates epicycles, eccentric circles, and equant points, with the latter being imaginary points in space from which uniform circular motion is measured. Attention should be paid to the “formal-rational” codification and classification of Roman civil law into four main divisions: the law of inheritance, the law of persons, the law of things, and the law of of obligations, with each of these subdivided into a variety of kinds of laws, with rational methods specifying how to arrive at the formulation of particular rules. The rules upon which legal decisions were based came to be presented in categories headed by definitions. The most general rules within each of these categories were the principles upon which more specific rules were derived. This ordering was in line with a formal operational mode of reasoning for the rules were presented without reference to the factual settings in which they were developed and the terminology used in these rules was abstract.
This effort at a rationally consistent system of law was refined and developed through the first centuries AD, culminating in what is known as Justinian’s Code, 527 to 565 AD, which served as the foundation of the “Papal Revolution” of the years 1050-1150, associated with the rise of Canon Law. This Papal Revolution, by separating the Church’s corporate autonomy, its right to exercise legal authority within its own domain, and by analyzing and synthesizing all authoritative statements concerning the nature of law, the various sources of law, and the definitions and relationships between different kinds of laws, and encouraging whole new types of laws, created a modern legal system.
Oesterdiefkhoff acknowledges in passing that ancient Greece saw “seminal forms of democracy…for a certain period,” a form of state which actually entails, in his view, the fourth stage of cognitive development. If Greek democracy was short lived, what about the republican form of government during ancient Roman times, and the impact this form of government had on the modern Constitution of the United States? We can also mention the representative parliaments and estates of medieval Europe? To be sure, ancient Greece and Rome, and the Middle Ages, were far from the formal operational attainment of modern Europe (even if we draw attention to the continuation of witchcraft and magic in Enlightenment Europe).
It is telling, however, that according to Charles Radding’s book, A World Made by Men: Cognition and Society, 400-1200 (1985), new lines of formal operational reasoning were “well established by 1100” in some European circles. I say “telling” because this book (one of only two) directly employs Piaget’s theory to make sense of Europe’s intellectual history. Oesterdiefkhoff references this work without paying attention to its argument. From an Europe that employed ordeals of boiling water and glowing iron to decide innocence and guilt, and that “looked for direction” to divinely inspired pronouncements from superiors, kings, abbots, or the ancients, and that was rarely concerned with “human intention,” we see (after 1100) a growing number of theologians insisting that humans must employ their God-given reasoning powers to determine the truth. Whereas the way theological disputes were settled before 1100 was “by citing authority,” “it was even increasingly the case [after 1100] that the very authority of a text’s author might be denied or disregarded” (p. 204). Using “one’s own judgment” was encouraged, combined with the study of logic as “the science of distinguishing true and false arguments”.
Although Radding is not definitive and barely elaborates key points, he understands that this increase in logical cognition entailed a new awareness of the distinction “between the knower and what is known”, between the I and the not-I. Medievalists actually went ahead of the ancient Greeks. For Plato, an idea existed and was correct if its origins were outside the mind, in the world of immaterial and perfect forms, which he differentiated from the untrue world of physical things. Perfect ideas were independent of the human mind, outside space and time, immutable. These ideas were not the products of human cognition. While the only way the human mind could apprehend these ideas was through intense training in geometrical (formal) reasoning, the aim was to reach a world of godlike forms to which the human mind was subservient.
While Aristotle transformed Plato’s forms into the “essences” of individual things, he believed that universal words existed in individual objects, or that abstract concepts could be equated with the essences of things. It is not that Aristotle did not perceive any dividing line between the supernatural, the world of dreams, and the natural; it is that he was a conceptual realist who believed that the contents of consciousness really existed as the essences of particular objects. Medieval nominalists showed a deeper grasp of the relationships between the mind and the external world by abandoning the notion that Forms (or ideas) represent true reality, the source of the mind’s ideas, and arguing instead that general concepts are mere names, neither the essences of things nor forms standing outside the material world. Only particular objects existed, and the role of cognition was to make true statements about the world of particular things even though ideas are not things but mental tools originated by men.
Nominalism represented a higher level of awareness of the role the human mind plays in cognition and of the distinction between the knower and the world outside. While Plato distinguished reason from the world of sensory phenomena, including natural desires, and, in so doing, identified the faculty of reasoning in its own right, he viewed human (intellectual) activity as dependent or subservient to a world of perfect and purely immaterial forms existing independently of the mind. Moreover, among medieval philosophers we find (in Peter Abelard, for example) a greater emphasis on intention, the view that the intention of humans should be considered in determining the moral worth of an action. Human action should not be attributed to supernatural powers or evil forces entering into human bodies and directing it. Humans have a capacity to think through different courses of actions and for this reason human actions cannot be understood without a consideration of human intentions.
Radding brings up the emerging “idea of nature as a system of necessary forces” in opposition to the early medieval idea about miraculous events, as well as the “treatment of velocity itself as a quantity… comparing motion that follows differently shaped paths,” in the work of Gerard of Brussels in the early 1200s (p. 249). A better example of formal operational concepts would be Nicole Oresme’s (1320-1382) depiction of uniformly accelerated motion, which was not about motion in the real world but an effort to explain how motion increases uniformly over time in a totally abstract way. This view anticipated Galileo’s law of falling bodies. Among other examples Radding brings to elucidate this medieval shift to formal operational thinking is the observation that by the reign of Henry (1133 –1189) the idea had taken root that consultation of members of the upper classes should be the norm in the workings of the monarchy, as well as the legal idea that mental competence should be a prerequisite in deciding criminal behavior.
The Birth of Expectation in Early Modern Era
Don LePan’s book, The Cognitive Revolution in Western Culture (1989) agrees with Radding that “there is considerable evidence of at least the beginnings of changes in the cognitive processes occurring among the educated elite in the twelfth century” (p. 45). But he believes that new cognitive processes began to spread in the early Modern period (or the Renaissance) when Europeans developed the capacity of “expectancy”, which he defines as “the ability to form specific notions as to what is likely to happen in a given situation” (p. 75). It is around this sense of expectancy, LePan says, that most of the cognitive processes Piaget identifies with four stage are clearly evident. This sense of expectancy involves a “rational assessment of probabilities”, evaluating “disparate pieces of information” within a chain of events and circumstances as to whether something is likely to transpire in the future or not, drawing inferences from this information, and projecting “these inferences into the hypothetical realm of the future” (pp. 74-75). Before this capacity developed, the sense of future expectation that humans had was of a predetermined sort, or accidental and beyond reason, in which an outcome was believed to happen “regardless of the intervening chain of events” (p. 79) and without an objective assessment of human intentions and events about how the future event will likely happen.
This sense of expectancy involved the emergence of an ability to think in terms of abstract universal time, as contrasted to the commonly held notion of pre-modern peoples that “time moves at variable speeds, depending on the nature and quality of the events”. Among primitives, the recounting of past events, or history, is merely an aggregation of disconnected anecdotes without any sense of chronology and causal relationship and no grammatical distinction between words referring to past events or to present events. The past is conceived similarly to the present. While early Christian historians did have a sense of chronology, a universal history where all events were framed within a temporal sequence, they did not have a framework of abstract and objective time. They were more interested in detecting the plan of God rather than in how humans with intentions made their own history.
Because pre-modern peoples lack a framework of abstract and objective time, the ‘when’ of an event is merely about before or after other events and not about the length of time elapsed between it and other events. Premodern peoples are also incapable of distinguishing between travelling the same distance and travelling at the same speed. They lack the habit of thinking of velocity as a quantity distinct from those of distance and time. Without a temporal conception wherein one can think of causes as anterior to the effect, it is not possible to consider historical events in terms of causal relations within a sequence of past, present, and future events.
For these reasons, premodern peoples were unable to think in terms of expectations of a hypothetical future, that is, to think about what will happen in the future in terms of multiple chains of causation and the ways in which these causes, sometimes happening simultaneously in different places, may bring about a future effect. LePan is particularly keen in showing that William Shakespeare’s originality was a result of his ability to create complex plots which gave the audience “a continual sense of anticipation…by drawing them into [an] unfolding pattern of connections with the past and the future of the story” (p. 175). The curiosity of a premodern audience is restricted to what will happen next within a sequence of episodes in which the reader or audience is confident about what is likely to happen, or what the final outcome will be, and in which there is, therefore, no sense of expectation whether it will happen and no concern to envisage the hypothetical possibilities of situations and no weighing of causes and intentions against each other and no judgment of what the probable outcome of the future will be.
As to what brought this new sense of expectation and the spread of the habits of thought associated with stage four, LePan is inclined to follow A. R Luria’s argument that the causes of cognitive change are due to social and educational factors. He is rather vague; as society changes, literacy is mastered, the level of education increases, the cognitive processes change. Which came first, new cognitive processes or new ways of educating children or new “underlying economic changes”? They “reinforced each other”. LePan carefully distances himself from any claim that Europeans were genetically wired for higher levels of cognition. Even thought he rejects the establishment idea that “all peoples think with exactly the same thought processes”, he believes that all humans are equally capable of reaching this stage. Without realizing that Piaget laid the groundwork for Kohlberg’s moral stages, he insists there is no “direct correlation between degrees of rationality and degrees of moral goodness” (p. 15). The book ends with a strange “postscript” about how he has been living with his wife in rural Zimbabwe for the last two years. He says he wishes the primitive and the modern mind could co-exist with each other, praises the cultural “vitality” of this African country, and then concludes with the expectation that “if something like a new Shakespeare is to emerge, it will be from the valleys of the Niger or the Zambezi” (p. 307). The subtitle of The Cognitive Revolution in Western Culture is Volume I The Birth of Expectation. He did not write a second volume.
The uniqueness of the West frightens academics. They have concocted every imaginable explanation to avoid coming to terms with the fact that Europeans could not have produced so many transformations, innovations, renaissances, original thinkers, and the entire modern world, without having superior intellectual powers and superior creative impulses. The tendency for some decades now has been to ignore the cultural achievements of Europeans, minimize them or reduce the “rise of the West” to one transformation, the industrial revolution, currently seen as the only happening that brought about the “great divergence”. The prevailing interpretation paints these achievements as no better than what transpired in any other primitive culture, and, indeed, far worse insomuch as the West was different only in its imperialistic habits, obsessive impulse for military competition and genocidal actions against other races.
I agree with Oesterdiefkhoff that the faster cognitive maturation of European peoples “is the decisive phenomenon” in need of an explanation if our aim is to explain the rise of modern-scientific society. I will leave aside the question whether this is the only factor that needs explanation if our aim is to explain other unique attributes of the West, such as the immense cultural creativity of this civilization. Cultural creativity in the arts presupposes a higher level of cognitive development but it would be one sided to reduce all forms of creativity to formal operational habits. Once these cognitive habits are established, formal operations can be performed at the highest level of expertise by individuals who are not creative with a high IQ and a very good education. Computers can be programmed to perform formal operations but it is hard to say that they are self-conscious beings rather than automata unthinkingly executing prescribed actions. Computers do not understand the meaning of the real world they are processing information for; they are not “aware” of what they are thinking about, they have no sense of self, and cannot, therefore, examine their own thoughts, exercise free will and show a spirited character. Obviously, humans who engage in formal operations are not computers. But if we equate the human intellect with formal operational thinking, identify this capacity as the defining trait of modern culture and Western uniqueness, we are endorsing a computational model of human consciousness.
Self-Consciousness is Uniquely European
Oesterdiefkhoff and LePan wanted to generate the origins of formal operational habits by positing the prior presence of proto-formal habits, the alphabet, Aristotelian logic, literacy; but knowing this was a self-referential explanation they also brought in educational institutions, implying thereby that these institutions were created by proto-formal thinkers who taught children to learn formal operations, still offering a self-referential account. We need to step outside the world of formal operations to understand its origins. Oesterdiefkhoff identifies Descartes as the first thinker to offer a systematic methodology for the pursuit of knowledge based strictly on formal operational principles. It is not a coincidence that Descartes is also known as the first modern philosopher in having postulated self-consciousness as the first principle of his formal-deductive philosophy. Descartes showed himself to be very spirited in daring to doubt and repudiate all authority and everything he had been taught, to arrive at the view that the only secure foundation for knowledge was in self-consciousness. The only secure ground for formal operations was his certainty that he was a thinking being despite doubting everything else. Everything could be subjected to doubt except his awareness that his own mind is the one authority capable of deciding what is true knowledge, not the external senses and not any external authority.
The Cartesian idea that self-consciousness on its own can self-ground itself would be superseded by future thinkers who correctly set about connecting self-consciousness to an intersubjective social context (a social setting I would identify as singularly European since no other setting could have generated this Cartesian idea). My point now is that Piaget’s fourth stage, in its modern form, would have been impossible without self-consciousness. Descartes did not invent self-consciousness; ancient Greece saw the beginnings of self-conscious new humans; but he did offer its first modern expression, with more sophisticated expressions to follow. It is worth citing Hegel’s treatment of Descartes in his History of Philosophy:
Actually we now first come to the philosophy of the modern world, and we begin this with Descartes. With him we truly enter upon an independent philosophy, which knows that it emerges independently out of reason…Here, we may say, we are at home, and like the mariner after a long voyage over the tempestuous seas, we can finally call out, “Land!”…In this new period the essential principle is that of thought, which proceeds solely from itself…The universal principle is now to grasp the inner sphere as such, and to set aside the claims of dead externality and authority; the latter is to be viewed as out of place here (Hegel’s Lectures on the History of Philosophy, trans., Haldane and Simpson, vol. III, p. 217).
The key idea is that thought proceeds from itself, out of reason, independently of all external authorities. The biological roots of this declaration of independence by the human thinking subject are to be found in the natural obsession men have shown across all cultures to affirm the male ego in contradistinction to the enveloping womblike environment. This struggle for male identity is only a sexual precondition, an always present one, for the subsequent appearance of self-awareness and the first inklings of human individuality. The first cultural signs of individualism are to be found in pre-historical Indo-European societies uniquely ruled by “high spirited” aristocratic men living in a state of permanent mobility and adversity for whom the highest value in life was honourable struggle to the death for pure prestige. It was out of this struggle by aristocratic men seeking excellence in warfare worthy of recognition from their aristocratic peers that the separation and freedom of humans from the undifferentiated world of nature and the undifferentiated world of collectivist-despotic societies was fostered.
Cognitive and evolutionary psychologists, and philosophers of the mind, take it for granted that humans as humans are self-conscious beings, aware of themselves as living. “Consciousness is the greatest invention in the history of life; it has allowed life to become aware of itself“, said Stephen Jay Gould. This is true if by self-consciousness we mean the awareness humans have of their first-person inner experiences, pain, feelings, memories. Human beings are constantly trying “to understand, respond to and manipulate the behavior of other human beings,” and in so doing they learn to read other people’s behavior, their feelings, and interests, by self-examining their own thoughts and feelings, imagining what it is like to be in the other person’s shoes. This capacity to reflect on one’s states of mind and emotions in order to understand the behavior of others is a biologically ingrained trait found in all humans, selected by nature. Nicholas Humphrey, in a very insightful short book, The Inner Eye, identifies this capacity as a form of “social intelligence” that evolved with gorillas and chimps. Consciousness was selected by nature because it enhanced the ability of these primates to survive within social settings characterized by “endless small disputes about social dominance, about who grooms who, about who should have first access to a favourite food, or sleep in the best site” (p.37). In dealing with these issues, primates “have to think, remember, calculate, and weigh things up inside their heads” (p. 39). They have to learn to read the brains of other gorillas by looking inside their own brains and imagining what it is like to be in the situation of another gorilla.
This social intelligence is very different, but just as important, as the technical and natural intelligence required to survive in the acquisition of food and protection in a hostile environment. I am not going to rehearse Steven Mithen’s additional claim to Nicholas Humphrey’s argument that consciousness can be said to have emerged not when primates learned to predict the social behaviour of other members of the group, but when homo-sapiens during the Upper Paleolithic era managed to achieve enough “cognitive fluidity” between the different intelligences, social, linguistic, technical, and natural. Neither will I rehearse Julian Jaynes’s argument that such advanced peoples as the Mesopotamians and Egyptians were still lacking in self-consciousness, without “an interior self”, subservient to powerful gods controlling and arresting the development of their cognitive processes. I have added in part one of this article Piaget’s scientifically based argument that pre-modern peoples did have “childlike” minds, which made it very difficult for them to rely on their own reasoning powers, to attain independence from the influence of unknown spirits and age-old mandates accepted without reflection.
I will conclude by asserting that it goes against the entire history of actual cognition, actual intellectual developments, the history of science, mathematics, psychology, physics, chemistry, to be satisfied with the degree of consciousness found in primates, Upper Paleolithic peoples, and all non-Western civilizations, which never reached the stage of formal operations, and which stagnated intellectually after the Bronze Age, and, in the case of China and the Islamic world, after about 1300 AD. Europeans reached a higher level of consciousness starting in ancient Greek times with their spirited discovery of the faculty of the mind, and their increasing awareness of their own agency as human beings capable of understanding the workings of the world in terms of self-determined or rationally validated regularities coupled with their growing awareness that man was the measure of all things, a subject with a spirited will-to-be-conscious of himself as a free subject, rather than a mere object of nature and mysterious forces, but a subject who takes himself to be the ‘highest point’ on which all else depends. But this self-consciousness was in its infancy in ancient times and it would take German idealism during the 1800s for a full account of how the (self-conscious) I can be shown to lie at the very basis of all knowledge, and beyond this outlook to develop a philosophical-historical account that demonstrates a full awareness that this self-conscious I was self-generated only within the particular cultural setting of Western Civilization.
Dasen, P. (1994). “Culture and cognitive development from a Piagetian perspective.” In W .J. Lonner & R.S. Malpass (Eds.), Psychology and culture. Boston: Allyn and Bacon.
Humphrey, Nicholas (2002). The Inner Eye. Social Intelligence in Evolution. Oxford University Press.
Radding, M. Charles (1985). A World Made by Men. Cognition and Society, 400-1200. The University of North Carolina Press.