In the last two weeks, I profiled Francis Bacon and Galileo Galilei, the so-called fathers of the scientific method and modern science. They marked a massive turning point in the way humans thought and discovered facts about the world, but it still took us a few hundred years to accumulate the facts that mattered most. In one simple chart, here is what I mean by that statement:
Yes, this chart is highly simplified (see this infographic for a more detailed story of the variability of life expectancy since the time of Neanderthals), but the general slopes and durations of these lines are essentially correct and they portray quite starkly a species that was locked into its role on Earth until it was suddenly liberated in the 1900's by the aggregation of knowledge about microbiology, sanitation, political organisation, economic models of production and distribution, medicine, and other advances in other sciences and technologies. The characterisation of this long stretch of stagnant life expectancy as "nasty, brutish, and short" was given to us by the next thinker in my examination of the survival of the fittest philosophers--Thomas Hobbes. Though his thought experiment of what life must have been like in the state of nature before societies were formed was essentially correct, he didn't have enough knowledge of history, archeology, and anthropology to know that he was still living in the (only slightly less) nasty, brutish, and short age himself. It's important to remember this as we enter the era of the modern philosophers. Although their arguments and reasoning are often quite sophisticated, there is an explanation for why many of their conclusions appear off the wall—they just didn't have all the facts that the accumulation of our scientific methods have now unearthed. Let's hear some of the good words of Hobbes then before his philosophy is picked apart in the light of our current perch in history.

In the state of nature, profit is the measure of right. (Note, free marketers, that this is not a good thing!)

Leisure is the mother of philosophy, and common-wealth the mother of peace and leisure; where first were great and flourishing cities, there was first the study of philosophy.

If this superstitious fear of spirits were taken away, and with it, prognostications from dreams, false prophecies, and many other things depending thereon, by which, crafty ambitious persons abuse the simple people, men would be much more fitted then they are for civill obedience.

For such is the nature of men, that howsoever they may acknowledge many others to be more witty, or more eloquent, or more learned; yet they will hardly believe there be many so wise as themselves: for they see their own wisdom at hand, and other men's at a distance.

Where men build on false grounds, the more they build, the greater is the ruin.

The source of every crime, is some defect of the understanding; or some error in reasoning; or some sudden force of the passions.

The office of the sovereign, be it a monarch or an assembly, consists in the end for that which he was trusted with the sovereign power, namely the procuration of the safety of the people, to which he is obliged by the law of nature.

(Particularly the laws of nature that govern evolution!)

Thomas Hobbes (1588-1679 CE) was an English philosopher, best known today for his work on political philosophy. His 1651 book Leviathan established the foundation for most of Western political philosophy from the perspective of social contract theory.

Hobbes is famous for describing the natural state of mankind (the state pertaining before a central government is formed) as a war of every man against every man in which life is “nasty, brutish, and short.” This is an apt description of a species characterized primarily by competition. Unchecked competition narrows time horizons and does lead to a shortened existence.

Hobbes’ account of human nature as self-interested cooperation has proved to be an enduring theory in the field of philosophical anthropology. Yes. We cooperate in order to better compete. Over all time horizons, self-interested cooperation explains the survival of genes, the survival of the self, and the survival of the species.

Needs to Adapt
Hobbes was one of the main philosophers who founded materialism (matter is the only substance, and reality is identical with the actually occurring states of energy and matter). He argued repeatedly that there are no incorporeal substances, and that all things, including human thoughts, and even God, heaven, and hell are corporeal matter in motion. Materialism is the basis for reality and it has discovered no evidence of god, heaven, or hell. Human thoughts are brain states.

The starting point for most social contract theories is an examination of the human condition absent from any structured social order, usually termed the “state of nature.” In this condition, an individual’s actions are bound only by his or her personal power, constrained by conscience. From this common starting point, the various proponents of social contract theory attempt to explain why it is in an individual’s rational self-interest to voluntarily give up the freedom one has in the state of nature in order to obtain the benefits of political order. Hobbes advocated absolute monarchy but he also developed some of the fundamentals of European liberal thought: the right of the individual; the natural equality of all men; the view that all legitimate political power must be "representative" and based on the consent of the people; and a liberal interpretation of law which leaves people free to do whatever the law does not explicitly forbid. Cooperative groups outcompete individualistic groups. This is why society develops from “states of nature.” Cooperation is maintained by recognizing the right of the individual, the equality of all men and women, and the consent of the governed. Absolute monarchies are incompatible with consent. Economic theory demonstrates the need to have monopoly providers for public goods such as justice, but representative government with checks and balances in the system is a better solution to the need to enforce and engender cooperation.

Gone Extinct
Leviathan was written during the English Civil War; much of the book is occupied with demonstrating the necessity of a strong central authority to avoid the evil of discord and civil war. In particular, the doctrine of separation of powers is rejected: the sovereign must control civil, military, judicial and ecclesiastical powers. Right diagnosis - wrong solution. A better one was yet to come.

In coming to terms with some of Hobbes' poor political philosophy conclusions, it's important to keep in mind just what a time of upheaval the English Civil War was. To give you an idea of the horrors that Hobbes was witnessing, estimates indicate that "England suffered a 3.7% loss of population, Scotland a loss of 6%, while Ireland suffered a loss of 41% of its population. Putting these numbers into the context of other catastrophes helps to understand the devastation. The great potato famine of 1845–1852 resulted in a loss of 16% of the Irish population, while during the Second World War the population of the Soviet Union also fell by 16%." A man of sympathy can therefore be forgiven for overreacting to these levels of destruction by arguing for the stability of strong monarchies. However, political reform was, and continues to be, necessary to avoid a repeat of the devastation that war and conflict bring to our species. The key to combatting our nasty, brutish, and short lives? It's cooperation among informed citizens in a progressive society organised by those with an understanding of philosophy. This is what will lead us to the pleasant, gentle, and long lives we all wish for ourselves.

The Piazza dei Miracoli, or the Square of Miracles, in Pisa, Italy. It's a place named for supernatural unexplainable phenomena, but paradoxically it's much more famous for engineering (both failures and fixes) and science, because it was the location of Galileo's (apocryphal?) experiment about falling objects, which overturned Aristotle's 2,000-year-old theory of mass that was so horribly wrong. I suppose it was miraculous that mankind held on that long to the idea that heavy objects fall faster than lighter ones in direct proportion to their weight. What else could explain such an unexamined belief?

Last week I wrote about Francis Bacon—the man credited with inventing the scientific method. While he did famously die of pneumonia after experimenting with the use of snow to refrigerate chicken meat, it's one thing to devote your life to science, it's another thing to do so in the face of great hostility and manage to change the world's view of its worldview. That's what Galileo Galilei managed to achieve from his humble origins in Pisa. He wasn't exactly a philosopher in the way that we use the term today; he was chiefly concerned with the cosmology aspect of metaphysics rather than any of the moral, epistemological, political, or logical aspects of the field. But Galileo "has always played a key role in any history of science and, in many histories of philosophy, he is a, if not the, central figure of the scientific revolution of the 17th Century. When he was born there was no such thing as ‘science’, yet by the time he died science was well on its way to becoming a discipline and its concepts and method a whole philosophical system." As an evolutionary philosopher myself, one who's whole philosophical system is indeed driven by science, I thought is was very important to mark this transition in philosophy when I was making my list concerning the survival of the fittest philosophers. Here is what I briefly said about Galileo at that time:

Galileo Galilei (1564-1642 CE) was an Italian physicist, mathematician, astronomer, and philosopher who played a major role in the Scientific Revolution. Galileo has been called the father of modern observational astronomy, the father of modern physics, the father of science, and the father of modern science.

Stephen Hawking says, "Galileo, perhaps more than any other single person, was responsible for the birth of modern science.” He aided the separation of science from both philosophy and religion - a major development in human thought. For his views on heliocentrism, he was tried by the Inquisition, found "vehemently suspect of heresy," forced to recant, and spent the rest of his life under house arrest. Just a brief note to honor the debt we owe this man for his imprisonment and determination in the face of the church.

Needs to Adapt

Gone Extinct


Just what does this mean to say Galileo was at the centre of a scientific revolution? Well we're drawing the period of medieval philosophy to a close because of this revolution so let's look at some of the key ideas and people that were a part of it. As you read through this list, try to imagine a world that had not discovered these ideas yet (though to be honest, it's probably impossible for us to do so since so much of our concepts of the world and ourselves are bound up in understanding these things).

  • Nicolaus Copernicus published On the Revolutions of the Heavenly Spheres in 1543, which advanced the heliocentric theory of cosmology.
  • Andreas Vesalius published De Humani Corporis Fabrica (On the Structure of the Human Body) in 1543, which found that the circulation of blood resolved from pumping of the heart. He also assembled the first human skeleton from cutting open cadavers.
  • William Gilbert published On the Magnet and Magnetic Bodies, and on the Great Magnet the Earth in 1600, which laid the foundations of a theory of magnetism and electricity.
  • Tycho Brahe made extensive and more accurate naked eye observations of the planets in the late 16th century, which became the basic data for Kepler's astronomical studies.
  • Sir Francis Bacon published Novum Organum in 1620, which outlined a new system of logic based on the process of reduction, which he offered as an improvement over Aristotle's philosophical process of syllogism. This contributed to the development of what became known as the scientific method.
  • Galileo Galilei improved the telescope, with which he made several important astronomical discoveries, including the four largest moons of Jupiter, the phases of Venus, and the rings of Saturn, and made detailed observations of sunspots. He also developed the laws for falling bodies based on pioneering quantitative experiments, which he analyzed mathematically.
  • Johannes Kepler published the first two of his three laws of planetary motion in 1609.
  • René Descartes published his Discourse on the Method in 1637, which helped to extend the definition of the scientific method.
  • Antonie van Leeuwenhoek constructed powerful single lens microscopes and made extensive observations that he published around 1660, opening up the micro-world of biology.
  • Isaac Newton (1643–1727) built upon the work of Kepler and Galileo. He showed that an inverse square law for gravity explained the elliptical orbits of the planets, and advanced the law of universal gravitation. His development of infinitesimal calculus opened up new applications of the methods of mathematics to science.

In just over 100 years (lightening speed without modern transportation and communication methods), the entire world changed with the introduction of science. Where did I come from? Where am I? What am I? All of the answers to these age-old philosophical questions were changed forever. We'll discuss Descartes and Newton from this list a little later (as well as many other descendants of the scientific revolution), but for now, let's just finally say goodbye to medieval philosophy with some wise words from the man who marks its end.

The modern observations deprive all former writers of any authority, since if they had seen what we see, they would have judged as we judge.

Philosophy is written in that great book which ever lies before our eyes — I mean the universe — but we cannot understand it if we do not first learn the language and grasp the symbols in which it is written.

All truths are easy to understand once they are discovered; the point is to discover them.

To apply oneself to great inventions, starting from the smallest beginnings, is no task for ordinary minds; to divine that wonderful arts lie hid behind trivial and childish things is a conception for superhuman talents.

In the sciences, the authority of thousands of opinions is not worth as much as one tiny spark of reason in an individual man.

It's time to take stock of where we've been, because we are about to cross a momentous threshold. So far in this series of essays about the survival of the fittest philosophers, I've looked at 21 of history's most influential thinkers. Starting with the foundations of the major religions and their moral philosophies for the world, I looked at Moses and the 10 Commandments, the writings of the Upanishads, Taoism, Buddhism, and Confucianism. Together, these form the basis of 77% of the religious population of the world (with unaffiliated, folk, and "other" making up the rest). Next up in my essays was the birth of proper philosophy in Ancient Greece that brought us logic, metaphysics, epistemology, and political philosophy. These were given to us by the pre-Socratics, Socrates, Plato, Aristotle, Epicurus, the Stoics, and the Skeptics. After these Greeks took our thoughts as far as they could go with the use of our senses alone—without the benefit of any great advancements in science and technology—humanity then turned inward for 1,000 years, battling over doctrines of revelation and their interpretation by religious thinkers such as Jesus, Augustine, Muhammed, Avicenna, Anselm, Averroes, Aquinas, Erasmus, and Luther. This collected cannon of ancient wisdom, philosophy, and religion sometimes seems like a mountain of thought upon which we might rely, but as Francis Bacon said:

The age of antiquity is the youth of the world.

It's important to remember that the 2,800 years during which these thoughts were developed is still just a spec in the vast sweep of evolutionary history—a spec that was overwhelmed once we discovered the tools to uncover the rest of time that our universe has existed. The update of Cosmos with Neil deGrasse Tyson is on tv right now and I hope you are all watching it. In one of the recent episodes, they updated the cosmic calendar—a concept first popularised by Carl Sagan in his book Dragons of Eden: Speculations on the Evolution of Human Intelligence, and then widely expanded when Sagan hosted the original televised version of Cosmos that first aired in 1980. In the cosmic calendar, the entire history of the universe is compressed into a 365-day scale to help our human brains make sense of the vast numbers we get into when discussing the 13.8-billion-year history of the universe. Here's the original 5-minute clip explaining this concept:
So each month is 1.25 billion years. Each day is 40 million years. Each second almost 500 years. The first humans arrived at 10:30 pm on December 31st. At 11:46, humans tamed fire. At 11:59 and 20 seconds, we finally domesticated plants and animals. The first cities took hold at 11:59:35. All the philosophical works I've profiled were compiled from 11:59:52 to 11:59:59. It's only in the last second of the last minute of the last hour of the last day of the last month of the cosmic calendar that all of our scientific advances have occurred. It's only during that last second that we learned anything about the previous 31,535,993 seconds that were in the cosmic calendar before recorded history. It's only during that last second that we learned anything about 99.999978% of the history of the universe. And yet we lend credence to those ancients who relied upon their 0.000022% of experience?

What happened that changed all this? What happened that unlocked the overwhelming majority of history to our inquisitive minds? The scientific method happened. And it was ushered in by Francis Bacon in 1620 when he published Novum Organum (New Instrument in English)Although Aristotle had "provided specific axioms for every scientific discipline, what Bacon found lacking in the Greek philosopher's work was a master principle or general theory of science, which could be applied to all branches of natural history and philosophy." Novum Organum filled that gap when it "outlined a new system of logic based on the process of reduction, which he offered as an improvement over Aristotle's philosophical process of syllogism. This contributed to the development of what became known as the scientific method" during the scientific revolution.

Under King James I of England, Bacon had risen to the highest political office of Lord Chancellor, but "his international fame and influence spread during his last years when he was able to focus his energies exclusively on his philosophical work, and even more so after his death, when English scientists took up his idea of a cooperative research institution in establishing the Royal Society." As if this all weren't enough, Bacon was also a beautiful writer, contributing many strong quotes to the history of philosophy.

The monuments of wit survive the monuments of power. Knowledge itself is power.

Human knowledge and human power meet in one; for where the cause is not known the effect cannot be produced. 

We cannot command nature except by obeying her.

No pleasure is comparable to the standing upon the vantage-ground of truth.

If a man will begin with certainties, he shall end in doubts; but if he will be content to begin with doubts he shall end in certainties.

Those who have taken upon them to lay down the law of nature as a thing already searched out and understood, whether they have spoken in simple assurance or professional affectation, have therein done philosophy and the sciences great injury.

But by far the greatest obstacle to the progress of science and to the undertaking of new tasks and provinces therein is found in this—that men despair and think things impossible.

Prosperity doth best discover vice, but adversity doth best discover virtue.

If a man be gracious and courteous to strangers, it shows he is a citizen of the world, and that his heart is no island cut off from other lands, but a continent that joins to them.

Nothing doth more hurt in a state than that cunning men pass for wise.

Read not to contradict and confute; nor to believe and take for granted; nor to find talk and discourse; but to weigh and consider.

Some books are to be tasted, others to be swallowed, and some few to be chewed and digested.

Reading maketh a full man; conference a ready man; and writing, an exact man.

Let's see exactly what I wrote about Bacon when I considered him in my original Evolutionary Philosophy book.

Francis Bacon (1561-1626 CE) was an English philosopher, statesman, scientist, lawyer, jurist, author, and pioneer of the scientific method. Bacon has been called the creator of empiricism, and remains extremely influential through his works, especially as a philosophical advocate and practitioner of the scientific method during the scientific revolution. The third US president Thomas Jefferson wrote; "Bacon, Locke, and Newton. I consider them as the three greatest men that have ever lived, without any exception, and as having laid the foundation of those superstructures which have been raised in the Physical and Moral sciences."

Scientific Method - a method or procedure that has characterized natural science since the 17th century, consisting in systematic observation, measurement, and experiment, and the formulation, testing, and modification of hypotheses. It bears repeating just how important this is to the discovery of knowledge we need to survive.

Bacon did not propose an actual philosophy, but rather a method of developing philosophy. He argued that although philosophy at the time used the deductive syllogism to interpret nature, the philosopher should instead proceed through inductive reasoning from fact to axiom to law. This is the spirit by which Evolutionary Philosophy hopes to develop its beliefs.

Needs to Adapt
The end of induction is the discovery of forms, the ways in which natural phenomena occur, the causes from which they proceed. The continuation of science has uncovered that perfect forms do not lie behind existence. Diversity of form, adaptability of existence - these are what allow natural phenomena to survive.

Bacon said that men should confine the sense within the limits of duty in respect to things divine, while not falling in the opposite error, which would be to think that inquisition of nature is forbidden by divine law. Another admonition was concerning the ends of science: that mankind should seek knowledge not for pleasure, contention, superiority over others, profit, fame, or power, but for the benefit and use of life, and that they perfect and govern it in charity. Life is the end goal. Senses must be confined within the limits of what is good for life. This is not divine. It is not from a god. It is from reality and the world we live in. It is profane and it is good.

In 1623 Bacon expressed his aspirations and ideals in New Atlantis. Released in 1627, this was his creation of an ideal land where "generosity and enlightenment, dignity and splendor, piety and public spirit" were the commonly held qualities of the inhabitants of Bensalem. Taking the definition of piety as dutiful to oneself and to society (and not to religion), then this utopian vision does indeed describe a cooperative species built to survive for the long term.

Gone Extinct

Our first thinker with none of his major contributions having gone extinct. That is the power of the scientific method. Will we continue to use it? Will we use it wisely? As Carl Sagan said at the close of the clip above, "what happens in the first second of the next cosmic year, depends on what we do." In the anthropocene we are certainly now in, our choices do dominate the path that life on earth will take. Will we even make it another 500 years? I hope so. It would be a shame to throw away all this work from the last cosmic second.
Sometimes I dream about nailing copies of my book to church doors around the world a la Luther...

In last week's essay, I noted that when Erasmus was charged with “laying the egg that Luther hatched,” he half admitted the charge but said he had expected quite another bird. Erasmus may have opened the door to formal critique of the Catholic church when he published In Praise of Folly in 1511, but what kind of bird was it that charged through that door?

In 1517, when Martin Luther posted a large sheet on the All Saints' Church in Wittenberg, Saxony, his Ninety-Five Theses laid out "a devastating critique of the church’s sale of indulgences and explained the fundamentals of justification by grace alone." This was no mean feat considering that the Spanish Inquisition had recently been established in 1478. Heretics were being burned at the stake (or worse) at the hands of Catholic zealots all over Europe who were defending their dogma. Yet Erasmus kept some of those hounds at bay through the vast weight of his intellect. By the 1530s, for example, his writings accounted for 10 to 20 percent of all book sales. During the early years of the reformation, Erasmus "did much, mostly quietly and through private conferences and correspondence, to ensure that Luther was not abruptly silenced and put to death," even though he didn't fully approve of Luther's ideas and eventually broke publicly with him in 1524. Let's look more at Luther to understand why these enemies of a common enemy could not remain friends. It's a strange tale not often appreciated by all the sects of Christianity that owe their existence to this man.

Luther was born into a peasant family, but one that invested heavily in his education. He was just on the verge of becoming a lawyer in 1505 at the age of 21 when he was caught in very heavy thunderstorm. Afraid that he was going to die, he screamed out a vow: “Save me, St. Anna, and I shall become a monk!” He survived, of course, as most of us do from thunderstorms, but he did keep his promise to that loud rend in the air to enter the monastery, even though it was a difficult decision that Martin knew would greatly disappoint his parents. Beyond the vow, however, Luther had strong internal reasons to join the monastery, haunted by insecurity about his salvation, which he described as an overwhelming terror, calling them his anfectungen or afflictions. Luther was not alone in this experience. The "late medieval piety that Luther was a part of, which stressed Christ primarily as the avenging Judge, made spiritual terror, guilt, and despair the ordeal of many." Fleeing the fear inflicted by the church, Luther sought comfort in the only place that offered it…that very same church. Who could have guessed? Assurance evaded Luther though, as he grew disenchanted with all he saw on the inside of the institution that had offered him condemnation as well as salvation. He threw himself into the life of a monk, but it did not seem to help. Finally, a mentor told Luther to "focus on Christ and him alone in his quest for assurance," and this lead to Luther's break with the need for interventions from a religious hierarchy that he found to be full of abuses—especially nepotism, simony, usury, pluralism, and the sale of indulgences. After a dozen years in service, a mentally tortured Luther nailed his mounting objections to the wall.

Within two weeks, copies of the Ninety-Five Theses had "spread throughout Germany; within two months throughout Europe. In January 1518, friends of Luther translated the Ninety-Five Theses from Latin into German, printed, and widely copied them, making the controversy one of the first in history to be aided by the printing press." Once again, technology enabled progress. It took the Church almost three years to formulate a response, but in June of 1520, Pope Leo X issued a Papal bull outlining forty-one purported errors in Luther's theses. Luther was then summoned by the Holy Roman Emperor Charles V to either renounce or reaffirm his views in a grand assembly—known as a Diet—that was to take place in the town of Worms, Germany in what later became awkwardly known to English speakers as the Diet of Worms.

At the Diet, an imperial prosecutor asked Luther if a collection of his writings were indeed his and if he was ready to revoke their heresies. Luther requested time to prepare his answer and was given until the following day when he came back with an impassioned speech that encouraged others to take up his fight. He said:

Unless I am convinced by the testimony of the Scriptures or by clear reason (for I do not trust either in the pope or in councils alone, since it is well known that they have often erred and contradicted themselves), I am bound by the Scriptures I have quoted, and my conscience is captive to the Word of God. I cannot and will not recant anything, since it is neither safe nor right to go against conscience. May God help me. Amen.

Private conferences were held to determine Luther's fate, but before a decision was reached, Luther fled. During his return home, he disappeared though and went into hiding, for one month later the Edict of Worms was issued by Emperor Charles V, which declared:

We forbid anyone from this time forward to dare, either by words or by deeds, to receive, defend, sustain, or favour the said Martin Luther. On the contrary, we want him to be apprehended and punished as a notorious heretic, as he deserves, to be brought personally before us, or to be securely guarded until those who have captured him inform us, whereupon we will order the appropriate manner of proceeding against the said Luther. Those who will help in his capture will be rewarded generously for their good work.

Gimme some of that old time religion!

In January 1521, the pope excommunicated Luther, but this only emboldened him. In the summer of 1521, he "widened his target from individual pieties like indulgences and pilgrimages to doctrines at the heart of Church practices." Revolutionary theologians took his lead and embarked on a radical programme of reform that "provoked disturbances, including a revolt by the Augustinian monks against their prior, the smashing of statues and images in churches, and denunciations of the magistracy." Bands of visionary zealots preached "revolutionary doctrines such as the equality of man, adult baptism, and Christ's imminent return." Luther set about reversing or modifying some of these new church practices, but he was unable to stifle the radicalism that had been unleashed by his passion. Preachers helped instigate the German Peasants' War of 1524–25, during which many atrocities were committed, often in Luther's name, but this was only the beginning of many religious wars in Europe that raged from 1524 to 1648.

Luther spent the rest of his days marrying a nun, raising six children, organising the new church he had helped give birth to, and writing numerous volumes. Unfortunately, among Luther's other major works was his 60,000-word treatise On the Jews and Their Lies, published in 1543, three years before his death. Luther argued that the Jews were "no longer the chosen people but 'the devil's people', and referred to them with violent, vile language. Luther advocated setting synagogues on fire, destroying Jewish prayerbooks, forbidding rabbis from preaching, seizing Jews' property and money, and smashing up their homes, so that these 'envenomed worms' would be forced into labour or expelled 'for all time'. Luther's words 'We are at fault in not slaying them' amounted to a sanction for murder." According to the consensus view of historians, this anti-Jewish rhetoric from a man considered by Germans to be a major prophet contributed significantly to the development of antisemitism in Germany, and provided an "ideal underpinning" for the Nazis' attacks on Jews in the 1930s and 1940s almost 400 years later.

And yet there are Lutherans!

Though he does not fare well in my brief analysis of him in the survival of the fittest philosophers.

Martin Luther (1483-1546 CE) was a German priest, professor of theology and iconic figure of the Protestant Reformation.


Needs to Adapt
Luther kicked off the Protestant reformation with the publication of his “Ninety Five Thesis” on October 31, 1517, which attacked the church’s sale of indulgences. He was plagued by uncertainty and doubt about his own salvation until he found solace in Paul’s epistle to the Romans discussing God’s graciousness to the individual – this allowed him to rebel against the Catholic Church and its indulgences. He felt indulgences placed dependence on traveling salesmen instead of God. Luther was not a proponent of reason and science, but his forceful revolt enabled Christianity to be opened up to the evolutionary force of competition, which ultimately diminished its powers due to its weak intellectual basis.

Gone Extinct
Luther famously wrote that “reason is the devil’s whore.” He held though, that philosophy and reason are a great aid to society when used properly and a threat only when used improperly. The proper role of philosophy is organizational and as an aid in governance. Reason can be an aid to faith in that it helps to clarify and organize, but it is always second-order discourse; it is faith seeking understanding and never the reverse. Reason is the devil’s whore precisely because it asks the wrong questions and looks in the wrong direction for answers. Revelation is the only proper place for theology to begin, reason must always take a back seat. Reason discovers the truth. Truth must never take a back seat to revelation, faith, or theology. Truth is required for survival. By calling for an abandonment of reason and risking the survival of the species for the sake of its beliefs and its power, religion is the true evil and the true whore selling its soul for its own survival. No more - go extinct already!

Unfortunately, religion is far from extinct at the moment, though its influence is clearly waning. Fortunately for me, this marks the last of the influential philosophers of history who were primarily concerned with religion. From here on out, we can confine our discussions to more rational debates. Thank goodness!

A constant element of enjoyment must be mingled with our studies, so that we think of learning as a game rather than a form of drudgery, for no activity can be continued for long if it does not to some extent afford pleasure to the participant.
     ---- Desiderius Erasmus in a Letter to Christian Northoff (1497)

Oh mankind. Is it any wonder you cannot help but look upon me and go mad with desire. Just outside your grasp, I keep always a bit ahead. Reach out for a touch, but never can you capture. You call me a diversion, an interruption, a disturbance. You do what you can do to try to spurn me with bad labels, with blaming weaknesses of character for your short span of attention. You even test out pills to give you strength against my lures, while extolling the great uses of grit, focus, and resolve. But where would you all be without me in front of your nose. What joy would be in life without some thing to chase and chase. Spend too much, too much, time in deep and quiet contemplation, and one is always led toward a difficult old question. One whose likely answer we would rather not consider. What's that over there? Oh I see, you've caught me anew. I am your own Distraction. Now let me go and hide again.
     ----  In Praise of Distraction (my own attempt at mingling enjoyment with these studies) 

The Erasmus bridge in Rotterdam with its single, strong, bent tower anchored to one side of a river, enabling dozens of smaller strands to support a path to the other side. Is there a better metaphor for the man himself whose landmark work In Praise of Folly buttressed the revolt in Europe against Catholic control of the spirit—a revolt that led eventually to the humanism that lessened the role of the supernatural in daily life? I think not.

In Praise of Follypublished in 1511, starts off with "a satirical passage, in which Folly praises herself; it then takes a darker tone in a series of orations as Folly praises self-deception and madness and moves to a satirical examination of pious but superstitious abuses of Catholic doctrine and corrupt practices in parts of the Roman Catholic Church and the folly of pedants. The essay ends with a straightforward statement of Christian ideals." This short book (less than 100 pages), supposedly written in one week as a gift to Erasmus' friend Thomas More (he of Utopia), laid the foundations of the Protestant Reformation with an impervious critique against the practices of the Church and its political allies. Over the past four weeks, I've been stuck in the medieval religious musings of Islamic and Christian philosophers--Avicenna, Anslem, Averroes, and Aquinas—who represented the stagnation of European thought from AD 1000-1250 with something akin to a stutterer trying to begin a recitation of the alphabet. A, a, a, a… But then, a further 250 years later, after that stagnation left society rotting in corrupt practices among the powerful, someone finally became more concerned with human affairs than with finding proofs for the existence of beings that offer no such evidence of being.

Erasmus was the most famous and influential humanist of the Northern Renaissance. He was, to quote the Stanford Encyclopaedia of Philosophy at length, "a phenomenally productive writer (the most complete edition of his collected works fills ten large folio volumes) and was the first European intellectual to exploit fully the power of the printed word, making the true center of his career not a university or the court of a secular prince or high prelate but the greatest publishing houses of the Netherlands, Paris, Venice, and—above all—Basel. He was a prolific and influential author in many genres. He was a leading writer on education, author of five influential treatises on humanist educational theory, and even a greater number of widely used and often reprinted textbooks taught in humanistic schools throughout Europe. The guides to theological method and exegesis of the Bible that he wrote as prefaces to the 1516 and 1518 editions of the New Testament mark a major turn in theology and the interpretation of Scripture and posed a serious challenge to the scholastic theology that had dominated university faculties of theology since the thirteenth century. The one genre in which Erasmus wrote no works at all was philosophy, though he often cited ancient philosophers and dealt (normally in a non-philosophical way) with several intellectual problems of interest to philosophers."

In those days, humanism was not so much a philosophy, but more of a method of learning that stood in contrast to the medieval scholastic mode. While scholastics focused on resolving contradictions between authors, humanists studied ancient texts in the original and appraised them through a combination of reasoning and empirical evidence. This is exactly what I am trying to do with my analysis of the survival of the fittest philosophers. So even though he's less of a philosopher and more of a social critic, Erasmus nevertheless marks an important shift in thinking that took place so he is worth noting in this series of essays on the evolution of human philosophy. With that in mind, let's see how Erasmus stacks up.

Desiderius Erasmus (1466-1536 CE) was a Dutch Renaissance humanist, Catholic priest, social critic, teacher, early proponent of religious toleration, and theologian. He has been called the Prince of the Humanists, and the crowning glory of the Christian humanists.

Erasmus is most famous for “In Praise of Folly” in which the personification of Folly praises foolish activities of the day, including superstitious religious practices, uncritical theories held by traditional scientists, the vanity of Church leaders, folk beliefs in ghosts and goblins, Christian rituals involving prayers to the saints, and the sale of “indulgence certificates” by the Catholic church to raise money for lavish building projects in return for less time in purgatory. It’s always good to call for an end to wasteful practices that do harm to the species.

Needs to Adapt
Erasmus marks the point where the “new learning” had arrived at the parting of the ways. He tried to free the methods of scholarship from the rigidity and formalism of medieval traditions. His life seems full of fatal contradictions, but it was his conviction that what was needed to regenerate Europe was sound learning, applied frankly and fearlessly to the administration of public affairs in Church and State. All great, except the applications of this learning would eventually undermine the very existence of a church at all.

When Erasmus was charged with “laying the egg that Luther hatched,” he half admitted the charge but said he had expected quite another bird. Unfortunately he showed cowardice or a lack of purpose by writing that a man may properly have two opinions on religious subjects - one for himself and his intimate friends, and another for the public. Truth should never be hidden from the public. Erasmus probably would not have felt the need to hide his beliefs in later ages, but at least he was another thin wedge cracking the hegemony of the church.

Gone Extinct
After his death his writings were placed on the Index of Prohibited Books by the Roman Catholic Church. It’s not his writings that have gone extinct, but the idea that a book should ever be banned by a religion. What a shame this Catholic practice occurred for over 400 years from 1559 to 1966.

Not bad. Erasmus is also the man who coined the following famous phrase:

In the country of the blind the one eyed man is king.

And while he did, unfortunately live in very blind times, Erasmus managed to open one eye and also had this sage advice to offer:

You must acquire the best knowledge first, and without delay; it is the height of madness to learn what you will later have to unlearn.

Would that we all could take such a path and avoid the madness many of our childhoods inflicted…

Phew. Got a satire you'd like to write for fun? Share it in the comments below. I think we could all use a giggle.

This is a picture of a lecture hall where the very first public dissection of a human body was performed in the oldest university in the world. Do you know where it is? Shouldn't this be a place of pilgrimage for anyone wanting to cherish the role that learning has played in our history? I think it should.

University. The word is derived from the Latin universitas magistrorum et scholarium, which roughly means "community of teachers and scholars." Prior to their creation, apprentices were taught in separate guilds; there was no "one place" you could go to for any and all learning. And most scholarly work (if you can call it that) took place in monasteries where Christian dogma was passed down unchallenged during the 500-600 years after the fall of Rome. Finally though, around 1080 AD,  “scholasticism" was introduced into religious studies by Anselm who wanted to use reason to (ontologically) prove the existence of his god and thereby justify all the monastic work that had gone toward that belief. These new scholastics became focused on applying logic and facts about natural processes to biblical passages in an attempt to prove their viability. This became the primary mission of lecturers, and the expectation of students in monasteries. As more and more of the products of these monasteries interacted with the world though, reason finally leaked into the general public. All over Europe "rulers and city governments began to create universities to satisfy a European thirst for knowledge and the belief that society would benefit from the scholarly expertise generated from these institutions. Princes and leaders of city governments perceived the potential benefits of having a scholarly expertise develop with the ability to address difficult problems and achieve desired ends." The first universities in Europe were thus formed in Bologna (1088), Paris (1150), Oxford (1167), Modena (1175), Palencia (1208), Cambridge (1209), Salamanca (1218), Montpellier (1220), Padua (1222), Naples (1224), and Toulouse (1229). The rediscovery of Aristotle's works during this time (which we saw in the story of Averroes last week) also fuelled this general spirit of rational inquiry that had now re-emerged into the world.

Also emerging into this world, just after the 10th ever university was founded, was Thomas Aquinas. Born in 1225 in Roccasecca, a small village midway between Rome and Naples, Aquinas "lived at a critical juncture of western culture when the arrival of the Aristotelian corpus in Latin translation reopened the question of the relation between faith and reason that had remained intact for centuries." The fact that this crisis flared up just as universities were being founded meant that Aquinas (who came from a wealthy family intent on educating him) was well positioned to study these questions of faith and reason and ended up becoming the one to find a new way of coexisting between these two poles. The result was one that survived in secular society until the rise of physics tore the religious universe apart. Even today though, in the religious world, Aquinas is "honored as a saint by the Catholic Church and is held to be the model teacher for those studying for the priesthood. In modern times, under papal directives, the study of his works have long been used as a core of the required program of study for those seeking ordination as priests or deacons, as well as for those in religious formation and for other students of the sacred disciplines (Catholic philosophy, theology, history, liturgy, and canon law)."

This sounds crazy, relying on 13th century writings for modern education, until you realise that religion does not use evidence to progress and Aquinas led the way for that stagnation with quotes such as:

To one who has faith, no explanation is necessary. To one without faith, no explanation is possible.

This perfectly illustrates the unbridgeable divide between the faithful and the secular that has kept religion mired in simplistic thinking for thousands of years. I sometimes grow weary of entering the religious debate over and over, but it is worth remembering this quote from Aquinas to find the strength to continue on:

Better to illuminate than merely to shine, to deliver to others contemplated truths than merely to contemplate.

Isn't that a good motto for philosophers to follow. And like the unsettled times that Aquinas lived in, when Aristotelean reason butted heads with Christian faith, our current information age is a new time of diverse ideas coming together. Like the universities that brought communities of scholars together for the first time, the internet is now bringing together vast new universes of knowledge that are mixing together and being distilled for truth. Those who take the time to open-mindedly contemplate the diverse beliefs that existed in their own niches for hundreds or thousands of years can eventually discover the truths that survive the competition of combination and appraisal. Once that task is done, those truths must, as Aquinas said, be shared to illuminate others who remain stuck in their secluded mindsets. I'll continue my own efforts now to illuminate—and to find others who can illuminate me—by sharing my analysis of how Aquinas fared in my survival of the fittest philosophers. Please point out to me where I am wrong so that I may cease being wrong as soon as possible.

Thomas Aquinas (1225-1274 CE) was an Italian Dominican priest of the Catholic Church, and an immensely influential philosopher and theologian in the tradition of scholasticism. The works for which he is best known are the Summa Theologica and the Summa Contra Gentiles. He is considered the Church's greatest theologian and philosopher.


Needs to Adapt
Aquinas was the foremost classical proponent of natural theology. Natural theology is a branch of theology based on reason and ordinary experience. Thus it is distinguished from revealed theology (or revealed religion), which is based on scripture and religious experiences of various kinds. At least this continued the crack in religious leadership that allowed the light of the scientific method to eventually shine through.

Aquinas defined the four cardinal virtues as prudence, temperance, justice, and fortitude. There are, however, three theological virtues: faith, hope, and charity. These are supernatural and are distinct from other virtues in their object, namely, God. The four cardinal virtues do fold into the six categories of virtue enumerated by positive psychology. Hope and charity are contained in two other categories. Faith, if defined as belief in positivity, is a virtue. The religious definition of faith though - belief without proof - is a detriment to life and therefore a vice. None of these virtues are supernatural. All are evolved behaviors that aid in the continued life of the species.

Gone Extinct
Thomas believed that the existence of God is neither obvious nor unprovable. In the Summa Theologica, he considered in great detail five reasons for the existence of God, which he termed the Quinque Viaa or Five Ways. (1) The argument of the unmoved mover. Infinite regression questions leave us with the same question, not god as an answer. We still don’t know how the universe began. (2) The argument of the first cause. This is the same infinite regression that leads us back to the question of what happened before the Big Bang. (3) The argument from contingency. Even if something has always existed, there is nothing to say that it is a god who designed the universe and watches over us. (4) The argument from degree. Actually, there is no evidence of varying degrees of perfection. There is only change and adaptation to the environment in order to remain alive. (5) The teleological argument. This precursor of intelligent design ignores the ignorance of life and the blindness of evolution. For those that cannot adapt, the universe is a changing place with no mercy. The search for a proof for the existence of god continues without success.

In Thomas's thought, the goal of human existence is union and eternal fellowship with God. Specifically, this goal is achieved through the beatific vision, an event in which a person experiences perfect, unending happiness by seeing the very essence of God. This vision, which occurs after death, is a gift from God given to those who have experienced salvation and redemption through Christ while living on earth. How sad that the purpose of life was thought to be death. The meaning of life is to live! With further definitions, religion can allow for long-term thinking and living a good life, but the false promises lead too often to wasted sacrifice, missed opportunities, and enabling self-destructive behavior.

Aquinas believed that truth is known through reason (natural revelation) and faith (supernatural revelation). Supernatural revelation (faith) and natural revelation (reason) are complementary rather than contradictory in nature, for they pertain to the same unity: truth. When supernatural revelations contradict each other, they cannot be said to contain any elements of truth. By definition, no supernatural revelation can ever be proven to be better than any other supernatural revelation. Reason is still the only path to truth.

Aquinas never considered himself a philosopher, and criticized philosophers, whom he saw as pagans, for always "falling short of the true and proper wisdom to be found in Christian revelation.” This philosopher is happy to exclude Aquinas from our ranks. Christian revelation falls well short of the wisdom and truth discovered by philosophy and science.

So the world went to university and Christianity immediately and forever found its highest thinker. But he wasn't very bright. Meanwhile, universities went on and on with their own discoveries, and so shall we...

Remember this detail of Plato (left) and Aristotle (right) in The School of Athens fresco by Raphael? Aristotle is gesturing to the earth, representing his belief in knowledge through empirical observation and experience, while Plato is gesturing to the heavens, representing his belief in The Forms. Rinse, wait 1000 years, and repeat. (picture credit Wikipedia)
What is it about philosophers that makes them think there is a world of perfection somewhere out there hanging free from this one? Is it in the nature of those who think hard about the world to lose themselves in their reveries and drift loose to a place where they can dream that they have no ties to the material realm? Or do their difficulties and frustrations with things as they are somehow nurture them to develop these idealist longings? The answer is a bit of both of course, with the "nature x nurture" model explaining both the origin of personalities distributed along a spectrum of being biased towards thought or action, as well as explaining how random environments help exaggerate or blunt those tendencies toward a successful adaptive fit. Explained thusly, it's no wonder we keep seeing these patterns repeated—of thinkers drifting off, only to be tugged back to reality by a clear-eyed empiricist. We first saw this in the perfect forms of Plato, which he thought existed out there somewhere in the ether and were the prior generators of all things in the world. But those forms were dismissed by Aristotle, perhaps the first great scientist, who preferred to start with what he saw and simply explain the world from there.

Two weeks ago, I took a look at Avicenna—the first great philosopher of the Islamic Golden Age. He lived in the far eastern edge of the caliphate in modern day Uzbekistan and used arguments about infinite regressions and floating men to infer that there must be an essence that precedes the existence of the world. But just as Aristotle rebuffed Plato
in Ancient Greece, an Islamic scholar came along to rebuff Avicenna with a more natural existentialist explanation of what we see. Unfortunately, it took 200 years for this second bright light of this Islamic period to arise, and he did so on the opposite end of the empire some 4500 miles away in Cordoba Spain. This Islamic Aristotelian was Averroes.

If you remember from my profile of Aristotle, we only have 31 of his approximately 200 treatises, and the writing that survives, "makes heavy use of unexplained technical terminology, and his sentence structure can at times prove frustrating...haphazardly organized, if organized at all…(which) helps explain why students who turn to Aristotle after first being introduced to the prose in Plato's dialogues often find the experience frustrating." One of the reasons Aristotle survives at all is because of the translations and summaries that Averroes undertook for these works. Reporting how he was inspired to write his famous commentaries, Averroes said, "Abu Bakr ibn Tufayl summoned me one day and told me that he had heard the Commander of the Faithful complaining about the disjointedness of Aristotle's mode of expression and the resultant obscurity of his intentions. He said that if someone took on these books who could summarize them and clarify their aims after first thoroughly understanding them himself, people would have an easier time comprehending them. 'If you have the energy,' Ibn Tufayl told me, 'you do it. I'm confident you can, because I know what a good mind and devoted character you have, and how dedicated you are to the art.'"

Important work, these summaries of philosophers too lost in their obtuse thoughts for their own good… Speaking of which, here's how I viewed the contributions of Averroes in my own analysis of the survival of the fittest philosophers.
Averroes (1126-1198 CE) was a Muslim polymath, a master of Aristotelian philosophy, Islamic philosophy, Islamic theology, Maliki law and jurisprudence, logic, psychology, politics, Arabic music theory, and the sciences of medicine, astronomy, geography, mathematics, physics and celestial mechanics.

In ontology, Averroes rejects the view advanced by Avicenna that existence is merely accidental. Avicenna held that “essence is ontologically prior to existence.” The accidental, i.e. attributes that are not essential, are additional contingent characteristics. A hat may be red, it may be old, and (for Avicenna) it may exist. Averroes, following Aristotle, holds that individual existing substances are primary. One may separate them mentally; however, ontologically speaking, existence and essence are one. Yes. More existentialism in history.

Averroes’ most important original philosophical work was The Incoherence of Incoherence, in which he defended Aristotelian philosophy. Other works were the Fasl al-Maqal, which argued for the legality of philosophical investigation under Islamic law. Averroes, following Plato, accepted the principle of women’s equality. He thought they should be educated and allowed to serve in the military; the best among them might be tomorrow’s philosophers or rulers. Averroes had no discernible influence on Islamic philosophic thought until modern times though. What a shame for such a large swath of humanity.

Needs to Adapt
Arab philosophers did not have access to Aristotle's Politics. Averroes commented on Plato's Republic, arguing that the state there described was the same as the original constitution of the Arabs. Averroes, following Plato’s paternalistic model, advances an authoritarian ideal. Absolute monarchy led by a philosopher-king creates a virtuous society. This requires extensive use of coercion, although persuasion is preferred and possible if the young are properly raised. Kings, even philosopher-kings, are an untenable inconsistency in a cooperative society. Representative government is required to strengthen social bonds since that is philosophically consistent with the ideal society’s principles. Force may be required to ensure that cheaters do not win, and cooperation increases when punishment from the group is allowed, but no one should be coerced to do the right thing. Raising the young properly would go a long way toward creating this kind of society.

Gone Extinct
According to Averroes, there is no conflict between religion and philosophy; they are different ways of reaching the same truth. He believed in the eternity of the universe. He also held that the soul is divided into two parts, one individual and one divine; while the individual soul is not eternal, all humans at the basic level share one and the same divine soul. Averroes has two kinds of Knowledge of Truth. The first being his knowledge of truth of religion being based in faith and thus could not be tested, nor did it require training to understand. The second knowledge of truth is philosophy, which was reserved for an elite few who had the intellectual capacity to undertake its study. The beliefs he held show just how incompatible religion is with philosophy. The universe is not eternal - we can now roughly date it. There are no souls separate from existence. And no one should take religious views blindly. Philosophy, evolutionary philosophy anyway, finds justification for laws and morality that are useful for everyone, not just an intellectual elite.

Influenced by the empirical worldview of Aristotle, Averroes' thoughts could have been a great influence on the direction of the Islamic world. Unfortunately, that empire was about to crumble and have little time for well thought out progress. There is "little agreement on the precise causes" of the decline of the golden age of Islam, but in addition to invasions by Mongols and crusaders that brought the destruction of libraries and madrasahs, it has also been suggested that political mismanagement and the stifling of ijtihad (independent reasoning) in the 12th century in favor of institutionalised taqleed (imitation) thinking played a part. The destruction of Baghdad in 1258 by Hulagu Khan (Genghis Khan's grandson and Kublai Khan's brother) is traditionally seen as the approximate end of the Golden Age—a mere 60 years after the death of Averroes. Just as the fall of the Roman Empire stalled scientific explorations and progress by the Aristotelian disciples of that age, any further explorations the Islamic world may have made were halted by yet another disastrous political upheaval.

Fortunately, Averroes' thoughts took hold in a Europe that was ready to listen to Aristotle and the wisdom of ancient greece again after several centuries of stagnation. Averroes was "the founding father of secular thought in Western Europe" and his detailed commentaries on Aristotle earned him the title of "The Commentator" in Europe. Latin translations of Averroes' work led the way to the popularisation of Aristotle and were responsible for the development of scholasticism in medieval Europe, which we saw the beginnings of last week with Anselm and will continue a bit further next week with our penultimate religious scholar. Stay tuned!
The Canterbury cathedral didn't quite look like this when Anselm was its archbishop from 1093 to 1109. The gothic update visible today didn't happen until after Thomas Becket's murder there in 1170 made the place wealthy with pilgrims who travelled there to honour the martyr. Still, Canterbury was the seat of English Christendom, which, fresh after the Norman conquest, meant that it had broad influence over the British isle and northwestern France. Anselm had risen to this prestigious position from the unlikely source of a small village in the Italian Alps halfway between Lyon and Milan. He did so on the back of the philosophical writings he undertook while studying and working in a Benedictine abbey in Le Bec-Hellouin near Rouen. I wish I could land such a plum gig based on my philosophical writings. What could he have written that would bring him such fame and power presiding over an institution like this?

Chiefly, Anselm was famous for the Proslogian, written in 1077-1078. As he tells us in the preface of that work, Anselm wanted to find, "a single argument that needed nothing but itself alone for proof, that would by itself be enough to show that God really exists; that he is the supreme good, who depends on nothing else, but on whom all things depend for their being and for their well-being." That “single argument” is the one that appears in chapter 2 of the Proslogion, and the one that today we call the ontological argument (so named by Kant—the medievals simply called it Anselm's Argument). Versions of this argument have been defended and criticized by a succession of philosophers from Anselm's time straight through to the present day. Heck, even Stephen Colbert mentioned it in his google talk in 2012. Correctly understood, Anselm's argument can be summarized as follows:

1. That than which nothing greater can be thought, can be thought.
2. If that than which nothing greater can be thought can be thought, it exists in reality.


3. That than which nothing greater can be thought exists in reality.

That's it. Anselm, according to the Stanford Encyclopaedia of Philosophy, thought that "once we have formed this idea of that than which nothing greater can be thought, then we can see that such a being has features that cannot belong to a possible but non-existent object. In other words, hypothesis (2) is true. For example, a being that is capable of non-existence is less great than a being that exists necessarily. If that than which nothing greater can be thought does not exist, it is obviously capable of non-existence; and if it is capable of non-existence, then even if it were to exist, it would not be that than which nothing greater can be thought after all. So if that than which nothing greater can be thought can be thought — that is, if it is a possible being — it actually exists."

Philosophers have of course poked many different holes in this argument for a millennium. Try it for yourself. It's fun! But the one I want to focus on here is the leap Anselm makes between the ideas of the mind and the reality of the universe. A new reader sent me this comment this week: "One of the fundamental questions I ask and have been asked by my philosophy-minded colleagues is the relationship between matter and non-matter and how we (as matter) would ever be able to learn about non-matter?" I'm still working on understanding what this particular reader has in mind when he talks about non-matter, but generally the realm of thoughts and qualities are considered non-matter by idealists or dualists. The redness of a sunset, the taste of a wine, the "bat-ness" of being a bat. But recent neurological investigations into our use of metaphors helps explain how we make the leap from facts and observations about the concrete material world to the realm of our ideas about that material world. Here is an excerpt from a fantastic article entitled Metaphors Are Us by Robert Sapolsky, professor of biology and neurology at Stanford University, to help explain what I'm talking about.

"We’re not so special after all. But there are still ways that humans appear to stand alone. One of those is hugely important: the human capacity to think symbolically. Metaphors, similes, parables, figures of speech—they exert enormous power over us. We kill for symbols, die for them. In recent years scientists from leading universities, including UCLA, University College London, and Yale, have made remarkable insights into the neurobiology of symbols. A major finding from their work is that the brain is not very good at distinguishing between the metaphorical and literal. In fact, as scientists have shown us, symbols and metaphors, and the morality they engender, are the product of clunky processes in our brains. Symbols serve as a simplifying stand-in for something complex. (A rectangle of cloth with stars and stripes represents all of American history and values.) And this is very useful. Symbolic language brought huge evolutionary advantages. This can be seen even in the baby steps of symbolism of other species. When vervet monkeys, for instance, spot a predator, they don’t just generically scream. They use distinct vocalizations, different “proto-words,” where one means, “Aiiiiii!, predator on the ground, run up the tree,” and the other means, “Aiiiiii!, predator in the air, run down the tree.” Language pries apart a message from its meaning, and as our hominid ancestors kept getting better at this separation, great individual and social advantages accrued. We became capable of representing emotions in the past and possible emotions in the future, as well as things that have nothing to do with emotion. How did our brains evolve to mediate this complexity? In an awkward way. The best way to shine a light on this unwieldy process is through metaphors for two feelings critical to survival: pain and disgust.

Consider the following: you stub your toe. Pain receptors there send messages to the spine and on up to the brain, where various regions kick into action. This is the meat-and-potatoes of pain processing, found in every mammal. But there are fancier, more recently evolved parts of the brain in the frontal cortex that assess the meaning of the pain. Maybe it’s bad news: your stubbed toe signals the start of some unlikely disease. Or maybe it’s good news: you’re going to get your firewalker diploma because the hot coals made your toes throb. Much of this assessing occurs in a frontal cortical region called the anterior cingulate. This structure is heavily involved in “error detection,” noting discrepancies between what is anticipated and what occurs. And pain from out of nowhere surely represents a discrepancy between the pain-free setting that you anticipate versus the painful reality. Now let’s go a little deeper, based on work by Naomi Eisenberger at UCLA. While lying in a brain scanner, you play a game of virtual catch, where you and two people in another room toss a cyberball around on a computer screen. (In reality, there aren’t two other people, only a computer program.) In the control condition, you’re informed mid-play that there’s a computer glitch and you’re temporarily off-line. You watch the virtual ball get tossed between those two people. Now in the experimental setting, you’re playing with the other two and suddenly they start ignoring you and only toss the ball between them. Hey, how come they don’t want to play with me anymore? Junior high all over again. And the brain scanner shows that the neurons in your anterior cingulate activate. In other words, rejection hurts. “Well, yeah,” you might say. “But that’s not like stubbing your toe.” It is to your anterior cingulate. Both abstract social and literal pain impact the same cingulate neurones. We take things a step further with work by Tania Singer and Chris Frith at University College London. While in a brain scanner, you’re administered a mild shock, delivered through electrodes on your fingers. All the usual brain regions activate, including the anterior cingulate. Now you watch your beloved get shocked in the same way. The brain regions that ask, “Is it my finger or toe that hurts?” remain silent. It’s not their problem. But your anterior cingulate activates, and as far as it’s concerned, “feeling someone’s pain” isn’t just a figure of speech. You seem to feel the pain too. As evolution continued to tinker, it did something remarkable with humans. It duct-taped (metaphorically, of course) the anterior cingulate’s role in giving context to pain into a profound capacity for empathy. We’re not the only species with an anterior cingulate, but studies show the human anterior cingulate is more complex than in other species, with more connections to abstract, associational parts of the cortex, regions that can call your attention to the pains of the world, rather than the pain in your big toe. And we feel someone else’s pain like no other species. We extend it over distance to help a refugee child on another continent. We extend it over time, feeling the terror of what are now mere human remains at Pompeii.

Consider another domain where our brains’ shaky management of symbols adds tremendous power to a unique human quality: morality. You’re in a brain scanner and because of the scientist’s weirdly persuasive request, you bite into some rotten food. Something rancid and fetid and skanky. This activates another part of the frontal cortex, the insula, which, among other functions, processes gustatory and olfactory disgust. It sends neuronal signals to face muscles that reflexively spit out that bite, and to your stomach muscles that make you puke. All mammals have an insula that processes gustatory disgust. After all, no animal wants to consume poison.But we are the only animal where that process serves something more abstract. Think about eating something disgusting. Think about a mouthful of centipedes, chewing and swallowing them as they struggle, wiping off the little legs that you’ve drooled onto your lips. Whammo goes the insula, leaping into action, sending out its usual messages of disgust. Now think about something awful you once did, something deeply shameful. The insula activates. It has been co-opted into processing that human invention: moral disgust. Remarkably, the way our brains use symbols to discern disgust and morality also contributes to political ideology. Work by scientists such as Kevin Smith of the University of Nebraska reveals that on the average conservatives have a lower threshold for visceral disgust than do liberals. Look at pictures of excrement or open sores undulating with maggots, and if your insula goes atypically berserk, chances are that you’re a conservative—but only about social issues, say, gay marriage, if you’re heterosexual. And if your insula just takes those maggots in stride, chances are you’re a liberal. Our wobbly, symbol-dependent brains are molded by personal ideology and culture, shaping our perceptions, emotions, and convictions. Many cultures inculcate their members into acquiring symbols that repel, doing so by strengthening specific neural pathways from the cortex to the insula, pathways that you’d never find in another species. Depending on who you are, those pathways could be activated by the sight of a swastika or of two men kissing. Or perhaps by the thoughts of an abortion, or of a 10-year-old Yemeni girl forced to marry an old man. Our stomachs lurch, and we feel the visceral certainty of what is wrong. And we belong. The same brain apparatus is behind symbols that move us to our most empathic, inclusive, and embracing.

The article goes a little off track from the point I'm making, as Sapolsky explains the physical locations of metaphor for two emotional phenomena, but the point still stands: our physical brain (specifically the anterior cingulate and the insula regions in Sapolsky's examples) is able to lump attributes together into groups and then name those groups. This use of symbols and metaphors and where they reside in the brain helps to explain the bridge that dualists think exists over a chasm between two discrete realms, when in fact the bridge is just another metaphor our physical brains have constructed to help make sense of the material world we are trying to live, survive, and thrive in. This is the explanation from evolutionary philosophy of why the ontological argument has no merit. The symbolical and metaphorical pictures in our mind are just physical brain states. They have no bearing or implication on what actually exists in the cold uncaring universe around us that we have so recently evolved within. Anselm's insistence makes no sense that just because we can think of "that than which nothing greater can be thought," it must follow that that thing must exist. Why would it? We can't produce other figments of our imagination.

Let's cut this short now and look at what I had to say about Anselm in my analysis of the survival of the fittest philosophers.

Anselm of Canterbury (1033-1109 CE) was the founder of scholasticism and is famous in the West as the originator of the ontological argument for the existence of God.

Not so much a philosophy or a theology as a method of learning, scholasticism places a strong emphasis on dialectical reasoning to extend knowledge by inference, and to resolve contradictions. Scholastic thought is also known for rigorous conceptual analysis and the careful drawing of distinctions. It originated as an outgrowth of, and a departure from, Christian monastic schools. Anselm may not have used it properly or well, but this rebirth of logic eventually led to the Reformation, scientific method, and the downfall of mystic revelation.

Needs to Adapt

Gone Extinct
Anselm reasoned that if "that than which nothing greater can be conceived" existed only in the intellect, it would not be "that than which nothing greater can be conceived," since it can be thought to exist in reality, which is greater. It follows, according to Anselm, that "that than which nothing greater can be conceived" must exist in reality. This was criticized on the grounds that humans cannot pass from intellect to reality. There are many things that humans can conceive of that do not exist in reality. A perfect circle for example.

Anselm also stated, “Nor do I seek to understand that I may believe, but I believe that I may understand. For this, too, I believe, that, unless I first believe, I shall not understand.” He held that faith precedes reason, but that reason can expand upon faith. Thus formally accepting false premises and confirmation bias into the institution of religion.

Almost home now. I can hardly wait to continue this use of scholastic reasoning to plow through two more mediocre medieval philosophers before we get to the good stuff. Onwards!

What would you say if you had the knowledge of this entire empire at your disposal in 1000 CE?

Last week, I looked at the soundness of Muhammad's teachings and found them severely lacking. What cannot be argued, however, is that they were effective. After his death in 632, proceeding leaders of the Islamic faith spread the word of their prophet and were unafraid to use the method of jihad to "struggle in the way of Allah" and build larger and larger caliphates, expanding first throughout the Arabian Peninsula, then along the Persian Gulf as well as the Mediterranean, Red, and Caspian Seas, before stretching out all the way from Portugal to India. While the collapse of Rome left Europe groping in the dark ages, this new empire to the south enjoyed several hundred years known as Islam's Golden Age during which time the extensive texts of the Greco-Roman, Persian, and Indian civilisations were encountered, translated into arabic, and studied extensively. The spread of this knowledge was aided tremendously by the concurrent spread of the invention of paper. While the Chinese had been using paper since they invented it in 105 CE, it did not move to the West until the defeat of the Chinese in the Battle of Talas in 751CE in present day Kyrgyzstan at the border between these two major empires. In the Muslim world, with a "new, easier writing system and the introduction of paper, information was democratized to the extent that, probably for the first time in history, it became possible to make a living from simply writing and selling books." (I wish that were easier today!) The whole of the Middle East along the silk road came to provide a thriving atmosphere for scholarly and cultural development. Into this world, in 980 CE, near the centre of present day Uzbekistan, in the city of Bukhara (which rivalled Baghdad as a cultural capital of the Islamic world), the great Avicenna was born—the most famous philosopher of the Islamic Golden Age.

Avicenna's father was a respected Islamic scholar and he had his son very carefully educated at Bukhara where "his independent thought was served by an extraordinary intelligence and memory, which allowed him to overtake his teachers at the age of fourteen. As he said in his autobiography, there was nothing that he had not learned when he reached eighteen" (which is a bit modern in its teenage know-it-all-ness, isn't it). Fortunately, Avicenna didn't stop learning at 18 and went on to create an extensive body of work. He wrote almost 450 works on a wide

The great city of Bukhara, Uzbekistan. Birthplace of Avicenna.

 range of subjects, of which around 240 have survived. Of particular note, 150 of his surviving works concentrate on philosophy and 40 of them concentrate on medicine. From these, the best quote I found from him was undoubtedly this one:

An ignorant doctor is the aide-de-camp of death.

Unfortunately, so is an ignorant philosopher, as we have seen repeatedly throughout this series in my examination of the Survival of the Fittest Philosophers. Avicenna was less destructive than most medieval philosophers, and although that's not saying much, let's see exactly what he had to say.

Avicenna (980-1037 CE) was a Persian polymath who wrote almost 450 treatises on a wide range of subjects. His corpus includes writing on philosophy, astronomy, alchemy, geology, psychology, Islamic theology, logic, mathematics, physics, as well as poetry. He is regarded as the most famous and influential polymath of the Islamic Golden Age in which the translations of Greco-Roman, Persian, and Indian texts were studied extensively.

His 14-volume Canon of Medicine was a standard medical text in Europe and the Islamic world until the 18th century. The book is known for its description of contagious diseases and sexually transmitted diseases, quarantine to limit the spread of infectious diseases, and testing of medicines. Some nice contributions to the long lineage of medical science.

Needs to Adapt
Avicenna inquired into the question of being (metaphysics), in which he distinguished between essence and existence. He argued that the fact of existence cannot be inferred from or accounted for by the essence of existing things, and that form and matter by themselves cannot interact and originate the movement of the universe or the progressive actualization of existing things. Existence must, therefore, be due to an agent-cause that necessitates, imparts, gives, or adds existence to an essence. It is still not known what caused the origin of the universe or why matter exists at all. However, it is well known how form and matter interacted to create the progressive actualization of existing things—this is evolution. The infinite regression of the agent-cause argument (who created the first agent?) leads only to the same questions. It does not lead to an all-seeing god.

Gone Extinct
Avicenna wrote his famous "Floating Man" thought experiment to demonstrate human self-awareness and the substantiality and immateriality of the soul. He told readers to imagine themselves created all at once while suspended in the air, isolated from all sensations, which includes no sensory contact with even their own bodies. He argued that, in this scenario, one would still have self-consciousness. The first knowledge of the flying person would be “I am,” affirming his or her essence. That essence could not be the body, obviously, as the flying person has no sensation. Avicenna thus concluded that the idea of the self is not logically dependent on any physical thing, and that the soul should not be seen in relative terms. The body is unnecessary; the soul is an immaterial substance. But bodies cannot just appear all at once suspended in air and isolated from all sensations. The argument is false right from the start. Our bodies are grounded in reality and there are no souls.

Not a particularly big contribution to the progress of philosophy, but we are indebted to Avicenna for his role in keeping inquiry alive during the dark ages. It won't be surprising in a few weeks to see the Islamic Golden Age come to an end though, moving on through one more bright light before the torch is passed to a revived Western Europe. Unfortunately, we haven't reached the light at the end of this dark tunnel just yet.
An ancient intricately carved doorway, falling apart at the edges, leading only into darkness. An apt metaphor for the poor nations housed under the crushing worldview of Islam.

In 2004, Dutch film director Theo Van Gogh worked with the Somali-born writer Ayaan Hirsi Ali (one of the original New Atheists meant to be part of the horsemen of the non-apocalypse), and together they produced the film Submission, which criticized the treatment of women in Islam. The title of the film is a literal translation of the word Islam (although in a religious context it means "voluntary submission to God") and on 2 November 2004, that submission was brutally enforced when Van Gogh was assassinated by a Dutch-Moroccan Muslim for the views expressed in the film. The murderer "initially fired several bullets at Van Gogh as he bicycled to work. Wounded, Van Gogh ran to the other side of the road and fell to the ground. According to eyewitnesses, Van Gogh's last words were: 'Mercy, mercy! We can talk about it, can't we?' The murderer then walked up to Van Gogh, who was still lying down, and calmly shot him several more times at close range. He cut Van Gogh’s throat, and tried to decapitate him with a large knife, after which he stabbed the knife deep into Van Gogh's chest. He then attached a note to the body with a smaller knife" that contained more death threats and polemics against Jews and the West. Terrorist acts such as these make me reticent to discuss the philosophy of Muhammad—the founder of Islam—but much like my analysis of Jesus of Nazareth, it must be done because of the huge influence he has had over the moral beliefs of billions of people over many centuries. Unfortunately, direct quotes such as these:

Even as the fingers of the two hands are equal, so are human beings equal to one another. No one has any right, nor any preference to claim over another. You are brothers.

All those who listen to me shall pass on my words to others and those to others again; and may the last ones understand my words better than those who listen to me directly.

have been widely ignored by extremist elements of the religion who have continued to distort and misunderstand much of Mahammad's more benign teachings. But what after all where the actual words of Muhammad? And how do we understand them in the light of modern knowledge? How does he stack up in an analysis of the survival of the fittest philosophers?

Muhammad (570-632 CE) was the founder of the religion of Islam. Discontented with life in Mecca, he retreated to a cave in the surrounding mountains for meditation and reflection. According to Islamic beliefs it was here, at age 40, in the month of Ramadan, where he received his first revelation from God. The revelations, which Muhammad reported receiving until his death, form the verses of the Quran, regarded by Muslims as the “Word of God” and around which the religion is based. Muslims consider him the restorer of an uncorrupted original monotheistic faith of Adam, Noah, Abraham, Moses, Jesus, and other prophets.


Needs to Adapt

Gone Extinct
The Quran presents five pillars as a framework for worship and a sign of commitment to the faith. They are (1) the shahada (creed professing monotheism and accepting Muhammad as God’s messenger), (2) daily prayers, (3) fasting during Ramadan, (4) almsgiving, and (5) the pilgrimage to Mecca at least once in a lifetime. Almsgiving is of course useful for a cooperative species trying to maintain diversity and coherence. In a universe without a god though, forcing the acceptance of one man’s unproven beliefs is harmful to society. Plus, once divine revelation is accepted, who is to say any one revelation is better than another. This creates the opportunity for perpetual uncompromising conflict. Prayers are a drag on efficiency and encourage faith where effort would be better. Intentionally weakening the body through fasting helps one to learn to deal with bodily pain, but spending one month a year in this weakened state is taking it too far. Requiring your followers to visit your birthplace is extremely vain and clearly intended just to boost your religion and the livelihood of your local followers (though they will gladly encourage the practice, giving a self-reinforcing circularity to the rule). 

In Shia Islam, there are ten practices that Shia Muslims must perform, called the Ancillaries of the Faith. (1) Salat (ritual prayer five times a day); (2) fasting during Ramadan; (3) almsgiving; (4) an annual taxation of one-fifth of all gain paid to Imams or poor descendants of Muhammad’s Ahl al-Bayt family; (5) pilgrimage to Mecca; (6) Jihad - a religious war with those who are unbelievers in the mission of Muhammad; (7) do the necessary good in life; (8) forbid what is evil; (9) expressing love towards Muhammad's family, Ahl al-Bayt; (10) disassociation with those who oppose God and those who caused harm to Muhammad or his family. Allowing for the usefulness of almsgiving, doing good, and forbidding evil, the rest of the ancillaries are solely focused on the perpetuation of the religion but are in fact very damaging to the human species. Spending hours every day in prayer is monumentally wasteful. Fasting one month a year is too much time spent in a weakened state. Giving hard-earned money to the charlatans who created and run this organization is perpetuating fraud. Declaring war on unbelievers creates an unbridgeable rift in humanity that is a direct threat to the survival of the species.

In line with the prohibition against creating images of sentient living beings, which is particularly strictly observed with respect to God and the Prophet, Islamic religious art is focused on the word. Images are an important way to record and transmit knowledge. No form of learning must ever be banned. Ignorant species go extinct.

The Sharia (literally "the path leading to the watering place") is Islamic law formed by traditional Islamic scholarship, which most Muslim groups adhere to. In Islam, Sharia is the expression of the divine will, and constitutes a system of duties that are incumbent upon a Muslim by virtue of his religious belief. No laws or governments should ever be based on divine will or people who purport to know a divine will. No such thing has ever been proven to exist and acceptance of even one divine will by any small group opens humanity up to competing divine wills and unbridgeable gaps.

Let's move on quietly. Sometimes the best way to convince others is to simply survive and thrive by following your own philosophy.