Tag Archives: life

The Failure of Philosophy on the Big Questions

Philosophy is commonly associated with the big questions of life. For example, a Google search leads to a number of books, articles, and other materials linking philosophers with such questions. The question here is, does philosophy deserve that association?

What Are the Big Questions?

Granted, people may differ on what they consider most important at any moment. If your boat is sinking in the middle of the ocean, your big questions may include “Can we plug the hole?” and “Is there a life raft?” But under ordinary circumstances, lists of really grand questions in life tend to be short and similar, from one source to another. Here, for example, are the topics listed in the contents of a book by Solomon and Higgins (2013):

  • The meaning of life
  • God
  • The nature of reality
  • The search for the truth
  • Self
  • Freedom
  • Morality and the good life

Similarly, the table of contents from a book by Sample, Mills, and Sterba (2004) lists these as “the big questions”:

  • What can we know?
  • What can we know about the nature and existence of God?
  • Are we ever free?
  • Does our existence have a meaning or purpose?
  • How should we live?

Blackburn (2013) phrases similar concerns in somewhat different terms (and adds some that may be better answered by scientists than by philosophers):

  • Am I a ghost in a machine?
  • What is human nature?
  • Am I free?
  • What do we know?
  • Are we rational animals?
  • How can I lie to myself?
  • Is there such a thing as society?
  • Can we understand each other?
  • Can machines think?
  • Why be good?
  • Is it all relative?
  • Does time go by?
  • Why do things keep on keeping on?
  • Why is there something and not nothing?
  • What fills up space?
  • What is beauty?
  • Do we need God?
  • What is it all for?
  • What are my rights?
  • Is death to be feared?

There is not terribly much difference among those lists. A student, assigned to boil them down into the Top Ten Issues, might mention something like existence and nonexistence, reality and knowledge, consciousness and beauty, goodness and freedom, and God and the universe.

How Is Philosophy Doing on the Big Questions?

Imagine a world in which contemporary philosophers had arrived at answers to the big questions, and were effectively communicating those answers to the college students sitting in their classes. In such a world, the self-help sections in bookstores (and the self-help websites online) would probably be much fewer, smaller, and less popular. Religious nuts, spouting nonsense, would get nowhere with a public familiar with philosophy’s answers to the big questions. Politicians would be philosopher-kings, succeeding only to the extent that they could engage educated listeners with reasoned defenses of their preferred views on those questions.

Sad to say, the train went off the tracks somewhere. Self-help has long been a booming business. Religion and politics are the jokes that rule us. Hardly anybody thinks that philosophy, of the type taught in universities, has much relevance to the real world. Yes, a few times per century, some philosopher exerts far-reaching albeit gradual influence upon society; and yes, within other fields of knowledge, there is the occasional intellectual who understands philosophers’ insights, and applies them to his/her own work. But those are exceptions that prove the rule. There is an enormous contrast between what could be happening, as illustrated in those exceptions, and what is actually happening in the overwhelming bulk of philosophical study and writing.

As a practical matter, philosophers have long been pulling a bait-and-switch — holding out the promise of useful education, so as to get people to take their classes and buy their books, but then disappointing generation after generation of students with extremely complex texts that, very often, degenerate into hairsplitting trivia. Students can certainly pick up some ideas, and some familiarity with forms of intellectual debate, that may be useful in their future careers in other fields — although there are no guarantees, as philosophical discussion and reasoning can be very alien to the working world.

The point here is not that philosophy is a complete waste of time. It is that philosophy is a failure for purposes of providing answers to the big questions.

It is not that philosophers have not tried to answer the big questions. It is that, as we learn in philosophy class, every answer has its assumptions, its limits, its weaknesses. The real bait-and-switch is that, with few exceptions, those complex and trivial texts build to a single conclusion: there are not really any answers to the big questions. There are only unsatisfactory ways of attempting to provide such answers.

I do believe that that conclusion is correct — that the philosophers have not been lying to us, that for the most part there truly are no completely satisfactory answers to the big questions. Then again, that is precisely what someone like me would believe — someone who has followed the occasional philosophical debate far enough to arrive at the conclusions expressed in the preceding paragraph. With the aid of a bit of background reading, I, or someone like me, could probably poke holes in just about any big answer that someone might suggest. Persons with this kind of education tend to function as skeptics toward the very notion that there might actually be a useful answer to a big question.

Here’s an example. Take the first topic on the first of those three lists (above): the meaning of life. The Stanford Encyclopedia of Philosophy entry on “the meaning of life” says that that topic has interested philosophers since the time of Aristotle. But that entry also says that, somehow, “it is only in the last 30 years that debate with real depth has appeared.” How is that possible? Nor has that deeper contemporary debate led anywhere in particular. The encyclopedia entry suggests that — consistent with philosophy’s established track record — it has yielded, not answers, but rather more questions:

When the topic of the meaning of life comes up, people often pose one of two questions: “So, what is the meaning of life?” and “What are you talking about?”

The entry goes on to state that some people have debated the meaning of life’s “meaning” — but this, too, has not yielded definitive insight:

If talk about meaning in life is not by definition talk about happiness or rightness, then what is it about? There is as yet no consensus in the field.

In further discussion, the entry indicates that some philosophers ascribe meaning to life as it relates to God, while others prefer a sense of life’s meaning that relates in some way to one’s eternal soul. Still others focus on life’s meaning in non-supernatural terms, having to do with either the subjective individual perspective or something else, external to us, that confers meaning upon life regardless of subjective mental state. Finally, there are nihilist or pessimistic perspectives, in which “what would make a life meaningful either cannot obtain or as a matter of fact simply never does.”

So there you are. There, in a nutshell, is philosophy’s answer to the question of the meaning of life. The answer is, it depends on which philosopher you agree with. Very helpful. That and five dollars will get you a cup of coffee.

The true state of affairs is not that philosophy grapples with the big questions in a serious and responsible way. The true state of affairs is that, in the words of a New York Times article, philosophy suffers from “an embarrassing failure, after over 2000 years, to settle any of its central issues.”

Certainly there are people who enjoy philosophizing for its own sake, sitting around and batting ideas back and forth. For that sort of person, big questions can actually be unrewarding, as they tend to involve messy combinations of fact and feeling. Indeed, most important questions in life are like that. When you have a real-life problem, you might entertain various abstract notions, but at the end of the day you need a practical answer.

Suppose, as a relatively simple example, that you’re trying to decide whether to adopt a child. That’s not one of the big questions. But it illustrates a kind of situation in which someone does have a burning need for an answer. It’s not something that you can futz around with for years, and in the end just shrug and say, “Well, I guess there are no absolutely right or wrong answers.” People who bring personal interest and immediate need to the big questions are not wanting someone to diddle them for a while. They are wanting workable conclusions to inform their lives. And the need can be urgent — in the case of someone who is losing his/her religion, for example; in the case of someone considering suicide, or struggling with deep personal loss.

Philosophy tends to provide everything except that sort of working conclusion. In that sense, the bait-and-switch description may not be quite right; perhaps the better characterization is that philosophy is a subterfuge, a means of identifying the people who are most likely to seek out and live by specific answers to big questions, and persuading them that it is silly or at least unrealistic to seek such answers. Philosophy is, indeed, a debilitating subterfuge, insofar as its study tends not even to equip the student with a sophisticated alternative. Most students will not clearly and permanently digest and remember what the philosophers have actually said on a specific question. Instead, what the students tend to retain is a general belief that there is probably some good reason why any attempted answer to such a question is flawed and should be ignored.

If the student ever does arrive at a point in life where s/he needs real answers to big questions, s/he is likely to be found in the self-help aisle, or looking into the words of various physical and social scientists or religious leaders — more or less as s/he would have done if s/he had never read a word of philosophy. In the works of those self-help, scientific, and religious writers, the student may encounter references to various philosophers, and may once again be reminded that philosophy claims to be at the root of the big questions; but for the most part such references will be historical in nature. They will be reminders that, if you want to pretend to wrestle with big questions, you should consider wasting a few years in philosophy classes.

Philosophy vs. Metaphilosophy

Philosophy used to be done by people like Plato and Aristotle, who would try to articulate relatively straightforward solutions to big questions. But then readers noticed problems with the way that Plato et al. formulated or answered such questions. Over time, it developed that reasoned approaches to grand philosophical questions were invariably problematic. There was always some devil lurking in the details. Thus philosophy became more of a historical affair, like the history of the Roman Empire or of ancient Christianity, in which the early deeds of great leaders gradually devolved into the baffled and increasingly ineffectual scrabblings of minor devotees. At a certain point, attempting to get an overview of all that material, you grasp that it is essentially a history lesson — and perhaps an unnecessarily complicated one at that — and you move on, in search of better alternatives.

We see, in other words, that philosophy as currently taught in college courses, and as conveyed in books about philosophy, is a largely bloodless affair, conducted by people with no skin in the game. Is there a God? Maybe, maybe not — but it’s not something that this sort of philosopher will lose any sleep over. It is an activity in which the dominant voice is that of the spectator, sitting back and watching what other people have tried to do, in their variously brilliant or foolish struggles with the big questions.

One could characterize such armchair philosophizing as “metaphilosophy.” Officially speaking, “meta” implies self-reference (i.e., about oneself). So — according to Wikipedia and the Internet Encyclopedia of Philosophy — metaphilosophy is philosophizing about philosophy.

But the concept of metaphilosophy has drawn a lukewarm reception. Most philosophers seem to feel that meta questions (e.g., “what is the purpose of philosophy?”) are just a part of philosophy itself. And of course philosophers consider themselves qualified to decide what lies within the proper scope of their professional activities, as do other kinds of professionals (e.g., police officers, generals, prostitutes, politicians). Ironically, though, the claim to possess an accurate overall understanding of philosophy, sufficient to reject the label of metaphilosophy, is just what one would expect from a metaphilosopher.

It does not appear, in fact, that philosophers have a very good grasp of the proper scope of their profession. They have positioned themselves as experts in their field, but not as experts on public need. As experts within their own concept of expertise, they have presumed to dictate what the general public should find interesting, or what the general public should be able to understand. Such positioning amounts to elitism: we will speak to the more intelligent people (i.e., those who are more like us), and leave the others to fend for themselves. Certainly some concepts are difficult to understand. But leaving those unlike us to come up with their own beliefs is, in effect, leaving the door open to liars and quacks — and that, we have discovered, is a great way to undermine public support for philosophical inquiry.

While metaphilosophy is certainly not the ordinary word to describe philosophy professors’ everyday teaching and writing about philosophy, it does seem to be the appropriate word. There are real philosophers, who are motivated to resolve big questions with practicable answers that can make a difference in real lives; and then there are various historians, analysts, and teachers who are content to talk about what the real philosophers are trying to do. Traditionally, both groups are called “philosophers.” But that seems lame, for a profession so oriented toward detecting distinctions. We do not confuse football players with those who merely talk about football, or who record the history of its games. We do not confuse the people who study sex with the people who actually participate in it. Let us likewise cease to confuse philosophers with metaphilosophical teachers and historians.

This is not to deny that the garden-variety teacher of philosophy may consider him/herself — perhaps with good reason — to be a philosopher of the first rank, prevented by circumstance rather than lack of brilliance from changing the world with the things that s/he would publish, given time and funding. The line between direct philosophical practice and indirect metaphilosophizing may be vague, contested, and in flux. Nonetheless, there does seem to be the possibility of a useful distinction between the people, ideas, works, situations, or statements that seem to count as solution-focused engagements with the big questions, and those that do not.

In that light, one might look more carefully at the definition of philosophy. The Merriam-Webster online dictionary offers a contrast between, on one hand, “the study of ideas about knowledge, truth, the nature and meaning of life, etc.” and, on the other hand, “a particular set of ideas about knowledge, truth, the nature and meaning of life, etc.” or “a set of ideas about how to do something or how to live.” That contrast amounts to a difference between the general study of ideas offered by various philosophers down through the centuries, suitable for metaphilosophy, and the particular study of an identified set of ideas about a specific issue (e.g., a big question). The former is philosophizing about philosophy — adding the teacher’s or historian’s interpretation on top of what famous philosophers have said — while the latter is the actual practice thereof.

Reconceiving Philosophy as (Especially)
the Pursuit of Answers to Big Questions

It is possible to define teaching to include every instructive activity taken by every crow, dog, and human on the planet. But for purposes of people who are trying to educate small children, the definition of teaching quickly becomes much more narrowly conceived and closely monitored. The same is true of history: there is a difference between logging every random factoid (with or without commentary) and an attempt to provide a concise and readable explanation of what happened in, say, America’s war in Afghanistan. It is neither helpful nor appropriate to indulge the freedoms implied in the broad definition, when circumstances call for an outcome consistent with narrow application.

Likewise in the case of philosophy. The key question (above) is whether the putative philosopher is engaged in the particular study of an identified set of ideas about a big issue. As one moves away from that sort of thing, one appears increasingly likely to be engaged in metaphilosophy — in, that is, classical philosophy’s interminably indecisive dabbling in ideas about ideas, lacking commitment to delivery of working solutions within an appropriate timeframe.

One can belong to various groups; one can share interests with a wide variety of people. It will not be surprising, though, if a philosopher, vitally engaged in the study of a big question, has less in common with metaphilosophers in his/her university department, and more in common with poets, sociologists, and lawyers who have become engaged in some aspect of that same big question. In other words, “philosopher” will no doubt continue to be a term applied carelessly to anyone with a PhD in the field; but, again, for purposes of people seeking useful answers to big questions, there may be a world of difference between real philosophers and abstruse metaphilosophers.

If philosophy is reconceived as the focused pursuit of useful answers to big questions — spinning metaphilosophy off into, perhaps, a subgroup within the university’s departments of history or literature — then it immediately becomes somewhat less appropriate to adjudge philosophy, as a whole, to be a failure with respect to such questions. It also becomes clearer that it is OK if you have not mastered the classic philosophers. Instead, the question may be, how well is this or that contemporary philosopher doing, in his/her up-to-date struggles with the particular big question on which s/he is focused.

Assuming this reconceptualization of philosophy — along with a determined effort to present philosophical findings intelligibly — it could develop that, at some point in the future, philosophy will cease to be a failure with respect to the big questions. That is not to anticipate that philosophers will have all the answers, or that they will have magically ceased to reach conclusions rife with contradiction, error, and impracticality. It is just that, at such a time, their reconceptualized and more tightly focused discipline may at least have bridged part of the gap between what they do and what the world needs from them. Success in this regard may have arrived when the average person seeks guidance from a philosopher — rather than from a minister, astrologer, or self-appointed expert — because the philosopher’s guidance is more palpably based in a superior combination of science, experience, and reasoning, and less dependent upon random opinion.

Next Steps

This article has proposed a distinction between metaphilosophy (understood as the bloodless recounting or analysis of what various philosophers have said) and philosophy (understood as the immediate pursuit of conclusions on big questions within a realistic timeframe). That distinction does not imply that metaphilosophy is worthless. No doubt there are many purposes for which it is well suited. Among other things, the Internet offers tons of material on the history of philosophy, and of course there have been many books as well. Well-known examples include Bertrand Russell’s History of Western Philosophy and Copleston’s History of Philosophy series.

Under the rubric of applied (a/k/a practical or popular) philosophy, one finds many (and potentially engaging) philosophical investigations of specific issues arising in the daily news. Such investigations span subjects ranging from health care to hate crimes. Here, again, such subjects can readily entail exploration of topics outside philosophy (e.g., law, in the case of hate crimes). One source distinguishes applied philosophy from accessible philosophy, where the latter consists of efforts to present the ideas and/or works of mainstream philosophers in more readily digested form. My own plain-English restatement of Plato’s Republic would be an example. Daniel Fincke and Brendan Myers offer related thoughts and materials. Philosophy Bites appears to be a recognized source of both applied and accessible philosophy.

Yet applied and accessible philosophy seem to be beside the point — the former, because it appears oriented toward small questions, not big ones; and the latter, because it appears to offer only a simplified route to understanding the ways in which philosophy has failed to reach useful conclusions on the big questions. In other words, the situation seems to be that (with or without accessible treatment) either we accept the rationality-based approach of western philosophy and its lack of convincing solutions, or we reject that approach and go with something else instead.

One rejectionist route is that of religion. Religious organizations and thinkers offer answers to big questions. These are not traditionally considered part of philosophy because they draw upon sources of alleged knowledge that are not open to rational analysis. For example, in Christianity, which has been the primary focus of debates on religion and philosophy in Western culture, key beliefs tend to require uncritical acceptance of unverifiable stories, presented in a scriptural book of mixed reliability.

Before turning to religion, the person seeking workable answers to big questions might consider adopting a single school of philosophy and making a go of it, warts and all — concluding (as one must also do in a religion) that the chosen philosophy has its difficulties and its quandaries, but is nonetheless time-tested and worthy for practical purposes. As a start in this direction, one might look at Wikipedia’s lists of Western and Eastern philosophical movements, along with Listverse’s list. Several of the items on those lists (e.g., existentialism, pragmatism, utilitarianism) appear capable of providing guiding principles sufficient to chart a course through many of the big questions. For instance, Koshal (2010, p. 105) construes Rorty’s pragmatism in these words: “[Pragmatism] maintains that unless we take something for granted we shall never settle any question . . . . The [propositions] we should rely on are those for which we have the most evidence for and little or none against.”

Where the chosen philosophy falls short, one might supplement it with eclectic selections from one or more other philosophies. A reasonable objective, in such an approach, might be, not to arrive at a single quasi-religious God’s-eye answer to all questions, but rather to develop conceptualizations that work and make sense for one’s own purposes. Unlike a religious approach, this objective would appear compatible with, and potentially open to, discussion with and learning from people who have adopted other philosophies.

As these suggestions imply, giving up on philosophy as a source of big answers does not necessarily entail giving up on philosophers as sources of good clues. Perhaps one’s personal philosophy is best developed inductively, starting with applied philosophical discussions of specific topics and allowing one’s reading and thinking to grow toward larger hunches and speculations.

It may turn out that there is not, and for the indefinite future there will not be, a single Bible-like compendium of definitive words, straightforwardly answering the big questions in terms satisfactory to a given reader. In that case, the point of this article might be that one need not therefore lurch to the opposite extreme. There may be strategies, oriented toward development of a working personal philosophy responsive to the big questions, that do not necessitate the undergraduate philosophy major’s bewildered stagger through a thicket of bickering eggheads. Ultimately, it is possible that a carefully reconceived profession of philosophy can succeed where today’s multifarious profession has failed.


Drives Within the Life Force: Striving vs. Resting

Previous posts in this blog have portrayed life as a restless, aggressive force that is forever striving to grow and become more powerful. Alongside that orientation, however, there is the reality that nothing has as much strength or control over its environment as it might desire. There are times when living things are growing, and there are also times when they choose, or find themselves compelled, to rest and regroup.

Indeed, that understates the case. If the drive to grow is a clear-eyed focus on not resting until the job is done, the drive to relax is an intoxicating compulsion to take a break and enjoy life. Some forms of entertainment do function as extensions of the workplace. Even there, however, there tends to be a real difference between genuine enjoyment and the mere use of recreational activities in a disciplined manner for preconceived ends.

Hence, when previous posts evoke the relentless striving of the life force, they might be refined to acknowledge that the striving actually does relent sometimes. Beyond the essentials of sleep, food, and other rest and refreshment to which virtually everyone must submit periodically, there is an endless smorgasbord of diversions. People vary greatly in their choices and indulgences among those offerings, ranging from solo reflections and private physical pleasures to highly social engagements. But everyone indulges at least some of them, at least some of the time, and their indulgence can take place even while they are working.

This refinement clarifies the role of death as life’s antagonist. Death terminates life, not only by confronting and defeating the drive to grow, but also by capitalizing on openings provided by the drive to rest. The strong can be killed while they sleep; the healthy can be undone by indulgence of unhealthy entertainment; the hardworking can be defeated by losing sight of the big picture. Vigilance may be key to survival; but as with many other virtues, vigilance is a word easier to say than to practice consistently.

Previous posts have talked about life and death, and also about the social force. Like death, the social force interacts with both of life’s core drives. This interaction is complex. Different social groups (e.g., religions, corporations, families) variously favor and oppose different kinds of striving and resting behaviors, disagreeing with and contradicting one another and sometimes themselves. Acts of working or avoiding work, eating or not eating, enjoying oneself or not, and so forth are approved or rejected in assorted ways, according to rules that can be very convoluted, precise, and even petty.

Unlike life, we do not know much about death. From life’s perspective, death is simply the absence of life, and life’s perspective is essentially all we have. It is thus not feasible to speak meaningfully about divergent drives within death. Things are different with the social force: to the extent that it reflects the life force, it incorporates its own combinations of striving and resting.

The gist of these remarks is simply that, as noted in prior posts, life can be summarized as a striving and restless force, and in the aggregate it is indeed that: there is always someone or something, somewhere, that is interested in eating your lunch. At the same time, life’s striving has the potential to undermine itself, insofar as the most aggressive striving often comes closest to triggering potentially self-destructive physical and social errors and countermeasures. Experience with life often teaches people to leaven the striving compulsion with personal and social relaxation, achieving some measure of reconciliation between one’s personal objectives and one’s personal and social limits.

Social Existence in Life’s Territory

Earlier posts in this series had sketched out a sporting metaphor, where life and death are like two teams trying to gain control of parts of the playing field, or like two sides in a tug-of-war contest. But then, in something of a departure, the immediately preceding post identified three key forces in human experience, not two: in addition to life and death, there was also a social propensity in various living things, notably humans.

The sporting metaphors are useful for visualizing interactions between life and death, but they don’t leave much room for the social force. Accordingly, the present post develops a different metaphor, that of the battlefield.

As noted in the preceding post, life and death are the consummate forces in human experience, but our encounters with those forces tend to be socially mediated. In other words, the attempt to enjoy life’s opportunities for growth and strength tends to be facilitated in some ways, and restricted in other ways, by our group memberships (by e.g., one’s family, job, community, and local government). Likewise for the risk of death, and of derivative conditions threatening death: a given group can provide safety from mental and physical deterioration and injury, but that same group can also ridicule, ostracize, expel, or otherwise attack, punish, and harm a disfavored member.

On the battlefield of life and death, we observe the front lines, where opposing forces struggle back and forth, variously seizing and surrendering bits of desperately defended turf. Medical treatments and other ways of improving and prolonging life seize territory formerly held by death; then again, plagues and other catastrophes occasionally send the force of death raging across spaces previously considered safe.

We do not see beyond the frontier: we have no firm knowledge of what, if anything, might be happening on death’s side. But on life’s side, within our years of existence, we find that civilians – humans and other living things – tend to take up residence, settle down, and develop their various cultures. Or instead of the battlefield, one could think of the process of building a house: once you get the walls up and the roof on, you can begin to bring in the furniture and decorate the place; but you know that cold or hot air, mold, earthquakes, and thieves will forever exploit ways, bold or subtle, to intrude.

Not to overdo the impression of a barrier: in fact, houses have windows and doors, intended to admit some of what lies outside, and battlefields have their moments of confusion and fraternization, their traitors, their generals who unwittingly help the enemy by clinging to mistaken impressions. What serves us can also serve our adversary, be it death or the merest mold spore.

And so it is with the social force. Just as a belligerent nation’s merchants may secretly aspire to do business on both sides of the battlefield, the social force that serves life simultaneously maintains communications with death. We experience the social force as a life-based phenomenon; and yet, as noted above, the social force is also prepared to indulge hostility, treachery, and other threats to growth and strength. Indeed, the social force is useful, for purposes of steering life’s energies, precisely because it brings death into life – because, that is, it threatens and, if necessary, delivers death and its derivative aspects (e.g., privation, exclusion, injury) to those who violate a group’s expectations.

Basically, if you want to steer and shape the life force, you use the death force; and since the people whose life force you want to control tend to be persons other than yourself, this sort of thing typically takes place, not in solitary existence, but in the social space that accounts for most of human life experience. The social force unfolds within life, and yet it reaches out beyond life. Like a parasite, it thrives within its host, and yet pursues its own agenda: the social force would not benefit from an end to life, but it is certainly willing to countenance an end to some lives, for the benefit of others.  Like a merchant or emissary that has carved out latitude to deal on both sides in wartime, the social force is tolerated for the benefits it brings, which is not the same as saying that its results are always lovely.

The force of life, like a great army, is as dangerous as an earlier post suggested: it will do anything to secure survival and growth. War is too important to be left to the generals, and life too is a raw drive that we generally prefer to see channeled in socially determined directions.  Hence the social force, despite its downsides, has become a refinement without which human existence would be virtually unbearable.

The social force exists, not as a fundamental reality like life and death, but as an experience-based invention that has come to permeate our thoughts, words, and actions.  By this point, when people talk about life, what they invariably mean is life as shaped by the social force.  This terminological confusion can lead them to look askance on, for instance, that earlier post that was so critical of life:  what people typically mean by “life” is a highly social matter, polished with the judicious use of death and destruction, to convert the bare-bones life force into a kind of social existence.

Life, Death, and the Social Propensity

Previous posts in this blog have characterized life as a sort of bully, one that compels people to follow its lead and do things in its preferred way. Life’s preferred way involves striving to grow and become stronger, usually at the expense of other living things, often including members of one’s own species. As described in a previous post, the struggle between life and death can thus be viewed as a kind of tug-of-war, where everyone starts out on life’s side but ends up on death’s side, as life dismisses those who do not continue to meet the elite standards necessary for a place on its team.

As also noted in that previous post, most people find themselves somewhere between life’s most powerful winners and those who, from life’s perspective, have wound up in death’s collection of losers. For most of us, life is a mix of comfort and struggle. The constant risk of injury or other harm, potentially putting us on a downward slide toward death, motivates us to protect ourselves and seek ways of becoming stronger and more comfortable.

That previous post also observed that those who become most powerful in life often do so by enlisting the support of others. People commonly join and seek advancement in the service of various individuals and organizations, so as to protect themselves against threats and take advantage of opportunities unavailable to the lone wolf.

This grouping tendency is quite powerful. When nearby populations permit, people tend to become affiliated with multiple groups, organizations, and institutions. Some such affiliations are voluntary (e.g., a parent-teacher organization; a political party); others are involuntary (e.g., one’s childhood family; society as a whole). Some are perhaps theoretically voluntary but practically involuntary (e.g., the market). Of course, people vary in how much they put into such memberships, and how much they get out of them. The point here is just that, for the enjoyment of life and/or for protection against death, groupings tend to form, with various costs and benefits for their members.

The tendency to join or belong to organizations and other groups – the propensity to be social – necessitates a significant refinement of the picture sketched out in the previous posts. Those posts have emphasized the binary opposition of life and death. Life and death do continue to vie for control, in myriad ways large and small. Life and death remain the consummate forces of human experience. But most of us are neither at life’s pinnacle nor at immediate risk of dying, whereas we are almost always interacting, thinking about interacting, or preparing for interaction with others. Hence, for purposes of the human beings with whom these posts are primarily concerned (and also, no doubt, for other species), the social propensity often appears more immediately compelling than either life or death.

It seems, in other words, that the social propensity would be optional or dispensable, without the struggle between life and death outlined above; but because we are torn by that struggle, the social propensity tends to play an important role. It has the capacity to make an enormous difference in how much we will grow and become stronger, and also in how vulnerable we are to become weaker and die. In both such regards, the social force plays both sides of the table — sometimes assisting in our growth, and at other times retarding it; sometimes sheltering us against a fall, and sometimes pushing us over the edge.

The social force may not be the peer of life and death, in the starkest times of our lives; but for practical purposes, on the day-to-day level, the social force has everything to do with what life and death mean to us. For the most part, in human existence, the social force operates as a peer of life and death. The well-known statement that something is a matter of life and death underlines the consummate importance of those two forces, but it also implies that, normally, we tend to be preoccupied with things other than life and death.

From the perspective of life and death, the social force is a tremendous modifier. Within outer parameters contested by life and death, questions such as who will be born, who will thrive, and who will die, and how quickly, tend to be decided by arrangements among groups of humans.

It certainly is possible that the nature and functioning of life, death, and the social force are due to the workings of some kind of supernatural source. This discussion leaves out the possible interventions of a deity, not because such a being would be incompatible with the scenario developing here, but because it is not necessary to add that layer of speculation. Attributing these matters to a divine being does carry the risk of blaming him/her/it for things that are ultimately caused by humans or perhaps by other beings. In that sense, divine attribution is worth avoiding, not only because it potentially invents an additional, unnecessary complication, but also because it risks blasphemy.

It may be wiser, and it certainly seems more reasonable, to credit or blame a divine being for our human conditions when (a) the divinity has made a clear and credible claim of responsibility or (b) to the extent that divine intervention is necessary to explain aspects of human existence. In short, at this point, it tentatively appears that the elemental forces of life and death, and the derivative social propensity, account reasonably well for basic realities of the human situation — regardless of whether any divinity has set those forces in play.

The Struggle Between Life and Death

An earlier post in this blog suggests that we can talk about life and death as large abstractions; but for personal purposes, one tends to experience life and death, not as gigantic monoliths staring at each other across a no-man’s-land, but rather as wrestlers or sporting teams, forever grappling and struggling for a slightly stronger hold or another inch of turf.  (Of course, the intention here is not that life and death are gods or other beings, nor do they necessarily reflect the workings of any divinity or supernatural force.  It is simply convenient to speak as though they had personalities and objectives of their own.)

Our experience of this conflict between life and death comes down to countless day-to-day confrontations among people, animals, other living things, and inanimate forces and objects.  This post offers a few observations about those confrontations.

There is a saying:  “That which does not kill me makes me stronger.”  The idea seems to be that losing a fight tends to make a person faster, more powerful, more motivated, or otherwise better prepared to win next time.  There is some truth to this notion.  But often it is simply false:  you may come away intimidated or crippled.  Being a loser often brings real impairments.  In other words, losses in the struggle between life and death can have real and permanent consequences.

There is another saying:  “Pick your battles.”  The concept here is that you conserve your strength, plan your strategy, and then strike when you are positioned to win.  It is a nice idea in theory, but it tends not to work out in practice.  For one thing, the people who adopt this approach tend to be cautious.  The battles they pick are not the ones where they have a 51% chance, or even a 60% chance, of winning.  They tend to hold out for the ones where they have a 95%-plus likelihood of coming out on top.  This means they all flock together on the bandwagon when they manage to find a cause that has been officially approved as a matter of the Good Guys vs. the Bad Guys – they all want to get their turn to land a few blows on the target of their collective ire, without much actual personal risk.  The rest of the time, unfortunately, they tend to facilitate evil rather than confront it.

Among those who say, “Pick your battles,” what we usually see is not usually a matter of courageous individuals carefully building their strength in preparation for a masterful strike that will effect significant change.  The people with courage tend to be taking action when the time is ripe.  What we see, among those who claim to pick their battles, tends to be individuals who are simply hiding behind the excuse that they are waiting for the right time to make their move.  For the most part, the only move they will be making is a promotion, once they have convinced the big shots that they are reliably similar to the person they are replacing.

Often, conflict operates as follows:  You are walking down the road.  Here comes the bully, walking toward you.  The bully says, Join me or I’ll thrash you.  You were not particularly interested in fighting at that moment.  You do not stand to gain much from fighting the bully.  It is easier to join him and look for a better moment to free yourself.  So you turn around and follow the bully.  What the bully knows – what you do not realize – is that, once you surrender to him, you will probably continue to defer to him, especially if he continues to offer a tolerable situation.  So the bully continues down the road, and confronts another individual, and another, and another.  Altogether, the bully encounters a hundred people, one at a time.  Each of them is just like you.  Each time the person looks at the bully – and, as his followers grow, at the collection of people behind him – and concludes that fighting makes no sense.  So each person follows the bully.  And now he is the leader of a force, and anyone who dares to oppose him gets mauled by a pack.  It may seem that many of these individuals, following him, would be prepared to flake off and turn against him in the event of misfortune.  Sometimes that does happen.  But what often happens is that some particularly weak individuals take shelter in his strength.  These people are motivated to root out dissension among his followers, so as to preserve the bully’s organization and to enhance their own standing within it.  Others, who initially would have been able to stand against him, or at least to flee when the opportunity arises, instead become preoccupied with their position within his operation.  He has a private army, or something like it, and it develops an internal structure and logic.

That is how conflict operates in general.  There is a version of it to be found within the behavior of life itself.  Life is a kind of bully.  As the saying goes, nothing succeeds like success.  In pursuit of its own project, life rewards those who prove most adept at converting opportunities and resources into personal advantage.  The path of least resistance tends to entail going along with life, focusing on growing and becoming stronger.  As you progress from youth to maturity, you are steadily less likely to question this.  It becomes second nature.

Life in general (and the organizations of various bullies and tyrants) tends toward hierarchical arrangements.  Like the mountain of success, the list of successful climbers is pyramid-shaped because those who reach its higher levels tend to offer other climbers a stark choice:  support my climb, or I will shove you over a cliff.  At the higher levels, not many climbers are left, and those who do survive tend to be the most ruthless in garnering ever more power to themselves – for their own enjoyment, and also to prevent any competitors from getting it.  This is very different from death, where there is room for everyone.

The analogy of sports teams might thus be narrowed to the tug of war, where one team grabs one end of a rope, and another team grabs the other end, and they try to pull each other across a dividing line.  But in the case of life vs. death, it is an odd kind of tug-of-war, where everyone starts out on life’s side, but ultimately everyone ends up on death’s side.  There is only so much space on life’s end of the rope, so life is forever throwing people off the team.  Life runs an elite organization, ruthlessly picking and choosing, discarding the weak ones as soon as a stronger one comes along.  Even if you made the cut this year, perhaps next year you won’t.

The struggle between life and death tends to leave people in a difficult position.  There are the strong and, on the opposite extreme, there are the dead.  Between those extremes lies the vast bulk of humanity, neither supreme nor expired.  For most people, the conflict between life and death entails an ongoing struggle – sometimes strenuous, sometimes not – to stay alive and to try to make things a bit better for oneself, without attracting hostile attention from dominant individuals.  This state of affairs leads people to engage in certain protective behaviors, discussed in a separate post.

Characterizing Life and Death

Everyone knows what life and death are.  Life is when you are alive, and death is when you stop living.

Several posts in this blog have offered a different perspective.  In the vocabulary used here, life and death are actors or forces.  Life is the force that impels living things to strive for their own increased size, power, and comfort.  Death is the agent that seeks to damage, impair, and ultimately kill living things.  In place of medicine’s black-and-white dividing line between the living organism and the dead one, this blog favors shades of gray.

It may seem odd to talk about shading, when death is so stark and final.  Doctors struggle to keep people from dying precisely because, once they are gone, there is no getting them back.  Or so it has always seemed.  The problem is that, in recent years, the time-honored dividing line between life and death has become interesting, for several reasons.

There is, first, the fact that people who would have been dead in the old days can now sometimes be brought back to life.  An example:  cold-water drowning, in which the person’s body is not only as cold as death but has drowned to boot; and yet the combination can mean survival and full recovery.  Another example:  asystolic (i.e., “flatlined”) cardiac arrest – which, again, has often been taken as a sign of death but can sometimes be reversed.  Even brain death, which would seemingly be a convincing cessation of life, turns out to be much more complicated and uncertain than one might expect.

In addition to the issue of determining when brain death is final, we have definitional (and sometimes legal) problems arising from brain-dead people who are kept “alive” by machines.  We also have glimmers of the opposite:  the brain in a vat, continuing to function after the body has gone.  Although it is still in the realm of science fiction, one can at least imagine, now, a database that would store the contents of one’s brain, and a set of bioengineering procedures that could recreate one’s body, such that even a person blown to bits might someday be reconstructed.

Along with complexities on the physical level, there are strange reports on a more spiritual level, involving near-death experiences and the like.  As noted in a previous post, not all of these reports are obviously wishful thinking or religious invention; some appear to come from credible sources and seem to be backed by varying levels of third-party verification.  While the whole line of thought is a long way from being established, it is possible that temporary physical death could quiet the noise and distractions, enabling some remnant of a person to recognize an alternate or subsequent world, or a substratum of existence, in which that human remnant might experience a form of post-death individual awareness lasting for seconds, minutes, or longer.  If there is anything at all to such reports, the underlying mechanisms may someday become as familiar as today’s “magic” of using defibrillator paddles in the emergency room to shock a heart attack patient back to life.

In short, the reliable old black-and-white line between life and death is still with us, in the large majority of cases, but it is showing signs of vulnerability.  As in other areas, straight lines are hard to draw.  There always seem to be exceptions that complicate any simple division of situations.

For some purposes, it can be more useful to think in terms of dynamic rather than static definitions – in terms of processes or agents rather than fixed entities.  Rather than treat life and death as vast and unchanging monoliths, it appears that sometimes we might characterize them as grabby kids with their fingers in each other’s pies.  On our side of the line, in the world of the living, we see physicians digging into death’s territory – striving, that is, to preserve, prolong, and even resurrect life in the ways just described.  We can’t see what, if anything, may be happening behind the curtain, in the realm of death.  But we do see that death is likewise constantly probing into our sphere of life.  On physical and mental levels, death fights every medical advance.  Combat veterans do survive wounds that would formerly have been fatal, but many of them also struggle with bodily and psychological legacies of their brush with eternity.  Old and ill people live on, now, long after they would have been extinguished, back in olden times; but often they, too, remain with us at a price:  sometimes their minds and/or bodies are halfway in the grave, long before doctors announce the final lightening of their burdens.

The interactions of these dynamic forces are often not well captured in large, fixed abstractions of life and death that attempt to incorporate everything from the amoeba to Jesus Christ.  Especially when speaking of one’s own lived experience, sometimes the contested frontier between these kingdoms is better captured in private terms.  On that level, the individual may perceive that life and death adopt unexpected strategies, variously appearing as personal benefactor or foe with changes of scene.

To adopt another metaphor characterizing the large notions of life and death, it may be interesting to observe the land and the sea from far above; but it tends to be more compelling to experience their interface at surf’s edge, where the waves meet the sand.  Then one may be positioned to contemplate the personal reality of existence and departure, as the water reaches one’s toes, or is closing about one’s nostrils.  Then, perhaps, the preconceived notions and canned assumptions can give way to firsthand learning about the interplay of life and death, in forms custom-tailored to one’s own existence.

The Goodness of Death

It is generally assumed that life is good and death is bad.  This is a sensible assumption, from the perspective of living things.  As with most simplistic beliefs, however, to some extent it distorts the realities.

Previous posts have explored this distortion on life’s side – noting, for example, that life depends upon fundamentally ugly behavior and that it resembles a prison.  The present post develops the other side of the story, noting some positive things about death.

Consider, first, the extreme cases.  There are situations in which death is a blessing.  It brings an end to pain, for those suffering from torture and terminal disease.  When administered to others, it can bring liberty from oppressors.  War, revolution, and other violence have often occurred in response to exploitation and abuse.

The mere possibility of death can have positive effects.  Tyrants, for example, must bear in mind their own vulnerability.  While they may be able to hire bodyguards and otherwise secure themselves to a considerable extent, the risk of assassination can nonetheless discourage them from outrageous acts.

The risk of death can also influence others to mind their behavior.  People fear death intensely enough to avoid situations that carry even a hint of serious risk.  The few who take such risks tend to become object lessons for the many who do not.  Fear of becoming like them is constantly reinforced.

Over time, people tend to learn approved ways of behaving.  We can thank the threat of death for the amazing phenomenon of billions of people coexisting on this planet, often in very close quarters, and yet enjoying much peace and happiness.  Death is, among other things, an enforcer of good behavior.

It may seem odd to think of the Grim Reaper as an aide in the minor project of deterring a schoolyard bully.  But that is really no stranger than seeing the massive life force at work in the tiny strivings of bacteria and insects, as they go about their business of killing and growing.  Life and death alike use accumulated petty means to advance their larger agendas.  Life basks in the happy endorsement of a science show on TV, describing the patient work of ants; and then death speaks in the voice of the commercial advertiser, whose financial support for that science show comes from the sale of insecticide.

Death achieves good outcomes because of its interesting partnership with life.  While a previous post characterized life and death as opposing teams on a playing field, one might point out that sports teams strive against one another on one level, while collaborating on another level to put on a performance.  Both sides are essential.  Life uses killing (of e.g., prey and adversaries) to achieve growth and strength; death uses life’s pursuit of growth and strength to achieve killing.

It would be easy to say that death is the bad cop in this duo but, again, it is not that simple.  The pursuit of life has yielded many despicable acts at the expense of others, while the acceptance of death has facilitated many heroic endings on behalf of those who survive.  The better analogy is not that life is a good cop or otherwise a savior; it is that life is the guide who knows the particular path we want to take, so we accept his guidance and try to overlook the many things we really dislike about him.

These observations are consistent with the common finding that nothing is good in excess.  Truthfulness is good, but perhaps one would lie to the Gestapo representatives, when they knock on the door and ask whether you are sheltering Anne Frank.  Sunshine is good, except when it causes skin cancer.  Life is generally good – sometimes it is fantastic – but there are situations when not even life can properly be one’s top priority.

People tend to recognize such realities.  They see that life, on one hand, tends to concentrate its gains in a pyramid of success stories and superachievers, where many struggle but few come out on top; and they see that death, on the other hand, is completely open and accepting.  The pinnacle of life is the thing that everyone wants but most can’t have; the pit of death is the thing that nobody wants but is free to all.

Given such choices, people do the sensible thing:  they temporize.  With such limited options, people typically want neither extreme of the scale:  neither untrammeled life power for the single tyrant nor final death for the tyrant’s many victims.  Under the circumstances, people tend to seek a middling ground where the game can continue for as long as possible – where life and death can fight it out, inch by inch, year after year – where, ideally, individual humans like themselves are able, within their personal lives, to tip the balance one way or the other, now and then, as they see fit, in small but meaningful confrontations between life and death.

In those countless little day-to-day situations, people make different combinations of decisions.  There are differences in how strongly people follow the call of the life force.  Some heed that call in a very high percentage of cases.  Even the interests of their own children take second place to their personal needs and desires.  Other people, at the opposite extreme, tend to subordinate their needs to those of others.  These people resist life’s priorities, deliberately and often putting themselves at risk of damage and, if necessary, death.

There is a saying:  if you go looking for trouble, you will probably find it.  Life is for the living, and that tends to mean that those who don’t mind their own business and just pursue their own best life prospects risk premature death.  As another saying advises, only the good die young.  Or as Primo Levi put it, writing of the Holocaust, the best died first; those who survived tended to be the selfish, the violent, the collaborators, the spies.  In short, life’s ugliness tends to predict the character traits that life favors.

Talk of selflessness can seem wrongheaded, to those who have little exposure to the downside of life.  When everything is going well, it can be easy to believe that everyone can succeed, that what the world needs is better training for success – not an unselfishness that might confuse a tidy system of rewards for merit.  Those who believe such things may feel that the situation is simple indeed:  life is good, death is bad, and one’s duty is to become stronger and more capable.  It may take direct or vicarious experience to pop that bubble, to acquaint the naïve with the seamy ingredients of success, with how quickly success can fade, with what life is like for those who lose success or who never had it.  Failure, loss, and rejection are much easier to understand when you experience them firsthand.

What emerges from such observations is not exactly that death is good.  It is that death is the soil from which much goodness grows.  Death, life’s bête noire, cannot contribute directly to human existence per se; but death certainly can influence the form of the conflict.  An ethic of sacrificing oneself for the sake of others is a thumb in life’s eye; it is an invasive rewriting of life’s code, a retelling of how life itself is best lived.  Death cannot offer the moments of pleasure engendered by success in life.  It can, however, shape the interpretation of such moments, contextualizing them to raise the question of whether personal growth and success should be seen as the best or only indicators of human achievement.  Death is a grab for the body, but an ethic accepting of death is a play for the soul, and not necessarily an ignoble one.

There are, in short, several ways in which death promotes goodness.  First, as noted at the outset, death provides blessed relief from pain.  It is, directly and derivatively, a tool with which to intimidate aggressors and eliminate tyrants.  The risk of injury and damage, carrying the least hint of possible death, is often sufficient to discourage undesirable behaviors.  Death is often where the best people go first.  It is something that the bravest acccept for themselves, in order to make things better for those who remain behind.  Personal encounters with death’s precursors teach empathy and humility.

In such remarks, it becomes evident that death offers benefits that are not consistent with life’s growth project.  Life, left to its own devices, can become insufferable.  It seems that, if death did not exist, we might have to invent it.

The Moment of Awareness of Death

The bomb explodes over Hiroshima.  People are incinerated without warning.  For these poor souls, the moment of death is simply 8:15 AM, August 6, 1945.  At Ground Zero, there is no time for people to become aware of what has happened, or to consciously register its impact upon their bodies:  they are instantly and completely gone.

On the other extreme, a child is born.  It is perhaps always aware of the risk of death on some subconscious level, but years will pass before it gains a meaningful sense of what death can entail.  And then, much later, that child matures into old age, winding into a gradual senescence that terminates with a long-anticipated and perhaps unfeared eclipse.

There are, in short, multiple possibilities for a person’s death.  Everyone gets a chronological moment of death, a point on the clock when they cease to be alive.  Everyone also gets a physical experience of dying, starting when brain and body begin to expire, and ending when that process is complete – sometimes within an instant, sometimes unfolding over hours or even weeks (e.g., Pitorak, 2003, p. 44; Sigrist, 1996).  But not everyone gets a period of advance warning, during which they can get their thoughts and affairs in order; nor does everyone experience a conscious awareness that death has arrived.

Among those who do become aware that their death is now underway, for some that awareness proves to be a false alarm:  they don’t die after all, or they come back to life after being more or less officially dead.  Among those who come back from such close confrontations with death, some report unusual experiences.  These include “panoramic” memory or life review experiences, widely known as the phenomenon in which “my whole life passed before my eyes,” although it appears that these may be better understood as a part of the larger phenomenon of the near-death experience (NDE).  Particularly because of their supernatural claims (involving e.g., movement through a tunnel toward a bright light, encounters with people who have died, visions of otherworldly places), NDEs remain controversial.  A number of interesting pieces of recent research (e.g., Agrillo, 2011; Waxman, 2012; Thonnard et al., 2013; Greyson, 2010) suggest, nonetheless, that readers who are not narrowmindedly precommitted to purely materialist explanations at all costs may find much to think about in NDE accounts.

The experiences and observations reported in NDE accounts do not seem typical for the majority of people who suddenly become aware that death is near.  For a sense of what people usually feel in situations of fatal injury (as distinct from the more drawn-out dying scenarios mentioned above), a review of various accounts yields quotes like these:

  • Gosline:  “Death comes in many guises, but one way or another it is usually a lack of oxygen to the brain that delivers the coup de grâce. . . . If the flow of freshly oxygenated blood to the brain is stopped, through whatever mechanism, people tend to have about 10 seconds before losing consciousness.”  But it takes longer if there continues to be some oxygen.  In drowning, for example, the minute or two of struggling to stay afloat ends with “a feeling of tearing and a burning sensation in the chest as water goes down into the airway. Then that sort of slips into a feeling of calmness and tranquility.”  Bleeding to death:  “Anyone losing 1.5 litres . . . feels weak, thirsty and anxious, and would be breathing fast.  By 2 litres, people experience dizziness, confusion and then eventual unconsciousness.”  Emotions may range from fear to calm, depending on extent of injuries.  Decapitation:  “Reports from post-revolutionary France cited movements of the eyes and mouth for 15 to 30 seconds after the blade struck, although these may have been post-mortem twitches and reflexes.”  Falling:  “Survivors of great falls often report the sensation of time slowing down. . . . Some experienced climbers or skydivers who have survived a fall report feeling focused, alert and driven to ensure they landed in the best way possible.”
  • Aron:  “As your brain begins to shut down you feel what I can only discribe as the essence of your existance, the core concepts of your psyche, bleeding out of you.  Concepts like honor, honesty, bravery, integrity, humility . . . become harder and harder to find.  Simultaneously, you are struck with the coldest, harshest reality imaginable; you are done.  Everything you’ve worked for in life, everything you ever dreamed about, all the pain, the struggle, the hardship, the reward, the victory, everything, is now worthless.  Your past means nothing, your wishes for the future are pointless, and even you in the present are pathetic and tiny compared to the behemoth that approaches you.”
  • Begley:  “As the brain runs out of oxygen, it closes down noncritical functions first. Sight, hearing and consciousness fade out, as though by the gradual twist of a dimmer switch. Pain vanishes. . . . People who bleed to death first hyperventilate . . . . [but] the heart and then brain slow. A flood of natural opiates called endorphins washes over the brain, bringing on both tranquillity and hallucinations . . . . Half of all patients who die conscious and in a hospital, a 1995 study found, suffered moderate to severe pain” though hopefully that has improved with better use of painkillers.
  • Scoville:  “Anyone who has been shot will know it’s just the absolute coldness that goes through your body. . . . [feelings like] hot gravel . . . getting hit with a baseball bat . . . a red hot poker going through you . . . . The human pupil may dilate during a shooting, leaving the viewer with the impression of seeing things through a tube as everything else blends into the white periphery. . . . Additional sensory deprivation may take the form of auditory exclusion. . . . [The body produces epinephrine;] the effects of pain may be diminished, and you may view your surroundings in muted colors or find it difficult to carry out simple tasks. . . . [T]he mind may lock on a variety of visual cues in . . . adrenaline-enhanced acuity . . . a sense of disorientation . . . . [Victims] may also find they need to empty their bladders or evacuate their bowels. . . . [B]reathing may become difficult, making you feel as though you are going to hyperventilate.”
  • Orwell:  getting shot:  “Roughly speaking it was the sensation of being at the center of an explosion. There seemed to be a loud bang and a blinding flash of light all around me, and I felt a tremendous shock [and] . . . with it a sense of utter weakness, a feeling of being stricken and shriveled up to nothing. . . . [M]y knees crumpled up and I was falling, my head hitting the ground with a violent bang which, to my relief, did not hurt. I had a numb, dazed feeling, a consciousness of being very badly hurt, but no pain in the ordinary sense. . . . As soon as I knew that the bullet had gone clean through my neck I took it for granted I was done for. . . . Everything was very blurry.”  See also Meredith and Kevlar Kid.
  • Rose offers a number of accounts, including these:  (1) “I was ejected from my motorcycle headfirst. . . . It felt as though I was sinking into a deep dark pool of water.  Everything around me was black and the world we live in kept getting smaller and smaller. . . . It was like I was sinking slowly into a world of unknown.  Sound began to act as though it was farther and farther away.  In a strange way, I felt in peace.  My pain was gone and the weight of the world passed me by.  I recall having memories of my friends and family.”  (2) “To me it felt a bit like slipping into a dream. . . . [I]t felt peaceful, almost uplifting. . . . Then my vision came back. . . . dim at first, very fuzzy, then everything got brighter and more defined.”  (3) “You feel like you’re going to the deepest sleep (in fact you are) and when waking you’re confused as hell and don’t really understand what happened . . . . Extremely unnerving and scary in a detached way. . . . No memories of the other side, just that feeling of being so unbelievably tired and that if I just slept everything would be OK.”

Only a minority of such accounts describe a specific awareness (discussed in another post) that time has stopped or slowed down.  But perhaps that sensation pervades many experiences of dying.  It certainly seems that people at the brink of death often experience numerous unusual thoughts and sensations, telescoped into a very brief timeframe.

What emerges from these assorted accounts is a sense of negation – of the future, of oneself, of the things that people worry about, of feeling and thought, of the body.  Death seems to operate as a negation of life, not only in the general sense that the two are commonly recognized as opposites, but also at this granular level, as life’s specific attributes are rolled up until nothing is left.  While accounts of NDEs and these other accounts of death differ in important ways, in one sense they are parallel:  death means leaving this life behind.  In life, possibilities proliferate; in death, life’s possibilities all vanish.

There is a certain irony in the prospect that, for all its ambitions and enterprises, all its noise and tumult, life inevitably fails after a period of mere years — whereas death, a singular moment that collapses it all back to zero, ushers in some kind of eternity.  Life’s continuous Now, running for generations, offers one expression of what a moment can be, within human experience; death’s full stop offers an alternative.  The foregoing accounts of experience at the moment of death tend to be preoccupied with the ending of life, as distinct from the beginning of death.  There is perhaps not much to say about a simple termination; or if there is, evidently it would be out of character for death to say it.

The Fecundity of Life vs. the Parsimony of Science

Scientists and philosophers who discuss theory often say that a good theory has certain traits.  Among other things, a theory is especially likely to provide a useful scientific explanation if it is predictive (i.e., it explains the phenomena well enough to predict what will happen in certain conditions) and unique (i.e., when it uses new terms or concepts, it explains how they differ from terms and concepts already in use).

This post focuses on another trait of good theory:  parsimony.  This is often expressed as the view that the simplest explanation is usually the right one.  In a well-known phrasing, the principle of parsimony says, “Do not multiply entities beyond necessity.”  This principle is often used in support of atheism:  basically, if you can explain the world and everything in it without using the concept of God, then it seems unnecessary to invoke or invent that concept, and doing so will probably lead you to incorrect conclusions.

The principle of parsimony has some things in common with the principle of noncontradiction.  According to the latter, something cannot be both true and false.  More precisely, says Wikipedia, “contradictory statements cannot both be true in the same sense at the same time.”  Either the cat is white or it is not white, but not both simultaneously.  Seems obvious.

The problem is that things in the real world rarely occupy “the same sense at the same time.”  Life and philosophy part ways in these regards.  In philosophy, you don’t need redundant entities; but in life, you have redundant entities all over the place.  Life has no problem whatsoever with repeating itself in endless, idiotic duplication and near-duplication.  You already have a cockroach?  Here, have another!  and another!  Nor does life hesitate to contradict itself, at least for human purposes.  Yes, if you were able to pry apart every peccary, peccadillo, and pecuniary propensity, you would perhaps be able to characterize them in black-and-white terms:  fat, not thin; significant, not trivial.  But in real life, as soon as you look away for a minute, another version of the damned thing turns out to be just the opposite of what you expected.

There is a reason for it, of course; there is always some explanation for why something like a bumblebee can’t fly but, in fact, the bumblebee can fly but, in this case, the bumblebee cannot fly, but you were right when you said it could, because it could then.  Just drag everything back to the laboratory, dissect it, and you can explain it all.  Except that, of course, you can’t.  There aren’t enough of you to do it, and besides, you won’t live nearly long enough.  In theory, there are no violations of the principle of noncontradiction.  In reality, much of life is devoted to discovery of and reaction to things that weren’t supposed to be contradictory, but in effect are exactly that – and so they will remain, for the most part, because there aren’t nearly enough of us to sort them all out.

We have, in the principles of noncontradiction and parsimony, a set of abstractions from life.  Within the abstracted spaces of the philosophical armchair and the scientific laboratory, they carry weight.  Within the grand sweep of life, they don’t necessarily.  Mere redundancy, by itself, does not violate parsimony; but as just noted, it’s often not a case of mere redundancy, but rather of endless refinements on the original proposition.  The cat is white, except when it stands in a colored light, unless by “white” you mean . . . and so forth.  The words we use to explain things aren’t so much the neat stack of parsimonious constructs imagined by antique philosophers as they are a continuing dialogue in which people have to keep trying to distinguish cases in order to mimic verbally (in extremely limited form) the endless subtle permutations of life.

As an agnostic polytheistic fundamentalist, I’m not deeply invested in the question of whether the abstracted nature of parsimony shoots holes in atheists’ dismissal of God – for purposes of real life, that is, as distinct from the armchair, where we can sort out a small and artificially structured interpretation of reality in our heads.  People may differ in their willingness to accept that the universe confounds expectations.  I’m less interested in God than in life – in its eagerness to spew forth copy after copy, many mutating slightly from the original.  However useful parsimony may be for the formulation of abstract theories, it is not very helpful in characterizing existence.  Parsimony yields the conclusion but ignores the steps to it – giving us, say, the survival of a species with superior reproductive capability, but without the drama and the adventure, the struggles spanning months or eons that such victory required.  Parsimony is a reviewer who tells us how the book ends but not how it got there:  by definition, parsimony trims out everything it considers extraneous.  You can force it to come to the microphone and recite the progress of the marathon, but it will do so in plodding fashion, without imagination; it really just wants to state the outcome and be done.

In this post, I have parsed parsimony because it gets misused.  There are people who think they have provided an especially cool explanation when they make it as cryptic as possible, as though one dare not risk the wastage of a word.  This mentality, conducive perhaps to certain kinds of activities (e.g., the drafting of statutes), is in considerable conflict with the nature of life.  Life demands and practices constant, florid elaboration (in e.g., the interpretation of statutes).  We are closer to life’s spirit when we postulate not zero, not one, but a thousand gods, a dozen theories, a score of scenarios.  Closing down the options is a concession to our limits and/or our preferences, not to what may be.

As part of life, people indulge deliberately and indifferently arbitrary and superfluous thoughts, actions, and characteristics.  There is always much more to know, even within a rigorously parsimonious scientific mindset; but in fact there is vastly, overwhelmingly much more to suspect.  Principles of parsimony and noncontradiction can help us to think about things, but can be overindulged to the extreme of blinding us to possibilities that life has probably already recognized.

The Life Imperative: Life as Prison or Addiction

A previous post points out that life depends upon constant, unavoidable killing.  There is a question of whether a person should continue to participate in that sort of thing, or should instead voluntarily leave this life.  It seems that an answer to that question may depend, in part, on how one understands the situation.

One possibility is that life is an addiction, or at least that it has important and thought-provoking features that resemble the experience of addiction.  If one thinks of everyone as being addicted to staying alive, then it is no surprise that vast numbers of people, including parents, police, and other authority figures, would be engaged in a sort of brainwashing that would overstate life’s glories and would downplay its bad parts.  Such behavior would be consistent with addiction experience: there is always an excuse for partaking, always a reason that drags the addict back to indulgence.

The excuses for staying alive vary, but the outcome is the same.  People may contend, for example, that the thought of ending one’s own life is sick or confused, or that the person who kills him/herself will go to Hell.  But all of those propositions are open to question.  It turns out that the Bible does not support the claim about Hell; the person who sacrifices him/herself to save others is regarded as a hero; the individual who is on the edge of suicide may not feel confused at all, but may actually possess sudden clarity.  Hence the arguments against suicide could look like the sorts of things that any addict would say, or would listen to, in defense of the addiction.

But can life really qualify as an addiction?  Definitions vary, but it seems that addictions tend to entail recurrent indulgence in immediately gratifying behavior that the addict cannot stop despite its adverse longer-term consequences.  Taking it apart,

  • “Immediate gratification.”  It seems that many people strive to stay alive even when life is not very gratifying.  But then, drug addicts often continue to take their drug even when they have built up a tolerance and are no longer getting high; they may just need to avoid the agony of withdrawal.  Maybe that would still count as a form of short-term gratification, when people cling grimly to life despite the absence of any joy in it – despite, indeed, a growing dislike or fear of it.  They still need the experience, they are still making excuses for it, they are still not ready to make a sharp break from it.
  • “Adverse long-term consequences.”  There is, indeed, the perpetual killing that makes each person’s life possible.  But as discussed in an earlier post, there does not seem to be a better long-term alternative.  For one thing, we don’t know whether an afterlife (if any) would be better; we also don’t know what unexpected things might happen if we stay around.  Also, our disappearance will not change the basic problem:  elimination of all killing would probably mean the elimination of all life.  The point of that exercise would not be clear:  we would have killed everything in order to stop killing.  Meanwhile, there are living things (e.g., trillions of bacteria in our guts) that depend on us, or whose lives may be made longer or better through our existence and our efforts.

These thoughts suggest that the addiction metaphor may not be the best way of characterizing the decision to continue living.  There is the problem that life does not always provide short-term gratification; and even if one waives that on the grounds that gratification may consist of merely avoiding the discomfort of withdrawal, there is the problem that the decision to stay alive does not obviously constitute a disregard of undesirable long-term consequences.

Rather than characterize life as an addiction, it seems that it might be better described as an imperative.  That is, the decision to continue to live seems to be a response to a command that we must fulfill.  This feeling of obligation may be especially strong in those who are responsible for taking care of family members or other people, but it tends to be quite strong even without that.  To be sure, there is the occasional suicide epidemic; there are young daredevils who willingly place themselves into harm’s way.  But these tend to be exceptions that prove the rule:  you are supposed to preserve your life. It is a strong rule, less frequently violated than any of the Ten Commandments, and yet not included among them or, usually, among other laws.

How can so many people agree on this imperative to preserve one’s life, when people are inclined to disagree on virtually everything else?  Perhaps the answer is that, actually, people are not in agreement about it.  At any moment, there are likely to be people in your vicinity who are silently struggling with physical or mental pain that they would shed in a heartbeat, if they saw a realistic alternative.  It seems that people might abandon this life in droves if they could look beyond the grave and see the specifics of a viable life after death.

Under the circumstances of real life, with no practical alternative, the life imperative appears to foster a kind of confinement.  We are more or less obliged to stay in this life, even in the absence of commandments, because there does not appear to be any real choice.  Life is not merely an imperative; it is, in effect, a prison.

That will hardly seem obvious to people who are living it up.  But consider prison life.  Depending on the prison, there may be enough to keep you occupied:  classes, reading, writing, researching law, working out, sitting and watching.  There are some perks:  the health care is said to be better than many Americans can afford.  Life is simple.  Some people, especially those who have become accustomed to it, actually prefer it to life outside, where they don’t know the rules.  It’s not heaven; but then, neither is this life.  The point is that it can be tolerable.  Many incarcerated individuals, offered a chance to join in on a highly risky escape attempt, may just not bother.

Life, like prison, tries to be a place where people can stand to stay put, and people reciprocate by trying to make it bearable.  In effect, you and I are in cahoots with life:  we don’t remotely want any more of its harsh discipline; but since we have no realistic alternative, we try to dress it up as a worthy affair.  Like Stalin in the U.S.S.R. during World War II, life may be a son of a bitch, but at least it’s our son of a bitch.  We would rather have something better, but we don’t get to make that choice.  Life, like Stalin, may torture and kill tens of millions, sometimes merely because they were too trusting or just didn’t fit in the plan.  But for you and me, the choice is pretty simple:  become part of the steamroller, or else become part of the road.

As a person who is alive and intends to remain so, I can hardly help being one of those facilitators of an oppressive regime.  Specifically, I don’t encourage anyone to kill themselves.  I do believe the things that I have written in that separate post about preparing for suicide; I think there is much to do, and much to think about, before ever taking such a step.  Nonetheless, in the interests of fleshing out the prison metaphor, an honest appraisal of the situation does seem to call for acknowledgement that it takes courage and, probably, desperation to reach the point of committing oneself to an escape attempt.  When the moment for decision arrives, and you see an opportunity to get out, it is not completely boneheaded or illegitimate if you flatly declare that you cannot stand your present circumstances anymore (if that really is how you feel about it) and that you are willing to take any risk to try something else.  It still may not be the smartest move, but at least it may be understandable.

I am not suicidal because I just don’t see the advantage of a prison break at this time.  I am a participant in life’s murderous regime.  I participate in the killing of plants and animals because everything must die anyway, with or without me.  Is it better never to be born, than to be born and then die?  Never to be born, perhaps, if being born somehow scars an eternal soul; the problem is just that I have no idea whether there is such a thing as an eternal soul, nor what exactly this life might do to it.  Based on the information I have, it appears that I am supposed to just play along, for as long as I can, or at least until I have a clear reason to do something else.  I am invariably able to tolerate life; my life does have its pluses; and if I am imprisoned in this life, I have grown accustomed to it.  It is acceptable.

%d bloggers like this: