A diary of the self-absorbed...

Thursday, April 22, 2010

The Science of Flipping Birds

I recently read a quote from physicist Richard Feynman: "Philosophy of science is about as useful to scientists as ornithology is to birds." Feynman was a brilliant thinker, of this there can be little doubt. But with this particular quote, Feynman exposes an all too familiar naivety emerging from some of our brightest minds. Ornithology may not be of much use to birds, but birds don't drop atomic bombs. There isn't a middle finger buried in the science of ornithology capable of wiping out the planet, or conversely, a helpful set of thumbs developing MRI technology.

Of course it goes without saying, or at least it should go without saying, that a philosophy of science is what kept Feynman's research funded at all. It was that same philosophy of science that takes what he and others like him do and 'apply' (pun intended) it to the so-called real world. In other words, a fully funded philosophy of science was the teat from which he was 'granted' to drink, as well as the containers into which he spilled his milk. (Yes, granted was another intentional pun.) I say that it should go without saying, but obviously it doesn't or I wouldn't be saying it.

In one sense, Feynman's quote is spot on. You might recall from an earlier blog post, that I mentioned the last mandatory sterilization that occurred in America when I was in the seventh grade, and not a single scientific rule was violated in that process. It should be of no great surprise that a philosophy of science was no use to the actual mechanics of the procedure. Of course the same thing is true in the discovery of penicillin, or the manipulation of the stem cell. At a most rudimentary level of thinking, one could argue that the science itself happened in a proverbial vacuum.

The only problem with such base and archaic reasoning comes from the fact that human beings did the science. And we don't touch anything in this world without a working metaphysic, even if such a metaphysic is an act of primitive nominalism. Maybe I'll come back to that later in another post, but for now, it is sufficient enough to say that our ideological fingerprints emerge all over the science we do. Feynman is right in that these fingerprints have no bearing on the science itself, but it certainly was a force in both funding the laboratory and applying the results. Pharmaceutical science is beautiful for the entire time it's in the test tube, and both the economy and ethics of it are of no account. But to get it under the microscope in the first place required a series of steps that in no way correspond to the scientific method, and of course once it came out of the test tube, we begin in 'earnest' to attach a price tag to it. ("earnest" – three puns, going for a record).

Feynman's resistance to the philosophy of science displayed in this quote is a bit ironic, given his statements dredged up by Ian Hacking in the book, "The Social Construction of What?"

"Mathematically each of the three different formulations, Newton's law, the local field theory and the minimum principle, gives exactly the same consequences. What do we do then?

You will read in all the books that we cannot decide scientifically on one or the other. That is true. They are equivalent scientifically. It is impossible to make a decision, because there is no experimental way to distinguish between them if all the consequences are the same. But psychologically they are very different in two ways. First, philosophically you like them or do not like them; and training is the only way to beat that disease. Second, psychologically they are very different because they are completely un-equivalent when you are trying to guess new laws. Feynman, 1967." (Emphasis mine)

Here we see another potential problem embedded in the science itself. The way we choose to begin often determines the path we take to reach the end. Selling out to one could potentially determine the way you go after the next. This concept is explored a bit in my blog below on marginalization and the booger-ditch of psychological research. Without a philosophy of science, even under the microscope, we might potentially be choosing between two sets of methods, both of which work. With enough funding and application, we might be able to determine which works best, at least in any given moment, but that's unlikely since once a working way is found, the funding gets amplified it that direction. Still, there's no denying that reverse engineering on the stem cell (a mostly ideologically mandated move) did in fact lead to discoveries on a path that tinkering with the stem cell alone would most likely never have forged.

In this particular case, the scientist suckled from the teat of least resistance until he was all but forced to work the other breast by the notorious G.W. Bush executive order, steeped in ideological metaphysics. Now research continues on both nipples and if you read enough, you'll see the scientist takes his lips back and forth between them, trying not to accidentally spill anything on his metaphysical chin. And of course, we shouldn't forget the baby bottle waiting to catch a few drops of this genetic milk -- with a big WiCell sticker wrapped around it. After all, they hold the patent to the bulk of it anyway. They are the ones doling out the samples and stand to reap the first fruits of science applied rewards.

But we, the idle birds of reason, need not think on these things. There is no good reason to flip the birds, or turn the science. Ornithology doesn't matter to birds, even if through such a probing good, our wings will be mended, or sometimes clipped, as we fly across these lawns and man-made nests.


 


 


 

Monday, April 19, 2010

Strife

One of the oft quoted and oft misrepresented verses in the Bible comes from Psalms 46. The way you probably memorized it is – “Be still and know that I am God.” But a more accurate translation probably reads, “Cease striving, and know that I am God.”

People that know me from way back, meaning seminary and before, know that I’ve always been a big fan of that mystical hooligan, Meister Eckhart. At a time that the Church needed something out of the box to challenge papal authority, Eckhart carefully dismantled it and ripped again the veil of religion held up between man and God. Eckhart was a big time fan of silence and of Eastern concepts like ‘emptying’ the soul in order that the goodness of God might refill it. I love the guy and all the reasons I love him put him on trial for heresy so I figure I’m likely in good company because the same psychology nailed Jesus to a cross and burned the best minds of many generations at the stake.

As good and as wonderful as silence and “being still” may be, to interpret Psalms 46:10 as some kind of Eastern spiritual thing just doesn’t come close the absolutely stunning and painful context of the saying. Let’s look at Psalms 46 in the context in which it was written:

8 Come and see the works of the LORD,
the desolations he has brought on the earth.
9 He makes wars cease to the ends of the earth;
he breaks the bow and shatters the spear,
he burns the shields with fire.
10 "Be still, and know that I am God;

Or more accurately put, cease striving and know that I am God.

Like so many of the Psalms, this one is about theodicy, or the problem of evil in the world. Here’s the kicker (and if you’ve read much of my apologetics, getting kicked in the groin is a metaphor I use often, so please forgive if you’re the easily offended type, and don’t get miffed if in learning you are easily offended I intentionally place a few kicks there… it’s a character flaw that I’m working on); so here’s the kicker: This “be still” stuff is written into a context of both divine depravity dispensed upon mankind while simultaneously coupled with its removal.

You might remember from Genesis, and it matters not what you take literally and what you don’t, that God asks a parenthetical question, “How long am to strive with man?” The answer that gets churned out is “about 120 years.” (Genesis 6)

Super. So let’s strive. You vs. Me… It’s on baby. And what I can’t finish, I will leave to my kids, and grand-kids, and great-grandkids, and as history seems to indicate, to generation after generation. Strife. Religious strife, theological strife, scientific strife, political strife… pick it and strive.

I can’t say I loved the movie, but I certainly loved the closing lines of Legends of the Fall:

“I was wrong about many things. It was those who loved him most who died young. He was a rock they broke themselves against however much he tried to protect them.”

So it is with strife when you choose an Eternal opponent who refuses your definitions. He becomes the rock you break yourself against.

The thing I like best in Psalms 46 is verse 8. David, the likely poet who wrote these words, doesn’t even bother with an argument against God’s goodness. He does that in other places. Instead, he acknowledges “the desolations God has brought upon the Earth.” Buber once stated that nothing can doom a man but the belief in doom. Our faith pays homage to the medley of the gods... biology, chemistry, physics: gods which offer liberation from magic and superstition in exchange for the laws of entropy and decay. I guess I fail to see the difference, both are indicative of strife.

Stop striving, the Psalter says. The bows will break, and the shields will burn.

I think the hardest lesson for me to learn – one I still struggle with daily – is the fact that I clothe my strife under the banner of “justice.” And that at the end of the day one truth remained: I loved justice more than I loved God. He knows this about me. And we strive.

“Being still,” at least for me, is about much more than just the silence.

Tuesday, April 13, 2010

Booger Ditch Psychology, Part Two; or ... The Three Christs of Yipsilanti

Given the below considerations, how exactly is the “self” placed in the context of normative expressions and experiences? Given the fact that verbal and auditory hallucinations are descriptively classified in Western society as “disorders,” they immediately become marginalized as systems of meaning. Not all cultures respond in this way to hallucinations. Many cultures consider these activities as rites of passage, normal expressions of the unconscious, the spiritual world, perhaps even “normal” events within the language faculty of human beings. Speaking with imaginary figures, hearing voices, or disassociating through an “out of body” experience, are descriptively acceptable forms of expression, originating as meaning constructed from the individual. Rather than denying a legitimate placement of the self within this context (as is the approach of most Western psychologists and psychiatrists) these other cultures employ the spiritual resources of the community to help plant the “self” firmly within a (perhaps constructed) universe, which is bigger and more influential than the empirical realities. In other words, the individual suffering from hallucination X may in fact have a completely separate reality in which to construct his experience and interpret his data, and thereby with enough tolerance from the community, lead a relatively productive life.

The disorder of self occurs in the Western paradigm when the value of these illusory experiences becomes marginalized: a self-fulfilling prophecy. Rather than incorporating the individual’s reality and working therein, the therapist (or worse, the pharmacist) develops an “elimination” plan, through which the aberrant behaviors are relentlessly questioned, deconstructed, and hopefully replaced by structured expressions of a more socially acceptable nature.

I don't want to be misunderstood. We should assume that disorders do exist, at least incorporeally, and we need to see them as such for the sake of those suffering from, and those suffering on the account of their manifestations.

Certainly there are cases in which it is the interest of the collective community, or immediate family to adopt this Western approach to problems as they manifest themselves. But what if these interventions involved less deconstructing of the psychosis, and more of a reconstructing approach, using the internal world view of the mentally ill to build an illusory reality in which he or she can operate?

This was precisely the approach of Milton Rokeach in dealing with his, Three Christs of Yipsilanti. As more of a social worker than a psychologist (he was technically a social psychologist), Rokeach worked for several years with three men who each believed they were Jesus Christ. He brought the three men together in a group in hopes of un-establishing the deific world in which they had placed themselves. After one year of the project, Rokeach reported his findings:

"The three Christs had each adjusted to their new way of life; each in his own way had learned to cope with the others and with us…. The novelty and shock of confrontation had worn off. Each one had formulated and stabilized a set of rationalized beliefs to account for the claims of the others, and these rationalizations were bolstered by a silent bargain and repertoire of rituals designed to avoid the tension-producing subject of identity."

He decided to try a different approach after the first year. Entering into their delusions, Rokeach immersed himself into the worlds of the three mean and wrote confrontational letters to each of them signed in the name of their respective delusional creations. The men never knew it was Rokeach who penned the letters. In the correspondence, Rokeach challenged the men’s delusions by becoming a figure of authority as defined by the illusionary “selves” each man had constructed. He found that the delusions were best countered by advice from a perceived authority figure emerging from within the world of the deluded self. Drawing on the work of Bruno Betteheim’s study of concentration camps and anti-Semitic Jews who changed their concepts of self by adopting a new referent, he found that the three Christs were much more likely to identify with the ideology of either an aggressor, or a perceived authority. This approach has been used numerous times, as is currently bedrock to the theory that abductees will often grow to care for and relate to their kidnappers. But Rokeach argues that in the case of a psychotic, the external referents have become all together untrustworthy, and constructing a respectable authority was necessary in his treatment of the three Christs.

He advances the following hypothesis:

"A normal person will change his beliefs or behavior whenever suggestions for such change are seen by him to emanate from some figure or institution he accepts as a positive authority. Either he will change his beliefs and behavior so that thy conform with what he believes positive authority expects of him, or if he cannot or will not change, he will alter his beliefs about the positive authority itself; he will become more negative or more disaffected with the authority and, in the extreme, he will even formulate new beliefs about new authorities to rely on. "

Given the nature of current psychological theories in dealing with disorders, it’s hard to imagine the therapist as a “positive authority,” especially given today’s reliance on prescriptive medication to “cure” patients. This “pharmacological hedonism,” as it is aptly named by historian Edward Shorter, can sometimes only service the individual by the alleviation of symptoms. I can confess to real, unavoidable power of such drugs and have experienced them first hand via anti-depressants and I must truthfully confess that having such symptoms relieved wasn't inherently a bad thing.

Up to this point, I have considered the limits of both statistics and language in collectively classifying and defining disorder (and I have omitted two sections, one on Noam Chomsky and the other on Jung), but what opportunities does language offer the field of psychology — particularly considering that language is one of our only windows into the unconscious? Rokeach’s approach seems beneficial, albeit with a certain degree of risk. Entering linguistically into the world of the psychotic seems like dangerous business, especially given the potential lack of a perceived positive-authority. To the primitive, it was the shaman or spiritual magician who guided individuals through their various psychological states, in a similar way in which the scientific magician, the psychologist / psychiatrist does in the West today. By constructing a mythological tale, even a mythological world, in which to place the individual’s fears and hallucinations, this primitive therapist created a space for the unconscious to structurally express itself through prescriptive speech acts. Unfortunately, there is little research today to suggest if this approach is a valid.

What we do have are modern religious experiences, which border on the cusp of this primitivism and its latent perception of authority. Several empirical studies have been done on a variety of religious states and their effects. Some studies have shown that mystical spiritual beliefs and practices relieve stress and are more likely to define “mental pathology” in a positive light when taken in a spiritual context. Modern psychology would have diagnosed Joan of Arc as schizophrenic, paranoid, hysteric, or epileptic—maybe all of these, but in the context of her own mythological understandings, she was just a woman of devout faith; and in that context, she was able to become a powerful historical figure. How many other Joan’s of Arc are locked away in padded cells, or drooling quietly under the haze of medication? I suppose they at least aren't wielding firebrands... so we have that going for us.

William James defines “healthy-mindedness” as one positive aspect of the religious experience lending itself to numerous positive outcomes. Persons of this type of religious experience are intrinsically motivated to live out their beliefs in a highly personal, altruistic way. He contrasts this type of experience with those religious folk who are extrinsically motivated by creed and dogma. These sorts are called the “sick soul” by James and are continually surrounded by guilt, anxiety, and other neurosis.

Admittedly, I am no expert on the current trends of narrative psychology. But if we are going to rely on magical classifications to diagnose and hopefully “cure” people, I wonder what harm could possibly come from a spiritually-minded therapist, individually focused and willing to help the struggling patient solidify and construct an inner dialogue and mythology to deal with psychological stressors? If our language faculty is truly a part of unconscious make-up, then it seems reasonable to suggest that part of that faculty is the mythological expression. History and anthropology would suggest this is also the case, in that every culture has at least some form of belief structure—be it the sun, an anthropomorphic god, or monotheism. These seemingly imagined authorities are no more incorporeal than current psychological theories and building an empirical vocabulary around them may be beneficial to treating a suffering individual and a society with a steadily increasing influx of descriptively “disordered” selves. Is it a coincidence that the advent of modern Western psychology coincided with rationalism and the decline of religious sentiment? Perhaps, though Thomas Szasz says it is not. It may be that the unconscious of many still need a structured expression of spiritual dialogue in which to place their “selves” and their stressors. Structured speech, given from a perceived spiritual authority and focused on the individual’s needs, may hold at least one key to unlocking the problems of today’s mentally ill. The approach seems no less reasonable than modern psychology and its constructions.

Jung reminds us that in a time when people are disregarding spiritual ideologies, it might be good to remember why they were constructed in the first place: to deal with the brutality, vulnerability, and anxiety of the ancient world. The more they are discarded, the more these brutalities and stresses are unleashed upon the psyche. I doubt I would disagree as to the great psychological “crutch” religion has been to the world since the dawn of consciousness. However, re-establishing the spiritual props of religious dialogue and authority may be as equally helpful as the magic of mental health. We need therapists as brave as Rokeach to find out.

But then again, we could just give them pills.

I admit that after my four month stay as chaplain in an insane asylum, I do find myself wondering.




The Three Christs of Yipsilanti, Milton Rokeach: Alfred A. Knopf, Inc. (1964) p. 190- 194

The History of Psychiatry, Edward Shorter: John Wiley and Sons, Inc. (1997), p. 325-326

Sunday, April 11, 2010

Booger-Ditch Psychology

“A rose by any other name is still a rose.” At least that’s the way the saying goes. But is this necessarily the case? Words are representations of objects (referents) that are defined by other words and thereby incorporate, often by accident, what is and is not important about the object. In the Western paradigm, or meaning system, what is dominantly important is that which can be seen empirically, or proven logically. Ironically, it is this very empirical existence of the individual that the psychologist / psychiatrist tends to back nervously away from, choosing instead to work in categories & classifications according the dominant scientific paradigm. Creating “kinds” is a primary task of scientists—classifications of reality, which make communication easier, and reality more assessable to logic. This “kind-making” is not a bad practice for the scientist, and arguably helpful to many different fields. The problem for psychology in adopting a “kind-making” approach to human behavior is that humans don’t classify out as nicely as the periodic table. So a great deal of “picking and choosing” has to occur for the psychology to make its classifications. This means some features will be necessarily excluded, and others will be included in the considerations of normalcy and disorder.

Linguistically, this can pose a few problems from the start. Take for example the word “face.” What comprises the face? We would be quick to define it as the front part of the head containing the eyes, nose, and mouth. Some might even add the lips, chin, cheeks, forehead, or eyebrows in their definition. But what do we make of the small trough beneath the nose and above the upper lip? What exactly do we call this feature of the face? When I was growing up, it was referred to as “the booger ditch.” While there most likely exists a medical term for this indentation, there is no common word for it in English. It would be about the last thing one would use to describe the face, because no clear word to describe this area exists. Our system of meaning considers this area obsolete, even though it is present on nearly every human face. In fact, one of the only times the area is noticed at all is when there is an area-specific deformity there, such as a cleft-pallet.

The singularity of the “booger ditch” works against itself in the area of definitions: eyes stare, blink, and tear; mouths move and express, and noses bleed and run. What does “the booger ditch” really do? It doesn’t seem to have much of a purpose for English speakers, therefore as a singular feature this area is conveniently marginalized in our vocabulary. So is a rose by any other name still a rose? The answer is a solid “yes,” but only if our systems of meaning, relevance, and subsequent classifications are identical—or at best, similar. It is possible to envision a language in which the words “red,” “petals,” and “stem” are not the primary features a native speaker identifies with a rose. And while conversing with this native speaker and a divergent interpretation, we may be discussing the same empirical referent, albeit with two completely different understandings. And most importantly, even if we come to the same understanding, we have almost no way of realizing what may have been marginalized and framed into our meanings.

This is cogent to psychological classifications for one primary reason: in generating a label for aberrant behaviors, one chooses to include and marginalize various features of the disorder. What is marginalized is often what is seemingly unrelated, or without purpose. In the Western paradigm, behaviors are included as referents for disorder, as are speech acts, namely because these activities can be seen and/or heard. Provided we all agree together that the empirical referent we see and hear is fundamentally real and not imagined, we are diagnosed as being relatively ‘whole,’ psychologically speaking. Yet what our psychologists operating under the scientific paradigm may quickly marginalize as purposeless is often what cannot be empirically seen or heard, something “incorporeal,” if you will.

Interestingly, incorporeal carries with it two meanings. First, as defined by both Webster’s dictionary and the Oxford Dictionary of English, “incorporeal” means that which pertains to the immaterial, or the spirit. Second, in the field of law, “incorporeal” is something that has no material existence in itself, but attaches itself to some actual thing. An example of the latter definition would be the concept of rent. The house I am renting is corporeal, and I, the occupier of the house, am also corporeal, as is the money I use to pay the rent; however, the rent I pay is incorporeal. The very concept of rent is itself incorporeal, but integral to the ideology of economy and property. Hence to carry this concept over to the ideology named psychology, the behaviors and speech acts manifested within the meaning system are necessarily empirical, while the concept of “disorder,” or a root cause, remains ideological, perhaps even illusionary.

Second, and notably absent in psychological theories, is the definition of incorporeal as “spiritual, or that which pertains to the spirit.” Linguistically, the singularity of the concept of spirit lends itself to be easily marginalized. It is difficult to speak of outside of metaphor. Most things spiritual are, in a sense, the “booger ditch” on the face of scientific psychological theory. This is largely due to the lack of an empirical base and therein, the spiritual type of incorporeal meaning must be determined retroactively, as in a metaphor, or the creation of a new signifier (word).

Safouan describes this quite particularly by relaying the story of one man’s use of the word, “famillionare.” An apparently well-known fellow described his familiarity and social popularity by creating a new morphological term entitled “famillionare.” The signified meaning, which existed semantically in unconscious or incorporeal form, engendered retroactively through language. Now it is possible to imagine a kind-making exercise in which one picks popular millionaires, and locks them into a new classification called “famillionare.” Famillionares existed among us for many years before the creation of the class, and it can be argued that we subconsciously recognized these famillionares as famillionares when we previously interacted with them, or saw them on television. Now that we have brought these famillionares to the surface of thought, we can arrange them better, define them, set limits on how much popularity or wealth is needed to become a famillionare, and perhaps even begin to deal with the anxiety famillionares face on a daily basis. This new “kind” of anxiety is called “famillionitius.”

Why is any of this significant? Because psychology would have us believe that disorders are transformations of, or deviations from established behavioral norms rather than productions of an incorporeal belief structures belonging to an individual. In other words, the sources of some meanings are produced intrinsically and separated from the accepted system of meaning. I may have been suffering from famillionitius, years before I had a word for it. This brings us back to Jung who insists that the “magic” of a classification derived from a statistical mean dangerously grants us the power to negate, or transform, individual meaning. In the same way, language classifications may actually be “magically creating” kinds of aberrant behavior. As soon as we begin to discuss a subject from a theoretical standpoint, the prescriptive productions of the individual are interpreted and defined descriptively.

This is even more dangerous than would first appear. Systems of meaning are not easily challenged; it in fact takes a certain degree of linguistic effort to do so because the words afforded to us are limited by the meaning system. These words actually have profound influences in the way in which we think, as noted by the Sapir-Whorf hypothesis. The Sapir-Whorf theory of Linguistic determinism implies that the kinds of available words within a system of language significantly shape perceptions of the world, a type of unconscious self-fulfilling prophecy. If some meanings can only be revealed retroactively though language, we may very well be marginalizing meanings by a necessarily limited definition of “disorder” X. For example, “Jerry” never realized he suffered from “famillionitus,” but after learning about the term, he now interprets all his anxiety as derivative of the conditions of his wealth and fame. Jerry understands now why he yells at his wife and kicks his dog. If no other language is advanced to help Jerry define his activity, he is linguistically determined to filter his meanings through this new descriptive language, as opposed to his prescriptive, unconscious tendencies.

In a more recent study (2001), linguists compared emotional expressions in English with those of Russian. The study revealed that the English language to a much greater degree, objectified emotions more than the Russian language. Whereas the Russian language mostly allowed present tense verbs to describe feelings of anger or frustration, the English speakers employed more adjectives expressing emotional “states.” This is not uncommon for English speakers when compared to many other languages. Consider the morphology of the English statement “He will like me.” In our language, the words appear independently, autonomous, and objectified to a large degree. This is not the case in most other languages. In Swahili for example, the statement, “He will like me,” would be represented in a single word, transcribed something like this, “atanipenda.” The phoneme “ni” significantly embedded in the word represents the English word “me.” The objectification of the subject is most noted in younger Western languages, particularly English. This objectification is clearly an influence on the way in which we see the world, indicating that even our language code can limit expression for the subjective.

Given these considerations, it is not too far of a leap to see the way in which psychology utilizes language, particularly through classifications, as a type of “incantation,” whose magic is capable of shifting the incorporeal productions of individual-prescriptive meanings toward objectified theories of “self” from which one deviates descriptively.

(To be continued...)



References:

Journal of Literature and Psychology, Volume 46, Issue ½ (2000), pages 29-42.

Cambridge Encyclopedia Of Language, 2nd Edition. David Crystal, editor: Cambridge University Press (1997), p. 15.

Expressive emotions in multiple languages. Jean-Marc Dewalele & Aneta Pavenko: Language Learning 52, 2 (2002).

Friday, April 9, 2010

My End is My Beginning

"My end is my beginning." I love T.S. Eliot and I particularly appreciate this sentiment. I've found it to be true in so many different areas of life -- from nature, to relationships, and even to something as benign as changing jobs.

Where I've never seen the principle work (or at least work very well) is in writing. I suppose the cliff-hanger has it's place in literature, but sometimes I feel like I am getting jerked around. One thing I've always tried to do in writing stories is to offer up a clear beginning and ending. It just seems like a common courtesy to the reader.

Nevertheless, there is a rub in it all for me. Writing beginnings is easy. I have hundreds of stories that I started and never finished. I like getting to know my characters as I write and explore the setting of the story as I move through it. Beginnings are easy because beginnings are fun. In writing stories, I would compare it to starting a new relationship with someone. The excitement is always there and the act of discovery doesn't seem to wear thin.

But like all things, newness doesn't last. Familiarity sets in and with it, sometimes, boredom. It's easier to just walk away from the story and start another one. That's what separates those who dabble from those who do. Getting through those middle parts of the story is work. It can be laborious. I've been stonewalled at 50,000 words much more often than I want to admit. And I've walked away during those times for months, even years at a time.

Doers push through and do the hard work. Dabblers start a new story. But the hardest work for the doer comes when it is time to type two simple words: THE END. I have found it so hard to end a story -- by the time I have worked through the characters, lived with them for many months, I just don't want to let them go. Even when I already have scheduled out my plot points and know what is going to happen to them, I just don't want to turn them loose to the devices of literature.

This feeling bothered me so much, that many years ago, it occurred to me that something theological was taking place in my relationship with my characters. I was their 'god' so to speak. I created them, I plotted out their life course, I was responsible for either allowing bad things to happen to them, or intervening in their lives to shield them from harm. That opened up something new in me spiritually, as I contemplated my own existence and relationship with the Divine.

So I started writing a novella, entitled Nova. It was a story about a Native American woman named Nova, and her cross country journey with a group of settlers who she ultimately betrays to her tribe to be murdered. I wrote both the forward and the conclusion in my own voice, as author god of this story world and these characters. I had to ask myself some hard questions about why I was writing a tragedy, when it could have just as easily been a comedy.

I decided that the answer to that question was moot. "Of all the worlds I could have created with these words, this was the world that had to come first." I went on to use a quote from theologian Martin Buber about releasing one's own sense of 'doom.' As I continued comparing reality and fiction, I slowly changed.

I reached the end of a long, personal struggle with the problem of evil in the world. And at the end of that struggle, and that story, there was a new beginning. Then it sort of struck me that this was truly the intersection between literature and reality -- books give us endings, but in each ending, the good ones will always leave us with something new: a thought or maybe an emotion that urges us to another place, or along the path of another journey. And this reality seems no where more evident than in our spirituality and our spiritual stories.

I'll resist the urge to launch into an apologetic defense of that last statement. I'll just say that I believe it to be very true, and walk away with this ending and as I do, I walk into a new beginning.