A Response Boston Catholics Can Be Proud Of

Crowd inside Saint Paul Cathedral Boston

Catholics and their supporters gather in St. Paul’s Church in Cambridge. Photo by Boston Globe.

Crowds process from MIT chapel to St. Paul's Cathedral

Crowds walk in Eucharistic Procession from MIT chapel to St. Paul’s Church. Photo by Boston Globe.

Crowd in St. Paul's spilling out the door

Crowd in St. Paul’s spilling out the door. Photo by Boston Globe.

As many are aware from coverage in the local and national news, a satanic “black mass” was to be performed in Harvard University’s Memorial Hall on the night of May 12, 2014, sponsored by a student “cultural awareness club” from the Harvard Extension School. The black mass was to have been performed by members of The Satanic Temple of New York. As was extensively reported throughout the controversy leading up to the event, the main purpose of a so-called black mass is to parody and denigrate the Catholic Mass, which is the most sacred rite in Catholicism and beloved worship liturgy for 1.2 billion Catholics worldwide.

Although the satanists insist they are atheists and do not believe in an actual devil or in any supernatural being, the black mass itself is designed to “invert” and mock the Catholic Mass through the use of satirized similar language, parody vestments and, apparently, the desecration of a Eucharist (a consecrated host). As many people know, the belief that the consecrated Eucharist contains the Real Presence of Christ — the Body of Christ in reality, not in symbol — is a central pillar of Catholic faith. Therefore, even as The Satanic Temple eventually changed their plans about “obtaining” (stealing?) a consecrated Eucharist to use during the black mass and claimed they would use an unconsecrated host, the impending event remained a source of visceral offense for Catholics of many stripes.

But it was also clearly an offense for a great many people of goodwill who supported their Catholic brothers and sisters. Some pointed out that for an event that purported to be about “cultural awareness,” not much “awareness” was shown by the organizers. The black mass would have been no different from any similar performance denigrating a Jewish Shabbat, or a Muslim or Buddhist prayer service, or a Native American sacred ritual. People should renounce any such denigration as uncivil, disrespectful, and hurtful to our human community. At the last minute, following an enormous yet non-violent outcry which evidently included a petition with 60,000 signatures, the student club canceled the event about an hour before it was due to take place.

Nevertheless, my purpose in writing this post is not to recap what happened. My main point is to express my happiness with, and my admiration for, the community’s response to the black mass. The Archdiocese of Boston, led by Archbishop Séan O’Malley, O.F.M.Cap, did not make any call for angry protest marches or fist-pumping. Instead, the archdiocese arranged to have a Eucharistic procession starting at the MIT chapel and ending at St. Paul’s Cathedral in Harvard Square, where participants would take part in a quiet prayer service to honor Christ, the Eucharist, and to pray for those involved in the black mass. The Catholic Church, being full of human beings, will never be perfect, and it is often painfully not-perfect. But I have to say that Archbishop O’Malley and the Catholic community of Boston hit this one out of the park. The Catholic response was not to start fires, not to throw bottles or Molotov cocktails or even pies; the response was not to condemn any person (only the event itself); the response was not to shout angry slogans or incite riots or damage property. Not a single stone was thrown.

Instead, Catholics and other people of goodwill walked peacefully behind a monstrance containing the Eucharist; they walked from one place to another. When they arrived at their destination, they were over a thousand strong and could not all fit in the cathedral. It was standing room only. And it was quiet and peaceful. These people perfectly exemplified Christ’s call to non-violence and humble dignity, and in doing so they manifested the kingdom of God here in this place. They embodied Christ’s exhortation to turn the other cheek, which does not mean lying down as a doormat, but rather standing in peaceful, redirected resistance. It seems to me that last night’s procession and Eucharistic prayer service was a triumph of the Holy Spirit.

© 2014 Elizabeth Keck



The Wisdom of the Simple

Recently, I watched a PBS documentary on the tiny country of Bhutan, which is south of Tibet. The people of Bhutan live by the philosophy espoused by their leader, who, incredible as it might sound, seems to be the embodiment of Plato’s “enlightened philosopher-king.” He frequently moves among the poor and is transitioning the country to democracy. The philosophy in which he guides his people is known as “gross national happiness,” meaning that policies enacted in Bhutan should always be enacted with the goal of happiness for all the country’s inhabitants — and not just the human ones, but also the animals and the environs. This will in turn lead to greater human happiness.

Until only a couple of decades ago, Bhutan had no real interaction with the outside world. They also had very low crime, practically no drug use, and a population who overwhelmingly categorized themselves as “happy.” They were happy even though they were mostly subsistence farmers with no extra money to speak of. Then, with the opening of Bhutan to the global world, televisions and the Internet arrived — and with them, advertising. While most of the countryside population still does not have televisions or computers, many of the city folk do, and have begun to report a major decline in happiness. Crime has risen, as has drug use. Fast food joints — though no McDonald’s yet — have cropped up, and there is a higher rate of depression. This seems to be partially the fault of exposure to advertising, and to Western ideals of the “perfect” body and the “perfect” life. Bhutanese women, who previously measured themselves according to the traditional notion of the ideal woman — the strong, capable, wise person who holds her household together — now report feeling ugly as they compare themselves to sleek fashion models with perpetual hunger pains and thousands of dollars of product in their surreal hair. Ads for all the new “must-have” products can be seen anywhere in the urban areas, urging viewers to evaluate the material quality of their lives and find it lacking.

Now you might say: surely the subsistence farmers would be happier with these extra things and the money to go with them, since their lives are filled with backbreaking work and very little formal education. How could they truly be happy under those conditions? However, when the documentarist went to the countryside to interview these farmers who lived in huts with their families, the response seemed universal. They were happy. And they weren’t just saying it; you could see it on their faces. These were people with very little (if any) extra cash, with no modern gadgets or even running water, who spent entire days in rice paddies with yaks. Surprising, then, was their seemingly universal answer to the question: “Would you want more things if you could have them, and more money?” They answered no, as long as they continued to have their necessities and just a bit more for comfortable leeway. They did not want any excess.

More astonishing was their answer to the follow-up question: “Why do you not want more?” Seemingly as one, these simple Buddhist farmers responded, “Because if you have too many things, you’re not happy anymore. Instead you’re always worried about people coming and stealing your money or your things, and you want more. You think you don’t have enough and you become very attached to these things. So then you are not happy; it causes suffering.”

These were not people who had attended some local Buddhist seminary for graduate training. Yet there they were, espousing the quintessential Buddhist philosophy, which as a way of life frequently eludes scholars of religion, and being quite happy about it. They were espousing the philosophy that so many of their urban compatriots had perhaps unconsciously let slip away. Yet this philosophy is by no means limited to Buddhism. It can be found in most religions of which I am aware, including Christianity — or at least Christianity in its purer form, one not watered down by its affiliation with the majority culture in the West.

The farmers’ statements are also backed up by a major recent study showing that people, worldwide, report being at their happiest when they have enough to cover their basic needs, plus a little more for comfort. In one of the greatest paradoxes, people from all over the globe report that the more excess they have, the unhappier they are. The wealthiest nations have the highest suicide rates.

I believe that all this says much less about Buddhist philosophy than it does about the fundamentals of humankind. One does not need to be Buddhist to experience what these Bhutanese farmers are talking about. My Catholic grandparents experienced it, living out their lives in their small, but very solid, household. I experience it when I don’t feel the need to buy the latest gadget and throw out the earlier version that I only got last year, which still works perfectly. I experience it when I know that I really don’t want a bigger house — even if I could afford one — or a bigger widescreen TV that would mount on my wall, or even cable. (With that last one, I’m often met with incredulity). It is true that in our modern society, one cannot disengage from everything unless one enters a cloistered religious order. I do have my computer, my iPod, my cell phone, and a TV. But we can be content with what we have, and not think we need more because someone we know has a fancy car or an Internet TV. Let’s distinguish between what we want and what we need.

So there is wisdom in simple things. There is also wisdom in a simple approach to life and faith. When we become caught up in ourselves, things go from simple to complicated to a hopeless mess in quite a hurry. Pope Benedict XVI, in Jesus of Nazareth, Part Two, writes of St. Paul’s statement that even though he was an expert in the Law, he was ignorant of how God truly worked:

In view of his earlier self-assurance as a perfect disciple of the Law who knew and lived by the Scriptures, these are strong words; he who had studied under the best masters and who might reasonably have considered himself a real expert on the Scriptures, has to acknowledge, in retrospect, that he was ignorant…This combination of expert knowledge and deep ignorance certainly causes us to ponder….Clearly this mixture of knowledge and ignorance, of material expertise and deep incomprehension, occurs in every period of history….Are we not blind precisely as people with knowledge? Is it not on account of our knowledge that we are incapable of recognizing Truth itself, which tries to reach us through what we know? (Pope Benedict XVI, Jesus of Nazareth, Part Two, pp. 206-207)

How we work all this out in modern society is anyone’s guess. Certainly no one is advocating that we renounce education. But as we educate ourselves, as we learn and as we seem to acquire more and more things — including, perhaps, a deceptive sense of our own self-sufficiency — we need to remember humility, simplicity, and happiness.

Copyright © 2011 Elizabeth Keck

The Meaning of Life (Or, Migrating Geese)

For years I said that mosquitos had no purpose. The entire goal of their lives, I proclaimed with indignation, was simply to fly around, steal other beings’ blood, cause massive annoyance and sometimes terrible disease, only to lay their eggs so the whole thing could begin again. What did they contribute, besides the occasional meal for frogs and bats? Surely they took far more than they gave. What could be the purpose of such a creature?

When I had time to extrapolate this premise out along its logical paths, I then saw that I could apply what I had said about mosquitos to quite a few other life forms. True, every life form exists within an ecosystem, and in theory the removal of any of those life forms would impact the ecosystem in some way. Remove those very mosquitos, for example, and you remove a food source for those frogs and bats I mentioned. Would that eventually lead to inordinate pressure on other insect stocks to make up for the lack of mosquitos? Quite probably. Or take bacteria. Some bacteria live in our guts and contribute in a vital way to our digestion; without them, we could not process certain foods. Some viruses are retroviruses, and I have heard that the development of our vocal chords for speech might have been the result of a retrovirus rewriting our DNA. And yet, some bacteria do nothing but sit in the middle of deep ocean volcanic vents, spending their days enjoying the heat and sulfuric gases. Some viruses do nothing but cause misery to their hosts and replicate themselves. Do these have a “purpose”?

Thinking along these lines, I eventually worked my way up to human beings. Most of us usually take it for granted that humans have some greater purpose, even if we do not know what it is. Yet don’t we really mean that we have some greater purpose for ourselves as sentient beings? For example, to learn as much as we can about the universe, or to practice compassion, or to further causes of justice, or to cure diseases, or to grow as much as we can spiritually. Those purposes are marvelous in themselves. But are they not quite human-oriented? Beyond the human species itself, such purposes do not make much of a positive impact. Moreover, if we examine ourselves with perfect honesty, it becomes indisputable that humans, of all the species, are by far the most destructive to themselves, to other species, and even to the planet we call home. So what is our “purpose”? What, in other words, is the meaning of life? Not just human life, but all life?

So I had this question in the back of my mind while riding in our car the other day with my husband. En route, we passed a flock of Canada geese grazing on a lawn; they were stopping for a bite to eat along their migration path. I remarked to my husband that what an uncomplicated life those geese led (though I grant that they do need to look out for predators trying to make lunches out of them). They eat. They sleep. They have goslings. They keep each other company. And when the weather changes twice a year, they know they need to begin their migration. So they fly to another place and repeat the cycle. Theirs seemed a deliciously simple life.

And that was when it occurred to me: The meaning of life is, simply, to be. What other purpose do all the species, including our own, have? Each species has developed its own kind of life, its own way of living. Within those schemas, the purpose is to live those lives. For the geese, it is to eat, reproduce, enjoy the sun and the water, and migrate. For the bacteria basking in the sulfuric gases of volcanic vents, it is simply to thrive there. For trees, it is to grow toward the sun and generate seeds to perpetuate more trees. For humans, it is to love, to learn, to create, to work, to eat, to think about whether a God or gods are at hand and what that God or gods want, and a myriad of other things too numerous to list.

The idea that the meaning of all life is simply to be was reinforced for me later in the day, when in the course of genealogical research I happened to come across my grandfather Louis’ birth record and death record juxtaposed online. There it was, his birth record, proclaiming with all the hope and all the possibilities in the world that this baby boy was born on May 11, 1911. All those years were ahead of him, and he was about to live them. And there right underneath it was his death record from 1992, signaling that he had lived those years and gone on to whatever then awaited him. And there are billions more birth records and death records, with more added every day, and they go on, and on, and on. This fact does not in any way marginalize all the events and loves and everyday wonders that occur in every life. On the contrary, it affirms them, as every life of every creature is lived within that meaning of simply to be.

Copyright © 2011 Elizabeth Keck



He Who Lives By The Sword

Of the killing of Osama bin Laden, the Dalai Lama, always the voice of ultimate compassion, said that while bin Laden might have deserved forgiveness as a human being, “forgiveness doesn’t mean forget what happened…. If something is serious and it is necessary to take counter-measures, you have to take counter-measures.” The Vatican, expressing what was likely a similar sentiment, noted that bin Laden was responsible for the destruction of countless lives, spreading division and hatred, and manipulating religions to that end. The statement concluded: “In the face of a person’s death, a Christian never rejoices, but reflects on the serious responsibilities of each person before God and before humanity, and hopes and works so that every event may be the occasion for the further growth of peace and not of hatred.” Explicit in this statement is the lack of condemnation of the manner of bin Laden’s death. These statements from these respective religious leaders are striking, because both the Dalai Lama and the Vatican can usually be expected to make statements affirming life in all forms and disapproving of actions that lead to another’s death.

In the last several days, religious people of different traditions have struggled with conflicting responses within themselves to the news, and have sought whether their responses are in accordance with their faith. Thousands of people Sunday night rushed the streets, cheered, waved flags, sang, and chanted “USA!” Reactions against this jubilation have come from some quarters since, declaring it uncivilized or ignorant. Others refrain from judging that jubilation, but express concern that we should not celebrate or feel satisfaction at the killing of even someone as heinous as bin Laden, even though we are all better off with him out of commission, and even though he received every bit the justice he deserved.

But I think the tempered responses of the Dalai Lama and the Vatican speak to the reality that even if we know our noblest selves might refrain from taking satisfaction from this event, this was a man who, together with his associates, murdered innocent people. And not just our people, but many others across the globe, including Muslims. This was a man who spread hate like a bacterium. He needed to be dispatched from this world for a functional reason – that is, so that he could no longer plot murders and motivate new followers with his charismatic presence. But on a more emotional level, we do feel gladness that he was taken down, and taken down by one of our own warriors. Surely his followers do not see this as our victory, but that is irrelevant to the emotional release of a nation whose citizens were murdered merely because they went to work one day 10 years ago, or boarded a plane that one day — a nation whose psyche and daily reality were suddenly and permanently changed by the act of savagery that was 9/11.

From a Christian perspective, the New Testament teaches that we do best when we emulate Christ, who exhorts us to be peaceful, longsuffering, compassionate, forgiving, non-violent, and non-vengeful. The central message of Christ’s journey to the Cross, after all, was his willing endurance of a grave injustice and grave suffering when he was innocent of any wrongdoing, thus revealing God’s solidarity with suffering, and the glorification that ensued. But even the New Testament – even some of the words attributed to Jesus – recognize that we are still living in a broken world that harbors both chaos and evil, and people who choose evil. We strive toward the divine world, but it is still “not yet.” Sometimes, in this world, we are left with no realistic choice besides what happened Sunday night. Bin Laden chose his fate years ago, and went into it with eyes wide open. As the Vatican spokesman noted, we all bear a responsibility before God, and bin Laden reaped what he sowed.

Jesus did teach and demonstrate love, compassion, and forgiveness. But the New Testament also observes that  “he who lives by the sword, dies by the sword.” As humans, we should reflect soberly and sadly on the chain of events that led here, and that stretches back more than a decade. There is an Old Testament scripture that states, “Vengeance is mine – so says the Lord.” So it is. And we will leave such matters in his hands.

© Elizabeth Keck 2011

My God, Your God, or the Unmoved Mover?

In ancient times, people didn’t have the religious wars that we have today and have had over the centuries in the Common Era. There was no such thing as one religion warring against another. The closest anybody got to that were the frequent wars that petty kingdoms waged against one another, usually over territory, and the wars that a stronger nation waged against weaker nations in the endless pursuit of empire-building. In both types of conflict, the nations’ gods were perceived as essential to the outcome. (We see this in the Hebrew Bible numerous times.) The winning nation would usually proclaim its high god’s superiority over the losing nation’s high god; sometimes, as in the case of Cyrus of Persia in his victory over Babylon, the winner claimed that the loser’s god voluntarily handed over his own nation in anger against them. The losing nation would usually conclude something similar — typically not thinking that its god was simply weaker, but more that the people had angered the god in some way and were now facing consequences.

This idea appears a number of times in the Bible. One major example is 2 Kings 17, which offers that explanation for why Samaria (Northern Israel) suffered bitter defeat at the hands of Assyria in the eighth century BCE. Another is 2 Kings 24-25, which describes why Jerusalem and Judah fell to the Babylonians in the sixth century BCE. The prophets, such as Jeremiah and Ezekiel (to name only two!), also teem with the idea that Yahweh will hand over — or has already handed over — his people to foreign nations if they do not clean up their act. The Bible does not countenance any idea that Yahweh was ever defeated by some other nation’s god — for the Israelites, Yahweh was the only one who made the real decisions. Eventually, Yahweh was conceived as the only real god at all; this formation of thoroughgoing monotheism seems to have developed in the sixth century BCE, judging by its strong formulations in Second Isaiah (Is 40-55) and Ezekiel.

In any case, all that was as close as you got to a religiously-based conflict. Not very close at all. This is because people in ancient times typically did not have a problem with the idea that different people had different gods — even to the proliferation of thousands of gods. Even within one nation, where the people typically all shared a number of high-level national gods, it was quite common for individuals to cultivate special personal relationships with one god or two, often even with lower-ranking gods. We see this in ancient Egypt, Sumeria, Assyria, Babylonia, Ugarit, Greece, and Rome; we know it from the texts these peoples left behind. A person’s devotion to one god on a personal level did not lead to that person’s dismissal of other people’s personal gods; it was more an acceptance of the actions of multiple deities among different spheres. (The brief and infamous reign of Pharaoh Akhenaten was a notable exception to this, but let’s not go there.) You even find it in ancient Israel. Archeologists have uncovered countless female “pillar figurines” from individual homes; these were likely representations of a fertility goddess to whom women would pray about reproductive and maternal concerns. Yet it’s improbable that such practitioners would have denied that Yahweh was the shared national high god.

Socrates, Plato, and Aristotle, however, waved aside traditional ideas that cast gods in the behavior of people. But they also went further and abandoned the idea of a personal god with whom one had a real relationship, with whom one could communicate. For them, particularly Plato and Aristotle, there was only one real God (although Plato referred not to “God” but to an inscrutable entity he called the Good), and that Being was so high that it was by definition beyond human knowledge or reach. They reasoned that a God so vast would likely exist beyond human capacity to influence through prayer, since such a God would operate on the scale of the entire universe. Aristotle famously dubbed this Being the “Unmoved Mover.” Nothing could act upon or influence the Unmoved Mover; but the Unmoved Mover had set the universe in motion. This is similar to the approach of Thomas Jefferson and others of the Founding Fathers, who practiced Deism — not, contrary to what the Tea Party convinces itself, an especially pious form of evangelical Christianity.

The great monotheistic religions — Judaism, Christianity, and Islam — did something remarkable and combined the above understandings. Similar to Plato and Aristotle, they conceive of one pristine ultimate Being, to the exclusion of others, who operates on a universal scale. But similar to older notions, they also conceive of this Being as a personal God with whom one can communicate, have a relationship, and to whom one can actually pray. Christianity went a step further along these lines and conceived this singular, ultimate God of the universe to incarnate as a human within history. All three of these religions understand the one God to act in the lives of created beings in an ongoing way, and to take an active individual interest in them.

Personally, I prefer that model, though at times I think that Aristotle’s seems the more logical one. Then again, even in Aristotle’s model, if the ultimate God is so far beyond our understanding, who is to say what is logical? I once knew someone in graduate school who, as we walked back to the halls of the ivory tower from Taco Bell one lunchtime, informed me that he was perfectly unperturbed by the idea of a vast God-beyond-reach, an Unmoved Mover. I, by contrast, flailed against the possibility of a removed God who was unlikely to talk to me, hear me, or relate to me; for me, this was unacceptable. For him, it failed to shake his unflappable calm that God was God, God knew all, and why be flustered over the details? I think he found me vaguely amusing.

Ultimately, on this matter we have no course but to embrace humility and lack of knowledge, accept uncertainty, and follow where our inner self urges us. On that score, I am reminded of that romp of a movie, “The 13th Warrior,” set in the middle ages. There is a point when an Arab Muslim protagonist calls out to his close friend, a Viking warrior, that he will pray to the one God for him. His Viking friend responds, “In your country, you may have need of only one God. But in my country, we have need of many! I will pray to all of them for you.”

© Elizabeth Keck 2011

Something New Under the Sun

In my last post, I talked about the differences in how “older things” are viewed in our culture as opposed to ancient cultures. I noted that our culture tends in large part to esteem what is new, while relegating older customs (or, sadly, older persons) to a past that need have no bearing on current preferences. I noted that ancient cultures, including much of what we find in the Bible, did the opposite: they largely valued the worth of inherited traditions, and tended to work within those structures even when innovating.

Yet sometimes we do witness things that represent the reverse. In the Bible, the famous lament of Ecclesiastes (Hebrew, Qoheleth) that “there is nothing new under the sun” echoes across the ages from a man who lived in a time when things rarely changed, and even more rarely changed for the better. But we also witness elements within our own culture doing the reverse of the above pattern. The most prominent example I can think of these days is the negative reaction among some to the stratospheric rise of new social networking — communication in the form of email, Facebook, Twitter, and texting. Over the last couple of years, we saw various talking heads appearing on TV to talk about what we are “losing” to these new forms of communication. Then we were treated (and continue to be) to written commentaries, or even whole books, on the virtues of the older forms of communication over against the deficits of the newer ones. Invariably, the argument is something along the lines of how people form better relationships and are more satisfied when their primary forms of communication are in-person visits and phone calls rather than written forms such as email, Facebook, or texting.

Let me say that in-person interaction is, indeed, usually the most preferable and satisfying. I don’t think that even the most ardent email, Facebook, or texting aficionado would dispute that in principle. But, as we all know, in-person interaction is not always possible, especially in a modern world where most people — certainly most people who have not yet reached middle age — have a number of friends and family who do not live in physical proximity. Sometimes, the physical proximity is there, but the open time is not. This may especially be the case with thirty-somethings and younger, who often lead frenetic lives filled with all kinds of disparate occupations and obligations as they try to establish their places in the world. Such people may indeed get together for coffee or a visit with friends, but during the times when they do not see each other face to face, they more and more rely on written-form, instant communication to keep in frequent touch. More and more, it is becoming clear that such forms of communication are surpassing the traditional phone call for the day-to-day comments one might make to someone else outside one’s nuclear family.

Someone I know terms this phenomenon “asynchronous” (or non-synchronous) communication. It is not intrinsically worse than the “synchronous” communication that comes with a phone call. Surely, there are certain interactions that are better over the phone: in-depth conversations, for example, that cover a broad range of topics too large to put in writing efficiently. Or when the parties involved are a couple who delight in nothing more than the sound of the other’s voice. But a phone call does require that both parties cease what they are doing and carve out the time necessary for synchronized communication, during which they can usually do nothing else. This is sometimes not practical for many, in a world with full and conflicting schedules. So people have turned to “asynchronous” communication, which allows them to keep up interaction and conversation with others more frequently — and in a freer fashion — than if they relied on the phone.

There is also the comment of my (then middle-aged!) high school history teacher, who, before anyone had heard of email, insisted that he disliked talking on the phone so much that he would not do it unless absolutely necessary. His problem was the odd and sometimes awkward quality of phone conversation: not seeing the other person’s face and getting that unspoken feedback while talking to them can feel strange, especially if there is a pause (and we all know there is nothing worse than phone pauses). Remembering his comment led me to think of something else: what if people talked on the phone all the time for decades not because they intrinsically would have preferred it, but because they had no choice — other than postal mail, in which case they would have to wait a minimum two weeks for a reply? Indeed, how could email, Facebook, Twitter, and texting have enjoyed such an unbounded explosion of popularity if they did not fulfill something that many people felt they needed or wanted? Clearly, these technologies do provide something that many people like, and it makes one wonder if it’s just because now there is a choice, and before, there wasn’t.

It is also amusing to remind ourselves that when phones were first invented, many people most assuredly raised the very same perspective that something would be “lost” with the new-fangled invention. Others, assuredly, were delighted. We see the same thing happening today. For that matter, one could amuse oneself further by considering that written communications such as email, when personal and not business in nature, bear more similarity to old-style communication by letter. I read an opinion piece by a teenager in my regional newspaper recently, in which the writer observed that when people around his parents’ age rail against texting as a legitimate form of communication, it’s usually just “adults not understanding what they’re looking at.” Another opinion piece (written by a thirty-something) noted that being on Facebook is like being “at a cocktail party, where there are all these different conversations going on.” That’s not a bad thing, even if an in-person cocktail party is ideal.

Perhaps we should take it as a good thing that electronic communication has blossomed so thoroughly — it could not have done so, after all, if people did not want to talk to each other. We do not live in the time of Ecclesiastes. There is, indeed, something new under the sun — even many things. And they’re not all bad.

© Elizabeth Keck 2011

Out with the Old?

First, I’d like to acknowledge that it’s been a long time since I have posted — back in December, to be precise. This is because the last couple of months have been consumed with completing my doctoral degree, which culminated in the defense of my dissertation on Feb 28. Still riding the crest of that tide, I’m looking forward to posting here more regularly.

That said, I was reading my regional newspaper the other day, and came across an article that included advice from a few career counselors in response to disillusioned job seekers. One of these wanted to know why she had had interviews for seven months, but no job offers. One of the dispensed nuggets of advice was the following: “When writing your thank-you notes, make sure to send them by email. Handwritten ones can make you appear old-fashioned.”

Inherent in this nugget of advice, of course, was the bald and unquestioned implication that being old-fashioned  is automatically bad. I am not a career counselor, so I cannot claim that this advice is wrong. I do, however, remember the days when email was only a few years old and had not yet caught on as a ubiquitous form of communication. In those days, one was warned always to send handwritten thank-yous to an interviewer, and never emailed ones, because a handwritten note would show that you were professional enough to make an effort with a time-honored tradition. Nonetheless, the rapid pace of modern changes of convention is not my main point. I am more piqued by the counselor’s unquestioned acceptance that “old-fashioned” equals negative; this is proclaimed as a truism, taken for granted.

What strikes me particularly (and this won’t be surprising, given my newly-minted occupation as a biblical scholar) is how different our culture is from the ancient world in how it considers the worth of old ways and old things. In the culture of the Bible — to use just one example of an ancient culture here — old ways, old things, and old people carried a cargo of deep respect, and were emulated by younger newcomers seeking to make their own meaningful contribution. A prophet or psalmist, for example, could innovate with a creative idea, but expressed such innovation through deference to older convention, and often with reference to older things. There are too many examples of this in the Bible to do more than scratch the surface here, but one of my favorites involves the use of ascending numbers. This was an ancient literary convention. Here are a few examples:

“There are 3 things that will not be satisfied, 4 that will not say ‘Enough’: Sheol, a barren womb, earth that is never satisfied with water, and fire that never says ‘Enough’ ” (Proverbs 30:15-16).

“There are 3 things that are too wonderful for me, 4 that I do not understand: the way of the eagle in the sky, the way of the snake upon a rock, the way of a ship in the heart of the sea, and the way of a man with a young woman” (Proverbs 30:18-19).

“Under 3 things the earth quakes, and under 4 it cannot bear up: under a servant when he becomes king, a fool when he is satisfied with food, an unloved woman when she gets a husband, and a maidservant when she supplants her mistress” (Proverbs 30:21-23).

“Yet gleanings will remain in it like the shaking of an olive tree, 2 or 3 olives on the topmost bough, 4 or 5 on the branches of a fruitful tree, declares Yahweh the God of Israel” (Isaiah 17:6).

“Thus says Yahweh, For 3 transgressions of Damascus, and for 4, I will not revoke it [punishment], because they threshed Gilead with sharp iron” (Amos 1:3).

It’s worth noting that in Amos, the “for 3 transgressions and for 4” continues in a litany of divine charges against various oppressors. To use a different example, the books of Samuel make several references to God’s deliverance of Israel in the Exodus, but these references are made in the new context of the people at war with the Philistines and other groups; reference to “the olden days” is valuable. We see such references to the Exodus again in the context of the Babylonian Exile in the 6th century BCE, in which Second Isaiah (for example), an exilic-era prophet who wrote Isaiah 40-55, reminds the people of how God parted the Red Sea, led them through, and extinguished the pursuing oppressors.

I could go on, but I’m beginning to get tired. The point is clear. There’s a real difference between how our culture perceives “old-fashioned” things, and how the Bible (and other ancient cultures) perceived them. Now this is not to say that “the olden days” represent some golden era where everything was easier and good and everybody was kind and thoughtful, and so on. My recent reading of Mark Twain’s Autobiography was enough to cure me of any such notion, as the great humorist himself goes on at length about what is deficient and distasteful about hypocrisy, politicians, political parties, and the electorate in his day. Excerpt that passage and you could have in front of you an editorial in any newspaper during our modern election cycles. So this is not to say that everything old equals good. But it is to say that by the same token, not everything old equals bad, and not everything new equals good.

And then there are the words of that immortal realist/cynic (depending on your point of view), Ecclesiastes: “A generation goes and a generation comes, but the Earth remains forever. The sun rises and the sun sets, and hastening to its place it rises there again. Going to the South, then turning to the North, the wind goes swirling, swirling, and on its swirling courses the wind returns. All the streams go to the sea, yet the sea is not full; to the place where the streams go, there they keep on going” (Ecclesiastes 1:4-7).

© Elizabeth Keck 2011

Advent, Thor’s Hammer, and Cosmic Mystery

So instead of walking into our church to participate in Advent Lessons and Carols this past Sunday, I found myself standing at that very time in a Scandinavian gift shop buying the Hammer of Thor. All right, not an actual hammer, but a necklace and earrings shaped like the Hammer of Thor. Thor is the pre-Christian Norse storm god, the blow from whose mighty hammer was said to create thunder. Thor was also respected for his strength and for his ability to endure pain without complaint. I purchased the representations of Thor’s Hammer primarily because of my interest in ancient religions, and particularly ones that involve such fantastic mythology. But I also gravitate toward a broad personal theology that I indulge from time to time, although I identify as Christian.

We didn’t skip Lessons and Carols on purpose; it was just one of those things that sneak up on you. But the incongruity of the situation — getting waylaid on one’s way to Advent service by a Scandinavian shop selling the Hammer of Thor — prompted me to think, as I often do, about the confluence of certain religious tenets and how my own “personal theology” fits into both Christianity and the broader world of spirituality. I’m not the sort who will say that all religions are essentially the same — because, really, they are not. It is not even the case that all religions believe in a Creator (Buddhism, for example, does not, but instead holds that the universe has been eternally existing). Different religions emphasize different things, and they cannot be easily mashed together without overlooking and even disrespecting these things. However, it does seem clear that each religion is pointing toward something “else,” something more, something greater than what we can see with our immediate eyes in our immediate physical surroundings. It is for that reason that I tend to augment my Christian practice with contributions from other philosophies, which often are not contradictory in any case.

Each Advent, I am compelled to think about the mystery, and the apparent lunacy, of the idea that the Creator God decided at some point in history to enter human flesh and become one of us, in a profound and world-altering act to demonstrate God’s love for his creation and his identification with us. It must be a ridiculous idea that a being whose breadth and depth are so far beyond our own that we are hopeless ever to comprehend it decided to become one of us for a time. It must be ludicrous that in that “becoming,” this cosmic being intended to free his creatures from the shackles of their ongoing misdeeds, to offer redemption from those misdeeds, and in so doing to effect a cosmic demonstration both of love and the inherent sanctity of our created bodies. Inherently sanctified because, so Christians believe, God saw fit to “become” into one of those bodies, and our flesh can receive no higher recognition, no higher gift.

All these things sound preposterous. But they also possess (to borrow the now famous phrase) the “audacity of hope.” The truly bold, outrageous, no-holds-barred kind of hope that might just have a chance at success, by sheer virtue of its audacity. Such is the mystery that Christianity proclaims, and to which it joyously holds on with both hands. This, despite the fact that the biggest mistake Christian churches often make is to forget that what lies at their heart is not a carefully sorted-out array of systematic rules and provisions, but is essentially cosmic mystery.

One can believe in the truth of Christianity as itself; but of course, that is necessarily different from the system that developed around it, since finite humans need finite and inadequate ways to assimilate the infinite divine. Thus, “Christianity” as it is practiced, systematized, and understood by finite creatures is necessarily different from the cosmic truth upon which Christians believe their religion is based, and which it tries to express, and which only God can fully understand.

So that brings me back to Thor’s Hammer. I wear it as a symbol of strength, confidence, and endurance, which are traits that I value and try to emulate (not always successfully, but that is the nature of our imperfect being). Mystery is that wearing it can help me foster those traits within myself and express them outside myself. Mystery is that, according to our best astrophysicists, our entire universe — all the energy and matter that it now contains — existed as a superdense spot much smaller than an ordinary pearl about 14 billion years ago, and in response to some action that we do not understand, instantaneously exploded in an event we call the Big Bang, and has been expanding ever since. Mystery is that all the elements on the Periodic Table are, without exception, forged within stars like our Sun, and that our bodies are therefore quite literally made from stars. Mystery is that for its first billion years, Earth was a nightmarish, volatile place, home only to constantly erupting volcanoes, lava oceans, a constant barrage of meteors, and an atmosphere toxic to life as we know it. Mystery is that, by means we still do not understand, the amino acids and proteins of life found their quickening, and a bunch of cyanobacteria over millions of years became responsible for an atmospheric oxygen content that would allow larger life forms to come to exist. Mystery is that we know of over 100 billion galaxies, and each galaxy has about 100 billion stars.

Mystery is that we as a species feel an ongoing pull toward and connection with some greater world not entirely visible to our physical eyes, but known so ineffably to our hearts. Mystery is that we could be loved by a cosmic being who could have set all of the above into motion; that we can love other people as wholly, as beautifully, and as inexplicably as we often do; and even that we have come to know what love is.

© Elizabeth Keck 2010

The Age of Mediocrity

I have not been to the movies in quite some time. This fact stems not just from the reality of my life as the parent of a child who is too young to sit through a movie with us (and this will presumably change in a year or so, at least where Disney movies are concerned). It stems from the fact that, quite honestly, the majority of movie previews that I see on television provokes a response not much more enthusiastic than “eh.” When did this happen? It cannot be that I am becoming a stick in the mud in my advancing years, since I have also heard this complaint from several different quarters. It is that, over the last few years, most of the movies whose worthiness Hollywood studios try desperately to convince us of have been either pointless altogether, or firmly in the “eh” zone.

Sure, the 3-D landmark Avatar was a visual triumph, and it was fun to watch and it had a few compelling moments; but the only thing the storyline could lay claim to was a large recycle bin of other people’s ideas. Which we had already either seen or read before. Many. Many. Times. The last time I can remember going to a movie and having my socks blown off and my head put into an alternate state for days was when Lord of the Rings: The Return of the King came out and my husband and I went to see it on opening day at 10 am. The situation is not helped by the fact that I am less than excited to spend now over $10 per ticket going to the theater if the odds of my being underwhelmed are greater than 50/50. Yet I used to adore going to the movies, and my husband and I (before the arrival of our unforgettable progeny) could usually find several per year to which we flocked with great anticipation. But now the idea of truly enjoying that many movies at the theater in any given year seems draped in nostalgia.

As I write this, I am able to see, near my TV, the cover of the box set of Bogart-and-Bacall films, which features a picture of a famous scene from their first movie, To Have and Have Not. In those days, you could go to a movie for 25 cents and have a pretty good shot of seeing something amazing. Casablanca. Cat on a Hot Tin Roof. Treasure of the Sierra Madre. On the Waterfront. A Streetcar Named Desire. Key Largo. It Happened One Night. Rebel Without a Cause. Sure, they made bad movies back then too, but from where I’m standing, an awful lot of classics came out of that period. Fast-forward just a little in time to the 1970s and you still get All the President’s Men, Three Days of the Condor, Star Wars,  and those two movies by whose mindblowing standard others fear to be judged: The Godfather and The Godfather Part II. The latter is surely one of the greatest artistic achievements to emerge from the film industry. But the last five years or so, we too often get boilerplate action flicks and cookie-cutter romantic “dramedies.”

What does this have to do with anything? It seems to me that the wave of mediocrity in film making is just nestled amidst a much larger wave of even greater mediocrity in our society. Marvelous little one-off shops with delightful inventory are being replaced with mega stores whose inventory is often banal. You can still find treasures in those mega stores, but not as easily. This in turn brings me to “quality” of manufacturing. In an age where most products for sale are made in China or similar places with the cheapest materials possible, we’re saying “they don’t make ’em like they used to” a lot more these days, when holes form and threads unravel in our clothes often before we’ve stopped thinking of them as new. Things that used to be constructed in solid wood are now particle board that splits along seams, bends, and/or collapses. Electronics, which you’d think would come with some durability for the price, often abandon this world for the next with a little too much abandon.

Sure, there are still great books being written and sold, but stores are also full of shelves and shelves of drivel for which “mediocre” is a word of praise; yet they are somehow published. Pop music, in my opinion, has hit new lows over the last ten years. Those in the pop industry are no longer even required to possess a decent singing voice, since studio albums are now often doctored with AutoTune and live renditions are often atrocious. Even with the studio versions, mediocrity of content seems accepted fare. Yes, there are some real counterexamples, but the industry seems content with predictable plain potatoes. Small restaurants still exist, thank goodness; but they are being threatened by mega chains that too often churn out not delightful meals for a night’s getaway, but bland, mediocre fare that tastes as if it could have been shipped in from out of state.

There are many truly motivated, intelligent, hardworking college students out there, and they deserve respect for their effort. But it must be acknowledged that many others in the college populace, which long ago represented the shining motivated of our society, now do as little as possible as badly as possible to receive what should be a C, but is too often an A in an era of undergraduate grade inflation. Mere completion of an assignment, regardless of quality, can be regarded by the student as deserving of a high grade. This is not just the students’ fault; this sorry state of affairs is fostered by the new cultural environment that advocates merely “the college experience.”

I am not in general a negative person, and I dislike complaining. But I do think that a little perfectionism, a little drive, a little striving to make something as good as you can make it or to do something as well as you can do it, a little pride in one’s craft, does our species credit and makes us happy. And makes others happy as well. God has given us more intellectual and creative capabilities, and more potential, than any other species on this beautiful, volatile planet of ours. Let us not squander our gifts. Mediocrity does not become us.

© Elizabeth Keck

Juan Williams and the Realities of Post-9/11 America

The current dust-up over the firing of Juan Williams from NPR speaks to a major cultural matter in contemporary America. I would submit that when we consider such a major cultural matter, we ought to do so under the light of all its complexity — and not just “cry havoc! and let slip the dogs of war.”

Williams, who until recently was a news analyst for NPR, appeared before Bill O’Reilly and noted that it is wrong to paint all Muslims everywhere with one broad brush. Williams’ larger point was that one cannot simply say “Muslims” are the culprit for terrorist attacks, as O’Reilly had provocatively asserted on The View last week, during which co-hosts Whoopi Goldberg and Joy Behar walked off the set in protest. This larger point that Williams espoused (and with which most of us would agree) is unoffensive and accurate. But in the course of making this larger point, Williams honestly admitted to his own personal fear that when he is on a plane and he sees a passenger in “Muslim garb,” he does get “nervous” and “worried.”

As I interpreted the interview, Williams appeared not to be proud of this worry he feels during air travel, and he certainly did not present it as something that should be advocated. On the contrary, he brought it to the conversation in the context of realities that currently exist in the American psyche, whether rational or not, in the post-9/11 world. For this he was fired from NPR, without any opportunity to elucidate his comments further, while NPR’s CEO commented that his “offensive” statement was “between him and his psychiatrist.”

Immediately, while liberals flocked to their own corner and denounced Williams as a bigot who deserved to be fired, conservatives in turn flocked to their corner and hailed him as the voice of ordinary Americans, silenced by the tyrannical elitism of NPR, which should no longer receive any federal funding from Congress. Both corners are too extreme and fail to consider the complexities that are involved. [It should be noted that Whoopi Goldberg, who initiated the walk-out on O’Reilly, came to Williams’ defense and said that to fire him for his statement was outrageous.]. The NPR ombudsman, voicing the position of those calling for Williams’ blood, wrote: “What Williams said was deeply offensive to Muslims and inflamed, rather than contributing positively, to an important debate about the role of Muslims in America. Williams was doing the kind of stereotyping in a public platform that is dangerous to a democracy.  It puts people in categories, as types – not as individuals with much in common despite their differences.”

I object to a number of these contentions, not because I am an apologist for a conservative perspective — far more often than not, I am solidly in a liberal camp — but because I believe these contentions are unfair and inaccurate, and blind to the reality of the “debate about the role of Muslims in America.” As for unfair and inaccurate, it seemed that Williams was not really advocating any stereotype about Muslims, or lumping all Muslims into one category (as O’Reilly had in fact done on The View). The context of Williams’ statements bears out that he was speaking about a non-rational personal fear, of which he might even have been ashamed — he was not speaking about Muslims. He was speaking about himself. To leap to the conclusion that he was investing in stereotypes and bigoted statements is disingenuous, and ignores the context of his words.

I have already argued with fervor that Muslims should be viewed and treated with the same respect and liberties as anyone else in this country, and that we must not ignore our founding principles of religious tolerance (for this, see my post, “More than Lip-Service for a Legacy”). Muslims in this country are entitled to the same religious respect and tolerance as Christians, Jews, Buddhists, or anyone else. They do not constitute some alien group, and Islam is certainly the most misunderstood religion in our entire nation.

But this does not change the reality that 9/11 and its perpetrators permanently scarred the American psyche. We who lived through it will never be the same. The legacy of 9/11 continues to this day, in the war that we are fighting in Afghanistan and recently in Iraq, in our counter-terrorism measures, in beefed-up security, in our laws, and in our memories. We cannot escape from it. Terrorists claiming some warped version of Islam attacked us and continue to attack around the world. They attacked us while using Islam as an excuse, a flimsy justification for their barbaric actions. They could have picked any religion and done the same. We know this.

But the fact remains: we were attacked by lunatics wielding planes as weapons. The fantasy that NPR’s ombudsman, and those with the same feelings, insist that we all believe is that the cultural affiliation (however warped) of the 9/11 terrorists does not matter on any level. Either we each manage to forget their affiliation with radical Islam, so that it never affects us in any of our fears ever again, or we are bigots engaging in stereotypes and should lose our jobs. The terrorists’ affiliation with Islam must be excised from our minds. This, in terms of the actual functioning of our psyches, is a fantasy. It is a wonderful fantasy that we should all be able to divorce any association of Islam from the terrorists who attack us. But it is not the reality in this country, where the most tolerant person who knows right-left-and-sideways that “Muslim” does not equal “terrorist,” might still feel that pull of worry in some dark part of his or her brain while sitting on an airplane. It might only be fleeting, then smacked down for the irrational thing that it is, but it is there. And, whether we like it or not, it needs to be acknowledged because it is reality. It cannot be dealt with if it cannot be acknowledged.

NPR’s ombudsman says that Williams did not “[contribute] positively, to an important debate about the role of Muslims in America.” What, exactly, would have qualified as “contributing positively”? How are we supposed to have a “debate” in the true sense of the word — i.e., not a monologue that assumes we should all think the same thing — if we cannot admit to a perfectly explicable, although not rational, fear that was implanted by 9/11? How is any debate worth anything if honesty cannot be allowed, if all the participants must adhere to a rulebook written by only one of the parties? Some people have compared Williams’ statements to stereotypes about, for example, African Americans or any other group, saying that if he said something similar toward another group, no one would be defending him.

That is true, no one would be defending him, and rightly so. But we cannot be blind to reality, or we can never really have “an important debate” that is worth more than just the term itself. The wave of terrorism aimed against almost the entire world in the modern day does not associate itself with, e.g., African Americans. It associates itself with radical Islam, even though this form of Islam has nothing to do with ordinary Muslims. We were attacked with planes. Thus it seems to me that we are allowed to be a little afraid on a plane, whether or not we think we really should be. We cannot realistically be expected simply to forget, deep in our psyches, the radical ideology that the terrorists espouse.

Psychologically, 9/11 scarred us. It is not just a slogan to say that we will never be the same. It is reality, just as the effects of that scarring are reality. Those effects are what Juan Williams honestly admitted to feeling sometimes on a plane. Instead of blindfolding ourselves and stopping up our ears, claiming to call for “positive contributions” to “an important debate” while Juan Williams is drawn and quartered, we ought to take a step back and ask ourselves if the rulebook for this “debate” includes honesty or not.

© Elizabeth Keck 2010