Thursday, December 17, 2009

End of (School) Days

The close of the semester - and the accompanying grading - always makes me ponder the nature of our current educational evaluative system and all its flaws, variants, and benefits. My husband frequently objects to the idea of "grades" as "meaningless" or "false" indicators of ability. And in some ways, he's right.

First, what any particular grade means is somewhat arbitrary from system to system - an A at one institution might be a B at another, and a third institution might not even have an A. Growing up, my grade school did not use standard grades; we had O (Outstanding), V (Very good), S (Satisfactory), U (Unsatisfactory), and N (Not acceptable). As far as we were concerned, it was the same system. It translated when we moved to high school into "normal" letters. The difference between an O and an A was simply the form of the mark on the paper. My high school used the standard system, but an A in an AP class was a 5.0 instead of a 4.0. My undergraduate university didn't use pluses or minuses, but slash grades: A, A/B, B, and so on, which meant that I might have a lower GPA with an A/B than a student who earned an A- at another institution, even though I might have been doing better work objectively speaking. These are problems with non-standardized grades.

And then there are the objections to an evaluative system altogether. The idea that we should simply "appreciate" the efforts of our students without giving them a grade. On one hand, I understand and, yes, "appreciate" this impulse. But on the other hand, shouldn't there be a distinction between outstanding quality of work and simply adequate quality? Shouldn't we recognize that a complete assignment can be done with more and less effort, and that those levels of effort should produce different results?

This leads to another "problem" with grading: sometimes you have a student who puts in a lot of effort and does "worse" than another student who puts in less effort. Some people are naturally talented. Does this mean they should do "better" with less work? Does this mean we are "punishing" people who put in much more work but can't produce the same quality end-product? I tend to say "yes, we can reward the better product, regardless of the amount of work put into it." Because one of these two things is objectively better. If given a choice of the final product without knowledge of the effort put into it, most people will choose the better product - in business, in art, in any field. And it is a fact - an often frustrating fact - that some people just aren't capable of producing work at the same level as others (at least in a given area).

And, finally, there is the current hot-button issue of grade inflation. The idea that students have come to expect As in all their classes for doing the bare minimum of work. An A is the new C. This bothers me, most particularly because it means that the people who DO produce better work (whether through effort or talent) are told that their superior product is the same as a much inferior product. It just isn't fair.

And that's what, I believe, grading all boils down to. One camp that wants people to feel good about their work and one that wants an objective system of evaluation. Personally, I want people to feel good about work they've done well. I want grades to mean something, whether a reward for hard work or an acknowledgment of innate talent. But we also need to recognize that there are average students, and that those students should not feel bad for getting an "average" grade.

I want my students to earn their grades. I want them to produce work worthy of the mark I give it. But I also want them not to feel inadequate for not being "the best" at something. I want them to be happy that they can improve their skills without needing to be "the best." Yes, striving for improvement is good and competition can sometimes foster that, but accepting the grade you've earned is a mark of maturity and understanding, and graciously doing so is a sign that you have learned an even more profound lesson: that you have done your best and accepted your strengths... and your limitations.

Wednesday, December 16, 2009

Things that go Bump in the Night

Recently I seem to have been unintentionally preoccupied with the supernatural - ghosts, ghouls, dead things, and so on. Not that this is entirely unsurprising, of course. My research involves plays - like, say, Macbeth or The Changeling - that contain not only gruesome deaths, but the returning (and revenging) spirits of the dead. In my other job, I work in a 323-year-old building with a crypt containing around 100 bodies, some of which are visible, with fairly serious noise and electrical "issues" that we like to blame on our resident (dead) Frenchman. I am fascinated by questions of belief, religion, and myth.

Over the last several days, I've been watching old episodes of Ghost Hunters, which seems right in line with this history. But it is something completely different. Instead of approaching the supernatural from the point of view of a literary device, a mythological belief system, or an amusing explanation for antique wiring, ghosts - for these people - are very real phenomena. Most of the time. For people who make a living hunting ghosts, they are very practical and do a good deal of "debunking" of claims.

But it makes me wonder a few things about people. First, why we want to believe in ghosts. Certainly, there is the desire to have proof that death is not the end of our existence. That there is something beyond the physical connection of synapses and cells that makes us us. I buy it. In fact, if someone could definitively prove to me that ghosts are a continuation of human existence, I'd be thrilled.

There is another part of me, however, who finds some of the possible explanations for "ghosting" to be profoundly interesting in a scientific way. Emotion- or energy-impressions, for instance. The idea that we might feel so deeply that we somehow impress some part of our consciousness or emotion into the world around us is profoundly strange. It seems to imply that there are physical laws we don't understand - laws that can explain how the energy of emotions can impact the physical world without what we understand as physical contact. Intriguing, certainly. But it also reminds us how truly ignorant we are about our own world; we don't even understand the basic rules for how we interact with our surroundings, or they with us. It also raises the question of what makes us what we are; are emotional impressions a part of us? Are they capable of feelings and intelligence of their own? Do we really leave behind semi-sentient beings when we feel that intensely?

Finally, the concept of a ghost - so long as I do not know what it actually is - is also profoundly disturbing. Is a ghost a person trapped in a cyclic pattern of emotion and behavior? Restricted to a specific path or area by virtue of some in-life connection or site of death? The idea of being stuck for what amounts to eternity is not a pleasant one.

And if ghosts aren't human at all, then what are they? What else inhabits our world that we cannot see or interact with on a regular basis? What else is out there that we fail to notice at all?

So what do I think they are? I don't. I'm open, in the true sense of an atheist. No one has managed to actually prove anything to me. I know people who believe in them, who have claims of seeing and feeling things. I believe them. I believe that the things they say happened are true. That doesn't mean I know what they are. Maybe they are the dead. Maybe they're "energy-impressions." Maybe they're something else altogether. I don't know. I choose not to try to make a choice.

Now if Pierre walks up to me and introduces himself, you can bet that just as soon as I've had a psych-evaluation I'll be more than happy to believe that ghosts are people. But until then, I'm happy to wait and wonder. Because maybe ghosts are quarks. Or quarks are ghosts. Or something equally strange but fully explainable by science that we just haven't discovered yet.

Remember, just because we don't have an answer doesn't mean there isn't one.

Sunday, November 15, 2009

GODDOG - Beast, man, and king

Although I find Derrida much more accessible now than when I was barely old enough to legally imbibe alcohol, I still hold the opinion that he is a good deal more interested in hearing himself talk than he was in being clear. Clarity is a good thing – and deliberate lexical obfuscation (while I’m good at it) is a personal pet peeve.

The other day I picked up The Beast and the Sovereign: Volume 1 off the shelf at the Harvard Bookstore (which, in an entirely unrelated note, is now the home of Paige M. Gutenborg, a search-print-and-bind machine that will print and bind anything in the public domain for $8). Brand-new Derrida (despite his death) based on seminars. This type of format tends to annoy me, as it makes a good deal more sense in person than written down (especially with the tangents), but that’s the only way some things will ever come out, and there isn’t much to be done about it. But that isn’t the point. The point is that The Beast and the Sovereign seems eminently germane to my dissertation. So I bought it.

Like most Derrida, it needs to be read in small doses. This dose is from the Second Seminar, December 19, 2001.

What would happen if, for example, political discourse, or even the political action welded to it and indissociable from it, were constituted or even instituted by something fabular, by that sort of narrative simulacrum, the convention of some historical
as if, by that fictive modality of “storytelling” that is called fabulous or fabular, which supposes giving to be known where one does not know, fraudulently affecting or showing off the making-known, and which administers, right in the work of the hors-d’oeuvre of some narrative, a moral lesson, a “moral”? A hypothesis according to which political, and even politicians’, logic and rhetoric would be always, through and through, the putting to work of a fable, a strategy to give meaning and credit to a fable, an affabulation – and therefore to a story indissociable from a moral, the putting of living beings, animals or humans, on stage, a supposedly instructive, informative, pedagogical, edifying, story, fictive, put up, artificial, even invented from whole cloth, but destined to educate, to teach, to make known, to share a knowledge, to bring to knowledge. (35)

The “if” in the first sentence seems to be an interrogative, a subjunctive that indicates doubt. Derrida goes on to discuss the idea of Terrorism as a fable, a real event made fabular by modern media and propaganda. By, he says specifically, the idea of media as capital to be bought, sold, released, or withheld. He refers to September 11 (only three months prior to the seminar) and the images dispersed by the media both pre- and post-government intervention.

But while Derrida claims that functions of government can be like a fabular institution, it seems to me that they do more than seem. Of course, there are levels of fabular. In Early Modern England, the government (especially under James I) was based on the fable of divine right kingship, a “fictive modality” that situates the king as a mini-god, a supernatural being whose touch could heal and whose intercession or judgment was that of God. In a theocracy, too, where the monarch is head of church and state, the fabular element is clear (provided one can step back far enough from the religious element to accept the equation of religion and fable).

But what about the government of the modern Western world – specifically, of America? The added “under God” in our pledge of allegiance would seem to place us within some more democratic extension of the second above example. A nation that is not theocratic in name, but seems to be so in doctrine, particularly with the rising power of the religious right.

But that is not the way I would see it. Yes, modern America is leaning disturbingly toward certain theocratic elements, but we are not a theocracy (at least not yet). But our government and ruling ideology are predicated on a fabular construction of nationalist myth. The Founding Fathers (always capitalized) have become our new political pantheon, with the occasional inclusion of Abraham Lincoln, Teddy Roosevelt, or JFK. Our fable is the fable of pure equality, the fable of pure freedom, the fable that our nation is better than any other in the world by virtue of… what? The very illusory things that make up the fable of our nation? Can truth be based on such a fiction?

Derrida seems to think so.

And me? I think so, too. But I also think that there is danger is not recognizing a fable for what it is: fiction. Now don’t get me wrong, I think fiction is as powerful a tool of education and liberation as fact – and can be used to achieve reason and rationalism so long as it is seen as fiction. But when fiction is assumed to be fact (as in the case of our nation’s foundational mythos or, for instance, religion), then we stop learning, stop improving, and begin to regress into a state of childish presumption when we think that we are the hero of our fable, that everything will bend to help us in our fabular quest, and that we – as the hero – cannot truly be harmed or die.

But it is a fable. We can get hurt. We can die. And we would do well to remember that fiction may reveal fact, but fact can never be fiction.

Saturday, October 03, 2009

The Freshman Experience

My university, like many others, has recently joined the "Freshman Experience" bandwagon. Recently, many universities have begun fretting about this so-called problem, as Brian's Coffee Breaks notes (article here). In short, universities are worried that the poor freshmen will be "lost" in their new - presumably larger - communities.

This focus on the "Freshman Experience" seems to me to be little more than a perpetuation of our increasing tendency to infantilize college students who, less than a generation ago, would have been considered fully capable adults. I teach freshmen. I know that most of them aren't yet fully prepared to take on complete adult responsibilities. But making them go through the "Freshmen Experience" isn't going to get them there. In fact, if anything, it will retard the process.

Universities often function in loco parentis, despite the fact that most of their students are legal adults. As an instructor, I do not wish to be a babysitter, and I think it ultimately does a disservice to my students. Yes, I should offer them advice and help them develop study and life skills, but, ultimately, they need to learn to become independent.

Students who do not want to get lost in huge classes need to learn to speak up - it's as simple as that. If you don't want to be a faceless number in the back of the room, then walk your butt up to the front and introduce yourself to your professor. Raise your hand. Assert yourself - that's the best advice anyone can give to a freshman who doesn't want to be "lost."

Our tendency to want to keep our children "children" for longer is detrimental to them, and to our society. Let them learn how to grow up on their own - because trying to hand-hold them into adulthood doesn't teach them anything but to expect someone else to do all their work for them - to make their choices for them and to not be held responsible for their own lives. Offer them help for serious issues - medical, mental, etc. - but offer resources, don't plan their lives for them. After all, if everything is done for them freshman year, then you'll have to have a "Sophomore Experience," then a "Junior Experience," and so on. But once they get out of the university, they won't find a "First year at work Experience." They'll find real life.

Thursday, September 24, 2009

Grading Green

So my university (at which I both teach and am a student) has really started pushing going as paperless as possible this year.

On the one hand, this is proving to be an annoyance, as my students have this rather marked increase in their ability to forget due dates and assignment specifications. As a result, I continue to hand out sheets in class.

On the other hand, though, I have required all students to turn in their papers electronically. This, I love. And they seem to be pretty happy with the arrangement, as well. I get no excuses of "my printer died," for starters. I don't have to worry about papers turned in with red or blue ink (though, really, I'd rather that than nothing). No mangled pages, no coffee stains, no food on the papers, AND my cats can't attack them. And believe me, explaining to a student why there are teeth marks and little bites taken out of their papers is entertaining.

I like grading on the computer. I like being able to use Macros for common errors (it goes SO much faster), being able to retype a comment when I don't like how I worded it (no erasing or scribbling out!), and I really like the fact that I type much faster than I can hand write anything. Grading takes nearly half the time it used to.

I like the fact that I can post their comments and grades on a course website and just be done with it. I don't have to haul papers to class, don't have to feel like I'm carrying half a tree with me wherever I go, and I don't have to worry about what to do when I find I have a only a red pen (and we're not supposed to use red because it traumatizes students - WTF?). I do have to carry around a laptop, which is heavy as hell, but lets me have the enjoyment of posting to my blog when I finish my grading early (as now).

All this, and the proverbial bag of chips that we're doing something that reduces our impact on the environment. Bonus!

I know my students like the convenience of emailing papers or uploading them, rather than printing and dragging. I'm sure they like the option to "accept changes" when I make small grammatical alterations to their drafts. I'm sure they appreciate not having to decipher my handwriting.

But all that said, there is something a good deal more impersonal about electronic comments. They can't see a smiley face drawn in their margin or the deep impress of a pen when I get excited (or angry) about something they've said. They can't guess about what color pen is going to have marked their papers this time. It's just the familiar, cold, glowing text on their screen that appears on every document, every webpage, every email that comes through their inbox or browser.

But, I think, that's why the experience of the classroom is so important. In today's world, where online education is becoming increasingly popular (and common), I think the classroom is, in fact, more important than ever in the process of education. Students need to see and hear not only their instructor, but one another. They need to be able to give a gut reaction, to express their immediate confusion, to be able to recognize when they or others have something particularly interesting or exciting to say. Yes, the internet is great, but text just doesn't do it all. We need human contact - especially in education, where the information itself is impersonal at best. Classmates and teachers are PEOPLE, and they are important ones who shape the way we think and learn, and that is something that no amount of "paperless" internet education is going to be able to make up for.

Sunday, July 26, 2009

Next

Typically, I do not post about personal events on this blog, but today I'm going to make an exception. Theoretically, you only get married once, and I think it is a momentous enough occasion to warrant a post, as it will be my last with this name and social status ("single").

As is the nature of such things, I don't imagine the process will change between now and the next post. That's the way life works. Milestones appear and disappear, are overleapt or crossed or smashed to smithereens, and we just keep going down the road, waiting for the next to appear so that we can say we have passed and marked that next mile in our journey.

Sometimes, if we're very lucky, we have the opportunity to make a choice between roads. To stop and turn or keep ourselves straight, to take the "scenic route" instead of the multilane freeway. Or, if we're particularly creative, to pull over, step out of the car, and strike out into the middle of nowhere with nothing more than what we can carry on our backs.

Sometimes a detour takes us back where we started, sometimes to where we would have ended up anyway. And sometimes, we find a waterfall, a cave, a canyon, at the end of our offshoot. Sometimes we never again return to that old path and are left to wonder what surprises - or not - it held.

Sometimes, we backtrack. We take one look at that rickety rope bridge and say "screw it." And sometimes, when we say that, instead of walking back we jump into the river and see where it takes us.

Road less traveled my ass, Mr. Frost. If there's a road, it's been traveled. Maybe "less" than other roads, but you, sir, are a rebel in sentiment only. If you really want to do something interesting, get off the road all together.

Now I'm a fan of the road, be it paved or dirt or even (much as I curse them) cobblestone. I use the roads in my life frequently. But every now and then, I get off and see what's going on in the ditch or the field or the forest.

This milestone, however, is firmly in the middle of the normal highway of life, and I'm okay with that. Contrary as it is to my nature to do the "normal" thing, this is one of the "normal" things I'm happy to do, a part of the highway I'm glad to travel. After all, striking out into the middle of nowhere is more fun with someone else along for the journey.

Saturday, July 04, 2009

Written by the Victors

In learning about history, one of my favorite things to do is dig up the little-known or "losing" side of the story. The story of the witches burned by the Inquisition, of the Spiritualists and atheists in the nineteenth century, of the Germans in WWII, of the unspoken and unsung.

That is not to say, of course, that the history written by the victors is unimportant. Or that written by the allies of the victims. But there are always more sides, more facets, to the gem of history than first appear visible to our biased gaze.

There aren't often true "villains" in history, people who truly just want to make others suffer. There is always some grain, some tiny glint of altruism or idealism that offsets the dirt and grime and corruption. Much as we hate to admit that people like Hitler, like bin Laden, have some sort of good, it is something that we should admit.

It is something we should admit not to justify their actions or excuse their cruelty, but to better understand ourselves and our own motivations. If we can see that they - the mysterious, proverbial "they" - have a modicum of goodness, then perhaps we can see that within what we see as our own virtue may be a kernel of darkness.

Sunday, June 21, 2009

When the world burns

Like many other Americans, I have been watching the protests in Iran from the safety and comfort of my home thousands of miles and an ocean away, wondering whether there is anything I can do to help them, and wondering whether - in similar circumstances - I would have the courage to face what they are facing. In my comparatively sheltered life, I have stood up for idealism more than once, but never when facing the immediate and physical threat of injury or death. That is the privilege that most people of my (American) generation have - to have grown up in a time and place where our actions of protest are enabled, authorized, and permitted to the extent that we need not fear death if we choose to speak our minds.

Certainly, there are exceptions. There have been incidents, even in my short life, in which people have been beaten by police or by other citizens, in which the military has gone too far, or in which hatred has caused injury or death to a member of a minority opinion/lifestyle/ethnic background. There was 9/11.

But, ultimately, we live in a society where rebellion is controlled because it is authorized. The people in Iran do not, but, clearly, they want to. That is not to say that they (necessarily) want to be American, but, rather, that they want to live in a society that reflects their beliefs while allowing a certain amount of flexibility - the possibility that things might change, that a system might need revision or alteration, that what was best in the past or is in the present might not be in the future.

If we, as Americans, learn nothing else from the Green Revolution, it should be that people are fundamentally far more powerful than they might believe. That the human spirit is wise and courageous and that people from across the world and on the other side of a religious or ethnic divide are just as wonderful, just as amazing - and perhaps even more so - as the people you pass on the street every day. What we should learn is that complacency with our lifestyle is not the solution to the betterment of our world, that permitting atrocity or even mild cruelty (or torture) is an unacceptable way of life.

Sunday, May 31, 2009

Gods and Demons

In Karen Armstrong's A History of God, Armstrong talks about the fact that monotheism was an unusual creation, but - further - that the refutation of other gods was a practice unique to Christianity. This is a point reiterated by Jeffrey B. Russell in A History of Witchcraft, in which he discusses paganism and the vilification of non-Christian deities.

Armstrong's point was that early Christians denied the very existence of other entities besides their god. Russell observes that what began as a denial - a policy that obviously needed revision - became a tendency to coopt:
And now Christian theologians made another important identification: the demons that the sorcerers were calling up were the pagan gods. Jupiter, Diana, and the other deities of the Roman pantheon were really demons, servants of Satan. As Christianity pressed northward, it made the same assertion about Wotan, Freya, and the other gods of the Celts and Teutons. Those who worshipped the gods worshipped demons whether they knew it or not. With this stroke, all pagans, as well as sorcerers, could be viewed as part of the monstrous plan of Satan to frustrate the salvation of the world. This was the posture of most theologians and church councils. Yet at the same time popular religion often treated the pagan deities quite differently, transferring the characteristics of the gods to the personalities of the saints. (39-40)
Since they couldn't eliminate other gods by denying their existence, Christian authorities instead labeled them either demons or, it seems, saints. Those whose myths could be refigured to include God and Christ were deemed acceptable, those who defied the kind of authoritarian strictures promoted by Christianity became its enemies.

What is perhaps most interesting is that we can clinically identify this move in historical circumstances (December 25th is the birthday of Mithras, for instance, and has nothing whatsoever to do with the birth of a nice Jewish boy in Bethlehem except for the fact that the Church needed a good day for its celebration), but that we as a society continue to cling to the religious elements that have so clearly been coopted as though they are empirical truth. Yet more evidence of our ability to believe in whatever we want despite the facts that contradict it waving small flags directly underneath our noses.

Human society is a fascinating melange of a variety of long- and short-term traditions that have given us enormous intellectual wealth, but that doesn't mean it's a good idea to wholesale believe in them simply because your mother told you to. Look first, people, then leap. Or, better yet, keep your fool feet planted firmly on the ground unless you've put on the safety harness first.

Saturday, May 30, 2009

Airborne

As a person who travels frequently, both for work and pleasure, I find myself often in a position to consider both the nature of the airline industry and the types of people who make use of it.

Travelers come in all shapes and sizes, all ages, both genders, and a variety of self-limiting economic strata.

The most impatient are the business folk - the people who have a pressing, important, or otherwise needful reason to travel. They have a mission to accomplish that is more specific than "visit my mother" or "go home." They travel with laptops and cell phones and bluetooth headsets plugged in from the moment the airplane lands to the three minutes between when the plane door closes and when the flight attendant tells them to shut it off. They are efficient in security lines and intolerant of small children or inexperienced travelers.

These are followed by people like me - frequent travelers, though ones who travel fewer than ten times a year, on average. We know the drill in security, we know where to find restrooms, coffee, and what passes in an airport for lunch. We know about flying standby to get home a few hours earlier. We also tend to hate it - we are not yet dulled to the atmosphere of the airport, which rankles us, gets under our skin and itches until we can get out of it and into the confines of an airplane, which is preferable simply because it means we are moving. But we are accustomed to the rhythm - the start and stop and stand and stretch and so on - of travel. To the requirements of the security checkpoints that make us put off our morning coffee for an extra aggravating twenty minutes, the method of packing both the carry-on and the personal item for maximum capacity and entertainment value, the appearance of nauseating versus passable food. We dislike people who slow us down, are irritated by the people who fumble their way through the process, who impede the airplane aisles, who don't know what to expect. It isn't fair of us, but the irritation is there, nevertheless.

The next species of traveler is the pleasure-traveler. The vacationer. The elderly couple or successful family. The people who have done this often enough that they are prepared but not so often that it has become a chore. These are the pleasant people to sit by, to be behind in line, to follow through security. They are efficient enough not to irritate those of use in the frequent- and business-style of travelers. Sometimes, the college student who has gone to school and gone home enough to be happy to go in whichever direction they're headed, especially if it's spring break.

Finally, the group that stalls the rest of us. The first-time travelers, the families with small children who don't know how to keep them under control or entertained (I've traveled with many a pleasant family and/or small child), the people who have never been on a plane (or act like it). In terms of annoyance, I have to admit that the experienced-but-overly-entitled travelers are often worse than the rookies, but the rookies are more frazzled, more nervous, more excitable, and more prone to being in the way no matter how much they try not to be. They are well-intentioned, but nevertheless manage to be in the wrong place, to stuff their luggage in the wrong way, or to block the aisle because their stuff doesn't fit in the overhead bin.

In a system as simultaneously efficient and inefficient as the airline industry, this mix makes traveling... interesting. Because one or two are enough to clog the gears, to slow the system, and to create a fascinating domino effect that causes the whole thing to come screeching to a staggering halt.

So what is the point of this little diatribe? Simply that any system that deals with humanity is bound to have cogs, whether out of good or malicious intents, that slip and stick and get in the way.

So some of us need to be more tolerant, more patient. And others need to pay more attention and attempt to make things smoother for everyone else - either by learning how the system works through a little research or themselves being more willing to let well enough alone.

Monday, May 11, 2009

Divine History

This week I finished Karen Armstrong's A History of God, which catalogs the last 4,000 years of religious monotheism. She makes some interesting points which I felt were worth a comment or two. The first is the idea of our tendency as humans to anthropomorphize our deities. We want our gods - for some reason entirely unclear to me - to be like us. Personally, I think we're horrible enough a species on our own that we don't need to be endowed with divine powers, and Armstrong seems to agree:

Instead of making God a symbol to challenge our prejudice and force us to contemplate our own shortcomings, it can be used to endorse our egotistic hatred and make it absolute. It makes God behave exactly like us, as though he were simply another human being. Such a God is likely to be more attractive and popular than the God of Amos and Isaiah, who demands ruthless self-criticism. (55)

"More attractive and popular" because he "endorse[s] our egotistic hatred and make[s] it absolute"? That sounds disturbingly like my memories of high school (which was, for what it's worth, Catholic). But the idea that our deity of choice behaves like a 13-17-year-old girl ought to rightly give us nightmares, especially if we remember what it was like to be the other 13-17-year-old girl who wasn't "attractive and popular." And yes, I know that isn't quite the point Armstrong is making... in fact, her point is worse. We are attracted to deities who are cruel and exclusionary because exclusivity and ostracism makes us feel better about ourselves. What a shining endorsement.

More disturbing still is the idea that it isn't simply others that religion of this sort teaches us to condemn. In fact, later monotheism (as it appears in Armstrong's study, this includes Christianity and Islam, but has moved past Judaism) encourages us to regard ourselves not as beings in need of improvement, but fundamentally and critically flawed:

A religion which teaches men and women to regard their humanity as chronically flawed can alienate them from themselves. Nowhere is this alienation more evident than in the denigration of sexuality in general and women in particular. (124)

So to sum up the argument thus far (and to give Armstrong her due, this book is not a rant akin to those offered by Richard Dawkins; rather, she includes these comments amid well-researched history and plenty of comments about the good religion has offered throughout history, as well), we see monotheism alienating subscribers of other religions, and then alienating its own adherents from one another. Armstrong continues to comment that "This is doubly ironic, since the idea that God had become flesh and shared our humanity should have encouraged Christians to value the body" (125). But it did not. Instead, Christianity (specifically, early to medieval) encouraged the physical and psychological debasement of human physical needs, causing repercussions that have lasted in the human psyche well into the twenty-first century as we know it.

Armstrong does not offer a solution - she isn't trying to "fix" religion, simply to explain its history. But she does remark upon something very interesting in one of her later chapters on the Enlightenment: "Once 'God' has ceased to be a passionately subjective experience, 'he' does not exist" (342). Which makes me think very carefully about the recent rise (noted in Susan Jacoby's The Age of American Unreason) of subjective opinions as "fact." It is an issue I see with my students (college freshmen), and one that appears with increasing frequency in today's media. So what do I take from this? The idea that the more subjective we are with our "facts," the more prevalent intolerant religious fundamentalism will be because if facts are subjective, then no one can be wrong, no matter how irrational or counter-factual their assertions are.

Not all spirituality causes these reactions or oppressive ideological tendencies. I recognize this. As an avowed atheist (or "Bright"), I am biased against religious views, but I know that not every spiritual person is intolerant, exclusive, or irrational. I know many spiritual people who are quite the opposite. But in today's increasingly fundamentalist world (be it Christianity, Islam, or Scientologist), reason is taking a back seat to subjective self-promotion and exclusionary racial, ethnic, and creedal oppression. And if we can permit this because it exists under the blanket of "religious freedom," then God - or Reason - help us all.

Tuesday, April 21, 2009

Why we need the Devil

In Breaking the Spell, Dennett quotes Rodney Stark's assessment of the need for the demonic to counter our understanding of the divine:

he even proposes that a God without a counterbalancing Satan is an unstable concept - "irrationaland perverse." Why? Because "one God of infinite scope must be responsible for everything, evil as well as good, and thus must be dangerously capricious, shifting intentions unpredictably and without reason" (p. 24). (192)

The question raised by Stark and Dennett here is the human need to create a powerful parental figure who wishes us good - who is benevolent and caring. But the universe as we know it is not composed entirely of positives. Humanity therefore has created for itself a scapegoat, a figure of pure and unadulterated evil whose purpose and intention is to do harm to us, to cause us to give in to our darker desires. A figure on which to blame those who do not wish to subscribe to that benevolent dictator.

We, as human beings immersed in our constructed religions, want someone to blame for the evils we have created. In Judaism, there was no devil (originally). There was only a deity whose capriciousness and violent changeability made him inherently unreliable as a father-figure. So, in subsequent years, humanity created a counterpart to this god, a way to siphon off those things that were undesirable in a deity and place them into the - necessarily weaker - body of a creature we dubbed the devil.

So what does it mean that future generations - notably including our own - have attempted to reclaim this devil from his isolation and damnation? Why have we tried so hard - and sometimes, as with Milton's Paradise Lost, against our will - to redefine and reidentify this demon as something we can understand? Something we can sympathize with? Something we can - sometimes - even love?

It is because, I think, we are coming to terms - slowly and unconsciously - with the fact that both deity and devil are contained within ourselves. If we love the divine, then we must also love the demonic, because we are both.

Sunday, April 19, 2009

Excuse me, what?!

So as we were watching The Colbert Report (a few days late, thanks to technology), I had one of those I-don't-believe-someone-really-thinks-that-way moments. Colbert was interviewing Douglas Kmiec, a Catholic Professor of Law at Pepperdine about his new book, Can A Catholic Support Him? (the "Him" is Obama). My surprise was not at the book, but at a comment made by Kmiec regarding gay marriage.

Kmiec's argument is that marriage should be the province of the Church and that "legal contracts" should be the province of the government. The latter should be entirely unrestricted by gender, creed, etc. The Church (or synagogue or whatever) should be allowed to put whatever restrictions it wants on who it marries or doesn't.

This was not shocking. In fact, it is one of the more rational religious stances. While I think that, as a society, we have created a legal status that is "marriage" that should not be restricted, as long as religious belief continues, a church should be allowed to marry/not marry whomever it wants for whatever reasons it wants. But that is the purview of the religion, NOT the state.

The shocking thing was when Kmiec suggested that if "legal marriage" were to be only a contract, atheists would convert to the Church to get a valid spiritual marriage.

Um... I hate to be the one to break it to you, Mr. Kmiec, but an atheist doesn't care one little bit about a "spiritual marriage" because we don't believe in the existence of god. As the internet would say, epic fail.

Saturday, April 11, 2009

Ages Pass

So it appears that I unintentionally gave up blogging during Lent. Oops. I have good reasons why this happened, but regardless, I have been remiss.

At present, I'm reading Daniel C. Dennett's Breaking the Spell, which attempts to rationalize with "believers" that "brights" (his adopted term for atheists and agnostics) are not out of their minds and that "believers" should feel obligated to investigate and question their beliefs.

Given that it is the night before Easter Sunday, I feel obliged to comment.

I think what Dennett is attempting to do is a good thing; there are far too many people out there who blindly accept whatever they are told - whether in regards to religion, government, or American history. My father did me a favor as a child and told me blatant lies on a fairly regular basis, leading me to never take him at his word and investigate everything he ever told me.

Sometimes he was telling me the truth. Sometimes he wasn't. But I did the work and found out.

In twenty-first century America, our society retains the childlike tendency to simply accept whatever we are told - by anyone in a position of ostensibly trustworthy authority. Unfortunately, for some people, this category does not include scientists. Certainly, science has been wrong (something Dennett is more than willing to admit). However, science has also done a very good job of showing when it is wrong and fixing the issue.

Religion... not so much. Not only is religion very hesitant to admit its wrongs, but it is even more hesitant to correct them, instead claiming the all-encompassing net of "faith" or "doctrine" or - better yet - "tradition." I'm sorry, folks, but if "tradition" states that I'm not allowed to be educated simply because I lack a particular piece of genitalia... I don't think so. And yet, this is one of the things that religion does insist upon: Catholicism insists that women cannot be priests; Islam insists on the inherent weakness and inferiority of women; for that matter, all Judeo-Christian-derived religions blame women for the fall of the human race, so we must be bad.

Dennett's problem with religion is not that it insists upon "tradition," per se, but that it ignores factual information in favor of it. For example, the Bible states that the world is 6,000 years old. Science has proven it is not. Believers in the literal truth of the Biblical text insist that science is wrong, despite the evidence that confronts them otherwise.

But my biggest problem with religion - organized or not - is that it fundamentally interferes with personal and social freedom and egalitarianism. Religion insists upon a hierarchy in which all people are subordinated to something somewhere. But at the same time, they insist that they are the "chosen," the most superior [race, species, creed, gender, etc.]. They must submit themselves to a set of rules created by "god" (through the mediating power of generations of very privileged and wealthy clergymen whose personal authority and status was coincidentally increased by these rules) while gleefully condemning everyone else for not wishing to be subject to those same rules.

Religion cannot bring peace and harmony so long as there is more than one, and humanity is incapable of agreeing on a creed. (This is not to say that if the world were atheist, we would all get along. Of course we wouldn't. But we wouldn't be fighting about religion, that's for damn sure.)

Ultimately, this is a subject about which I am pessimistic (uncharacteristic for me). I cannot imagine the people who adhere so desperately to their faith giving it up, since they are clearly unwilling to consider reason at all and therefore will not be swayed by it, no matter how hard people like Dennett try. Those who are reasonable people will cherry-pick: "Okay, I'll concede this point, but you just don't have faith. You have to have faith."

And it all comes down to that: faith. You're right, folks. I don't have faith. I have logic. I have reason. I have ethics. I have a deep respect for human life and the human condition. I have insatiable curiosity and a highly active imagination. I have compassion and love and joy and sorrow and anger and boredom. I have a desire to be more than I am, but also to make a positive difference in the world. I have a passion for life. But I do not have faith. Not in god. Not in religion. Some days, not in the humanity that devotes itself to a creed (which ever it may be) that demands the subjugation and destruction of those with whom it disagrees.

What do I have?

Hope that someday, people will stop using religion as a scapegoat and an excuse. Hope that people will confront one another in terms of ethics and reason, rather than myth. Hope that logic will allow us to move forward scientifically, rationally, and technologically when we are no longer held back by the desire to waste our time and energy devoting ourselves to a fiction that we have deluded ourselves into believing is real.

I do not have faith. I have hope.

Friday, February 27, 2009

Life Moments

Weddings are unique things.

And not just because they're (supposed to be) once-in-a-lifetime experiences or because they're an opportunity for family and friends to gather together in fancy clothes and eat free food, though these things certainly contribute. Nor because the people for which they are ostensibly a celebration spend a good deal of time, money, effort, stress, blood, sweat, and tears on them when common platitude has it that this is "their day." Nor because it is a ceremony ratified by (often) church and state, agreed upon across cultural boundaries as a necessary social institution, an opportunity to eliminate surnames (or, today, create newer, longer, more confusing ones), a methodology of determining (theoretically) paternity, and the chance to attain sanctioned tax breaks for no apparent reason.

No. Weddings are unique because around them has developed a culture of purity and "romance," little birds and hearts and bells and ruffles and bows and lace and... You get the idea. Because everything seems geared toward the mental fixations of a prepubescent girl. Pink. Bows. Frills. Pretty dresses. Pillows. Shiny jewelry. Flowers. Girly things.

The groom is meant - and often expected - to stand there and just let all the "femininity" explode into a flurry of petals and birds and hearts and just take it.

No wonder men hate weddings.

And the worst part is, even if the couple themselves is not so inclined, their families (99% of the time) are. The mother of the bride has fits and paroxysms of alternating joy and psychosis. The bride makes irrational demands in a shrill tone about which she will change her mind equally shrilly in less than two hours. The bridesmaids will either endure this sullenly or squee like little girls themselves. The groomsmen will attempt to get everyone (including themselves) very drunk. And who can blame them?

The wedding has become the domain of that most horrifying of creatures, the "bridezilla." A slavering, three-headed monster that spits acid and swallows pride and testicles for breakfast. A curious beastie that may be assuaged by chocolate and falls into a swoon at an appropriately ridiculous amount of lace and toole. It's hunger for petticoats and sparkling beads goes unsatieted, endless amounts of accouterments and accessories sacrificed to its bottomless stomach.

And this creature is meant to represent the desirable female? No wonder men are becoming increasingly commitment-shy.

And yet, social expectations and our own in-bred desires perpetuate this tradition because of what it stands for, not because of what it is. And what it stands for - the union of two people willing to share their lives and struggles, their joys and sorrows - is a wonderful thing.

But do I really have to look like a Bavarian fruitcake in order to do so?

Saturday, January 17, 2009

The Fear of Essays

As I have posted before, I'm working on a new portfolio project this semester with my freshman writing class. I'm thrilled. I think this is a great new teaching opportunity, and I think it will be much more effective for my students, as well.

But they seem to be afraid of it. Afraid, perhaps, of weighting all of their grade (though technically only 40% of their grade) on a single project, leading to only one paper. But the idea of one big project... that's frightening. After all, isn't it just easier to do a bunch of small papers and be done with them?

Easier? Perhaps. Though my paperload is actually smaller than the standard sequence here. But I want them to *gasp* learn! K and I have often talked about the fact that students could learn from learning the PROCESS of research, rather than being assigned a topic and told where to find things. And that's what I'm finally allowed to teach. And my students are terrified.

My theory is that they're actually afraid of choosing their own subject material. Sure, there are a few who are just confused and don't want to mess things up, but I think most of them are afraid of making the "wrong" choice. They want their teacher to tell them what to do. My reason for thinking this is that my repeats are all really excited about picking their own topics, because they're used to it. I made them make choices on their own last semester, and they discovered that they like it. My new kids... they don't understand how to write a paper if I don't tell them what to write about.

I have seen this every semester I teach.

Why are people so frightened of picking their own topics? Of trying something new or at least something interesting? If I say "write on Romeo & Juliet, but don't give you a topic, why is that scarier than me telling you to write on the development of Mercutio's character? Shouldn't it be liberating? Exciting?

It makes me wonder what they're actually learning in high school. Are they really only being taught to reiterate whatever they've heard in class? If so, how is that serving them? How is it making them better thinkers? Better people? How is it making them anything but mindless drones?

Because, believe me, these kids have great ideas. They're smart, and they see things in the world around them. They recognize the inherent value in things that we - the jaded academia - don't see because we're trained not to. For instance, a student is planning to write her project comparing the tradition of violence and revenge drama to Metal Gear Solid. I think this is great. I want them to see how the "old dead white men" tradition actually plays a role in their lives. This project idea is why I want to do the portfolios. Because I want them to see the relevance of interpretive study to EVERYTHING.

Now I just have to convince the other students of this.