Tuesday, November 01, 2011

Natural Zippers

"God isn't compatible with machinery and scientific medicine and universal happiness. You must make your choice. Our civilization has chosen machinery and medicine and happiness. That's why I have to keep these books locked up in the safe. They're smut. People would be shocked if..." The Savage interrupted him. "But isn't it natural to feel there's a God?" "You might as well ask if it's natural to do up one's trousers with zippers."
-- Aldous Huxley, Brave New World This passage illustrates - and explodes - one of the most conventional arguments for deity - "I feel that there is a God." Instinct. We rely heavily on instinct (and what we like to refer to as intuition, which is really just instinct given a nicer name to make it sound reasonable instead of impulsive) to guide us through the perils of our daily existence. Instinct can be good - it keeps us from falling off high objects, walking through bad neighborhoods at night, and being eaten by predators. But instinct has also led us very far astray. The world changes faster than our chemical and genetic makeup. Sometimes, instinct adapts. Most of us are capable of riding in a car at 50 mph or riding an elevator to the tenth floor without panicking. Most of us understand that despite our body's craving for fat and sugar, too much of it is bad for us. We relearn our instincts. We come to think of these things as normal, as natural. Like zippers on trousers. We want to think that there's a God because it gives things purpose - and if things have purpose, then it isn't our fault if they are out of our control. At least, we think, they're in someone's control. And if that someone happens to be benevolent and omnipotent, well, then, we don't have to feel bad about all the horrific things that take place in the world, because they're all under control. God is our psychic security blanket. Sometimes, we need to let those instincts go. To accept that zippers are as natural as trousers, and that trousers are as natural as rainstorms.

Friday, October 07, 2011

If Ignorance is Bliss... Idiocy is Heaven

All that is solid melts into air, all that is holy is profaned, and man is at last compelled to face with sober senses his real conditions of life and his relations with his kind. -- Karl Marx & Friedrich Engels, Communist Manifesto

My question in response to this is fairly basic - why, exactly, is this a bad thing? If religion is the opiate of the masses, why object to its removal? Is the truth of human existence really so horrific that we would rather willfully deny it by covering it over with a veneer of an idyllic illusion?

Of course we would. We live in a world constructed around fantasy - the fantasy of religion, of the "American Dream," of a non-existent harmonious nuclear family. We surround ourselves with fictionalized accounts of heroism, drama, and knights in shining armor. We play games with no basis in the real world, full of dragons and aliens and space marines; we watch movies and read books about people that never existed in worlds that never could exist; we pretend to ourselves that when we die, we will go to a magical place where everyone is young and beautiful and where we'll be surrounded by our deceased pets.

This is not to say that I have any objects to fantasy itself - I'm an avid reader of fantasy, sci fi, and fiction. I have, in fact, made a career out of it. I love movies. I love video games. But I love them in full awareness that they are fiction. Yes, at the core of every fiction is some kernel of truth and commentary about the real world, but that's just the point. They know they're fiction and have accepted that as part of their essential existence.

In Marx's view, we should be allowed to believe our illusions are real. We shouldn't go through life with the assumption that human beings are fundamentally selfish creatures. We should continue to believe that there is a reward after death, that our interactions aren't governed by self-interest, that we aren't a group of social animals that seeks ways to place itself above other groups of social animals by means of race, class, gender, or cultural choice.

We are. We need to get over a lot of that - someone's socioeconomic or ethnic background should not and does not make them a better or worse human being than I am - but we can't stop committing acts of bias and bigotry until we accept the fact that we do not treat each other as equals. We should, but we don't. And I am willing to admit that I am as guilty as the next person of considering myself "better" than someone without access to the education I've had. I dislike stupidity - but ignorance is better than willful idiocy. And to deny the fact that our "illusions" help to perpetuate classism, racism, homophobia, misogyny, and so on is willful idiocy. We know better. Now we should start acting like it.

Thursday, July 01, 2010

Truth, Faith, and Nietzsche

Over the last week or so I've been plowing my way through the Kaufman collection of Nietzsche's writings, and several of them have struck me as particularly poignant. Now Nietzsche has quite a reputation for coming up with evil and villainous conceptions of human nature. He has been used as an excuse for racism, Nazism, general Antisemitism, and maligned for his philosophic atheism.

Thus far, I have seen nothing racist beyond what was typical for a nineteenth-century European, and, in fact, Nietzsche defends Judaism - in a racist way, admittedly, but in a way that clearly marks him as not Antisemitic, especially not in a way that should encourage the kind of Antisemitism practiced by the Nazi party in twentieth-century Germany.

His atheism, however, is quite evident, but this I have no problem with. And, given some more recent atheistic tracts and works I have recently read, Nietzsche is downright mild and non-confrontational. He phrases his atheism specifically in terms of truth, self-delusion, and hypocrisy.

One of the early points in the book is from On Truth and Lie:
We still do not know where the urge for truth comes from; for as yet we have heard only of the obligation imposed by society that it should exist: to be truthful means using the customary metaphors - in moral terms: the obligation to lie according to a fixed convention, to lie herd-like in a style obligatory for all... (47)
First, of course, Nietzsche raises the question of whether truth - objective, singular truth - actually exists at all. The understanding of truth he here presents indicates an awareness of the subjectivity of human morality (the idea that "truth" varies according to circumstances) but also implicitly asks whether if basic truth does not exist, then how can we claim that there is a higher moral truth.

Nietzsche compounds this question with the now-infamous assertion that "God is dead," but also with claims of religious hypocrisy, as when he writes, "when one opens the Bible one does so for 'edification'" (The Dawn 76). In other words, those who read the Bible - and, presumably, any holy book - do so because they already know what they think of it and are looking to it to confirm their beliefs. Of course, this applies to the non-believer as well as the believer and says more about the problematic nature of holy works and human contradictions than it does of the claims made by those books.

But Nietzsche is ultimately more interested in the hypocrisy of believers than he is in their books. In Thus Spoke Zarathustra, the titular philosopher says "Behold the believers of all faiths! Whom do they hate most? The man who breaks their tables of values, the breaker, the lawbreaker; yet he is the creator" (135-136). What is most interesting about this idea is that it begins the introduction of the Ubermench (the Overman), the better future-human. Here, we see not only a critique of the faithful, but also a recognition that any creator - the deity that is worshiped, the founder of a religion, etc. - must by necessity violate the very rules of that religion.

It also establishes the idea that any future founder of something great - be it religion, scientific thought, government, etc. - must violate the rules of what already exists in order to do so. Implicitly, then, Nietzsche himself, in creating and articulating these new ideas, is a willing violator of the status quo. So break a few rules and make something new.

Thursday, April 22, 2010

The Age of Anti-Enlightenment

Although I and many of the people I know best and love are all members, graduates, or products of higher education, I have been noticing a recent trend in anti-intellectualism among politicians, society in general, and even among the members of my own family. Admittedly, those to whom I am closest (parents, cousins, aunts and uncles) are mostly in accord with my beliefs, but there is something deeply disturbing about the discovery that family and acquaintances do not share one's fundamental belief systems.

I'm not talking about religion, per se (this time). I have friends who are Christian, Jewish, agnostic, secular humanist, pagan, Wiccan, atheist, etc. They do not ask me to conform to their beliefs, and I do not ask them to agree with mine. But we do all share an affinity for knowledge - whether in terms of education or simply the desire to learn.

It is a passion that is, unfortunately, not shared by many people in our country.

People ask, often, why they should bother learning this thing or that thing. Why it matters whether something is fact or fiction. Why history is important.

This is not to say that I think everyone should learn everything - that's not possible, and we all know it. But there's no reason to actively avoid education. And no reason why on earth the majority of people in this country are unaware that we were not founded on Christian principles. For goodness sake people, why are most Europeans more well-versed in our history than we are? That's just sad.

It's a symptom of what Charles P. Pierce in Idiot America terms "a war on expertise" (8). He says,
The rise of Idiot America today reflects - for profit, mainly, but also, and more cynically, for political advantage and the pursuit of power - the breakdown of the consensus that the pursuit of knowledge is a good. it also represents the ascendancy of the notion that the people we should trust the least are the people who know best what they're talking about. (8)
In other words, we can't trust a scientist to know science, a historian to know history, or a doctor to know medicine. We (speaking here in the "Idiot America" sense) should rather trust, like Sarah Palin, in our instincts to guide us, in our knee-jerk reaction against anything new or unique, in our "common sense" - which, I would like to point out, is usually light-years away from "sense," however "common" it may be - to tell us that what we've always been told is true, despite the factual evidence to the contrary staring us incredulously in the face. We should agree that the snake is evil and that the fruit it proffers us is terrible despite the fact that it will indeed make us as gods.

Because that's what the story in Genesis all boils down to. The idea that knowledge is bad. That it is somehow evil to wish to be the best we can. To know truth from falsehood. The idea that discernment and conscience - that maturity - are corrupting forces that will sully the ignorant infantilism in which we (again, as "Idiot America") would prefer to wallow, our thumbs stuffed in our mouths and a glassy, glazed look in our eyes as we gaze upward, waiting for the beneficence of a giant Santa Claus to pat us on the head and give us presents.

Because if we take a bite out of the apple we might realize that there is no Santa Claus. That we are responsible for our own actions. That with knowledge comes power, that with power comes responsibility, that with responsibility comes maturity, and that with maturity comes wisdom. But if we never take that bite out of the apple, then we remain children, and someone else is able to tell us what to do, where to go, how to live, why we exist, and even who we are. Without knowledge and all that springs from it we are trapped in servitude, not to those with knowledge, but to the bullies who choose force over knowledge and fight to keep us away from knowledge because, ultimately, knowledge - the proverbial pen - is indeed mightier than the sword.

Apple, anyone? I hear they make a tasty pie. And what's more American than that?

Wednesday, April 21, 2010

Brought to you by the letter "A"

In this case, the "A" is not a scarlet fabric representation of marital (or extra-marital) infidelity emblazoned upon my breast so that the world can read my shame and shun me accordingly. However, I am fairly convinced that in some parts of the country (and the world), the "A" to which I here refer would in fact earn me far worse treatment at the hands of the local population. Fortunately for me, however, I live in a liberal American city that permits my special brand of atrocity.

"A," as will come as no surprise to my few regular readers, is for "atheism." Over the last week or so I have been reading John Allen Paulos' irreligion (which, in a side note, has a "0" on the cover, not an "A" or even an "i"), and a few weeks past had my class discussing such hot-button topics as "evolution versus creationism," "science versus religion," and "eugenics."

Some of the results of this have been interesting. Paulos is one of the more rational, reasoned atheist writers out there (he is a mathematician and much less angry than, for instance, Richard Dawkins), since he refrains from disparaging comments about believers and uses logic and probability to make his points. This is not to say that he doesn't season his book with a good deal of snark - there's plenty of that in there - but he tries to be tongue-in-cheek rather than abrasive.

One of the more interesting points he raises that I haven't seen in before is this: "Embedding God in a holy book's detailed narrative and building an entire culture around this narrative seem by themselves to confer a kind of existence on Him" (62). In other words, we'd feel awfully stupid in following the deistic tenets of our societies if we didn't believe in a god because then there is absolutely no reason for some of our laws, idiosyncrasies, and habitual practices. In other words, we justify our belief through the traditions that have grown out of it. Like saying that "Mommy and Daddy wouldn't put out milk and cookies if Santa Claus weren't real." The act itself neither proves nor disproves the existence of Santa Claus, just as the presence of religion neither proves nor disproves the existence of god.

And from this also springs the idea that people now have come to believe because they were not capable - as children - of making the decision not to believe, since they had not yet developed an adult's incredulity. We tend, as a species, not to convert to a wholly new religion in adulthood (it DOES happen, certainly, but it is less common than a perpetuation of childhood belief) because we are creatures of habit. As Paulos continues, "Suspend disbelief for long enough and one can end up believing" (62).

Thursday, March 18, 2010

Making History in Gamespace

What characterizes gamer theory is a playing with the role of the gamer within the game, not by stepping beyond it, into a time or a role beyond the game, but rather by stepping into games that are relatively free of the power of gamespace. The game is just like gamespace, only its transformations of gamer and game have no power beyond the battle in which they meet. In a game, you are free because you choose your necessities. In a game, you can hide out from a gamespace that reneges on its promises. In a game, you can choose which circumstances are to be the necessity against which you will grind down the shape of a self. Even if, in so choosing, you click to opt out of making history. [165]

Again from McKenzie Wark’s Gamer Theory. One of the points of contention I have with Wark’s theories is the idea that the game is restricted to influence and work outside of the “real world,” which Wark terms “gamespace.” Here, Wark suggests that the game exists independent of gamespace, and, most crucially from my perspective, the point that gaming removes the gamer from “making history.” The final line from the above quotation seems to give the gamer an option – to participate in the game or to participate in gamespace and the making of history.

Wark begins his theory in what he calls The Cave, an allegorical arcade that alludes to and mimics the Platonic Cave, a place that is distinct from the gamespace of the world, removed from it, unaffected by it, and unable to effect it. And it is this premise, I think, where Wark is wrong.

The gamer does not “opt out of making history.” The game and gamer are not in a Cave, cut off from the rest of the social machine. The game – like the works of literature and film to which Wark compares gaming – is a part of the intellectual and social milieu that is shaped by and shapes our ideological understanding of the world around us.

Games may be new media, but they are a vital part of our intellectual and ideological communication with and reaction to the gamespace of the world around us. They deserve not to be undervalued as mindless or shunted into a Cave frequented only by the basement-dwelling. Games are – as they have always been, even when analog rather than digital – a fundamental part of our lives. Games teach us socialization, competition, sportsmanship, and even encourage us to participate in and/or rebel against the socio-political gamespace that builds and reinforces the dominant ideologies of our culture.

Wednesday, March 10, 2010

Warning: Gamer at Play

My current in-process read is Gamer Theory by McKenzie Wark, which, while it certainly has its flaws in interpretation, raises some very interesting questions about games, gamespace, and gamers. For instance,

Stories no longer opiate us with imaginary reconciliations of real problems. The story just recounts the steps by which someone beat someone else - a real victory for imaginary stakes. [007]

My issue with this is that stories have always been the recounting of "the steps by which someone beat someone else" - Wark makes it sound as though this is a recent development in storytelling technology, and one that has somehow evolved through the degradation of our society's culture. But that isn't the case. Stories are always about someone else through whom we are meant to vicariously experience the events of the story. It happens that gaming - in the RPG and video game sense - permits a deeper level of this by causing the gamer (rather than the audience) to actively participate in the action of the story. The story is still scripted, even if it has alternate endings, and still controlled, however. It is still about "the steps by which someone beat someone else."

Wark continues, suggesting that the idea of "game" has come to permeate not only our narratives, but our existence:

The game has not just colonized reality, it is also the sole remaining ideal. Gamespace proclaims its legitimacy through victory over all rivals. The reigning ideology imagines the world as a level playing field, upon which all folks are equal before God, the great game designer. [008]

In this sense, gamespace is "real" space, and the concept of life as a game plays out (pardon the pun) all around us:

Work becomes gamespace, but no games are freely chosen anymore. Not least for children, who if they are to be the winsome offspring of win-all parents, find themselves drafted into evening shifts of team sport. The purpose of which is to build character. Which character? The character of the good sport. Character for what? For the workplace, with its team camaraderie and peer-enforced discipline. For others, work is still just dull, repetitive work, but they dream of escaping into the commerce of play - making it into the major leagues, or competing for record deals as a diva or a playa in the rap game. And for still others, there is only the game of survival... Play becomes everything to which it was once opposed. It is work, it is serious; it is morality, it is necessity. [011]

On the one hand, Wark captures the highly competitive understanding of the market that we see in our capitalist world. On the other hand, he seems to undervalue play for the sake of play. Yes, we have evolved into a highly competitive society that seems to understand its surroundings in terms of competition and payoff, but to say that there is no "play" anymore is to severely diminish the satisfaction that one receives from non-required competition - from a game that isn't the "gamespace of reality."

In that gamespace, Wark notes, "The only thing worse than being defeated is being undefeated. For then there is nothing against which to secure the worth of the gamer other than to find another game" [038]. In this paradigm, we are limited to the set established by gamespace, to the way in which our worth is constructed within the artificiality of the game itself, and of the god-designer. We cannot function as autonomous, individuated beings without the relational marker of the gamerscore or rank, but at the same time, we cannot be autonomous at all within the construct of the gamespace to which we (willingly?) subscribe our identities. We are powerless to escape gamespace and have sacrificed ourselves to it as mindless automatons incapable of participating in the allegory (or, in Wark's terms, "allegorithm") of the game itself.

Wark paraphrases Guy Debord's The Society of the Spectacle, but disagrees with his assessment that gaming in fact increases the participatory - and didactic - element of storytelling:

Key to Debord's understanding of "spectacle" is the concept of separation. Some argue that the "interactive" quality of contemporary media can, or at least might, rescue it from separation and its audience from passivity. One could with more justice see it the other way around: whatever has replaced the spectacle impoverishes it still further, by requiring of its hapless servants not only that they watch it at their leisure but that they spend their leisure actually producing it. Play becomes work. Note to [111]

The question here is how Wark defines work. In the literary world, we say that a novel, play, or poem does "work" when it interacts with and comments upon the society that has shaped it (or in which it is produced). In that sense, yes, the game does "work" through "play" (a concept with which a performance theorist is intimately familiar). The gamer in fact participates in this work by allowing the game to work through him- or herself in a way similar to how an audience at a theater or a reader of a novel participates in the "work" of the performance or book. However, Wark's claim that participation "impoverishes" the spectacle and mission of the game is as ridiculous as stating that the performance of a play "impoverishes" the spectacle of the text.

There is no spectacle without a certain level of interaction. There is no spectacle in the theater without the production that creates that spectacle. Likewise, there is no spectacle in a game without the full use of not only its audio track and visual graphics, but also the complex mechanics of the game design itself. The game itself - the rules, the "algorithm" (according to Wark) - is a form of spectacle upon which the designers rely. It is something new, this "interactive spectacle" that requires the active participation (rather than passive observation) of its spectators in order to operate fully, but it is nevertheless a form of spectacle.

But where I really disagree with Wark is with the suggestion that a game is somehow restricted, partitioned off from this "gamespace of reality." Not only has Wark attempted to rob games of their unique form of spectacle, but he asserts that

the utopian book or the atopian game lacks the power to transform the world. But where signs and images may bleed off the utopian page into the world, the algorithm of the game, in which each relation depends on one another, may not. At least not yet. [122]

I respectfully disagree. A game is as capable of "bleeding off" the console or computer screen and "into the world" as fully as a novel. Perhaps more so, by pure virtue of its interactivity. This is not to say that games are the impetus to violence (a point which Wark makes, and with which I agree, is that it is utterly ludicrous to suggest that games cause people to become more violent), but that the choices present in many games - such as Bioshock, Bioshock 2, Mass Effect 2, and so forth - directly involve the gamer in making a quantified moral choice (or series of choices) that impact the outcome of the game in unforseen ways. And choices that, while not analogous to everyday life, reflect hyperbolically some of the types of choices a gamer in "gamespace" may have to make. The point is that the kind of seepage Wark attributes to novels is at least equally present in games. Especially games that foreground the kinds of dystopian/atopian ideologies that provide an analog to the (deliberately) impossible utopian visions of More and others. Indeed, some games - Bioshock and Mass Effect 2 being prominent examples - interact with the very textual utopian visions Wark claims they cannot match: Bioshock takes on Atlas Shrugged and Mass Effect 2 engages Shakespeare's The Tempest. These are not "texts" in the traditional sense, no, but they are participating in and actively speaking to the textual tradition which Wark (and certainly many others) find somehow more valuable than the games which seek to respond to them.

Certainly, I am not advocating the abandonment of literature to the study of gaming, but neither can I say that gaming does not belong in the field. Certainly Wark agrees - and does excellent work - with the need to study digital media in its own right (and has some quite interesting readings of Katamari Damacy and The Sims, among others, in this book), but it seems to me that we cannot separate these "early" games from the literature that has produced them, any more than it would be folly to suggest that Shakespeare's plays did not owe a profound debt to the centuries of poetry and medieval drama that came before him. Our literature is our intellectual past, present, and future, but it would be foolish to discount the importance of play - of games and gaming - as we trace that history from Shakespeare's stage to the Xbox screen.

Monday, February 22, 2010

How to Level Up in Class

So my husband sent me this link today from work. He thought I would be particularly interested in the portion that discusses one teacher who works his classroom on an XP (eXperience Points) system and how the students respond so well to that system.

My response: "But... that's just how grading works." And it's true. For those of us who use a numerical system to do our grading, we're - in essence - giving our students XP. Class is a game.

Here's the gist:
- the student earns XP for every assignment, class, comment, etc.
- the student "levels-up" throughout the semester from zero (F) to whatever grade they earn at the end of the course
- as they pass each "level," they can choose to stop earning XP, or to keep going (the only possible downside to this is that many classes cannot be passed until the final assignment, so a teacher may need to start with level "Z" and work up to show progress)
- the class is designed to outfit the student for the next game chapter: a 300-level course instead of a 100-level, the second semester in a sequence, or even that most terrifying of boss battles, "real life"
- students with the most XP are the best equipped to handle future game chapters - they have the weapons of knowledge, grammar, communication skills, organization, practical skills, etc.
- when students are being chosen for teams (jobs, grad schools, law schools), they are evaluated as desirable if they have more XP - and therefore better weapons and skill-sets - than other players

And the game doesn't end with the first job. Every aspect of our lives revolves around XP - you get hired based on your experience with a particular type of problem; you have training missions designed to give you a tiny bit of XP so that you can handle the next level; once you've done the lower-level jobs, you have enough XP to move up in the job-levels of a business. You can even choose to do something games don't let you (yet, though XBOX's gamer-points are coming close), and take that XP to another game!

The key thing is - we don't think about our "lives" that way.

In the twenty-first century, the world is no longer a stage - it's a console, and all the men and women are merely players.

Friday, February 19, 2010

Spines and Bindings

So this week's theme - aside from grading - has been books. Not a surprising thing, given my choice of profession, really. But I've been reading The Book on the Bookshelf by Henry Petroski, and we recently began a lengthy process of replacing all of our mismatched shelves with matching ones.

As a bibliophile, I like owning books. I like owning pretty books, old books, and books that I enjoyed reading. To this day, my favorite Christmas present was given to me at the age of 12, and is my facsimile copy of Shakespeare's first Folio. I adore that book (and it's damn useful in my line of study).

But I'm not the kind of book person who can't stand the thought of marring my books. I take care of my old books, yes, but my paperbacks... I use my books. I write in them. I dogear the pages. I highlight, underline, scribble, and circle. I break their spines. I use tape to hold together the covers when they start to tear and fall off. As far as I'm concerned, a pristine book is like a new stuffed animal - pretty, but clearly unloved.

Hypocritically, of course, I hate reading books other people have marked. Not because I'm appalled at the fact that they "defaced" a book, but because the underlining and words are not mine. I'm a selfish book-scribbler. I want the only words in the book (besides the author's, of course) to be mine.

In the heady days of Kindle and Nook and the iPad - to say nothing of the yet-to-be-released Overbook - some people say that books will become passe. That paper and ink will be replaced - as papyrus and vellum were - with a screen. I think that for most pleasure reading, devices like this will become common.

But for those of us in the business of books - for students, teachers, professors - the paper copies will continue to be needed. We need texts that cannot be accidentally deleted or erased due to a bug. We need our notes to survive. We need to remember the layout of the page, to be able to flip to a passage marked with a dogear or flag, to know what we thought when we read it the first or third or tenth time over.

And some of us will always crave the feel of a book in our hands. There is something comforting, something visceral about a book that no Kindle or Nook will ever match. Not to say that I won't buy one someday myself, but somehow a small ereader just isn't the same as a paperback. The thickness that tells us how much we have left to explore. The rough softness of paper pulp in our fingers. Even the black dust of ink-stain on our fingertips. Tangible words that don't disappear into black or white when we hit a button, but stay, quietly waiting, for our eyes to release them again.

Thursday, January 07, 2010

God is a Secular Humanist?

My new book for today is John W. Loftus' Why I Became an Atheist, which details not only Loftus' conversion to and then from Christianity, but his arguments against it. I'm not terribly far into the book just yet, but came across a rather interesting argument I haven't seen in the many similar books I've read.

It concerns the notion of a moral compass and addresses the argument that atheists are amoral at best, immoral at worst because of their lack of belief in a higher being. Usually, authors point to the fact that this isn't the case in real life, which Loftus also does, but then he takes it further. He argues that if what God says is "good," then it isn't really objectively good, it's just an order.

This makes the whole concept of the goodness of God meaningless. If we think that the commands of God are good merely because he commands them, then his commands are, well, just his commands. We cannot call them good, for to call them good we'd have to have a standard above them to declare that they are indeed good commands. But on this theory they are just God's commands. God doesn't command us to do good things; he just commands us to do things.
...
If we say, on the other hand, that God commands what is right because it is right, then there is a higher standard of morality that is being appealed to, and if this is so, then there is a standard above God which is independent of him that makes his commands good. Rather than
declaring what is good, now God recognizes what is good and commands us to do likewise. But where did this standard come from that is purportedly higher than God? If it exists, this moral standard is the real God. (39)

And if God follows a higher moral standard - presumably, since that is the standard Christians claim to follow, in the best general interests of humanity - then God is a humanist. It makes sense. Jesus' teachings are generally humanist. The teachings of our "great" preachers - Martin Luther King, Jr., Mother Theresa, etc. - are humanist. So, God is humanist.

Which means, what, exactly?

Well, it either means that God is rather self-destructive, since humans are notorious for destroying things that might control them, or it means that God is human. And if God is human (not literally, but in concept), then it means that God must be, as we are, mortal. So either we invented a deity that mimics us in appearance and morality, or Nietzsche is right and God is dead.

Either way, humanity - and humanism - is the closest thing we really have to a god. A thought both terrifying and inspiring. Because if we are god, then we'd better shape the hell up and start acting like it.

Monday, January 04, 2010

New Year, Old Directions

A few days ago, I received an email from a very distant relative who is attempting to put together his family genealogy, into which I apparently fall. After sending him the requested information, I became curious. I knew that I had some relatives from Germany, and rumors of someone from Ireland, but I didn't really know that much about them (other than one particular couple who had a suicide pact to hang themselves in their barn, which they did).

Since my husband was aware of a rather lengthy and fascinating history for his family that involved three French brothers, some Mohawk, and someone getting shot by the Iriquois, I started doing some digging on his side - both his last name and the story are unique enough to make for some decent results. I did manage to get all the way back to France on his father's side, through the French brothers (who are actually Quebequois and one of whom WAS shot by an Iriquois) to about 1630.

Then I attacked mine, using the information from the distant relative and a little more dredged up by my obliging mother. I made it to Ireland fairly quickly on my father's side, but also to Prussia. To Prussia AGAIN on my mother's side, to York in 1603, and then discovered that I have one string of relatives that is VERY old blood New England. From Milford, Connecticut (and one straggler from Boston, I kid you not). In 1605. I didn't know there even WAS a Connecticut in 1605. So I guess (as Jenno remarked) I was destined for New England. My family roots are here, after all.

But all this makes me think about why we, as a species, are so interested in our histories. Do I become at all different now that I know where my ancestors came from? Of course not. Am I fascinated by coincidences and connections (like the fact that my progenitors left England during the reign of James I - and were alive there at the same time as Shakespeare)? Of course I am. But fundamentally, none of this information makes me a different person.

What is perhaps more interesting (as K pointed out) is that we as a species seem to oscillate between a desire to trace our past and a desire to expunge it. Germans who had relatives in the Nazi army, for instance, do not talk about and wish not to remember that fact. Often, when our own ancestors immigrated to the United States, they changed their names (as in K's case) or eliminated connections with their past, choosing not to record names, dates, and facts about themselves or their parents. Which, naturally, is why it is so difficult for us to dig it up now.

But, ultimately, I think our desire to trace our past is a desire to know more about the clan that formed us, to understand why we were raised with the religion, the ideologies, the nation that we were, and to make informed decisions about where we want the road of life to take us. If we know where our forebears have been, we can decide to revisit their journeys in expectation of our own or to avoid them entirely. We can use our past as a lens through which to contextualize our present.

But it is, I think, also a way of wondering and researching what it would have been like if we had lived in that past - if we, our personalities and minds had been born in another age, another nation, another culture. And we can do this, imaginatively speaking, though our ancestors.

It is, however, important to remember, as we embark upon this time-travel, that the future is always more important than our past, because it is into the future that we are really traveling, and while we may carry the past and present with us, we should always remember to keep our eyes forward - lest we miss seeing that rock and stumble or fall.

Thursday, December 17, 2009

End of (School) Days

The close of the semester - and the accompanying grading - always makes me ponder the nature of our current educational evaluative system and all its flaws, variants, and benefits. My husband frequently objects to the idea of "grades" as "meaningless" or "false" indicators of ability. And in some ways, he's right.

First, what any particular grade means is somewhat arbitrary from system to system - an A at one institution might be a B at another, and a third institution might not even have an A. Growing up, my grade school did not use standard grades; we had O (Outstanding), V (Very good), S (Satisfactory), U (Unsatisfactory), and N (Not acceptable). As far as we were concerned, it was the same system. It translated when we moved to high school into "normal" letters. The difference between an O and an A was simply the form of the mark on the paper. My high school used the standard system, but an A in an AP class was a 5.0 instead of a 4.0. My undergraduate university didn't use pluses or minuses, but slash grades: A, A/B, B, and so on, which meant that I might have a lower GPA with an A/B than a student who earned an A- at another institution, even though I might have been doing better work objectively speaking. These are problems with non-standardized grades.

And then there are the objections to an evaluative system altogether. The idea that we should simply "appreciate" the efforts of our students without giving them a grade. On one hand, I understand and, yes, "appreciate" this impulse. But on the other hand, shouldn't there be a distinction between outstanding quality of work and simply adequate quality? Shouldn't we recognize that a complete assignment can be done with more and less effort, and that those levels of effort should produce different results?

This leads to another "problem" with grading: sometimes you have a student who puts in a lot of effort and does "worse" than another student who puts in less effort. Some people are naturally talented. Does this mean they should do "better" with less work? Does this mean we are "punishing" people who put in much more work but can't produce the same quality end-product? I tend to say "yes, we can reward the better product, regardless of the amount of work put into it." Because one of these two things is objectively better. If given a choice of the final product without knowledge of the effort put into it, most people will choose the better product - in business, in art, in any field. And it is a fact - an often frustrating fact - that some people just aren't capable of producing work at the same level as others (at least in a given area).

And, finally, there is the current hot-button issue of grade inflation. The idea that students have come to expect As in all their classes for doing the bare minimum of work. An A is the new C. This bothers me, most particularly because it means that the people who DO produce better work (whether through effort or talent) are told that their superior product is the same as a much inferior product. It just isn't fair.

And that's what, I believe, grading all boils down to. One camp that wants people to feel good about their work and one that wants an objective system of evaluation. Personally, I want people to feel good about work they've done well. I want grades to mean something, whether a reward for hard work or an acknowledgment of innate talent. But we also need to recognize that there are average students, and that those students should not feel bad for getting an "average" grade.

I want my students to earn their grades. I want them to produce work worthy of the mark I give it. But I also want them not to feel inadequate for not being "the best" at something. I want them to be happy that they can improve their skills without needing to be "the best." Yes, striving for improvement is good and competition can sometimes foster that, but accepting the grade you've earned is a mark of maturity and understanding, and graciously doing so is a sign that you have learned an even more profound lesson: that you have done your best and accepted your strengths... and your limitations.

Wednesday, December 16, 2009

Things that go Bump in the Night

Recently I seem to have been unintentionally preoccupied with the supernatural - ghosts, ghouls, dead things, and so on. Not that this is entirely unsurprising, of course. My research involves plays - like, say, Macbeth or The Changeling - that contain not only gruesome deaths, but the returning (and revenging) spirits of the dead. In my other job, I work in a 323-year-old building with a crypt containing around 100 bodies, some of which are visible, with fairly serious noise and electrical "issues" that we like to blame on our resident (dead) Frenchman. I am fascinated by questions of belief, religion, and myth.

Over the last several days, I've been watching old episodes of Ghost Hunters, which seems right in line with this history. But it is something completely different. Instead of approaching the supernatural from the point of view of a literary device, a mythological belief system, or an amusing explanation for antique wiring, ghosts - for these people - are very real phenomena. Most of the time. For people who make a living hunting ghosts, they are very practical and do a good deal of "debunking" of claims.

But it makes me wonder a few things about people. First, why we want to believe in ghosts. Certainly, there is the desire to have proof that death is not the end of our existence. That there is something beyond the physical connection of synapses and cells that makes us us. I buy it. In fact, if someone could definitively prove to me that ghosts are a continuation of human existence, I'd be thrilled.

There is another part of me, however, who finds some of the possible explanations for "ghosting" to be profoundly interesting in a scientific way. Emotion- or energy-impressions, for instance. The idea that we might feel so deeply that we somehow impress some part of our consciousness or emotion into the world around us is profoundly strange. It seems to imply that there are physical laws we don't understand - laws that can explain how the energy of emotions can impact the physical world without what we understand as physical contact. Intriguing, certainly. But it also reminds us how truly ignorant we are about our own world; we don't even understand the basic rules for how we interact with our surroundings, or they with us. It also raises the question of what makes us what we are; are emotional impressions a part of us? Are they capable of feelings and intelligence of their own? Do we really leave behind semi-sentient beings when we feel that intensely?

Finally, the concept of a ghost - so long as I do not know what it actually is - is also profoundly disturbing. Is a ghost a person trapped in a cyclic pattern of emotion and behavior? Restricted to a specific path or area by virtue of some in-life connection or site of death? The idea of being stuck for what amounts to eternity is not a pleasant one.

And if ghosts aren't human at all, then what are they? What else inhabits our world that we cannot see or interact with on a regular basis? What else is out there that we fail to notice at all?

So what do I think they are? I don't. I'm open, in the true sense of an atheist. No one has managed to actually prove anything to me. I know people who believe in them, who have claims of seeing and feeling things. I believe them. I believe that the things they say happened are true. That doesn't mean I know what they are. Maybe they are the dead. Maybe they're "energy-impressions." Maybe they're something else altogether. I don't know. I choose not to try to make a choice.

Now if Pierre walks up to me and introduces himself, you can bet that just as soon as I've had a psych-evaluation I'll be more than happy to believe that ghosts are people. But until then, I'm happy to wait and wonder. Because maybe ghosts are quarks. Or quarks are ghosts. Or something equally strange but fully explainable by science that we just haven't discovered yet.

Remember, just because we don't have an answer doesn't mean there isn't one.

Sunday, November 15, 2009

GODDOG - Beast, man, and king

Although I find Derrida much more accessible now than when I was barely old enough to legally imbibe alcohol, I still hold the opinion that he is a good deal more interested in hearing himself talk than he was in being clear. Clarity is a good thing – and deliberate lexical obfuscation (while I’m good at it) is a personal pet peeve.

The other day I picked up The Beast and the Sovereign: Volume 1 off the shelf at the Harvard Bookstore (which, in an entirely unrelated note, is now the home of Paige M. Gutenborg, a search-print-and-bind machine that will print and bind anything in the public domain for $8). Brand-new Derrida (despite his death) based on seminars. This type of format tends to annoy me, as it makes a good deal more sense in person than written down (especially with the tangents), but that’s the only way some things will ever come out, and there isn’t much to be done about it. But that isn’t the point. The point is that The Beast and the Sovereign seems eminently germane to my dissertation. So I bought it.

Like most Derrida, it needs to be read in small doses. This dose is from the Second Seminar, December 19, 2001.

What would happen if, for example, political discourse, or even the political action welded to it and indissociable from it, were constituted or even instituted by something fabular, by that sort of narrative simulacrum, the convention of some historical
as if, by that fictive modality of “storytelling” that is called fabulous or fabular, which supposes giving to be known where one does not know, fraudulently affecting or showing off the making-known, and which administers, right in the work of the hors-d’oeuvre of some narrative, a moral lesson, a “moral”? A hypothesis according to which political, and even politicians’, logic and rhetoric would be always, through and through, the putting to work of a fable, a strategy to give meaning and credit to a fable, an affabulation – and therefore to a story indissociable from a moral, the putting of living beings, animals or humans, on stage, a supposedly instructive, informative, pedagogical, edifying, story, fictive, put up, artificial, even invented from whole cloth, but destined to educate, to teach, to make known, to share a knowledge, to bring to knowledge. (35)

The “if” in the first sentence seems to be an interrogative, a subjunctive that indicates doubt. Derrida goes on to discuss the idea of Terrorism as a fable, a real event made fabular by modern media and propaganda. By, he says specifically, the idea of media as capital to be bought, sold, released, or withheld. He refers to September 11 (only three months prior to the seminar) and the images dispersed by the media both pre- and post-government intervention.

But while Derrida claims that functions of government can be like a fabular institution, it seems to me that they do more than seem. Of course, there are levels of fabular. In Early Modern England, the government (especially under James I) was based on the fable of divine right kingship, a “fictive modality” that situates the king as a mini-god, a supernatural being whose touch could heal and whose intercession or judgment was that of God. In a theocracy, too, where the monarch is head of church and state, the fabular element is clear (provided one can step back far enough from the religious element to accept the equation of religion and fable).

But what about the government of the modern Western world – specifically, of America? The added “under God” in our pledge of allegiance would seem to place us within some more democratic extension of the second above example. A nation that is not theocratic in name, but seems to be so in doctrine, particularly with the rising power of the religious right.

But that is not the way I would see it. Yes, modern America is leaning disturbingly toward certain theocratic elements, but we are not a theocracy (at least not yet). But our government and ruling ideology are predicated on a fabular construction of nationalist myth. The Founding Fathers (always capitalized) have become our new political pantheon, with the occasional inclusion of Abraham Lincoln, Teddy Roosevelt, or JFK. Our fable is the fable of pure equality, the fable of pure freedom, the fable that our nation is better than any other in the world by virtue of… what? The very illusory things that make up the fable of our nation? Can truth be based on such a fiction?

Derrida seems to think so.

And me? I think so, too. But I also think that there is danger is not recognizing a fable for what it is: fiction. Now don’t get me wrong, I think fiction is as powerful a tool of education and liberation as fact – and can be used to achieve reason and rationalism so long as it is seen as fiction. But when fiction is assumed to be fact (as in the case of our nation’s foundational mythos or, for instance, religion), then we stop learning, stop improving, and begin to regress into a state of childish presumption when we think that we are the hero of our fable, that everything will bend to help us in our fabular quest, and that we – as the hero – cannot truly be harmed or die.

But it is a fable. We can get hurt. We can die. And we would do well to remember that fiction may reveal fact, but fact can never be fiction.

Saturday, October 03, 2009

The Freshman Experience

My university, like many others, has recently joined the "Freshman Experience" bandwagon. Recently, many universities have begun fretting about this so-called problem, as Brian's Coffee Breaks notes (article here). In short, universities are worried that the poor freshmen will be "lost" in their new - presumably larger - communities.

This focus on the "Freshman Experience" seems to me to be little more than a perpetuation of our increasing tendency to infantilize college students who, less than a generation ago, would have been considered fully capable adults. I teach freshmen. I know that most of them aren't yet fully prepared to take on complete adult responsibilities. But making them go through the "Freshmen Experience" isn't going to get them there. In fact, if anything, it will retard the process.

Universities often function in loco parentis, despite the fact that most of their students are legal adults. As an instructor, I do not wish to be a babysitter, and I think it ultimately does a disservice to my students. Yes, I should offer them advice and help them develop study and life skills, but, ultimately, they need to learn to become independent.

Students who do not want to get lost in huge classes need to learn to speak up - it's as simple as that. If you don't want to be a faceless number in the back of the room, then walk your butt up to the front and introduce yourself to your professor. Raise your hand. Assert yourself - that's the best advice anyone can give to a freshman who doesn't want to be "lost."

Our tendency to want to keep our children "children" for longer is detrimental to them, and to our society. Let them learn how to grow up on their own - because trying to hand-hold them into adulthood doesn't teach them anything but to expect someone else to do all their work for them - to make their choices for them and to not be held responsible for their own lives. Offer them help for serious issues - medical, mental, etc. - but offer resources, don't plan their lives for them. After all, if everything is done for them freshman year, then you'll have to have a "Sophomore Experience," then a "Junior Experience," and so on. But once they get out of the university, they won't find a "First year at work Experience." They'll find real life.

Thursday, September 24, 2009

Grading Green

So my university (at which I both teach and am a student) has really started pushing going as paperless as possible this year.

On the one hand, this is proving to be an annoyance, as my students have this rather marked increase in their ability to forget due dates and assignment specifications. As a result, I continue to hand out sheets in class.

On the other hand, though, I have required all students to turn in their papers electronically. This, I love. And they seem to be pretty happy with the arrangement, as well. I get no excuses of "my printer died," for starters. I don't have to worry about papers turned in with red or blue ink (though, really, I'd rather that than nothing). No mangled pages, no coffee stains, no food on the papers, AND my cats can't attack them. And believe me, explaining to a student why there are teeth marks and little bites taken out of their papers is entertaining.

I like grading on the computer. I like being able to use Macros for common errors (it goes SO much faster), being able to retype a comment when I don't like how I worded it (no erasing or scribbling out!), and I really like the fact that I type much faster than I can hand write anything. Grading takes nearly half the time it used to.

I like the fact that I can post their comments and grades on a course website and just be done with it. I don't have to haul papers to class, don't have to feel like I'm carrying half a tree with me wherever I go, and I don't have to worry about what to do when I find I have a only a red pen (and we're not supposed to use red because it traumatizes students - WTF?). I do have to carry around a laptop, which is heavy as hell, but lets me have the enjoyment of posting to my blog when I finish my grading early (as now).

All this, and the proverbial bag of chips that we're doing something that reduces our impact on the environment. Bonus!

I know my students like the convenience of emailing papers or uploading them, rather than printing and dragging. I'm sure they like the option to "accept changes" when I make small grammatical alterations to their drafts. I'm sure they appreciate not having to decipher my handwriting.

But all that said, there is something a good deal more impersonal about electronic comments. They can't see a smiley face drawn in their margin or the deep impress of a pen when I get excited (or angry) about something they've said. They can't guess about what color pen is going to have marked their papers this time. It's just the familiar, cold, glowing text on their screen that appears on every document, every webpage, every email that comes through their inbox or browser.

But, I think, that's why the experience of the classroom is so important. In today's world, where online education is becoming increasingly popular (and common), I think the classroom is, in fact, more important than ever in the process of education. Students need to see and hear not only their instructor, but one another. They need to be able to give a gut reaction, to express their immediate confusion, to be able to recognize when they or others have something particularly interesting or exciting to say. Yes, the internet is great, but text just doesn't do it all. We need human contact - especially in education, where the information itself is impersonal at best. Classmates and teachers are PEOPLE, and they are important ones who shape the way we think and learn, and that is something that no amount of "paperless" internet education is going to be able to make up for.

Sunday, July 26, 2009

Next

Typically, I do not post about personal events on this blog, but today I'm going to make an exception. Theoretically, you only get married once, and I think it is a momentous enough occasion to warrant a post, as it will be my last with this name and social status ("single").

As is the nature of such things, I don't imagine the process will change between now and the next post. That's the way life works. Milestones appear and disappear, are overleapt or crossed or smashed to smithereens, and we just keep going down the road, waiting for the next to appear so that we can say we have passed and marked that next mile in our journey.

Sometimes, if we're very lucky, we have the opportunity to make a choice between roads. To stop and turn or keep ourselves straight, to take the "scenic route" instead of the multilane freeway. Or, if we're particularly creative, to pull over, step out of the car, and strike out into the middle of nowhere with nothing more than what we can carry on our backs.

Sometimes a detour takes us back where we started, sometimes to where we would have ended up anyway. And sometimes, we find a waterfall, a cave, a canyon, at the end of our offshoot. Sometimes we never again return to that old path and are left to wonder what surprises - or not - it held.

Sometimes, we backtrack. We take one look at that rickety rope bridge and say "screw it." And sometimes, when we say that, instead of walking back we jump into the river and see where it takes us.

Road less traveled my ass, Mr. Frost. If there's a road, it's been traveled. Maybe "less" than other roads, but you, sir, are a rebel in sentiment only. If you really want to do something interesting, get off the road all together.

Now I'm a fan of the road, be it paved or dirt or even (much as I curse them) cobblestone. I use the roads in my life frequently. But every now and then, I get off and see what's going on in the ditch or the field or the forest.

This milestone, however, is firmly in the middle of the normal highway of life, and I'm okay with that. Contrary as it is to my nature to do the "normal" thing, this is one of the "normal" things I'm happy to do, a part of the highway I'm glad to travel. After all, striking out into the middle of nowhere is more fun with someone else along for the journey.

Saturday, July 04, 2009

Written by the Victors

In learning about history, one of my favorite things to do is dig up the little-known or "losing" side of the story. The story of the witches burned by the Inquisition, of the Spiritualists and atheists in the nineteenth century, of the Germans in WWII, of the unspoken and unsung.

That is not to say, of course, that the history written by the victors is unimportant. Or that written by the allies of the victims. But there are always more sides, more facets, to the gem of history than first appear visible to our biased gaze.

There aren't often true "villains" in history, people who truly just want to make others suffer. There is always some grain, some tiny glint of altruism or idealism that offsets the dirt and grime and corruption. Much as we hate to admit that people like Hitler, like bin Laden, have some sort of good, it is something that we should admit.

It is something we should admit not to justify their actions or excuse their cruelty, but to better understand ourselves and our own motivations. If we can see that they - the mysterious, proverbial "they" - have a modicum of goodness, then perhaps we can see that within what we see as our own virtue may be a kernel of darkness.

Sunday, June 21, 2009

When the world burns

Like many other Americans, I have been watching the protests in Iran from the safety and comfort of my home thousands of miles and an ocean away, wondering whether there is anything I can do to help them, and wondering whether - in similar circumstances - I would have the courage to face what they are facing. In my comparatively sheltered life, I have stood up for idealism more than once, but never when facing the immediate and physical threat of injury or death. That is the privilege that most people of my (American) generation have - to have grown up in a time and place where our actions of protest are enabled, authorized, and permitted to the extent that we need not fear death if we choose to speak our minds.

Certainly, there are exceptions. There have been incidents, even in my short life, in which people have been beaten by police or by other citizens, in which the military has gone too far, or in which hatred has caused injury or death to a member of a minority opinion/lifestyle/ethnic background. There was 9/11.

But, ultimately, we live in a society where rebellion is controlled because it is authorized. The people in Iran do not, but, clearly, they want to. That is not to say that they (necessarily) want to be American, but, rather, that they want to live in a society that reflects their beliefs while allowing a certain amount of flexibility - the possibility that things might change, that a system might need revision or alteration, that what was best in the past or is in the present might not be in the future.

If we, as Americans, learn nothing else from the Green Revolution, it should be that people are fundamentally far more powerful than they might believe. That the human spirit is wise and courageous and that people from across the world and on the other side of a religious or ethnic divide are just as wonderful, just as amazing - and perhaps even more so - as the people you pass on the street every day. What we should learn is that complacency with our lifestyle is not the solution to the betterment of our world, that permitting atrocity or even mild cruelty (or torture) is an unacceptable way of life.

Sunday, May 31, 2009

Gods and Demons

In Karen Armstrong's A History of God, Armstrong talks about the fact that monotheism was an unusual creation, but - further - that the refutation of other gods was a practice unique to Christianity. This is a point reiterated by Jeffrey B. Russell in A History of Witchcraft, in which he discusses paganism and the vilification of non-Christian deities.

Armstrong's point was that early Christians denied the very existence of other entities besides their god. Russell observes that what began as a denial - a policy that obviously needed revision - became a tendency to coopt:
And now Christian theologians made another important identification: the demons that the sorcerers were calling up were the pagan gods. Jupiter, Diana, and the other deities of the Roman pantheon were really demons, servants of Satan. As Christianity pressed northward, it made the same assertion about Wotan, Freya, and the other gods of the Celts and Teutons. Those who worshipped the gods worshipped demons whether they knew it or not. With this stroke, all pagans, as well as sorcerers, could be viewed as part of the monstrous plan of Satan to frustrate the salvation of the world. This was the posture of most theologians and church councils. Yet at the same time popular religion often treated the pagan deities quite differently, transferring the characteristics of the gods to the personalities of the saints. (39-40)
Since they couldn't eliminate other gods by denying their existence, Christian authorities instead labeled them either demons or, it seems, saints. Those whose myths could be refigured to include God and Christ were deemed acceptable, those who defied the kind of authoritarian strictures promoted by Christianity became its enemies.

What is perhaps most interesting is that we can clinically identify this move in historical circumstances (December 25th is the birthday of Mithras, for instance, and has nothing whatsoever to do with the birth of a nice Jewish boy in Bethlehem except for the fact that the Church needed a good day for its celebration), but that we as a society continue to cling to the religious elements that have so clearly been coopted as though they are empirical truth. Yet more evidence of our ability to believe in whatever we want despite the facts that contradict it waving small flags directly underneath our noses.

Human society is a fascinating melange of a variety of long- and short-term traditions that have given us enormous intellectual wealth, but that doesn't mean it's a good idea to wholesale believe in them simply because your mother told you to. Look first, people, then leap. Or, better yet, keep your fool feet planted firmly on the ground unless you've put on the safety harness first.