Over the last week or so I've been plowing my way through the Kaufman collection of Nietzsche's writings, and several of them have struck me as particularly poignant. Now Nietzsche has quite a reputation for coming up with evil and villainous conceptions of human nature. He has been used as an excuse for racism, Nazism, general Antisemitism, and maligned for his philosophic atheism.
Thus far, I have seen nothing racist beyond what was typical for a nineteenth-century European, and, in fact, Nietzsche defends Judaism - in a racist way, admittedly, but in a way that clearly marks him as not Antisemitic, especially not in a way that should encourage the kind of Antisemitism practiced by the Nazi party in twentieth-century Germany.
His atheism, however, is quite evident, but this I have no problem with. And, given some more recent atheistic tracts and works I have recently read, Nietzsche is downright mild and non-confrontational. He phrases his atheism specifically in terms of truth, self-delusion, and hypocrisy.
One of the early points in the book is from On Truth and Lie:
We still do not know where the urge for truth comes from; for as yet we have heard only of the obligation imposed by society that it should exist: to be truthful means using the customary metaphors - in moral terms: the obligation to lie according to a fixed convention, to lie herd-like in a style obligatory for all... (47)
First, of course, Nietzsche raises the question of whether truth - objective, singular truth - actually exists at all. The understanding of truth he here presents indicates an awareness of the subjectivity of human morality (the idea that "truth" varies according to circumstances) but also implicitly asks whether if basic truth does not exist, then how can we claim that there is a higher moral truth.
Nietzsche compounds this question with the now-infamous assertion that "God is dead," but also with claims of religious hypocrisy, as when he writes, "when one opens the Bible one does so for 'edification'" (The Dawn 76). In other words, those who read the Bible - and, presumably, any holy book - do so because they already know what they think of it and are looking to it to confirm their beliefs. Of course, this applies to the non-believer as well as the believer and says more about the problematic nature of holy works and human contradictions than it does of the claims made by those books.
But Nietzsche is ultimately more interested in the hypocrisy of believers than he is in their books. In Thus Spoke Zarathustra, the titular philosopher says "Behold the believers of all faiths! Whom do they hate most? The man who breaks their tables of values, the breaker, the lawbreaker; yet he is the creator" (135-136). What is most interesting about this idea is that it begins the introduction of the Ubermench (the Overman), the better future-human. Here, we see not only a critique of the faithful, but also a recognition that any creator - the deity that is worshiped, the founder of a religion, etc. - must by necessity violate the very rules of that religion.
It also establishes the idea that any future founder of something great - be it religion, scientific thought, government, etc. - must violate the rules of what already exists in order to do so. Implicitly, then, Nietzsche himself, in creating and articulating these new ideas, is a willing violator of the status quo. So break a few rules and make something new.
"Words fly up, my thoughts remain below."
black and white, Angels and demons.
We aren't two sides of the same coin.
We're the gold into which those sides are imprinted."
Thursday, July 01, 2010
Thursday, April 22, 2010
The Age of Anti-Enlightenment
Although I and many of the people I know best and love are all members, graduates, or products of higher education, I have been noticing a recent trend in anti-intellectualism among politicians, society in general, and even among the members of my own family. Admittedly, those to whom I am closest (parents, cousins, aunts and uncles) are mostly in accord with my beliefs, but there is something deeply disturbing about the discovery that family and acquaintances do not share one's fundamental belief systems.
I'm not talking about religion, per se (this time). I have friends who are Christian, Jewish, agnostic, secular humanist, pagan, Wiccan, atheist, etc. They do not ask me to conform to their beliefs, and I do not ask them to agree with mine. But we do all share an affinity for knowledge - whether in terms of education or simply the desire to learn.
It is a passion that is, unfortunately, not shared by many people in our country.
People ask, often, why they should bother learning this thing or that thing. Why it matters whether something is fact or fiction. Why history is important.
This is not to say that I think everyone should learn everything - that's not possible, and we all know it. But there's no reason to actively avoid education. And no reason why on earth the majority of people in this country are unaware that we were not founded on Christian principles. For goodness sake people, why are most Europeans more well-versed in our history than we are? That's just sad.
It's a symptom of what Charles P. Pierce in Idiot America terms "a war on expertise" (8). He says,
The rise of Idiot America today reflects - for profit, mainly, but also, and more cynically, for political advantage and the pursuit of power - the breakdown of the consensus that the pursuit of knowledge is a good. it also represents the ascendancy of the notion that the people we should trust the least are the people who know best what they're talking about. (8)
In other words, we can't trust a scientist to know science, a historian to know history, or a doctor to know medicine. We (speaking here in the "Idiot America" sense) should rather trust, like Sarah Palin, in our instincts to guide us, in our knee-jerk reaction against anything new or unique, in our "common sense" - which, I would like to point out, is usually light-years away from "sense," however "common" it may be - to tell us that what we've always been told is true, despite the factual evidence to the contrary staring us incredulously in the face. We should agree that the snake is evil and that the fruit it proffers us is terrible despite the fact that it will indeed make us as gods.
Because that's what the story in Genesis all boils down to. The idea that knowledge is bad. That it is somehow evil to wish to be the best we can. To know truth from falsehood. The idea that discernment and conscience - that maturity - are corrupting forces that will sully the ignorant infantilism in which we (again, as "Idiot America") would prefer to wallow, our thumbs stuffed in our mouths and a glassy, glazed look in our eyes as we gaze upward, waiting for the beneficence of a giant Santa Claus to pat us on the head and give us presents.
Because if we take a bite out of the apple we might realize that there is no Santa Claus. That we are responsible for our own actions. That with knowledge comes power, that with power comes responsibility, that with responsibility comes maturity, and that with maturity comes wisdom. But if we never take that bite out of the apple, then we remain children, and someone else is able to tell us what to do, where to go, how to live, why we exist, and even who we are. Without knowledge and all that springs from it we are trapped in servitude, not to those with knowledge, but to the bullies who choose force over knowledge and fight to keep us away from knowledge because, ultimately, knowledge - the proverbial pen - is indeed mightier than the sword.
Apple, anyone? I hear they make a tasty pie. And what's more American than that?
I'm not talking about religion, per se (this time). I have friends who are Christian, Jewish, agnostic, secular humanist, pagan, Wiccan, atheist, etc. They do not ask me to conform to their beliefs, and I do not ask them to agree with mine. But we do all share an affinity for knowledge - whether in terms of education or simply the desire to learn.
It is a passion that is, unfortunately, not shared by many people in our country.
People ask, often, why they should bother learning this thing or that thing. Why it matters whether something is fact or fiction. Why history is important.
This is not to say that I think everyone should learn everything - that's not possible, and we all know it. But there's no reason to actively avoid education. And no reason why on earth the majority of people in this country are unaware that we were not founded on Christian principles. For goodness sake people, why are most Europeans more well-versed in our history than we are? That's just sad.
It's a symptom of what Charles P. Pierce in Idiot America terms "a war on expertise" (8). He says,
The rise of Idiot America today reflects - for profit, mainly, but also, and more cynically, for political advantage and the pursuit of power - the breakdown of the consensus that the pursuit of knowledge is a good. it also represents the ascendancy of the notion that the people we should trust the least are the people who know best what they're talking about. (8)
In other words, we can't trust a scientist to know science, a historian to know history, or a doctor to know medicine. We (speaking here in the "Idiot America" sense) should rather trust, like Sarah Palin, in our instincts to guide us, in our knee-jerk reaction against anything new or unique, in our "common sense" - which, I would like to point out, is usually light-years away from "sense," however "common" it may be - to tell us that what we've always been told is true, despite the factual evidence to the contrary staring us incredulously in the face. We should agree that the snake is evil and that the fruit it proffers us is terrible despite the fact that it will indeed make us as gods.
Because that's what the story in Genesis all boils down to. The idea that knowledge is bad. That it is somehow evil to wish to be the best we can. To know truth from falsehood. The idea that discernment and conscience - that maturity - are corrupting forces that will sully the ignorant infantilism in which we (again, as "Idiot America") would prefer to wallow, our thumbs stuffed in our mouths and a glassy, glazed look in our eyes as we gaze upward, waiting for the beneficence of a giant Santa Claus to pat us on the head and give us presents.
Because if we take a bite out of the apple we might realize that there is no Santa Claus. That we are responsible for our own actions. That with knowledge comes power, that with power comes responsibility, that with responsibility comes maturity, and that with maturity comes wisdom. But if we never take that bite out of the apple, then we remain children, and someone else is able to tell us what to do, where to go, how to live, why we exist, and even who we are. Without knowledge and all that springs from it we are trapped in servitude, not to those with knowledge, but to the bullies who choose force over knowledge and fight to keep us away from knowledge because, ultimately, knowledge - the proverbial pen - is indeed mightier than the sword.
Apple, anyone? I hear they make a tasty pie. And what's more American than that?
Wednesday, April 21, 2010
Brought to you by the letter "A"
In this case, the "A" is not a scarlet fabric representation of marital (or extra-marital) infidelity emblazoned upon my breast so that the world can read my shame and shun me accordingly. However, I am fairly convinced that in some parts of the country (and the world), the "A" to which I here refer would in fact earn me far worse treatment at the hands of the local population. Fortunately for me, however, I live in a liberal American city that permits my special brand of atrocity.
"A," as will come as no surprise to my few regular readers, is for "atheism." Over the last week or so I have been reading John Allen Paulos' irreligion (which, in a side note, has a "0" on the cover, not an "A" or even an "i"), and a few weeks past had my class discussing such hot-button topics as "evolution versus creationism," "science versus religion," and "eugenics."
Some of the results of this have been interesting. Paulos is one of the more rational, reasoned atheist writers out there (he is a mathematician and much less angry than, for instance, Richard Dawkins), since he refrains from disparaging comments about believers and uses logic and probability to make his points. This is not to say that he doesn't season his book with a good deal of snark - there's plenty of that in there - but he tries to be tongue-in-cheek rather than abrasive.
One of the more interesting points he raises that I haven't seen in before is this: "Embedding God in a holy book's detailed narrative and building an entire culture around this narrative seem by themselves to confer a kind of existence on Him" (62). In other words, we'd feel awfully stupid in following the deistic tenets of our societies if we didn't believe in a god because then there is absolutely no reason for some of our laws, idiosyncrasies, and habitual practices. In other words, we justify our belief through the traditions that have grown out of it. Like saying that "Mommy and Daddy wouldn't put out milk and cookies if Santa Claus weren't real." The act itself neither proves nor disproves the existence of Santa Claus, just as the presence of religion neither proves nor disproves the existence of god.
And from this also springs the idea that people now have come to believe because they were not capable - as children - of making the decision not to believe, since they had not yet developed an adult's incredulity. We tend, as a species, not to convert to a wholly new religion in adulthood (it DOES happen, certainly, but it is less common than a perpetuation of childhood belief) because we are creatures of habit. As Paulos continues, "Suspend disbelief for long enough and one can end up believing" (62).
"A," as will come as no surprise to my few regular readers, is for "atheism." Over the last week or so I have been reading John Allen Paulos' irreligion (which, in a side note, has a "0" on the cover, not an "A" or even an "i"), and a few weeks past had my class discussing such hot-button topics as "evolution versus creationism," "science versus religion," and "eugenics."
Some of the results of this have been interesting. Paulos is one of the more rational, reasoned atheist writers out there (he is a mathematician and much less angry than, for instance, Richard Dawkins), since he refrains from disparaging comments about believers and uses logic and probability to make his points. This is not to say that he doesn't season his book with a good deal of snark - there's plenty of that in there - but he tries to be tongue-in-cheek rather than abrasive.
One of the more interesting points he raises that I haven't seen in before is this: "Embedding God in a holy book's detailed narrative and building an entire culture around this narrative seem by themselves to confer a kind of existence on Him" (62). In other words, we'd feel awfully stupid in following the deistic tenets of our societies if we didn't believe in a god because then there is absolutely no reason for some of our laws, idiosyncrasies, and habitual practices. In other words, we justify our belief through the traditions that have grown out of it. Like saying that "Mommy and Daddy wouldn't put out milk and cookies if Santa Claus weren't real." The act itself neither proves nor disproves the existence of Santa Claus, just as the presence of religion neither proves nor disproves the existence of god.
And from this also springs the idea that people now have come to believe because they were not capable - as children - of making the decision not to believe, since they had not yet developed an adult's incredulity. We tend, as a species, not to convert to a wholly new religion in adulthood (it DOES happen, certainly, but it is less common than a perpetuation of childhood belief) because we are creatures of habit. As Paulos continues, "Suspend disbelief for long enough and one can end up believing" (62).
Thursday, March 18, 2010
Making History in Gamespace
What characterizes gamer theory is a playing with the role of the gamer within the game, not by stepping beyond it, into a time or a role beyond the game, but rather by stepping into games that are relatively free of the power of gamespace. The game is just like gamespace, only its transformations of gamer and game have no power beyond the battle in which they meet. In a game, you are free because you choose your necessities. In a game, you can hide out from a gamespace that reneges on its promises. In a game, you can choose which circumstances are to be the necessity against which you will grind down the shape of a self. Even if, in so choosing, you click to opt out of making history. [165]
Again from McKenzie Wark’s Gamer Theory. One of the points of contention I have with Wark’s theories is the idea that the game is restricted to influence and work outside of the “real world,” which Wark terms “gamespace.” Here, Wark suggests that the game exists independent of gamespace, and, most crucially from my perspective, the point that gaming removes the gamer from “making history.” The final line from the above quotation seems to give the gamer an option – to participate in the game or to participate in gamespace and the making of history.
Wark begins his theory in what he calls The Cave, an allegorical arcade that alludes to and mimics the Platonic Cave, a place that is distinct from the gamespace of the world, removed from it, unaffected by it, and unable to effect it. And it is this premise, I think, where Wark is wrong.
The gamer does not “opt out of making history.” The game and gamer are not in a Cave, cut off from the rest of the social machine. The game – like the works of literature and film to which Wark compares gaming – is a part of the intellectual and social milieu that is shaped by and shapes our ideological understanding of the world around us.
Games may be new media, but they are a vital part of our intellectual and ideological communication with and reaction to the gamespace of the world around us. They deserve not to be undervalued as mindless or shunted into a Cave frequented only by the basement-dwelling. Games are – as they have always been, even when analog rather than digital – a fundamental part of our lives. Games teach us socialization, competition, sportsmanship, and even encourage us to participate in and/or rebel against the socio-political gamespace that builds and reinforces the dominant ideologies of our culture.
Again from McKenzie Wark’s Gamer Theory. One of the points of contention I have with Wark’s theories is the idea that the game is restricted to influence and work outside of the “real world,” which Wark terms “gamespace.” Here, Wark suggests that the game exists independent of gamespace, and, most crucially from my perspective, the point that gaming removes the gamer from “making history.” The final line from the above quotation seems to give the gamer an option – to participate in the game or to participate in gamespace and the making of history.
Wark begins his theory in what he calls The Cave, an allegorical arcade that alludes to and mimics the Platonic Cave, a place that is distinct from the gamespace of the world, removed from it, unaffected by it, and unable to effect it. And it is this premise, I think, where Wark is wrong.
The gamer does not “opt out of making history.” The game and gamer are not in a Cave, cut off from the rest of the social machine. The game – like the works of literature and film to which Wark compares gaming – is a part of the intellectual and social milieu that is shaped by and shapes our ideological understanding of the world around us.
Games may be new media, but they are a vital part of our intellectual and ideological communication with and reaction to the gamespace of the world around us. They deserve not to be undervalued as mindless or shunted into a Cave frequented only by the basement-dwelling. Games are – as they have always been, even when analog rather than digital – a fundamental part of our lives. Games teach us socialization, competition, sportsmanship, and even encourage us to participate in and/or rebel against the socio-political gamespace that builds and reinforces the dominant ideologies of our culture.
Wednesday, March 10, 2010
Warning: Gamer at Play
My current in-process read is Gamer Theory by McKenzie Wark, which, while it certainly has its flaws in interpretation, raises some very interesting questions about games, gamespace, and gamers. For instance,
Stories no longer opiate us with imaginary reconciliations of real problems. The story just recounts the steps by which someone beat someone else - a real victory for imaginary stakes. [007]
My issue with this is that stories have always been the recounting of "the steps by which someone beat someone else" - Wark makes it sound as though this is a recent development in storytelling technology, and one that has somehow evolved through the degradation of our society's culture. But that isn't the case. Stories are always about someone else through whom we are meant to vicariously experience the events of the story. It happens that gaming - in the RPG and video game sense - permits a deeper level of this by causing the gamer (rather than the audience) to actively participate in the action of the story. The story is still scripted, even if it has alternate endings, and still controlled, however. It is still about "the steps by which someone beat someone else."
Wark continues, suggesting that the idea of "game" has come to permeate not only our narratives, but our existence:
The game has not just colonized reality, it is also the sole remaining ideal. Gamespace proclaims its legitimacy through victory over all rivals. The reigning ideology imagines the world as a level playing field, upon which all folks are equal before God, the great game designer. [008]
In this sense, gamespace is "real" space, and the concept of life as a game plays out (pardon the pun) all around us:
Work becomes gamespace, but no games are freely chosen anymore. Not least for children, who if they are to be the winsome offspring of win-all parents, find themselves drafted into evening shifts of team sport. The purpose of which is to build character. Which character? The character of the good sport. Character for what? For the workplace, with its team camaraderie and peer-enforced discipline. For others, work is still just dull, repetitive work, but they dream of escaping into the commerce of play - making it into the major leagues, or competing for record deals as a diva or a playa in the rap game. And for still others, there is only the game of survival... Play becomes everything to which it was once opposed. It is work, it is serious; it is morality, it is necessity. [011]
On the one hand, Wark captures the highly competitive understanding of the market that we see in our capitalist world. On the other hand, he seems to undervalue play for the sake of play. Yes, we have evolved into a highly competitive society that seems to understand its surroundings in terms of competition and payoff, but to say that there is no "play" anymore is to severely diminish the satisfaction that one receives from non-required competition - from a game that isn't the "gamespace of reality."
In that gamespace, Wark notes, "The only thing worse than being defeated is being undefeated. For then there is nothing against which to secure the worth of the gamer other than to find another game" [038]. In this paradigm, we are limited to the set established by gamespace, to the way in which our worth is constructed within the artificiality of the game itself, and of the god-designer. We cannot function as autonomous, individuated beings without the relational marker of the gamerscore or rank, but at the same time, we cannot be autonomous at all within the construct of the gamespace to which we (willingly?) subscribe our identities. We are powerless to escape gamespace and have sacrificed ourselves to it as mindless automatons incapable of participating in the allegory (or, in Wark's terms, "allegorithm") of the game itself.
Wark paraphrases Guy Debord's The Society of the Spectacle, but disagrees with his assessment that gaming in fact increases the participatory - and didactic - element of storytelling:
Key to Debord's understanding of "spectacle" is the concept of separation. Some argue that the "interactive" quality of contemporary media can, or at least might, rescue it from separation and its audience from passivity. One could with more justice see it the other way around: whatever has replaced the spectacle impoverishes it still further, by requiring of its hapless servants not only that they watch it at their leisure but that they spend their leisure actually producing it. Play becomes work. Note to [111]
The question here is how Wark defines work. In the literary world, we say that a novel, play, or poem does "work" when it interacts with and comments upon the society that has shaped it (or in which it is produced). In that sense, yes, the game does "work" through "play" (a concept with which a performance theorist is intimately familiar). The gamer in fact participates in this work by allowing the game to work through him- or herself in a way similar to how an audience at a theater or a reader of a novel participates in the "work" of the performance or book. However, Wark's claim that participation "impoverishes" the spectacle and mission of the game is as ridiculous as stating that the performance of a play "impoverishes" the spectacle of the text.
There is no spectacle without a certain level of interaction. There is no spectacle in the theater without the production that creates that spectacle. Likewise, there is no spectacle in a game without the full use of not only its audio track and visual graphics, but also the complex mechanics of the game design itself. The game itself - the rules, the "algorithm" (according to Wark) - is a form of spectacle upon which the designers rely. It is something new, this "interactive spectacle" that requires the active participation (rather than passive observation) of its spectators in order to operate fully, but it is nevertheless a form of spectacle.
But where I really disagree with Wark is with the suggestion that a game is somehow restricted, partitioned off from this "gamespace of reality." Not only has Wark attempted to rob games of their unique form of spectacle, but he asserts that
the utopian book or the atopian game lacks the power to transform the world. But where signs and images may bleed off the utopian page into the world, the algorithm of the game, in which each relation depends on one another, may not. At least not yet. [122]
I respectfully disagree. A game is as capable of "bleeding off" the console or computer screen and "into the world" as fully as a novel. Perhaps more so, by pure virtue of its interactivity. This is not to say that games are the impetus to violence (a point which Wark makes, and with which I agree, is that it is utterly ludicrous to suggest that games cause people to become more violent), but that the choices present in many games - such as Bioshock, Bioshock 2, Mass Effect 2, and so forth - directly involve the gamer in making a quantified moral choice (or series of choices) that impact the outcome of the game in unforseen ways. And choices that, while not analogous to everyday life, reflect hyperbolically some of the types of choices a gamer in "gamespace" may have to make. The point is that the kind of seepage Wark attributes to novels is at least equally present in games. Especially games that foreground the kinds of dystopian/atopian ideologies that provide an analog to the (deliberately) impossible utopian visions of More and others. Indeed, some games - Bioshock and Mass Effect 2 being prominent examples - interact with the very textual utopian visions Wark claims they cannot match: Bioshock takes on Atlas Shrugged and Mass Effect 2 engages Shakespeare's The Tempest. These are not "texts" in the traditional sense, no, but they are participating in and actively speaking to the textual tradition which Wark (and certainly many others) find somehow more valuable than the games which seek to respond to them.
Certainly, I am not advocating the abandonment of literature to the study of gaming, but neither can I say that gaming does not belong in the field. Certainly Wark agrees - and does excellent work - with the need to study digital media in its own right (and has some quite interesting readings of Katamari Damacy and The Sims, among others, in this book), but it seems to me that we cannot separate these "early" games from the literature that has produced them, any more than it would be folly to suggest that Shakespeare's plays did not owe a profound debt to the centuries of poetry and medieval drama that came before him. Our literature is our intellectual past, present, and future, but it would be foolish to discount the importance of play - of games and gaming - as we trace that history from Shakespeare's stage to the Xbox screen.
Stories no longer opiate us with imaginary reconciliations of real problems. The story just recounts the steps by which someone beat someone else - a real victory for imaginary stakes. [007]
My issue with this is that stories have always been the recounting of "the steps by which someone beat someone else" - Wark makes it sound as though this is a recent development in storytelling technology, and one that has somehow evolved through the degradation of our society's culture. But that isn't the case. Stories are always about someone else through whom we are meant to vicariously experience the events of the story. It happens that gaming - in the RPG and video game sense - permits a deeper level of this by causing the gamer (rather than the audience) to actively participate in the action of the story. The story is still scripted, even if it has alternate endings, and still controlled, however. It is still about "the steps by which someone beat someone else."
Wark continues, suggesting that the idea of "game" has come to permeate not only our narratives, but our existence:
The game has not just colonized reality, it is also the sole remaining ideal. Gamespace proclaims its legitimacy through victory over all rivals. The reigning ideology imagines the world as a level playing field, upon which all folks are equal before God, the great game designer. [008]
In this sense, gamespace is "real" space, and the concept of life as a game plays out (pardon the pun) all around us:
Work becomes gamespace, but no games are freely chosen anymore. Not least for children, who if they are to be the winsome offspring of win-all parents, find themselves drafted into evening shifts of team sport. The purpose of which is to build character. Which character? The character of the good sport. Character for what? For the workplace, with its team camaraderie and peer-enforced discipline. For others, work is still just dull, repetitive work, but they dream of escaping into the commerce of play - making it into the major leagues, or competing for record deals as a diva or a playa in the rap game. And for still others, there is only the game of survival... Play becomes everything to which it was once opposed. It is work, it is serious; it is morality, it is necessity. [011]
On the one hand, Wark captures the highly competitive understanding of the market that we see in our capitalist world. On the other hand, he seems to undervalue play for the sake of play. Yes, we have evolved into a highly competitive society that seems to understand its surroundings in terms of competition and payoff, but to say that there is no "play" anymore is to severely diminish the satisfaction that one receives from non-required competition - from a game that isn't the "gamespace of reality."
In that gamespace, Wark notes, "The only thing worse than being defeated is being undefeated. For then there is nothing against which to secure the worth of the gamer other than to find another game" [038]. In this paradigm, we are limited to the set established by gamespace, to the way in which our worth is constructed within the artificiality of the game itself, and of the god-designer. We cannot function as autonomous, individuated beings without the relational marker of the gamerscore or rank, but at the same time, we cannot be autonomous at all within the construct of the gamespace to which we (willingly?) subscribe our identities. We are powerless to escape gamespace and have sacrificed ourselves to it as mindless automatons incapable of participating in the allegory (or, in Wark's terms, "allegorithm") of the game itself.
Wark paraphrases Guy Debord's The Society of the Spectacle, but disagrees with his assessment that gaming in fact increases the participatory - and didactic - element of storytelling:
Key to Debord's understanding of "spectacle" is the concept of separation. Some argue that the "interactive" quality of contemporary media can, or at least might, rescue it from separation and its audience from passivity. One could with more justice see it the other way around: whatever has replaced the spectacle impoverishes it still further, by requiring of its hapless servants not only that they watch it at their leisure but that they spend their leisure actually producing it. Play becomes work. Note to [111]
The question here is how Wark defines work. In the literary world, we say that a novel, play, or poem does "work" when it interacts with and comments upon the society that has shaped it (or in which it is produced). In that sense, yes, the game does "work" through "play" (a concept with which a performance theorist is intimately familiar). The gamer in fact participates in this work by allowing the game to work through him- or herself in a way similar to how an audience at a theater or a reader of a novel participates in the "work" of the performance or book. However, Wark's claim that participation "impoverishes" the spectacle and mission of the game is as ridiculous as stating that the performance of a play "impoverishes" the spectacle of the text.
There is no spectacle without a certain level of interaction. There is no spectacle in the theater without the production that creates that spectacle. Likewise, there is no spectacle in a game without the full use of not only its audio track and visual graphics, but also the complex mechanics of the game design itself. The game itself - the rules, the "algorithm" (according to Wark) - is a form of spectacle upon which the designers rely. It is something new, this "interactive spectacle" that requires the active participation (rather than passive observation) of its spectators in order to operate fully, but it is nevertheless a form of spectacle.
But where I really disagree with Wark is with the suggestion that a game is somehow restricted, partitioned off from this "gamespace of reality." Not only has Wark attempted to rob games of their unique form of spectacle, but he asserts that
the utopian book or the atopian game lacks the power to transform the world. But where signs and images may bleed off the utopian page into the world, the algorithm of the game, in which each relation depends on one another, may not. At least not yet. [122]
I respectfully disagree. A game is as capable of "bleeding off" the console or computer screen and "into the world" as fully as a novel. Perhaps more so, by pure virtue of its interactivity. This is not to say that games are the impetus to violence (a point which Wark makes, and with which I agree, is that it is utterly ludicrous to suggest that games cause people to become more violent), but that the choices present in many games - such as Bioshock, Bioshock 2, Mass Effect 2, and so forth - directly involve the gamer in making a quantified moral choice (or series of choices) that impact the outcome of the game in unforseen ways. And choices that, while not analogous to everyday life, reflect hyperbolically some of the types of choices a gamer in "gamespace" may have to make. The point is that the kind of seepage Wark attributes to novels is at least equally present in games. Especially games that foreground the kinds of dystopian/atopian ideologies that provide an analog to the (deliberately) impossible utopian visions of More and others. Indeed, some games - Bioshock and Mass Effect 2 being prominent examples - interact with the very textual utopian visions Wark claims they cannot match: Bioshock takes on Atlas Shrugged and Mass Effect 2 engages Shakespeare's The Tempest. These are not "texts" in the traditional sense, no, but they are participating in and actively speaking to the textual tradition which Wark (and certainly many others) find somehow more valuable than the games which seek to respond to them.
Certainly, I am not advocating the abandonment of literature to the study of gaming, but neither can I say that gaming does not belong in the field. Certainly Wark agrees - and does excellent work - with the need to study digital media in its own right (and has some quite interesting readings of Katamari Damacy and The Sims, among others, in this book), but it seems to me that we cannot separate these "early" games from the literature that has produced them, any more than it would be folly to suggest that Shakespeare's plays did not owe a profound debt to the centuries of poetry and medieval drama that came before him. Our literature is our intellectual past, present, and future, but it would be foolish to discount the importance of play - of games and gaming - as we trace that history from Shakespeare's stage to the Xbox screen.
Monday, February 22, 2010
How to Level Up in Class
So my husband sent me this link today from work. He thought I would be particularly interested in the portion that discusses one teacher who works his classroom on an XP (eXperience Points) system and how the students respond so well to that system.
My response: "But... that's just how grading works." And it's true. For those of us who use a numerical system to do our grading, we're - in essence - giving our students XP. Class is a game.
Here's the gist:
- the student earns XP for every assignment, class, comment, etc.
- the student "levels-up" throughout the semester from zero (F) to whatever grade they earn at the end of the course
- as they pass each "level," they can choose to stop earning XP, or to keep going (the only possible downside to this is that many classes cannot be passed until the final assignment, so a teacher may need to start with level "Z" and work up to show progress)
- the class is designed to outfit the student for the next game chapter: a 300-level course instead of a 100-level, the second semester in a sequence, or even that most terrifying of boss battles, "real life"
- students with the most XP are the best equipped to handle future game chapters - they have the weapons of knowledge, grammar, communication skills, organization, practical skills, etc.
- when students are being chosen for teams (jobs, grad schools, law schools), they are evaluated as desirable if they have more XP - and therefore better weapons and skill-sets - than other players
And the game doesn't end with the first job. Every aspect of our lives revolves around XP - you get hired based on your experience with a particular type of problem; you have training missions designed to give you a tiny bit of XP so that you can handle the next level; once you've done the lower-level jobs, you have enough XP to move up in the job-levels of a business. You can even choose to do something games don't let you (yet, though XBOX's gamer-points are coming close), and take that XP to another game!
The key thing is - we don't think about our "lives" that way.
In the twenty-first century, the world is no longer a stage - it's a console, and all the men and women are merely players.
My response: "But... that's just how grading works." And it's true. For those of us who use a numerical system to do our grading, we're - in essence - giving our students XP. Class is a game.
Here's the gist:
- the student earns XP for every assignment, class, comment, etc.
- the student "levels-up" throughout the semester from zero (F) to whatever grade they earn at the end of the course
- as they pass each "level," they can choose to stop earning XP, or to keep going (the only possible downside to this is that many classes cannot be passed until the final assignment, so a teacher may need to start with level "Z" and work up to show progress)
- the class is designed to outfit the student for the next game chapter: a 300-level course instead of a 100-level, the second semester in a sequence, or even that most terrifying of boss battles, "real life"
- students with the most XP are the best equipped to handle future game chapters - they have the weapons of knowledge, grammar, communication skills, organization, practical skills, etc.
- when students are being chosen for teams (jobs, grad schools, law schools), they are evaluated as desirable if they have more XP - and therefore better weapons and skill-sets - than other players
And the game doesn't end with the first job. Every aspect of our lives revolves around XP - you get hired based on your experience with a particular type of problem; you have training missions designed to give you a tiny bit of XP so that you can handle the next level; once you've done the lower-level jobs, you have enough XP to move up in the job-levels of a business. You can even choose to do something games don't let you (yet, though XBOX's gamer-points are coming close), and take that XP to another game!
The key thing is - we don't think about our "lives" that way.
In the twenty-first century, the world is no longer a stage - it's a console, and all the men and women are merely players.
Friday, February 19, 2010
Spines and Bindings
So this week's theme - aside from grading - has been books. Not a surprising thing, given my choice of profession, really. But I've been reading The Book on the Bookshelf by Henry Petroski, and we recently began a lengthy process of replacing all of our mismatched shelves with matching ones.
As a bibliophile, I like owning books. I like owning pretty books, old books, and books that I enjoyed reading. To this day, my favorite Christmas present was given to me at the age of 12, and is my facsimile copy of Shakespeare's first Folio. I adore that book (and it's damn useful in my line of study).
But I'm not the kind of book person who can't stand the thought of marring my books. I take care of my old books, yes, but my paperbacks... I use my books. I write in them. I dogear the pages. I highlight, underline, scribble, and circle. I break their spines. I use tape to hold together the covers when they start to tear and fall off. As far as I'm concerned, a pristine book is like a new stuffed animal - pretty, but clearly unloved.
Hypocritically, of course, I hate reading books other people have marked. Not because I'm appalled at the fact that they "defaced" a book, but because the underlining and words are not mine. I'm a selfish book-scribbler. I want the only words in the book (besides the author's, of course) to be mine.
In the heady days of Kindle and Nook and the iPad - to say nothing of the yet-to-be-released Overbook - some people say that books will become passe. That paper and ink will be replaced - as papyrus and vellum were - with a screen. I think that for most pleasure reading, devices like this will become common.
But for those of us in the business of books - for students, teachers, professors - the paper copies will continue to be needed. We need texts that cannot be accidentally deleted or erased due to a bug. We need our notes to survive. We need to remember the layout of the page, to be able to flip to a passage marked with a dogear or flag, to know what we thought when we read it the first or third or tenth time over.
And some of us will always crave the feel of a book in our hands. There is something comforting, something visceral about a book that no Kindle or Nook will ever match. Not to say that I won't buy one someday myself, but somehow a small ereader just isn't the same as a paperback. The thickness that tells us how much we have left to explore. The rough softness of paper pulp in our fingers. Even the black dust of ink-stain on our fingertips. Tangible words that don't disappear into black or white when we hit a button, but stay, quietly waiting, for our eyes to release them again.
As a bibliophile, I like owning books. I like owning pretty books, old books, and books that I enjoyed reading. To this day, my favorite Christmas present was given to me at the age of 12, and is my facsimile copy of Shakespeare's first Folio. I adore that book (and it's damn useful in my line of study).
But I'm not the kind of book person who can't stand the thought of marring my books. I take care of my old books, yes, but my paperbacks... I use my books. I write in them. I dogear the pages. I highlight, underline, scribble, and circle. I break their spines. I use tape to hold together the covers when they start to tear and fall off. As far as I'm concerned, a pristine book is like a new stuffed animal - pretty, but clearly unloved.
Hypocritically, of course, I hate reading books other people have marked. Not because I'm appalled at the fact that they "defaced" a book, but because the underlining and words are not mine. I'm a selfish book-scribbler. I want the only words in the book (besides the author's, of course) to be mine.
In the heady days of Kindle and Nook and the iPad - to say nothing of the yet-to-be-released Overbook - some people say that books will become passe. That paper and ink will be replaced - as papyrus and vellum were - with a screen. I think that for most pleasure reading, devices like this will become common.
But for those of us in the business of books - for students, teachers, professors - the paper copies will continue to be needed. We need texts that cannot be accidentally deleted or erased due to a bug. We need our notes to survive. We need to remember the layout of the page, to be able to flip to a passage marked with a dogear or flag, to know what we thought when we read it the first or third or tenth time over.
And some of us will always crave the feel of a book in our hands. There is something comforting, something visceral about a book that no Kindle or Nook will ever match. Not to say that I won't buy one someday myself, but somehow a small ereader just isn't the same as a paperback. The thickness that tells us how much we have left to explore. The rough softness of paper pulp in our fingers. Even the black dust of ink-stain on our fingertips. Tangible words that don't disappear into black or white when we hit a button, but stay, quietly waiting, for our eyes to release them again.
Thursday, January 07, 2010
God is a Secular Humanist?
My new book for today is John W. Loftus' Why I Became an Atheist, which details not only Loftus' conversion to and then from Christianity, but his arguments against it. I'm not terribly far into the book just yet, but came across a rather interesting argument I haven't seen in the many similar books I've read.
It concerns the notion of a moral compass and addresses the argument that atheists are amoral at best, immoral at worst because of their lack of belief in a higher being. Usually, authors point to the fact that this isn't the case in real life, which Loftus also does, but then he takes it further. He argues that if what God says is "good," then it isn't really objectively good, it's just an order.
This makes the whole concept of the goodness of God meaningless. If we think that the commands of God are good merely because he commands them, then his commands are, well, just his commands. We cannot call them good, for to call them good we'd have to have a standard above them to declare that they are indeed good commands. But on this theory they are just God's commands. God doesn't command us to do good things; he just commands us to do things.
...
If we say, on the other hand, that God commands what is right because it is right, then there is a higher standard of morality that is being appealed to, and if this is so, then there is a standard above God which is independent of him that makes his commands good. Rather than declaring what is good, now God recognizes what is good and commands us to do likewise. But where did this standard come from that is purportedly higher than God? If it exists, this moral standard is the real God. (39)
And if God follows a higher moral standard - presumably, since that is the standard Christians claim to follow, in the best general interests of humanity - then God is a humanist. It makes sense. Jesus' teachings are generally humanist. The teachings of our "great" preachers - Martin Luther King, Jr., Mother Theresa, etc. - are humanist. So, God is humanist.
Which means, what, exactly?
Well, it either means that God is rather self-destructive, since humans are notorious for destroying things that might control them, or it means that God is human. And if God is human (not literally, but in concept), then it means that God must be, as we are, mortal. So either we invented a deity that mimics us in appearance and morality, or Nietzsche is right and God is dead.
Either way, humanity - and humanism - is the closest thing we really have to a god. A thought both terrifying and inspiring. Because if we are god, then we'd better shape the hell up and start acting like it.
It concerns the notion of a moral compass and addresses the argument that atheists are amoral at best, immoral at worst because of their lack of belief in a higher being. Usually, authors point to the fact that this isn't the case in real life, which Loftus also does, but then he takes it further. He argues that if what God says is "good," then it isn't really objectively good, it's just an order.
This makes the whole concept of the goodness of God meaningless. If we think that the commands of God are good merely because he commands them, then his commands are, well, just his commands. We cannot call them good, for to call them good we'd have to have a standard above them to declare that they are indeed good commands. But on this theory they are just God's commands. God doesn't command us to do good things; he just commands us to do things.
...
If we say, on the other hand, that God commands what is right because it is right, then there is a higher standard of morality that is being appealed to, and if this is so, then there is a standard above God which is independent of him that makes his commands good. Rather than declaring what is good, now God recognizes what is good and commands us to do likewise. But where did this standard come from that is purportedly higher than God? If it exists, this moral standard is the real God. (39)
And if God follows a higher moral standard - presumably, since that is the standard Christians claim to follow, in the best general interests of humanity - then God is a humanist. It makes sense. Jesus' teachings are generally humanist. The teachings of our "great" preachers - Martin Luther King, Jr., Mother Theresa, etc. - are humanist. So, God is humanist.
Which means, what, exactly?
Well, it either means that God is rather self-destructive, since humans are notorious for destroying things that might control them, or it means that God is human. And if God is human (not literally, but in concept), then it means that God must be, as we are, mortal. So either we invented a deity that mimics us in appearance and morality, or Nietzsche is right and God is dead.
Either way, humanity - and humanism - is the closest thing we really have to a god. A thought both terrifying and inspiring. Because if we are god, then we'd better shape the hell up and start acting like it.
Monday, January 04, 2010
New Year, Old Directions
A few days ago, I received an email from a very distant relative who is attempting to put together his family genealogy, into which I apparently fall. After sending him the requested information, I became curious. I knew that I had some relatives from Germany, and rumors of someone from Ireland, but I didn't really know that much about them (other than one particular couple who had a suicide pact to hang themselves in their barn, which they did).
Since my husband was aware of a rather lengthy and fascinating history for his family that involved three French brothers, some Mohawk, and someone getting shot by the Iriquois, I started doing some digging on his side - both his last name and the story are unique enough to make for some decent results. I did manage to get all the way back to France on his father's side, through the French brothers (who are actually Quebequois and one of whom WAS shot by an Iriquois) to about 1630.
Then I attacked mine, using the information from the distant relative and a little more dredged up by my obliging mother. I made it to Ireland fairly quickly on my father's side, but also to Prussia. To Prussia AGAIN on my mother's side, to York in 1603, and then discovered that I have one string of relatives that is VERY old blood New England. From Milford, Connecticut (and one straggler from Boston, I kid you not). In 1605. I didn't know there even WAS a Connecticut in 1605. So I guess (as Jenno remarked) I was destined for New England. My family roots are here, after all.
But all this makes me think about why we, as a species, are so interested in our histories. Do I become at all different now that I know where my ancestors came from? Of course not. Am I fascinated by coincidences and connections (like the fact that my progenitors left England during the reign of James I - and were alive there at the same time as Shakespeare)? Of course I am. But fundamentally, none of this information makes me a different person.
What is perhaps more interesting (as K pointed out) is that we as a species seem to oscillate between a desire to trace our past and a desire to expunge it. Germans who had relatives in the Nazi army, for instance, do not talk about and wish not to remember that fact. Often, when our own ancestors immigrated to the United States, they changed their names (as in K's case) or eliminated connections with their past, choosing not to record names, dates, and facts about themselves or their parents. Which, naturally, is why it is so difficult for us to dig it up now.
But, ultimately, I think our desire to trace our past is a desire to know more about the clan that formed us, to understand why we were raised with the religion, the ideologies, the nation that we were, and to make informed decisions about where we want the road of life to take us. If we know where our forebears have been, we can decide to revisit their journeys in expectation of our own or to avoid them entirely. We can use our past as a lens through which to contextualize our present.
But it is, I think, also a way of wondering and researching what it would have been like if we had lived in that past - if we, our personalities and minds had been born in another age, another nation, another culture. And we can do this, imaginatively speaking, though our ancestors.
It is, however, important to remember, as we embark upon this time-travel, that the future is always more important than our past, because it is into the future that we are really traveling, and while we may carry the past and present with us, we should always remember to keep our eyes forward - lest we miss seeing that rock and stumble or fall.
Since my husband was aware of a rather lengthy and fascinating history for his family that involved three French brothers, some Mohawk, and someone getting shot by the Iriquois, I started doing some digging on his side - both his last name and the story are unique enough to make for some decent results. I did manage to get all the way back to France on his father's side, through the French brothers (who are actually Quebequois and one of whom WAS shot by an Iriquois) to about 1630.
Then I attacked mine, using the information from the distant relative and a little more dredged up by my obliging mother. I made it to Ireland fairly quickly on my father's side, but also to Prussia. To Prussia AGAIN on my mother's side, to York in 1603, and then discovered that I have one string of relatives that is VERY old blood New England. From Milford, Connecticut (and one straggler from Boston, I kid you not). In 1605. I didn't know there even WAS a Connecticut in 1605. So I guess (as Jenno remarked) I was destined for New England. My family roots are here, after all.
But all this makes me think about why we, as a species, are so interested in our histories. Do I become at all different now that I know where my ancestors came from? Of course not. Am I fascinated by coincidences and connections (like the fact that my progenitors left England during the reign of James I - and were alive there at the same time as Shakespeare)? Of course I am. But fundamentally, none of this information makes me a different person.
What is perhaps more interesting (as K pointed out) is that we as a species seem to oscillate between a desire to trace our past and a desire to expunge it. Germans who had relatives in the Nazi army, for instance, do not talk about and wish not to remember that fact. Often, when our own ancestors immigrated to the United States, they changed their names (as in K's case) or eliminated connections with their past, choosing not to record names, dates, and facts about themselves or their parents. Which, naturally, is why it is so difficult for us to dig it up now.
But, ultimately, I think our desire to trace our past is a desire to know more about the clan that formed us, to understand why we were raised with the religion, the ideologies, the nation that we were, and to make informed decisions about where we want the road of life to take us. If we know where our forebears have been, we can decide to revisit their journeys in expectation of our own or to avoid them entirely. We can use our past as a lens through which to contextualize our present.
But it is, I think, also a way of wondering and researching what it would have been like if we had lived in that past - if we, our personalities and minds had been born in another age, another nation, another culture. And we can do this, imaginatively speaking, though our ancestors.
It is, however, important to remember, as we embark upon this time-travel, that the future is always more important than our past, because it is into the future that we are really traveling, and while we may carry the past and present with us, we should always remember to keep our eyes forward - lest we miss seeing that rock and stumble or fall.
Subscribe to:
Posts (Atom)