If you aren't familiar with Extra Credits, I can't recommend it highly enough; it's hosted on the Penny Arcade site and has weekly animated shorts discussing some aspect of games from a very academic, intelligent perspective. They have more and better critiques of games, from an artistic or a social perspective, than anything similar I've seen. Their most recent episode addressed the difficulty of maintaining access to games after the technologies they run on have gone away. This I thought tied in nicely with my post from earlier in the week about how technology guides our games.
The show addressed the question of what we stand to gain if we have access to older games, and what we lose in contrast to other media when an old game is lost. A question invited by that one, though, is: What does the heritage of gaming look like now?
If you're anything like me, when you think about how games used to be you probably think about what games were like when you first started playing. I remember the Super Nintendo of my youth, the N64, and so on. If I encountered games that were older than those, it was usually because I was seeking out an earlier entry in a franchise I loved - hunting down Metroid after I beat Super Metroid, trying to crawl through the original Zelda, and so on. I don't think this experience is unique to me; I think people sometimes inherit systems and games from friends and family, but I'd expect that, except in rare cases, that will be from people their age or a little older, so likely they're not reaching that far back. If it's the case that most gamers haven't really inherited much themselves - they've only picked up the new, observed what's happened in their lifetime or just before - then the task of themselves becoming caretakers of the medium seems more challenging. When you yourself have been the recipient of a particular kind of cultural knowledge, you know better how to pass it down; without that experience, it's harder.
One of the biggest considerations, though, is that there isn't really a strong structure for passing down video games yet. You can find a college offering a degree in Engilsh literature almost anywhere; art or art history degrees, and cinema/filmography/film history degrees aren't much harder to find. It's certainly not impossible to find courses in video game development, but video game history? Almost impossible. Imagine taking a class on first-person shooters, on shifting views of race and social structures in the GTA series, on femininity in games. The idea of this would be insane to many (most?) people who don't play games, and probably a high percentage of those who do. And that right there is a serious consideration. If games aren't treated as art, as being vessels of important cultural information worthy of study, they will not be preserved. The same goes for the craft as well as the art: if we don't appreciate design choices in older games that used more limited technology, we're going to miss some important lessons. You may scoff at Super Metroid's outdated graphics, but you can learn a lot by watching how the controls work, or how the camera follows you, that you'd otherwise have to figure out on your own. Seeing good - and bad - choices made by earlier games, seeing how those games evolved as new technologies appeared, can help us make choices that will help our games survive longer. The more we as gamers think about games in this way, the more likely others will find that kind of value in them. When there's a need perceived to preserve these games, the ability to do so will follow.
If you're not the kind of person who has much patience for literature as an academic pursuit, this argument may not hold water for you. I've met a number of people who think that dedicating any amount of time or energy to the study of films or novels or video games is a waste of time because these aren't necessarily practical skills. And from a vocational perspective, that might be reasonable. But we have to remember that as historical objects, works of fiction can contain pretty densely packed information about the society they exist in. These objects are worthy of study because of what they tell us about ourselves; if we don't want to let that information ever last more than a few hardware life cycles, the medium is going to lose a lot of continuity, and our games will suffer for it.
Besides - I don't want to wake up one day, any day, and realize I won't ever be able to play the original Katamari Damacy again. I'm sure I'm not the only one who feels this way.
I was in Rochester last month and visited the International Center for the History of Electronic Games at the Strong Museum of Play, which is also considering these issues. I'd recommend a visit if you're in the area sometime.
ReplyDeleteYou may have heard of this, but MOMA in New York is actually set to open a new exhibit about videogames. It sounds like they're just trying to set up a representative sample of important titles across the few-decades life of the medium, so it will be anything but comprehensive. But still: this is a big deal. (Just off the top of my head: I read they'll be "showing" Pac Man, Tetris, Myst, Sim City, and The Sims just to name a few.)
ReplyDelete