Evening, folks. My name is Halfwit Hero, and I’m a gamer.

That’s perhaps the most asinine confession anyone ever made here on TAY, but its a confession nonetheless, and I hope it will serve as a suitable way to introduce myself to this community, which I’ve vaguely lurked for some years now.

Advertisement

In all honesty, I strongly believe I may be addicted to gaming, and have been since my parents first introduced the slightly maladjusted primary-school-me to the wonders of 16-bit graphics via a Sega Mega Drive back in 1993. I’m fully aware that the time I’ve spent touring the Emerald Hill Zone, or Renaissance Italy, or Velen, or Thedas, or Inaba (oh God, especially Inaba) could perhaps have been put to better use in more productive pursuits. However, despite my self-awareness, I know I won’t stop picking up the controller during downtime to guiltily sink a couple more hours into my latest slice of escapism.

But should I really feel guilty for it?

Advertisement

The answer, as I’m sure many of you will assert, is “no”. I know it is. But I still can’t shake that vague sense of unease that comes with powering up my PS4 or Wii U. Yesterday, Calum Marsh, Arts and Culture reporter for Canada’s National Post, published one of his regular Important Questions columns; titled “Has Any Videogame Ever Qualified as Art?”. It’s a short article, which spends most of its word-count (quite rightly) applauding Silent Hill 2 for dispensing with some of the more common trappings of the video-game, but Marsh also managed to perfectly summarise my own remorsefully held perspective on gaming: “a faint halo of shame still clings to the pastime”.

Advertisement

But why?

For more years than I care to remember, I’ve quietly held the view that games - or at least the vast majority of them - more than qualify as a legitimate artform. I enjoyed watching Mark Rylance manoeuvre his way around the Tudor court in Wolf Hall just as much as witnessing the Hero of Ferelden manipulate the nobles of the Landsmeet against Loghain in Dragon Age: Origins. Both featured tightly-plotted, absorbing political drama, filled with well-motivated characters and believable twists, but Hilary Mantel won the Booker prize for Wolf Hall, whereas David Gaider et al had to make do with less prestigious accolades. The script for 2008's Juno was considered witty enough to bag the film an Academy Award for Best Original Screenplay. Life is Strange told a similar, female-centric coming-of-age tale, realised in just as affecting a manner as its silver-screen counterpart. The pair are even plagued by the same problems: namely, they’re bogged down with the same kind of contrived youth-speak dialogue, lending them the unfortunate shelf-life of an already-soggy lettuce.

But these are somewhat subjective, surface-level comparisons, ones that the late great film critic Roger Ebert - the Great Enemy of gaming whose name is invariably checked whenever the “games as art” debate resurfaces - would inevitably dismiss as moot points.

More than six years have passed since Ebert famously claimed “Video Games Can Never Be Art”, provoking the ire of gamers worldwide. The essay that followed that statement primarily served as a counter-argument to thatgamecompany co-founder Kellee Santiago’s lively 2009 TED lecture “An Argument for Game Artistry”. While Ebert’s response maintains a gentlemanly respect for Santiago’s enthusiastic approach to the subject, his tone remains condescending, and based on his step-by-step rebuttals, he unfortunately seems to miss the point. For anyone who has yet to encounter the famous journal entry, it can be found here, with YouTube footage of Santiago’s lecture embedded too.

Advertisement

While Ebert’s essay has its flaws, it still highlights a very legitimate problem video-games face in being accepted as a genuine, worthy artform. That is, what we know as gaming remains constricted by the language used to describe it. Games are understood as competitive pastimes, which by their very nature have “winners”, “losers” and perhaps even “runners-up”. Naturally, this is still true of significant number of video-games, but the burgeoning market of A-List properties - not to mention the exploding indie scene - are dominated by single-player, narrative-focused experiences, in which high-scores, trophies and achievements are an ancillary bonus, rather like a pop-quiz that might be hidden among the special features that accompany a movie’s DVD release. In the past, watching Sonic spiral down from orbit to land on Tails’ biplane for a triumphant fly-by was considered a reward for mastering the game’s systems, and perhaps posed a genuine challenge to players, but in 2017, games are a gentler affair. With the obvious exception of the likes of entries in the Dark Souls series, the vast majority of modern titles are expected to yield their end credits without giving gamers too much trouble.

It is here where Ebert’s understanding of the medium (circa 2010) failed. By the time he’d made his argument, titles such as Braid, Flower, Shadow of the Colossus and countless others had already made an indelible mark on the industry and the format in general, paving the way for increasingly ambitious releases such as Dear Esther, The Stanley Parable, Firewatch, Everybody’s Gone to the Rapture and many more. What Ebert understood to be the defining quality of a video-game - the capacity to “win” it - had already become an obsolete notion. Flower itself is cited in Santiago’s argument for game artistry, in which she identifies its focus on “the balance of elements of the urban and the natural”. Sadly, Ebert’s response was to criticise Flower’s effectiveness as a game by asking the unanswered (and indeed unanswerable) question of whether the player “wins” by striking the “right” balance.

Advertisement

I could go on deconstructing Ebert’s argument, but it feels a little morbid considering how old the essay is now, how dead the author is, and how robustly other commentators must have ripped it apart in the past. However, in my experience, Ebert’s view of games as things that are “played” or “won” is still highly common beyond the dedicated audience. My own wife finds it difficult to understand that the experience of working through a finely crafted narrative or digital environment can be as rewarding as reading a particularly evocative piece of poetry. It’s this perception bias, coupled with the language we still habitually use to describe the “gaming” experience, that ends up denigrating the medium.

If we’re ever going to challenge the accepted view, which seems to - at best - tolerate game designers’ assertation that they’re artists, will we need to completely overhaul the vocabulary we use to describe games? Should we habitually refer to them as “interactive narratives” or “storytelling experiences”? I’d prefer not to. That way lies comical, cartoonish pretention. Perhaps all it will really take to gain this medium the recognition it so desperately deserves is time. Jim Davies, associate professor at Carlton University’s Institute of Cognitive Science, addresses this indirectly in his article “Video Games Do Guilt Better Than Any Other Artform”. He compares the progress of acceptance for video-games to the same journey motion pictures made from the late 1890s through to the beginning of the 1920s. “If it took 30 years for [motion picture technology] to become acceptable as art, then perhaps video games are more or less right on schedule”.

Perhaps until that allotted adjustment period has passed, we early adopters and connoisseurs will just have to bide our time and wait for the rest of the world to catch up...