Could we see another video game crash like the Great One of 1983? With dozens of successful game franchises established and a hobby pulling in billions of dollars a year, it seems unlikely. Certainly, the factors which led to the original crash–conceptions of poor software quality, overcrowding of the market and the like–will not be a problem. And, like before, any such crash will likely only be a temporary issue and not something that permanently destroys the industry. But the threat, the posssibility exists of at least a mild economic meltdown within the video game business. And unlike a business akin to video rentals where the basic model is slowly being rendered obsolete, the threat comes from within video games themselves.
Games are becoming too big. I don’t mean too big in the sense of literally containing too much content; such a thing probably isn’t possible and titles as far back as Ultima I have offered dozens of hours of gameplay instead of the less-than-an-hour arcade style prominent in so many games of the day. I mean that the amount of resources required to make a game has gradually become immense, with credits that have, on occasion made those of the average summer blockbuster movie look quaint by comparison. If your game is set in a 3D realm and doesn’t support an advanced ragdoll physics engine and Goraud mip-mapping (or whatever the technical equivalent is) then you’re seen as quite primitive. You might need an orchestra to record your soundtrack. A research team might be responsible for evaluating the attributes of every item in your game as they relate to real-life, or cataloging real-life locations for the sake of your senior art director and his team.
Don’t get me wrong, I’m not opposed to games with great aesthetics. Indeed, it wasn’t until the original Soul Calibur blew my mind in 1999 that I had much interest in 3D as a gaming medium (I still prefer sprites.) The seemingly-inevitable trend, however, is for commercial games to require more and more staff which, as a result, demands more and more revenue to make games worthwhile. Stories like that of what happened to developer Seven45 following the release of Power Gig help to accentuate that–a classic tale of creating a game that, on its own merits (not compared to others in its genre) or a few years earlier would have flourished, but now simply didn’t make the cut and, failing to recover considerable resource costs, the worst fate imaginable befalls the developing house.
All this is one reason that more highly successful games at this point are sequels as opposed to franchise-establishing games. Of the top 10 selling games of 2010 in Japan, only arguably one (Wii Party, which could be considered an extension of the Mario Party line) was a franchise first instead of a sequel. Overall video game sales aren’t much different: of the top 20 selling games of 2010 across multiplatforms, only 3 games (Wii Sports, Just Dance and Kinect Adventures) were not sequels. The development costs demand games that will guarantee more profits; sequels to established high-selling games fulfill that. I’m not the first person to realize that sequels are a growing problem.
Can we blame companies for making what they know will sell? As consumers, are we responsible for feeding the issue by primarily buying sequels? I’m not entirely innocent. I just finished spending five days playing through Dead Rising 2. Prior to that, I spent a month delving back into Fallout 3, and I spent plenty of time on Bioshock 2 and Red Dead Redemption as well. The games I probably put more time into than any others were Rock Band 2 and 3. I also have Mass Effect 2, Assassin’s Creed 2 and Brotherhood, Final Fantasy XIII and Halo Wars, among others, waiting to be played. I played original properties where they interested me–Blur and Borderlands, for example.
But the industry is changing. Slowly, very slowly, a schism is forming between larger, established development houses and their publishers, and the indie game movement. Created by smaller houses, sometimes just a single person, these games harken back to the days when some of my childhood favorites like Archon, Space Taxi and Krakout were created by 3 or less people, and even games acknowledged as seminal classics like Metroid, the Legend of Zelda and Metal Gear (WARNING: all three links go to the game endings so as to show the credits) were put together by a dozen people or less–an insane concept for a big-box game in today’s age. Indie games have the benefits of low cost and easy distribution which could, given time, turn tables on larger companies, offering games of increasing cost (I already estimated here that games for the next generation of systems will run at least $70); but if the balance shifts to a digital distribution system with no sort of quality check in place, what do we face except potentially a repeat of what we saw in 1982-1983, when rampant overpopulation of software without a proper quality oversight led to gems such as this (which I owned and played) and, yes, even this (which I thankfully neither owned nor played.) Already, we can see some instances where, left to their own devices, indie game developers demonstrate the need for some editorial retraint. Those are just examples of games where the subjects are questionable; there are likely plenty of cases where the programming quality of the games themselves are substandard. If we move to a framework of mass-released items of potentially poor quality, how different from 1983 is that?
Of course, this could also all just be useless conjecture, but it’s certain that if digital distribution eventually becomes the new mainstream method as has been proposed by many, a responsibility will remain to make sure that just because people can release something easy to a large audience, it doesn’t mean they should. Unlike 1983, video gaming now has a 30+ year history which has helped to establish itself as an art form as well as entertainment. Let’s not set ourselves back another three decades for the sake of base indulgences.