Life, The Universe, And Gaming: When Did Glitches Become Acceptable In Our Games?
I used to like glitches, a long time ago. In a galaxy far, far away. I would be playing a game at some point in time, read about some crazy glitch in it and then attempt to recreate that glitch for myself. It was never about actually seeing the glitch, I could YouTube it if I needed that much, but I just wanted to be able to say I broke a game by doing something or the other.
Nowadays, I don’t even have to try.
When did this become a thing?
By now you’ve already read more than enough on Assassin’s Creed III with all of its performance issues, whether it’s a shoddy frame rate, broken animations or other, far worse things, such as a mission that refuses to register as complete or another that breaks some ways into it, forcing either a reload or a more drastic action which hopefully does not involve starting an entirely new game.
I personally never experienced any of this, I must say. The worst it got for me was jumping into a non-climbable tree because lol, only to have Connor pause in mid-air for a few seconds in his falling motion, then land in front of the tree and die because the game took that air time to be falling distance. Also, there was the one time I mounted a horse — not like that — and it refused to move out of where it was, so it glitched out of existence for a second before suddenly resuming life as an equestrian mammal, somewhere on the Frontier.
Still, regardless of my own experiences, I would be blind and ignorant to disregard the complaints of thousands of gamers, furious at times over having lost entire saves to some or the other glitch, and that’s when the game was actually running smoothly for them. Apparently sometimes it was just completely unplayable. When did we get to the flagship franchise of a high profile publisher, being so broken?
Let’s move on to another flagship franchise from a high profile publisher and talk about Call of Duty: Black Ops II. While attempting to play it as part of my review, last week, there was a point in the game where the screen suddenly flickered to a blinding white, remaining that way for the remainder of a cutscene before returning to normal afterwards. Granted this isn’t as bad as some of the glitches in Treyarch’s previous games, but why on Earth is it even a thing here?
Finally, while I haven’t played Hitman: Absolution myself, I’ve been told that upon entering a new area sometimes, Agent 47’s character model spends a few seconds in its arms-outstretched design mode, before he reverts to a more natural position. Somewhat more alarmingly, Rudolf’s game progress has been wiped out entirely, twice, at the exact same point over five hours in, after the game freezes while saving a checkpoint.
I can understand when it’s a game like Skyrim where so much of the game might go entirely untouched by a player and there’s just so much that it’s impossible to cover everything in play-testing — although when you mess up the main quest (Esbern, anyone?) then you’re asking for trouble, Bethesda — but not a singleplayer campaign that lasts under ten hours. Definitely not in a flagship franchise. At least with Bethesda, they can lean on their open world excuse. You might also argue the case of Assassin’s Creed III, which considering how many Ubisoft studios worked on it, is something of a mongrel game, or a whore, either way. What does Treyarch have to lean on? What of IO Interactive? And what of the countless others I’ve not mentioned?
Glitches seem almost too commonplace these days, and they’re not the fun sort of glitches from yester-year, which change the world to inverted colours or cause you to drop into an empty box with a single pixel as the floor — such a glitch is actually an Easter Egg in Borderlands 2 — but now we have glitches which break your game, delete your saves and sometimes even render your console unusable. The latter applies to a design issue in L.A. Noire, which causes overheating on the PlayStation 3 if a specific update is not installed.
I’m not sure what to blame, here. Is it developer laziness? Bad play-testing because publishers can do as they please as long you’re willing to pay them for it? Or is it the beginnings of gaming over-saturation, where quality falls to quantity?
The latter certainly serves for a compelling argument. Developers are often pushed for time and as a result, rush their games and don’t afford them enough play-testing, and glitches then pop up all over the place, forcing day-one patches and the like. You might also argue that publishers know you’re going to buy their games so they slack on play-testing to save money and then just make developers release day-one patches to fix whatever issues crop up. This argument might also explain such things as on-disc DLC and the myriad Mario releases, where as long as you pay a publisher money for a game with such things, they will believe that they can sell you more of it.
The only way I can think of to deal with this is to hit publishers where it hurts, and speak with your wallets. I know this won’t even make a dent in your minds, and you will go on buying broken game after broken game, then complaining about it afterwards, but sometimes it’s just nice to know we all agree that there is a problem. Complaining also helps, I will grant. It’s the one time when it’s acceptable for a gamer to bitch and moan. You ought not to expect a game that changes your life, but you bloody well ought to expect a game that functions as it should.
So yeah. Can we maybe make more of a fuss about glitches in games before it becomes too commonplace? I for one would really rather not celebrate game-breaking glitches that force me to undo countless hours of gaming, or worse. By all means, tell me your own glitch stories in the comments. It would certainly make for a refreshing change in pace, from the usual commentors who challenge my opinions.