Tody’s Take: Have Gamers’ Standards Lowered, Or Are We More Casual?
There we go, strike three of my headline controversy. That rhymed, it was unintentional. So, this time around I’m not focusing on any particular game, like I did with my Call of Duty and Assassin’s Creed rants. You see, many a time in my humble life (I am the most humble Tody on the planet), I have something to rant or rage about. But translating that into the written word rather than just ranting about it is often quite a task. One which I can often be lazy to do. But I love a good debate, and engaging in good conversation, so here I am with a strong topic that I will be tackling with two in-depth parts. Don’t worry, you won’t have to wait for some kind of sequel to this column. It’s all in here. No really.
What exactly will these two parts be about? Well, the first will delve into what I call “the sin of sequels” in games today, and this will serve as a build up to actually answering the headline question. However, if strong opinions offend you, then please feel welcome to read ahead. I really won’t mind. Rage at me, agree with me, love me, vow to kill me, it’s all the same thing to me really. Discussion. Debate. Potato. So, who can tell me. What exactly is this so-called weirdly-named sin of sequels that I’m referring to?
Of course you don’t. I haven’t told you yet!
The Sin Of Sequels
In today’s world, game developers make a crucial mistake with sequels, and what saddens me is that many gamers, and this includes critics, just don’t seem to mind it anymore. Perhaps it’s not really a mistake, but it’s just the safer and easier route. The sin of sequels is that today’s follow-ups fall victim to being just “more of the same” with some new stuff. And we accept it, hell even I do at times, despite being the highly critical beaver that I am. However, I strongly believe that sequels are supposed to be “similar and/or spiritually the same, but better and/or reinvented”. By “better” I don’t just mean a few points up, I mean way better. A sequel is meant to outperform and top what came before it. Most importantly, a sequel needs to inject new life into a game. It needs freshness. Reasons to stay connected to the same world. Now you might be wondering what on earth I’m on about. If a game is more of the same but with new stuff, isn’t it technically better? Like Assassin’s Creed: Revelations compared to Brotherhood? The answer to that is quite simple. No, it’s not. A game isn’t only made up of how many different things you can do with the buttons you press. Prince of Persia: Warrior Within has better gameplay than Sands of Time, but many consider the latter to be a better overall experience.
More of the same isn’t always a bad thing, sometimes it’s what gamers actually want. However, the risk of plain old more of the same is that you could exhaust your own mechanics, like in the case of Revelations, mess with a good story, again like Brotherhood and Revelations, or fail to impress and run out of steam. I discussed this at length in my column about Assassin’s Creed. And of course there is the other extreme to be careful of, which is changing too much so that the original material becomes unrecognisable – which is where keeping your game spiritually the same comes in. This is exactly why so many franchises go downhill, because their sequels can’t find the right balance. Let me give you some examples. The original Dead Space was a good game, and perhaps fans wanted more of it. But then comes Dead Space 2, which is the exact same thing, except without the freshness and story intrigue. I believe Dead Space 2 was a mediocre game, and a bad sequel. On the other end of the spectrum, to give an example of changing too much, take something like Crysis 2. It took away the level of freedom, player power, mystery and diversity that the original game had, and turned it into a generic, constricted, cliched and linear affair. Even though Crysis 1 and 2 are very different, which is a start, the latter didn’t breathe new life into the franchise, but exhausted the gameplay.
Now to give you examples of good sequels. To start you right up there, I believe that the Metal Gear Solid franchise has some of the best examples of sequels in the entire industry. No matter what you think of it, each game was highly innovative, to the point that you played a completely different game with each sequel, but all of them stayed true to the series’ roots and outdid what came before it. But let’s look at something more recent. Batman: Arkham Asylum and Batman: Arkham City are a great example of how sequels should be done, and let me tell you why. Both games are awesome, but personally I preferred the overall experience of Asylum even though City had much better gameplay. Despite that, City was very different to Asylum, and the content and scope was much broader, yet it was completely recognisable as the same game at heart. But, most importantly, it was just fresh in so many ways, and that was its strength. Now, many of you will probably point towards games like the Uncharted series and Mass Effect trilogy, and argue that these games remained unchanged in their later sequels, but actually these are also examples of good sequels. If you look at it, from game to game the scale went up significantly, many past issues were plugged and it was clear that the games were pushing their own boundaries. Mass Effect 3’s scale is tremendously larger than its predecessors, and the same can be said about Uncharted 3: Drake’s Deception, or games like Dark Souls and God of War 3.
Those are examples of good sequels, and what separates them from something like Assassin’s Creed: Revelations, is that it never felt like a dramatic step up, but rather stayed on the same tune. Now that I’ve explained the core issue I have with sequels, let’s get on to answering the title question, because I believe it’s an important one that gamers should be getting serious about, as illustrated below.
To Answer The Question
You must be realising right now that I’m tackling an impossible question, because there’s just so much subjectivity and opinion involved in something like this. However, there are certain aspects of it that can be disputed, and many things I just don’t like about some gamers and developers of today. There exists two opposing forces in the gaming world that I see constantly clashing, and for arguments sake let’s call them the “Cavie” and the “Tody”. The Cavie lives to worship their favourite developer and defend what they do to the death, no matter how wrong, and basically gives rise to the concept I refer to as “absolute developer power”, and they come in the form of fanboys. Absolute developer power refers to fans’ tendency to just accept or not speak out against anything a developer does wrong, due to a love of what they do. That must sound familiar. What about the Tody? Well, we can call them whiners, and they’re haters and highly critical of games and developers, seemingly only able to be satisfied with perfection. At the moment, there is no balance between the two, and I feel it’s causing damage to the gaming industry because it’s clear to me that the Cavie’s far outnumber the Tody’s in this war.
I’m not saying that I want everyone to be overly critical and harsh and so serious all the time, like I am often enough. I’m quite content being the only Tody around who I know of. But I want gamers to just expect more from an industry that is capable of producing so much. An industry that is capable of incredible things, such as pushing boundaries in entertainment and creating entirely new forms and genres of entertainment. I’m seeing tremendous hype in the industry, but I’m not seeing the balance of high expectations that should come with it. Absolute developer power is a big factor in causing this, and I think I’ve figured out where this mentality has come from. But we’ll get to that as this column progresses. Yes, whiners can be seriously annoying, but I believe at the end of the day it’s a two way street. It always is. Game developers would be nothing without us, and we would be so much worse off without them. We make their games famous, and we should want their games to be the best they can be as a result. Don’t we then have the right to have high expectations and complain if we’re unhappy?
What am I talking about? Expecting a higher quality. I’m not talking about how much you can “enjoy” a game, I’m talking about pure quality, something that can to a certain degree actually be measured. I enjoyed the original Star Wars: The Force Unleashed, being a fan, but is it a great game? No, not by a long shot. On the reverse, I don’t really enjoy Gears of War or Halo, but I’ll never deny that they are games of extremely high quality. Goodness and enjoyment are subjective, but quality can be seen in the facts. So strictly speaking about quality, in such an amazing and exciting time like this, why exactly have gamers started lowering expectations, or why have gamers become more casual? Does marketing and media influence play that big a role, and do inflated review scores really have that big an impact? Of course they do, but a key issue is I think the two link up. Because gaming is such a globally common hobby these days, and has become a natural part of life, there is naturally going to be a gigantic increase in the number of casual gamers, and they’re mixing in with the hardcore crowd. This should result in a good thing, that there is now a much larger audience to please, but often this results in neglecting of the hardcore gamers in favour of pleasing the casual gamers – and we see this so often with sequels trying to be “more accessible” to newcomers, because they’re a bigger market.
Let’s take a recent example to illustrate my problem. Today, a game like Dragon Age 2 would get praised like it’s the next Blizzard game or something, despite the fact that it’s a game of really low quality. I’m not trying to be funny, but really think about it. I’m not asking if you enjoyed Dragon Age 2, but can it be a game of high quality with its basic and hugely repetitive combat, complete lack of variety, repeated loading screens, empty story, robotic dialogue system, generic characters, graphically unimpressive and shockingly copy and pasted environments, underwhelming scope, and pathetic party system with no consequences for making friends or enemies of your allies? Alright, fair enough if you don’t agree with all of those flaws, but then let’s make it clearer. Can we consider Dragon Age 2 to be high quality, which critic ratings suggest, when RPGs infinitely better than it came out in the same year? What about The Elder Scrolls V: Skyrim, Dark Souls and The Witcher 2? Or how about Mass Effect and the more recent Kingdoms of Amalur: Reckoning? This is one of my biggest problems.
Dragon Age 2, as a great example for this, was praised excessively at its launch and received high critical ratings when it didn’t fully deserve it. I’m willing to bet that if Dragon Age 2 had been called “The Adventures of Hawke” or “Dragon Slayer” and had no ties to BioWare or Dragon Age: Origins, it would have gotten destroyed publicly and critically. This brings me to the next important point in answering the title question. Does name and developer status really change perception that much? Definitely. I think it plays a key part in what I call “big game immunity”. The concept is pretty self-explanatory, but big game immunity is generally not punishing big games for flaws you would severely punish other smaller games for. It’s context, surely, but Dragon Age 2 is a good example because, for argument’s sake, copy and pasted environments like that would be received with disgust in a non-big game. It’s the truth.
And fans wonder why sequels change so little, or “dumb” down more hardcore elements. Or they wonder why franchises end up exhausting their own mechanics and growing old before their time. Game development is exceptionally costly, that’s a given, but it can also be highly profitable in success. And like the solution to any problem, it begins with acknowledgment. A voice can only influence or create any change if it’s heard in unison. If you’re going to shrug and say “who cares” and you’re happy with things exactly as they are, then obviously this column isn’t meant for you is it? I’ve spoken at length about the problem with game critics in a previous column, and ideally, no game and no developer should be free from criticism. The best opinion to trust is your own, but I think die-hard fans are entitled to something, and there should be a greater balance between gamers and developers. Our objective should be to help game developers make the best product, and theirs should be to actually make it. I’ve seen this working with developers like Naughty Dog, CD Projekt RED, Valve, Blizzard and many others, and the key here is there needs to be good communication and a good relationship between us and them. We’re allowed to have high expectations because of what we give back.
To finally end off, I have only really given you the big-time introduction to this topic. It’s impossible to cover it in one column. But the answer to the title question, in case it wasn’t made clear, is that gamers’ standards have not necessarily lowered, but we have a much higher degree of casual blood in us because of the scope of today’s gaming audience. The result is a less serious approach to games. People like me are extremely passionate about gaming and have high expectations, and you know there’s no wrong answer here. But the point of this column is ultimately two things:
- There needs to be a better understanding and relationship between gamers and developers. We make their games famous and enable them to make more games, and they provide us with great entertainment and help push the entertainment industry forward. It’s a two-way street.
- Gamers should expect more from an industry that is capable of doing so much. Not every game needs to be a revolution, but at the least it should boast good quality, especially if it’s a big title.
As for the grand conclusion to this long rant, the discussion and thought is now up to you. I wrote this with the intention of provoking that. I’ve opened up the topic, so let the rage begin.