2011-11-11

A brief moment of videogame history or Seriously, devs... WTF?

An old foe has returned to our lands, brothers and sisters. It's been lurking in the shadows for quite a while but has recently begun to take over completely. You know what I am talking about.

In the age before time (I hear people call them the 90's) there was a transition between games that were very simplistic by nature in an arcade-style fashion and a new line of games where suddenly the programmed cutscene came along. It was quite remarkable by the time with the advancement of graphical engines. One day the developers could even put in real live video-sequences in games! As many know however this was quite horrible because actors are in general not as cool as video-game characters and cheesiness dropped out of style in the late 80's. Some adventuregames had a lot of cutscenes and often they were used to forwards the storyline or given as a reward for finishing a quest of some kind. As long as they were used sparingly they were accepted as a new creative medium by the consumers while games that were primarily based on cutscenes, like half of the library to the Sega CD, failed horribly.

Now time went on and so did the videogame market. In the year of our lord (FSM) 1998 gamers were blessed with Half-Life which got a LOT of recognition because of the lack of non-interactive cutscenes. Instead, the game was based on scripted events in which the player had full control of himself and could make (often fatal) choices. It was revolutionizing. In the same way as Myst revolutionized gaming by putting the gamer as a person in a completely different world, Half-Life brought us interactive storytelling where you were smack-dab in the action and able to do as you saw fit. You weren't just playing as Gordon Freeman; you truly were Gordon Freeman. This was followed by games like System Shock 2 where the immersion into the game world was top priority. This was the future... or so we thought.

Fast-forward the tape to the situation we have today. Pick up the nearest high-budget title with a first-person or third-person perspective. How are cutscenes used in that game?
Look at the singleplayer campaign of BF3. It is riddled with cutscenes and scripted events that you cannot do much about. Look at Deus Ex 3. Whenever you do something really cool like a takedown, the camera perspective is switched to 3rd person and we get a cutscene. Looking at the gameplay videos of the new Elder Scrolls game, guess what, you get cutscenes. This increase in cutscenes and scripted events is slowly stripping us of the immersion that games like Half-Life gave us.

As soon as the camera flies off to a scripted position to show my character do something cool, it isn't me who is doing it. The feeling of the game is lost rapidly especially with the repeat of the same meaningless cutscene that was pretty neat the first two times but after that only becomes a chore. If you're in a game where you have been able to create your own character or even customize the basic character you want to be able to feel with that character. That feeling can be multiplied several times if the "fourth wall" isn't broken by stupid cutscenes. A game that should get recognition by doing this the correct way is Fallout 3. With it's special attack-system we get some scripted cutscene action but it is INTERACTIVE (the word of the day, folks) and beyond that the few cutscenes that occur are as they were in Half-Life.

Either this ridiculous cutscene nonsense has come from the simplification of games in general, the increase in focus of the inferior gamepad-control system or just plain laziness from developers. Either way, we're getting screwed over by developers, Sega CD-style, and if they manage to ruin Thief 4 with that shit I am going to barf.

So please, stop. If I want to see a movie I go get a bloody movie. If I want to play a game I want a game not an interactive movie!

Inga kommentarer: