The 'Shroom:Issue LXI/Dippy's Matilda

From the Super Mario Wiki, the Mario encyclopedia
< The 'Shroom:Issue LXI
Revision as of 18:45, May 20, 2017 by PorpleBot (talk | contribs) (Robot: Automated text replacement (-{{llquote +{{quote2))
Jump to navigationJump to search

Dippy's Matilda

by Crocodile Dippy (talk)


Hey guys, good to be writing something relaxed for a change. If you've come here expecting to see a big, scary monster spout vulgar nonsense and harsh criticism, then don't worry; this is my serious face room, and I'm more than happy to hear feedback about what you do and don't agree on, any counter arguments you might have, inquiries, or just general chat, maybe about other issues you'd like me to discuss somewhere down the track. So not to stretch the intro on for too long, let's get started.


In case you weren't able to tell from my review this month, I don't like cutscenes, at least as a means to deliver narrative. I know some people really like them and think of them in the most positive light, but the way most games approach cutscenes, and by extension narrative and player engagement, is just… wrong. While I don't see the complete removal of cutscenes to be necessarily a good thing, the way they're currently used is definitely taking the medium in a very bad and harmful direction. So for my first entry into this section, I'd like to observe why the current usage of cutscenes is detrimental to the overall interactive experience of games, how the relation between story and gameplay is supposed to work, and when cutscenes are beneficial.

I can understand why from a marketing perspective developers would want to continue using pre-rendered cutscenes; they look good. Marketing relies entirely on making the product look as fancy and high-quality as possible, and it's just not possible to make real-time models look as sophisticated or good-looking as pre-rendered models; thus many developers employ them as a quick and easy promotional tool, regardless of whether those cutscenes tell the audience anything about the actual game or not. This links into another issue of why developers shouldn't just rely on trailers to advertise their games and should start looking into using demos more frequently, but I'll talk about that some other day. It's also a good chance for the art designers to truly bring out their artistic visions without having to sacrifice the quality to the monster of hardware limitations. I'm not sure what the appeal is for the average gamer, but I'm willing to take a very loose guess that they simply enjoy films and probably get excited from the novelty of a game they're enjoying displaying something akin to movies. Correct me if I'm wrong, I'm just making wild and stupid assumptions there.

“People know me as creating story-driven video games so assume that I must love story video games. Generally I'm not a huge fan of story video games as that's the problem I have with them - I want to play. I don't want to sit down and be told the story. I want to interact with the story.”
Ken Levine, co-founder of Irrational Games
Games like Final Fantasy XIII and Metal Gear Solid 4 spend more time on cutscenes than actual gameplay.

The thing is, video games are an inherently interactive medium which means we play them, not watch them. While I don't think any gamer will ever argue this, there still seems to be a big debate both on how much gameplay is actually necessary to constitute an interactive experience, and how story and narrative is to be presented in games. Some argue that the only way to present a good story is taking control away from the player, while others argue that the player should be involved in the story and their actions should push the events along. The former argument strikes me as counterproductive, since excluding the player denigrates the very purpose of the medium; to directly involve and engage the player. Once you've lost focus of that central element of video games, your entire game falls apart, and we wind up with predicaments such as games that are both a half-assed video game and a half-assed movie. Delegating narrative to cutscenes cordons off story and gameplay as two entirely separate experiences, when ideally they should be one and the same; this segregation will more often than not turn the game into a string of disjointed and unconnected gameplay segments loosely bridged by a story that's not really integral to the player's gaming experience. While it would be foolish of me to say that it's impossible to get immersed in a story told primarily through cutscenes, it's still not ideal to exclude the player from the direct progression of the story, even a fully linear one; otherwise they're essentially just waiting for the plot to wake up and stop blocking the road to the next interactive moment. It feels like an interruption, even if you've found yourself absorbed in the story, and it seems to me that the industry can do a lot better than this to make all factors flow perfectly as one.

“You know the thing that doesn't work for me in these games are the little movies where they attempt to tell a story in between the playable levels. That's where there hasn't been a synergy between storytelling and gaming. They go to a lot of trouble to do these motion-capture movies that explain the characters. And then the second the game is returned to you and it's under your control, you forget everything the interstitials are trying to impact you with, and you just go back to shooting things.”
Steven Spielberg

The goal is for gameplay and story to support and enrich – not oppose – one another, and it's hard for them to back each other up when they're separated by a thick wall. Contradictions have always been one of the most blatantly harmful results of relying on cutscenes to tell the story; we've all seen it before, a character has been killed or near-mortally wounded by a bullet in a cutscene when they've been soaking them up in the actual gameplay, or the characters are displaying acrobatics and skills that you're never actually able to use in the game itself. Games like Metal Gear Solid, Final Fantasy, Devil May Cry, and now Asura's Wrath are representative of these sorts of discrepancies. Huh… a lot of these games are Japanese, aren't they? But uhh, these inconsistencies will usually break immersion for most, or at the very least disappoint the player since the developers have squandered potentially good gameplay mechanics in favour of gaudy pre-rendered presentation.

Half-Life was one of the first story-focussed games to severely limit the emphasis on cutscenes, which has carried over into virtually all of Valve's games.

Expanding on my hypothesis earlier, I'm sure many people like cutscenes because they give the game a cinematic tone, but there is a significant difference between feeling like a movie and being like a movie. The former is just an aesthetic choice that can add a lot to the atmosphere of a game provided it's still within the interactive confines of actual gameplay – see Left 4 Dead or Prince of Persia: Sands of Time for examples – while the latter is applying strict storytelling rules from a non-interactive medium to an interactive one which has different rules for presenting narrative. Some people are going as far as to label them “movie games”, a title I find incredibly disturbing since people should not be playing games to watch movies; films will always be better at being films than games will ever be, and the desire to see games replicate movies would be like someone going to the movie theatre to see text scroll by as if it were a book. As I keep pushing, games are about outstanding gameplay experiences, not cinematic experiences; the story should be core to the game, not the cutscenes. Maybe there's some insecurity present here, that perhaps replicating a long-standing and accepted medium will get video games the same sort of respect; but certainly the goal of video games as a serious medium and especially as an art form is to establish their own identity and gain approval on those merits. There is plenty we can learn from other mediums, and plenty they can learn from us, but that doesn't mean copying; that means analysing what they do to approach narrative and artistic expression, and working out how to apply those methods to the far more interactive gaming medium.

“But remember, cutscenes aren't there to deliver narrative. They're better used for creating context, and if they're not doing that, well they may be more trouble than they're worth.”
Daniel Floyd, narrator and editor of Extra Credits

Having said all that, cutscenes aren't an inherent evil; most developers just haven't figured out what they're best suited for. Every weapon in the developer's arsenal has some use, even if it's for something small and aesthetic. In my eyes, the best use for cutscenes in gaming is to give the player a contextual framing for the story, atmosphere and even gameplay, and even then they shouldn't be overused or too long. This is why we always expect an opening cinematic in video games; to show us the general tone and atmosphere of the world we're about to jump into and experience. Panoramas of environments, introducing and establishing important or vital characters, and putting the player in a position of hopelessness that’s outside their control; these all fall under the use of cutscenes to establish context. Think of how much more amazing the world of Ocarina of Time and Okami felt when the game took five seconds of your playtime every now and then to show you the mere face of the gorgeous environments you were about to explore, or how much greater the impact of the meeting with Andrew Ryan was in BioShock with control taken away from you. Think of how the opening cinematics of FallOut or Call of Duty 4 established the mood, atmosphere and context of the experience ahead of you, and how much emptier that experience would've been had the developers not used cutscenes to set that tone. I don't think any of these could have quite the same impact if they were exclusively in the player's control.

Now of course I don't purport to have all the answers about narration and story in games, and the topic only got brought up so much here since it's at the centre of the cutscenes argument; I'll probably dedicate another section later on to my views on how to make a good video game story. But it seems to me that if developers do wish to pursue narrative seriously, if they do want to make the player's experience in the game just that much richer with a brilliant story and clever writing, then they have to learn that games are about interactivity and not about being cinematic. If we can properly assess where cutscenes fit best and how best to apply them to a game, then we can perfectly frame and enrich a great gameplay and story experience without bogging the player down and taking them out of that experience with overdone cutscenes. Perhaps I'm not articulating myself particularly well here, but hopefully I've still managed to convey my general thoughts on the matter. Hopefully this section will inspire some pondering and conversation about the place of cutscenes in games; I'd love to hear your own opinions on the subject. See you blokes next month.