Why Video Games Are Looking More Like Movies

SMS
Why Video Games Are Looking More Like Movies
"The Last of Us: Part 2" is the latest big-budget game to showcase the connections between cinema and video games.
SHOW TRANSCRIPT

Real-life theaters are grappling with a lean summer movie season. But video games are still rolling out their big blockbusters. The latest such triple-A title is The Last of Us Part 2, a long-awaited follow-up to the critically-acclaimed 2013 game about a zombie apocalypse.

The Last of Us Part 2 features an ambitious narrative about a grand, grim, post-apocalyptic world: and like a lot of big-budget games, it often relies on cinema to tell that story. 

This is a sign of a bigger movement in technology and art: As games have become more realistic — and as movies have embraced more computer graphics —  the two industries have started to converge on each other, for better and for worse.

The developers behind The Last of Us have put a lot of focus on building empathy with the characters. 

One tool for them to do this is the game's actors — real performers, who’s actions and emotions are translated into the game via special motion capture suits.

Motion capture, or the process of translating an actor's movements and expressions onto a digital avatar, has become a common technique for movies and games alike. The same tech that’s animating The Last of Us Part 2 is also being used to bring supervillains like Marvel's Thanos to life.

But that technology actually has its roots in video games. One of the earliest motion capture studios was set up by video game maker Acclaim in 1995.

Acclaim's systems made it much easier for animators to use the data they were being given by the motion capture rigs. Those systems ended up being used by director Peter Jackson when it came time to animate a ground-breaking computer-generated character.

Remington Scott was a director in Acclaim's motion capture studio, and brought that experience to the visual effects studio WETA Digital for their work on "The Lord Of The Rings" franchise.

Scott told Newsy, "The system that we had was one in which Peter would be looking at a monitor. And he talked to Gollum. It was, I think, for the first time, a director of being able to talk directly to the digital character in real time and having that character respond."

Video games are still continuing to push the boundaries of this tech.

In 2011, Rockstar's Team Bondi developed a new facial capture system for their detective game, L.A. Noire: which required actors to sit perfectly still in front of an array of cameras. Five years later, game developers were able to deliver a similar quality of facial animation during a live stage demonstration of "Hellblade."

And big-name actors have taken notice: modern games have featured nuanced performances from the likes of Ellen Page and Kevin Spacey.

Scott told Newsy, "The idea is that the more realistic that your digital characters can be, the more ability you have to read into nuances and subtleties in their performances. And that really creates a connection between yourself and that digital persona in a way that strengthens that engagement."

But it’s not just about technology: video games and movies are becoming more intertwined when it comes to narratives as well. When people started to want to tell stories in games, they first turned to movies to figure out how.

Full-motion video games started in the early 80s as an efficient way to tell stories with more polished visuals. Games like "Dragon’s Lair" and "Night Trap" interspersed short movie clips in between segments of gameplay, rewarding players for their time and investment with a dose of cinema's finest acting.

Using actual video eventually fell out of fashion as in-game graphics advanced. But the format still stuck around; most games are still using non-interactive cutscenes to tell their stories.

Video games are inherently interactive: you can usually choose where you want your character to go and what you want them to do. But that freedom clashes with designers who want to tell a story: a scripted dramatic Western showdown doesn't work if the player can't handle their horses. 

This tension leads developers to add more scripted sequences to their games even outside of cutscenes. One popular trope is scripted sequences which give the player some agency, but keep them from straying too far from the "correct" path forward. Popular techniques like quick-time events, where players progress by hitting a button or two when prompted, allow developers to keep control of their authored moments, while still giving players a scrap of interactivity during the game.

Leaning too heavily on scripting can lead to pushback. While The Last of Us Part 2 has garnered largely positive reviews, several of the game's critics mention feeling like the game "forces" them to participate in its bloody revenge story, robbing them of agency in a supposedly interactive medium.

Ultimately, the convergence between video games and movies stems from a similar desire to tell stories. The developers for The Last of Us Part 2 have big storytelling ambitions, and films offer the studio proven techniques to achieve them — whether the players want to engage with them or not.