I’m old enough to remember video games as giant machines that required a full week’s worth of allowance in quarters to play. I also used to sit and watch my uncle play Super Mario Brothers on the original Nintendo system when he was in his 20’s.
Now that I’m in my 20’s, I find it humbling when I play video games with his son, especially when he pulls up his downloaded version of Super Mario Brothers on his Wii. Things like that have a way of screwing with your head a little bit.
I’m not a hardcore game by any stretch. I have no idea what kind of processors are in PS3’s and XBOX 360’s and I barely understand the difference between 1080i and 1080p. All I really understand is what I see, and throughout my gaming life I’ve seen the evolution of computer graphics from Super Mario all the way up to Mass Effect.
The first thing my friends used to brag about when talking about a new game was “you gotta see these graphics!” Nintendo 64 and a 3-dimensional environment blew our minds long before games had things like Motion Capture and Wii-motes.
Now, however, I’ve noticed the conversation about new and exciting games has changed. It’s less about mind-blowing graphics and more about content, game play and story.
The movie industry went through a similar evolution. CGI technology in movies made its first appearance in major motion picture in 1992 with Jurassic Park. At the time, computer graphics were used sparingly, but director Steven Spielberg pushed the envelope and generated the characters directly in the programs and the industry never looked back.
The Matrix took the next step in 1999 with its use of a 360° camera. Sky Captain and the World of Tomorrow was the first film to incorporate green screen into every shot, a method that has been greatly utilized since and perhaps reached its peak with 300. In 2010, James Cameron took motion capture and computer generated characters to a new level with Avatar.
The video game industry have fed off of these advancements and incorporated all of these technologies into their games. However, much like the movie industry, technology seems to have reached a quick peak and innovation is not as easy as it used to be.
Every summer blockbuster in 2013 will incorporate some form of CGI, green screen and/or motion-capture, so there isn’t anything to technologically differentiate one from another. The separation comes in the story, which is where the gaming industry will soon find itself.
So what is the future of video games if story and game play become the key components?
Assassin’s Creed is in production of its sixth installment set for release this fall. The graphics of the franchise have always been good with improvements coming largely in motion capture. The success of the franchise is not in the graphics, it’s in the open-world game play and a strong storyline with endless possibilities.
Games will always require some form of computer graphics, but with the playing field now leveled, we’ll start to see more games with movie-style premises (which could lead to more video games being optioned into movies).
The movie industry has influenced the gaming industry for the better part of 30 years, and I think the gaming industry will have a chance to switch it up moving forward.