fbpx

A Short History of CGI

When James Cameron’s Avatar hit theatres in 2009, critics and moviegoers alike couldn’t stop talking about the extensive and impressive use of CGI (Computer Generated Imagery) throughout the movie. Offered in 3D, Cameron’s visual effects were a step beyond lush – something otherworldly in their beauty. 

Avatar’s CGI formed a big part of why the movie became one of the highest-grossing films ever. It wasn’t the only computer-animated feature film at the time, and it certainly wasn’t the last. CGI has quite a long history, and a surprisingly interesting one.

What is CGI?

CGI stands for Computer Generated Imagery. It’s helpful to clarify that CGI is different from traditional animation techniques – although these days, the two are often confused. Traditionally, animation referred to hand-drawn, stylised art forms that were put together to create dynamic images. Of course, today all of the animation process can be done with computers, meaning that the visual difference between a computer generated image and digital animation might be very small.

However, for clarity, CGI typically refers to computer generated images inserted into otherwise live-action movies. Think back to the example of Avatar – much of the feature film was live-action, with live actors and real sets. But interspersed throughout the movie were computer-generated visual effects and images, blurring the lines between what was “live” and what was 3D computer graphics.

It’s worth noting that “animation” necessarily refers to motion capture; “CGI” includes both moving and still images used in video. In this article, we’ll be talking about CGI primarily in movies and films, covering some of the highlights of the almost seven decades that CGI has been in use.

A Simple Timeline

The early years

The history of CGI is still relatively short; the first film using CGI didn’t occur until the 1950s. In Vertigo, the 1958 Alfred Hitchcock thriller, an assistant named John Whitney used a nearly half-tonne WWII aircraft targeting computer and some simple mechanical tricks to create an endlessly swirling spiral. Hitchcock incorporated the CGI spiral in the opening sequence of the movie, instantly demonstrating the potential for CGI in film.

The rest of the 1960s mainly saw an increase in the complexity of CGI technology. CGI was used to model a human face, to create a 3D model, and eventually to render an entire short film. The decade also saw CGI move into the technical fields; Bell Labs used a computer to demonstrate how a satellite could orbit the earth in a consistent orientation in orbit.

In the 1970s, CGI broke into the mainstream. The seminal film Westworld used 3D CGI to display the point of view of Yul Brynner’s android badman. The raster computer graphics were created through a combination of existing film technology and computer graphics, resulting in an eye-catching red-and-yellow image that was intentionally pixelated to enhance the effect.  

Notable achievements in the remainder of the decade included architectural CGI, used both to model an existing structure and to model buildings drawn entirely from scratch. Most of the early CGI efforts came from leading universities or institutions with advanced technology departments; the U.S. Department of the Navy was involved in some.

1980s and ‘90s: CGI goes mainstream

After 1980, CGI started to become a filmmaking mainstay. The first motion capture CGI came in 1981, although that was admittedly only a test reel. But in 1982, Tron demonstrated to the world just some of the potential CGI held to create entirely new virtuals worlds. A year later, Star Trek II: The Wrath of Khan used CGI for that very purpose, to bring to life alien landscapes.

The rest of the decade brought a whole list of breakthroughs. CGI was combined with animation techniques for the first time; Disney got into the act with The Great Mouse Detective. Interestingly, CGI was increasingly being used to create hard-to-draw items, often architectural shapes. In Disney’s case, the gears and clockwork of Big Ben, crucial to a key scene in the movie, were painstakingly rendered on the computer and then combined with traditional animation.

Moving into the 90s, the list of feature films using CGI becomes simply a list of notable films; the technique had become a standard. Terminator 2, Jurassic Park, even a re-released Star Wars all made notable breakthroughs.

2000-2020: the present and future of CGI

Avatar came at the end of another breakthrough decade for CGI. Fantasy and sci-fi epics began to rely heavily on the technique, using CGI to flesh out the orc armies of Lord of the Rings and the fantastic beasts of Harry Potter and Avatar. 

3D CGI led a wave of interest in 3D movies, although by the latter part of the 2010s, that interest had somewhat waned. The use of CGI in general continued to grow, and to generally become more and more realistic. Photorealistic animals appeared in the Revenant, and mainstream blockbusters like Captain America: Civil War and Guardians of the Galaxy Vol. 2 began to experiment with CGI deaging, taking existing older actors back to their much earlier days.

As CGI continues to advance, new problems start to emerge. The term “uncanny valley” originated in the 1970s, referring to the design of robots or androids that seemed a bit too human. With CGI, the term took on a new layer of meaning, and the medium hasn’t quite been able to overcome a vague sense of unease when some images seem too realistic – but somehow not entirely convincing. 

CGI Beyond Film

All of the notable films and movies mentioned above improved the medium, but they also provided new tools for fields far beyond the cinema. Two, in particular, are worth mentioning.

CGI Art 

CGI for artistic purposes has of course found its biggest outlet in film, but that has in turn spawned an entire category of artists who work primarily with computer-generated images. The technical know-how and artistic skills spent perfecting the ripples in a pool of water are equally at home in a movie production studio and an artist’s digital canvas.

CGI Architecture 

Even more dramatic has been CGI’s impact on architecture. Blueprints have not vanished, but with CGI, architects and designers gained an entirely new, dynamic process. 3D models can be created to display an entire building, allowing virtual walk-throughs and in-depth visualizations. Additionally, the models can be changed as the design changes, allowing a level of experimentation in computer animation not possible with traditional blueprints.

As for the accuracy of those models, every breakthrough in film has added a tool to the chest. With advanced software packages, many of them web-based, architects and designers can incorporate realistic water (with accompanying sounds), highly-accurate textures for rock surfaces, and advanced lighting settings. 

For anyone interested in visualizing a particular concept or design, CGI animation offers an incredible array of tools to bring ideas to life without being bound by physical constraints. At the same time, CGI visualizations today are highly realistic; in the hands of skilled professionals, they can give the impression of a real-world scene without venturing into the uncanny valley.