Scatty.com

The History of Computer Animation: From Wireframes to Photorealism

When you think of the earliest forms of animation, you think of artists working tirelessly on cartoons to make sure each frame is drawn and painted precisely how it should be so they move seamlessly from one to the other. This is a style that would become rarer with each passing year, and while it’s still around, it has gone almost entirely extinct thanks to computer animation.

Computer animation comes in many different forms. From CGI imagery in live-action motion pictures to simple one-panel comics, you probably won’t go another day in your life without seeing a form of computer animation. Most of us know that “Toy Story” in 1996 became the first film that was completely done with computer animation, but how did we get to that point? Let’s take a quick look at the history of computer animation from wireframes to photorealism.

The Early Years

The beginning of computer animation wasn’t from the cartoon world, and might not be the form you’re thinking of. When younger generations see older movies that predate the modern computer, many tend to wonder how these films were able to have the types of graphics that they did. Much of this was thanks to wireframe animation which was used with technology that allowed animators to draw directly into a computer.

This is known as wireframe technology and acts as a digital blueprint for what would appear in the finished product. Much of this early technology wasn’t seen in major motion pictures, but were rather short films that people these days would likely refer to as “tech demos.” At the time, computers were seen more as a militaristic tool, and it wasn’t until the 1970s that more modern computer animation was used.

Blossoming Technology

1972 saw the advent of polygonal animation with Ed Catmull drawing polygons on a model which were then digitized. While the graphics looked crude by today’s standards, it was a revolutionary thing to see more than a half-century ago. Much of the 1970s would see the development of this type of technology, as well as the first film with CGI mixed with live action in the form of “Westworld”.

The following decade was when computer animation went from being what some called a “fad” or “gimmick” to being something that was an absolute necessity. One man that helped to spearhead the movement of making computer animation more common was George Lucas, who began Industrial Light & Magic. The release of his film “Star Wars” was perhaps the most important film in the history of computer animation, to boot.

New Wave of Animation

After the 1980s saw films like “The Last Starfighter”, “Tron”, and the “Indiana Jones” series making use of amazing computer animation, movie studios were wondering if they could make a film entirely with the technology. As we mentioned, studios had been using a blend of standard animation and computer animation in films by the time the mid-1990s rolled around, but it wasn’t until “Toy Story” that a film used nothing but computer-generated imagery (CGI). It would take a while before this became the industry standard, but these types of films were becoming more common by the start of the new millennium.

As for live-action films, CGI was taking a massive step forward during the 1990s. Many of the films we remember from the decade including “Jumanji”, “Titanic”, and “Spawn” were achieving firsts in movie history. In order, those movies featured the first photorealistic animals completely done in CGI, the first photorealistic fire, and the first photorealistic water.

Further Advancements in Computer Animation

When the new millennium kicked off, some of us wondered how animation could possibly look more realistic than it did thanks to movies like “The Matrix”. However, animation would become more refined and detailed. Animators were able to develop the ability to make people look like their younger selves, completely motion-capture people, and place them into media, including video games.

Speaking of video games, the medium has seen perhaps the fastest advancement in terms of computer animation. During the early 1990s, people were used to playing games with just 16 bits, but in less than 30 years, games began featuring photorealism and saw the release of gorgeous productions like “Ghost of Tsushima”, “Red Dead Redemption II”, and “Death Stranding”.

Movies have continued to develop computer animation to the point where actors are mostly wearing motion-capture suits and acting in front of green screens while everything else is digitally added later. Think of movies like the “Avengers” series where the actors had very few scenes where they were acting without massive sets that were covered in green screens.

Now, there’s an increase in artificial intelligence that’s able to create computer-generated imagery all on its own. Many are taking advantage of this technology to the point where many feel that films, music, and more will be completely AI-generated in the future. It’s hard to say for sure what will happen in the coming years, but one thing is for sure, and it’s that computer animation will only improve.

Leave a Reply

Your email address will not be published. Required fields are marked *