Why Practical is Always Better *Except when it’s not

why-practical-is-always-better-except-when-its-not

                Visual effects are one of the hallmarks of film as a medium. From the earliest days of silent film to the most recent summer blockbuster spectaculars, film has allowed the artist to transport the audience to a world filled with the spectacular: a distant planet, an alternate universe, and terrifying monsters, just to name a few. Throughout that time, creators and craftsmen have labored to bring these visions, any many more, to life in the cinematic art form – to breathe life into pure fantasy and enliven the imaginations of the viewer.

                Unsurprisingly, one of the key facets of this endeavor is the desire to bring even greater levels of realism to their work; to discover new ways of crafting an effect or of how to film it. From physical creations to optical trickery, this has led the industry down a never-ending rabbit hole of chasing the ultimate dream: effects so realistic as to be completely indistinguishable from reality (with the obvious exception of those effects who goal it is to be stylized in such a way as to create an alternate effect – i.e. Pixar films, hand-drawn animation, stop motion, etc.). This decades-long process, along with the continual advancement of technology, eventually led to the birth of when we now know as CGI VFX (Computer Generated Imagery, Visual Effects).

                While primitive and limited at first, this new field has quickly grown to be the preeminent discipline in the visual effects category – used for everything from creature creations to full set replacement and all points in between. While there is no doubt that this is an incredible tool, and one with near-limitless uses, the more-recent overreliance on this method (to the detriment of all others) has resulted in a cinematic experience that has (ironically) moved farther away from the realism they were striving for. Let me explain…

A Tangible History

                While the use of optical effects, such as double exposure and the like, date back to the infancy of film, there were several limits on what could be accomplished by these methods. They could not, for instance, build an Aztec temple or breathe life into an inhuman monster. For these and other fantastical ideas to be brought to life on the screen, artists and craftspeople would need to use their ingenuity and skill to invent new methods, techniques, and materials. They would need to push aside worries of the possible and translate the ethereal into the practical.

While by today’s standards many of these early attempts might now be considered laughable, they were often revolutionary for their time. Not only did these wondrous sights delight audiences, but they also added to the common pool of knowledge that informed future films. They were, quite simply, the giants upon whose shoulders the entire visual effects industry was built.

                Of course, this did not just apply to the earliest forbearers. Artists of all kinds from across cinematic history each added their own special flair. Each company and decade was the spark for a new type of effect or technique. Whether the use of stop motion, miniature building, suitmation, or the gore effects of the 1980s, each one of these art forms was crafted to aid filmmakers in translating an idea into reality – to invite the audience, just for a moment, to forget themselves in the spectacle of cinema.

                As a matter of practicality, as opposed to choice, nearly all of these techniques relied on the physical craftsmanship of individuals to create a tangible object or objects. In these instances, computing power was either non-existent or not yet suited for the task. That said, while not every effect looks as good to modern audiences as when they were first created, many of them do manage to hold up to this day. Why that is, is a very good question, and one I hope to explore in just a bit. In the meantime, change is on the way…

Enter Computers00100001

                While the use of limited computer effects in film has been in effect since the 1970s, the broad-scale acceptance of the new medium didn’t occur until the 1990s. While there are many benchmarks one could set for when exactly this happened (the first fully CGI film, in Toy Story), I think the clearest and most telling event was the release of 1993’s Jurassic Park. As the film is such a mainstay of pop culture, I won’t bother to recount the plot – suffice to say, the public had never seen a conception of dinosaurs so realistic at any point in history. The fluidity of their motion, and the texture of their skin, aided in the ultimate illusion – for audiences, the dinosaurs were real.

                After being blown away by this unimaginable level of quality, it is unsurprising that the public (and, for that matter, the studio heads) clamored for the next computer-generated spectacle. Film after film, to wildly varying results, attempted to make use of this emerging technology. While still fairly cost and time prohibitive, the freedom it offered was something akin to the installation of the first electric lightbulb in the average household – completely revolutionary.

                That said, being only in its infancy, the CGI of the time could only usually be used to “fill in the gaps” (aka. to do those things that would otherwise be impossible). This meant that if it could be done in miniature (or suit, or animatronic, or any other medium), it should. While this helped keep costs down, it also meant that CGI (as a percentage of the total film’s runtime) was usually limited to only a few minutes. This interspersing, along with the addition of dark lighting or other masking effects, meant that the audience never had too long to focus on the effect – thus maintaining the illusion.

You were the Chosen One, Anakin!

                Things continued in this way for many years, with each new film bringing with it improvements and advancements that pushed the boundaries of realism and scope, until the early-2000’s. Again, while there are many lines in the sand that can be drawn, there are two films (or, more aptly, trilogies of films) that epitomize where the industry would go (or abandon) heading forward.

                The first trilogy, starting in 1999, was the long-awaited Star Wars Prequel Trilogy. Putting aside any issues one may or may not have with their storytelling or plot, these films were high-water marks for the advancement of CGI into the forefront of the filmmaking process. Not only was each individual movie revolutionary for its time and what it achieved, each built upon what the others had done and pushed the possibilities of what was feasible. From fully-CGI characters like Jar-Jar Binks to wondrous worlds like Naboo and Mustafar, there was seemingly no limit on what could be put to screen.

                To achieve this effect, actors might often be filmed against a giant blue or green screen, with nothing and no one to interact with. As the effects were still in development, or even nonexistent, this often meant that the performers were acting and reacting only to their imaginations. It also often led to them looking somewhat out of place in their final environments. While both elements looked good on their own, when placed together there would be a cognitive disconnect in the mind of the observer – something was just… off. Remember this, as we’ll come back to it.

Fly, You Fools.

                The second trilogy of note in this era was Peter Jackson’s adaptation of the classic J.R.R.  Tolkien tale, The Lord of the Rings. These films, beginning in 2001, were a landmark accomplishment for many reasons and were subsequently showered with awards for their trouble. With astronomical budgets and a bevy of talented filmmakers, these three films were able to use each and every tool in the belt of cinematic history to bring the world of Middle Earth to life. From optical illusions like forced perspective, to completely digital characters like Gollum, from massive armies of extras dressed up in prosthetics, to elaborate miniatures the size of a swimming pool – there was no financial or creative expense spared.

                The result of all of this hard work is a trio of films that look as good – and realistic – as anything that has ever been out to screen. From Hobbiton to Mordor, the Balrog to the Ring Wraiths – it is often hard to differentiate which technique was used. Which elements were physically in front of a camera and which only exist as ones and zeros. The effect of this is to fully immerse the viewer in the reality of the world. Each place and character feels fleshed out, as if they exist wholly outside the scope of the story being told.

                While the impeccable writing and incredible acting no doubt add unqualifiable value to this perception, there is no debate that even in isolation the work is astounding. The merging and overlap between all of these various art forms gives way to a whole that is infinitely stronger than the sum of its parts. This trilogy, in stark contrast to the nearly-entirely-CGI-crafted Star Wars Prequel Trilogy, is perhaps the greatest example (and possibly last) of the optimal utilization of all facets of visual effects.

From this point forward, with few exceptions, it was been a steady march of CGI slowly but surely overtaking and subsuming nearly all of the other disciplines. Perhaps this was a matter of the ever-lowering costs associated with CGI when compared to crafting something practically, or perhaps it was a response to perceived audience tastes – in either case, the effect was the same: the homogenization and ‘flattening’ of visual effects.

It’s a Dinosaur! – The Tactile Effect

                While it might seem odd, I’m going to point to a scene in one of the high-water mark CGI VFX films to make an important case about practical effects. The scene in question is thus: In Jurassic Park, when the primary cast exit the tour vehicle, they soon find themselves face-to-face with both a triceratops and a large pile of its droppings. Importantly, this is the first time in the film that either we or the characters have seen a full-sized dinosaur in extremely close detail. It is also the first time (besides the tiny baby velociraptor in the hatchery – though the same points apply there) that we watch the characters interact with a dinosaur.

                The animal’s tongue is examined as Dr. Alan Grant presses his whole body to the undulating underside of the massive beast. Hands (and arms) are shoved into a giant pile of poop. We see not only how the actors affect the effects, but how the effects affect the actors – they are physically moved by the breath and their gloved arms come away with bits and pieces of feces clinging to them. Though the animal never stands to walk (or do much of anything besides lie there), the entire experience is extremely memorable due to the highly tactile nature of it. The effects interact with both the cast and the environment around them.

                You see, the human eye is smart. It has evolved over millions of years to be able to take in our environment and assess what it sees – to detect danger or to find food. In essence, to aid in our very survival. As a result, it has gotten very good at distinguishing what exactly reality looks like – the way that objects and individuals interact with the world around them and how it, in turn, interacts with them. The way that water beads off of a surface or how dirt and slime get trapped in every crease and crevice; the way that light casts shadows on an object, which themselves interact with still others; the weight and mass of something, apparent by the way it moves through space – these are but a tiny handful of the millions of cues your eyes absorb every second. If even one of them is off, and not by much, you can always tell. It might be subconscious, it might be subtle, but it’s always there.

They’re under the ground! – Why Practical is Always Better

                With all that said, the benefits of using practical effects become clear – whereas a computer, and an animator, must endeavor to recreate each and every one of these minuscule facets of reality to breathe life into their (photo-realistic) work, these qualities are just inherent to the real world. There is no worrying about lighting reference or interacting with the set or cast – the physical object takes care of all of that by itself, just by merely existing. Dirt, slime, water, smoke – all of these essential elements interact with the effect in unrepeatable and gloriously perfect chaos. Things that would require countless hours of computation and render time to get right, or at least as close to right is achievable, all happen in the blink of an eye. Put simply: the things on-screen look real because they are real.

                Another example I love to tout is that of 1990’s Tremors. This Sci-Fi/Horror/Comedy sees a small desert town confront the threat of large, underground worms. While I could praise the film for days for all of the things it gets right (plot, character, tone, atmosphere), today I will only focus on the quality and longevity of its creature effects. Now, while it’s true that the stunning design of the Graboids (the monsters in the film) goes a LONG way to cementing their believably, a good design is only half the battle. Created by ADI, these incredible monsters were brought to life using a variety of pre-CGI techniques: miniatures, puppets, animatronics, and giant underground sleds, just to name a few.

                While the selection of these mediums over CGI was unquestioningly influenced by its relative non-existence at the time (for this type of work, anyway), it does not hurt the film. In fact, I would argue that it is its greatest strength. With the small exception one less-than-ideal, blink-and-you’ll-miss-it blue screen shot, each and every frame of film looks just as good today as it did when it was captured. As the creatures were all practically made (and thus interacted perfectly with the sun and dust and slime of the film), and looked photorealistic at the time of release, how could they possibly age? If something looks 100% real today (in a non-hyperbolic sense), then it will – by default – look just as real in 10, 20, or 100 years.

                And that part about ‘a non-hyperbolic sense’ is important. After all, I have seen countless examples of CGI that was described as photo-real at the time of its release that was ‘dated’ merely a few years later (the ’90s and early 2000s are particularly egregious about this). While they no doubt did look good upon release, and relatively photo-real when compared to previous CGI effects, their description as such was merely hyperbole. Compare that to something that when seen up close in real life is indistinguishable from what exists in nature – aka. a truly life-like effect. Such an effect has no chance of aging as it has already reached the threshold for believability in reality.

                That said, this does not mean that all effects, by the very nature of them being practical, are always 100% realistic. This is up to the age of film (i.e. had the particular tool in question been perfected to realistic levels at the time of the production), the skill of the effects artists, and the talent of the cinematographer. A film made in the 1950s, and limited by the technology of the time, will likely never look “100% real” to the modern eye and a camera crew that shows the wrong part of the animatronic will always give away the illusion. However, if true photorealism can be achieved with practical effects, then they will continue to hold up no matter how much time goes by. The same cannot be said of even the best modern-day CGI.

*Except when it’s not

                You might think, after the diatribe above, that I loathe CGI – that I think it is a stain on the very essence of cinema and one that has no place in the art form. Nothing could be further from the truth. In point of fact, there are a great many tasks to which CGI is suited that could not be easily accomplished (if at all) by any other method: Full-motion/full-body creature wide shots, grandiose background extensions, certain non-human characters – the list is so very long I won’t bother to embarrass myself by going on.

                In many of these cases, there is absolutely no better choice than that of CGI. That said, it is not in these uses that I take issue. No, my problem is with the incorrect implementation of CGI in an instance where another method would have been better suited for the job. As mentioned before, things that are real, look real. Close-ups of skin or fur, interactions with materials such as water or smoke, and even the crumbling of a tower – in most instances, these things always look better when they are shot practically. To sum up: this is not a damnation of CGI as a tool, but rather an overuse of it.

                In fact, it is my contention that CGI is best used as a scalpel rather than a hammer. Look no further than my previous examples of The Lord of the Rings or Jurassic Park to see its optimal placement. In each of those instances, unless CGI was the only option available to bring a specific shot to life, it was not used. Moreover, its sparing use and constant juxtaposition with practical effects managed to breathe extra life into both effects. The textures and tactile nature of the giant animatronic T-rex (its eyes blinked and pupils dilated) fed into the reality of the CGI dinosaur in the same way that the fluidity and space of CGI model extended the animatronic behind the screen in our minds. The CGI flash of the death of Sauron’s giant eye was the cherry on top of the practical model of his tower’s destruction.

                Look no further than the masterpiece that is Mad Max: Fury Road to see a more recent example of the perfect melding of CGI and practical effects. The explosions, the cars, the crashes, the stunts – they were all real. And, because of that fact, they all felt real. Meanwhile, CGI was used to hide the wires, expand and background or paint out crew, and to craft the biggest part of the massive superstorm through which they drive. All of these uses served to enhance what was already there or craft something that would have been otherwise impossible. The CGI did the best thing it possibly could – it was invisible. Not in the sense that we literally can’t see it, of course, but in the sense that it was only there to service the story, not to be the story.

Conclusion

                The truth is: whether you’re talking practical or CGI effects, if people are paying attention to how good the effects are and not how the story is gripping them, then the effects – no matter how good – have failed. Story and characters should always be king and any type of effect is only ever there to further the audience’s immersion into that world. This is the exact reason that large-scale explosion fests at the end of modern blockbusters are often met with such blasé – they exist merely as a testament to themselves for the sake of spectacle and not to truly further the tale being told.

It’s also the same reason that a giant animatronic shark with less-than-accurate proportions can still thrill us to this day in JAWS –  the effects, great and flawed as they are, are all in service of telling (and showing) the story in the best way possible. If that way is CGI, wonderful! Go for it! But if it’s not, don’t allow yourself to be taken in by its more recent elevation to the status of a “jack-of-all-trades, one-stop-shop” for film effects. While CGI can be useful in a great many situations, there are just as many (and I’d argue many, many more) in which a practical effect would be even better. Coupled with the inherent benefit (in the modern age) of the lower likelihood of aging into obsolescence, and the boost in realism it will add not only to the film itself but to the computer-generated effects that are present, there is every reason in the world for filmmakers and audiences of the world to push for, and embrace, a return to the old way…

The real way…

The practical way.

After all, practical is always better*

*Except when it’s not

Chris

P.S. – And yes, I am well aware that (particularly on films with considerably smaller budgets) the choice to use CGI over practical effects is an economic necessity. I would never blame a film for working with what it has. This article is more aptly pointed towards those films which have a budget that allows for such choice and choose to forgo it.

P.P.S. – And yes, I am also aware that such a push towards a return to practical effects has more recently been felt in certain franchises – the Star Wars Sequel Trilogy, for one (a reaction, no doubt, to the long-term reception of effects for the aforementioned Prequel Trilogy). For whatever else they may be criticized for, I give them a great deal of credit for that. I sincerely hope the trend spreads.