55 items found for ""
- Severance: unskippable opening titles
Opening sequences set the tone. They introduce the mood, establish themes, and sometimes tell a story before the first scene even begins. While they exist to present the cast and crew, they’re also a chance to immerse the audience in the world of the film or series. A strong opening sequence can define a film’s identity and, if done well, linger in the viewer’s mind long after the episode ends. That’s exactly what happened with Severance . The opening sequences for both seasons were done by Berlin-based digital artist Oliver Latta ( Extraweg ) . They’re a kind of nightmare that is bizarre and fascinating at the same time. Strange. Uncomfortable. And yet, so enjoyable. Why? THE PSYCHOLOGY OF UNSETTLING DESIGN There’s a reason we can’t look away from things that disturb us. Uncanny imagery , like Latta’s rubbery faces, distorted bodies and overall weird art, triggers an emotional response. It generates discomfort that demands our attention. And ultimately, that’s the goal: to make people stop scrolling, to keep them from hitting the “skip” button. And that is what Latta does. He’s known for his eerie and often grotesque 3D animations, that subvert your expectations. His Instagram is filled with hypnotic loops you don’t really know how to feel about, but you keep watching one after the other (though it’s been a while since he last posted). His unique art quickly got a lot of attention on social media and he's worked for various brands and music videos. “My aim is to provoke emotions and be different, outstanding, and innovative. [...] I take viewers out of their comfort zones and make them think for themselves”, Latta told SIGGRAPH and added “I want to provoke and sometimes confuse”. AN UNFORGETTABLE INTRO So, it’s no surprise that he was approached by Severance executive producer and director Ben Stiller to craft the opening title that visually embodied the show’s themes of fractured identity and corporate control. When he reached out, Stiller had no specific treatment—just the script. With only those lines of dialogue as a reference, Latta began his research and built mood boards, exploring ways to create a surreal world that left room for interpretation. It took them nearly a year to refine the vision. “I saw this guy on Instagram called Extraweg, that had this weird animation of like babies coming out of a brain and turning into jelly. And I though, this is amazing! And so, I reached out to him and he hadn’t done an opening credit sequence, but it felt like his vibe was right for the show”, Ben Stiller in the Late Night with Seth Meyers on S1. With Season 2 of Severance now also streaming, we were able to watch yet another mesmerizing opening title sequence, that got people talking. “The sequence intricately explores Mark's fragmented memories and anxieties, symbolizing his struggle to reclaim his identity”, as explained in Extraweg's website. There seems to be even a little use of AI when the goat appears and morphs, which is very well thought and nobody seems to be bothered by it (very different to the scandals from the past!). To create the sequence, Latta scanned actor Adam Scott to accurately capture his likeness. The digi-double was then manipulated—deformed, duplicated, and reshaped—to visually reflect the show’s themes. For the animation, he pulled inspiration from online references but also relied on his extensive library of unpublished work, adapting existing ideas to fit Severance 's unique world, as he told It’s Nice That. BTS | ©Extraweg website THE MUSIC Of course, you cannot talk about the opening titles without mentioning the music. You see, title sequences are a bit like moody music videos. The music is as important as the images themselves. In Severance case, Theodore Shapiro is the Emmy-award composer behind the series main title, which is “ so eerily satisfying ” that it perfectly fits with the visuals themselves. And this is true for many great intro sequences. Take Westworld , where Ramin Djawadi ’s score enhances the amazing 3D tableaux. In fact, music is so important that it can sometimes be the defining element of an intro. That’s probably the case with The White Lotus . The sequence itself is visually simple—a curated wallpaper-style montage telling a story of wealth, gossip, and treason (for S2). But Cristobal Tapia De Veer ’s strange, hypnotic score turned it into something unforgettable. At first, the music felt odd, unexpected. But by the end of the season, people became obsessed. So, in the end, what makes Severance ’s intro stand out is a combination of both elements: the unfiltered imagery of Latta accompanied with a great music that just wraps up everything perfectly. It’s something to think about for when you’re doing your intro sequence. Think beyond the usual. The best intros aren’t safe. The ones you remember are the ones that take risks. And we can help you with that.
- How (not) to slow-mo?
Zack Snyder's "Rebel Moon" has ignited discussions about the use (or abuse) of slow motion in films. Known for his stylistic, extra-slow-motion sequences, Snyder has faced criticism for employing this technique excessively in both parts of his latest epic sci-fi movies, with many going so far as to say he has "ruined it". Rebel Moon - part 2 | ©Netflix Let's not mince words: the director has disappointed his Snyder-cult (us included), prompting reflection on where things went wrong. With "Rebel Moon", Snyder has shown that impressive visuals alone cannot compensate for a lack of compelling story and character development. To avoid repeating his mistakes, we've decided to dig deeper into the technique: what exactly is slow motion and how can it be used effectively? WHAT IS SLOW MOTION? Slow motion, often abbreviated as slow-mo, is a technique that creates the illusion of time moving slower than normal. This effect is achieved by recording footage at a higher frame rate than it is played back. For example, filming at 120 frames per second (fps) and then playing it back at 24 fps results in a scene appearing five times slower than real-time. In fact, some cameras are specifically designed for capturing high-frame-rate footage, such as the iconic Phantom camera. Dredd | ©DNA Films The technique was pioneered by August Musger , an Austrian priest and physicist. Initially designed to mitigate flickering in early cinema projectors, his invention unintentionally introduced the concept of slow motion. Musger patented his device in 1904, laying the foundation for what would eventually become a fundamental tool in filmmaking. WHEN TO USE SLOW-MO AND WHY? Enhancing visual impact might seem the primary role of slow motion, yet its potential extends beyond this initial function. It allows viewers to immerse themselves in intricate details, amplifying the drama, action, or thematic elements of a scene. However, like any cinematic technique, moderation is key to maintaining its effectiveness. Overuse can diminish its impact. Therefore, select carefully specific moments where this effect serves a purpose beyond mere visual appeal. Here are a few instances where employing slow motion can give that extra “umph” to your movie: Enhance Emotional Impact Slow motion enhances emotional impact by enabling the audience to absorb every detail of a powerful moment. It is particularly effective in capturing the subtle expressions and reactions of characters during pivotal scenes, such as death scenes like Gwen's in "The Amazing Spider-Man 2”. Yeah… that one. Highlight Action In action sequences, slow motion emphasizes the choreography and intensity of movements, providing directors with an opportunity to showcase critical story elements. This can include the protagonist’s technique, strategic decisions and the high stakes involved. Slow-mo transforms fast-paced action into a comprehensible visual experience for the audience, allowing us to appreciate essential details that might otherwise be missed. For example, in "Puss in Boots", slow-mo is used during Puss' first confrontation with his nemesis, Death, highlighting the moment he is actually injured for the first time, making him defeatable. Create Suspense By decelerating time, slow motion builds suspense and anticipation. This technique is invaluable in thriller or horror films, but its application extends beyond these genres. It intensifies the audience anxiety, as we anticipate the resolution of tense situations. In " Final Girls ", a film that humorously plays with horror tropes, slow motion is employed in a long sequence where the supposedly defeated killer unexpectedly reappears, chasing our characters through the forest. Surreal and Narrative Moments But slow motion can also play a distinct role in narrative storytelling. For instance, in "Dredd" (2012), a drug known as “Slo-Mo” alters perception by slowing down time, intensifying colors and creating a whimsy atmosphere. Each use of this drug transports us into this kind of surreal world, starkly contrasting with the violent scenes depicted. Similarly, filmmaker Lars von Trier uses the technique in several of his films to evoke a surreal ambiance, like the Hypnosis sequence in " Antichrist ". Visual Aesthetics It's fair to acknowledge that slow motion can enhance the visual appeal of any scene, making mundane actions appear extraordinary. However, this should not be the principal reason for its use. Choose wisely, as this is precisely where Snyder sinned. By turning everything (and we mean it) into a visual spectacle, he lost his audience interest. A perfect example of this is the now infamous farming scene in "Rebel Moon 2”. Despite Snyder's explanation that the plant is crucial to the plot, the excessive sloooooooooow-mo of people harvesting, made it feel more like a bad advertisement, rather than a cinematic experience. Puss in Boots | ©DreamWorks Animation L.L.C 300 VS. REBEL MOON In "300", Zack Snyder's use of slow motion was sparingly employed to enhance storytelling, particularly in battle scenes where it underscored the Spartans' skill and bravery. Each slow-mo shot had a clear purpose, contributing to the film's epic and stylized tone, reminiscent of the graphic novel it portrayed. Conversely, "Rebel Moon" utilizes slow motion excessively, seemingly because it aligns with Snyder's stylistic preferences. Unfortunately, this overuse diminishes pacing and narrative cohesion (not that there is much to begin with, but that’s another story) and actually disconnects the audience. Instead of accentuating pivotal moments, the frequent slow-mo scenes make nothing feel truly significant. They lacked narrative justification . This highlights two crucial lessons from Snyder on how not to use slow-mo : Visual spectacle alone cannot compensate for weak dialogue and underdeveloped characters. Redundancy makes it feel boring and takes away the excitement. In conclusion, slow motion is a potent tool in filmmaking when applied judiciously. It has the potential to heighten emotional and visual impact when used purposefully and precisely. However, as seen in "Rebel Moon", excessive reliance on this effect can detract from the overall experience.
- A New AI-Era of the Uncanny Valley
The uncanny valley— a term that has haunted roboticists and VFX artists for decades —is experiencing a resurgence due to artificial intelligence video generators. Coined by Japanese roboticist Masahiro Mori in 1970 in an essay , it describes the unsettling sensation we feel when encountering humanoid figures that are almost, but not quite, lifelike. By Dietmar Höpfl - shockfactor_ai Minor imperfections in appearance or movement trigger discomfort or even revulsion, tapping into deep psychological responses. Why it happens? We actually don’t really know. Apparently, some even question if this is a scientific concept or not, but research is being conducted in this area. ROOTS OF THE UNCANNY VALLEY Mori’s hypothesis stems from human psychology and our innate responses to realism in representation. While exact reasons for this discomfort are debated, researchers speculate it could be due to: Violation of Expectation : Subtle imperfections in a hyper-realistic face or movement disrupt our expectations, creating a jarring effect. Evolutionary Psychology : Some theories suggest our unease arise from our ability to distinguish between alive and dead, or healthy and diseased individuals—a potential survival mechanism. Empathy Gap : We may struggle to emotionally connect with near-perfect imitations that lack a true human essence (this seems to be the case with AI video generators). “The sense of eeriness is probably a form of instinct that protects us from proximal, rather than distal, sources of danger. Proximal sources of danger are corpses, members of different species, and other entities we can closely approach. Distal sources of danger include windstorms and floods”, Masahiro Mori AI UNCANNY VALLEY It’s well-known that CGI has long struggled with the uncanny valley . But, today, we're turning our attention to AI video generators and exploring why they can be so particularly creepy—and whether that's necessarily a bad thing. First, keep in mind that recreating humans is one of the biggest challenges in visual effects. Our brains are finely tuned to recognize the subtleties of human movement and expression. Even minor inconsistencies stand out and when technology tries very hard to mimic us but doesn't quite succeed, it leaves us with that uncomfortable feeling. By Mr. Relative Usually, this has been seen as a problem to overcome—a sign that the technology isn't advanced enough (or that, you know, “ CGI is ruining movies ”!). Therefore, in VFX, we try to stray away as much from humans as we can, but sometimes… it’s just what the director wants. This gives birth to very unfortunate results, like cat-hybrids or bringing back the dead to life for a movie. A term coined as “ digital necromancy ”, according to Futurism . But I diverse. Back with AI. The tool is very good at making us think, at first sight, that it looks human. However, when you look closely , you can see six fingers or slight malformations in the images it creates. But it’s when it tries to generate movement that it becomes really bizarre, going places we’ve never really seen. This happens because AI lacks true spatial awareness. It doesn't comprehend physical space as we do; instead, it generates content based on patterns learned from vast datasets. This can result in inconsistencies, distorted perspectives or unnatural movements. By Lola.viscera To overcome these challenges, developers are working on enhancing AI's spatial awareness and motion generation capabilities. For instance, in robotics, they're training robots in digital simulations (or digital twins) featuring stairs, other robots, metahumans and obstacles, so they learn how to interact with external elements and won't be a danger when put in the real world. Similarly, in the realm of filmmaking, the aim is to imitate reality , which is why CGI has (mostly) evolved to achieve realistic results. The same is going to happen with AI video generators, with companies looking to cut costs and create cheaper and easier ads to produce. However, I believe that AI's most compelling work emerges when its uncanny and weird outputs are embraced , especially in the horror genre (but not only!). Many artists on social media are leveraging these eerie qualities to create unsettling videos that we can't help but keep watching. These are the same artists we've included here as examples for our article. By Daryl Anselmo These uncanny visuals produced with AI challenge our perceptions of reality and elicit strong emotions. This shift raises a question: Is the uncanny valley necessarily a bad thing? While many still strive to eliminate it, embracing the uncanny opens up new possibilities. As an art form, it allows creators to explore places where we would not have been able to go on our own—a realm between reality and artificiality, engaging audiences in ways that are new, fascinating and definitely disturbing. What do you think about it? By Doopiidoo
- Filming Night Scenes
Even though modern cameras are more advanced than ever at filming in low light, capturing "the night" arguably remains one of the trickiest challenges in filmmaking. Because, when you're filming night scenes, it’s not only about the technical hurdles—grainy footage, complex lighting setups and extra hours—but about capturing the right feel for your film. The vibe you want to have. How dark? How dreamy? How much visibility do you want? “What you see with your eye doesn’t look the same when you try to capture it on film. You have to expose the film at a certain ratio for it to react in a certain way but that is not the same way you react to it in real life”, Jarin Blaschke, Nosferatu’s DP. To illustrate this, let’s explore how some directors, working closely with their cinematographers, have crafted night scenes that perfectly fit their story. But before, let’s first understand... WHY IS FILMING NIGHT SCENES HARD TO GET RIGHT? Night shoots require either actual nighttime filming or day-for-night techniques, where scenes are shot during the day and manipulated in post-production to look like night. Lighting the house of Nope, during the night | © Universal Pictures Each approach has its own pros and cons: Shooting at night requires high-powered lights while maintaining a natural feel. It also requires extended hours and increases costs due to lighting equipment and crew requirements (Studio Binder even made a whole article on how to survive a night shooting). Day-for-night shooting, on the other hand, demands precise exposure control and color grading to avoid unnatural-looking shadows, among other concerns. But it’s often the chosen technique particularly when you have to film vast landscapes, where lighting is just not viable. A tip from cinematographer Hoyte van Hoytema: Day-for-night works best when the light source is behind the subject (backlit), rather than shining directly at them (frontlit). In both cases, ensuring actor visibility while keeping the scene dark enough to feel real is a constant balancing act. Skies are the biggest giveaway when filming day-for-night, often necessitating sky replacements in post. VARIOUS APPROACHES TO NIGHT SCENES "Nosferatu" (2024) As with all his previous films, director Robert Eggers once again teamed up with cinematographer Jarin Blaschke for Nosferatu . And, much like in The Lighthouse (2019) and The Witch (2015) , Blaschke’s lighting approach remained consistent— highly motivated and naturalistic , even in night scenes, where he sometimes relied on a single candle to illuminate the space. “I’m creative, but in lighting I tend to be a little bit literal. There’s a fire in the room, I’m just gonna chose where the fire is gonna be and then embellish it (...)”, he told Variety . “If I want some more, I’ll put some mirrors and I’ll multiply the source”. So, to bring Nosferatu ’s night scenes to life, Eggers and Blaschke leaned on techniques from their previous films, including custom filters that create an almost monochromatic black-and-white aesthetic—without actually being black-and-white. “I used a filter to eliminate all yellow and red light as well as most of the green. What was left was mostly blue, which made everything look a certain way. In shooting, I’m just trying to recreate the same wavelengths that your eyes would see under those conditions”, Blaschke on Focus Features . For his take on this new version of the iconic vampire movie, Blaschke was nominated for Best Cinematography at the Oscars 2025 . "Nope" (2022) To film some of the night scenes in Nope , the filmmakers used a new approach to the day-for-night technique: infrared cinematography . Instead of faking night entirely in post, cinematographer Hoyte van Hoytema developed a method that allowed him to create believable, visible night scenes in vast outdoor environments—a crucial element for the film’s horror-sci-fi tone. Infrared was something van Hoytema had previously experimented with in Ad Astra (2019) , but for Nope , he was able to push it even further, thanks to director Jordan Peele’s trust in his vision. To achieve the effect, they perfectly aligned two cameras: one infrared and the other shooting on Panavision System 65mm film. This setup allowed them to capture details that would otherwise be lost in extreme darkness, creating a natural yet crisp night feel even in the film’s massive landscapes. “You use one camera [infrared] to tell you the relationship of the light levels between everything, and then you use the other camera to gather color information and film grain and such things”, Hoytema. In depth article about it by Noam Kroll | ©Universal In post-production, the VFX team then merged the two images in a process that, according to Business Insider , is similar to how they colorized old black-and-white movies, but without inventing the colors. Much like Blashke for Nosferatu, they also wanted to capture how the eye perceives the night, including how the pupil slowly adapts to complete darkness, allowing us to see full landscape under the moonlight. “We built up a sort of fades-in that are very slow, but that simulate very much your pupil dilation”, he explains. The VFX team then filmed specific plates of practical lights by night, which were later integrated into the shots. “For me, these little things are really the cherry toppings of the cake. It completes it and sort of finishes it”, he concludes. Mad Max: Fury Road (2015) Last but not least, George Miller took inspiration from old Westerns—many of which used the day-for-night technique—and decided to shoot his night scenes entirely during the day. To do so, instead of underexposing to simulate darkness –which is what is normally done for these shots–, the crew overexposed the footage by two stops . The approach was suggested by visual effects supervisor Andrew Jackson, who argued that overexposing would keep more detail while reducing noise. Though the idea needed some convincing, a few camera tests settled it. “It was solved right there ”, Jackson told FX Guide . "This [approach] enabled me to create very graphic contrasty images with detail exactly where I wanted it, and a fall off into shadows where I didn't want it", colorist Eric Whipp, on his blog post on Lowepost . Whipp then transformed the footage in post-production, adding a high-contrast blue tint to create the film’s iconic night aesthetic. This gave the night scenes that almost surreal, graphic-novel look while maintaining clarity. A look ultimately fit for the saga. The colorist then added that " almost every D4N shot was basically roto'd and had the sky replaced to create the look. It took a few months of fiddly work, but I think the look is different and graphic". ©Warner Bros Pictures So, in the end, night cinematography is ultimately about creative problem-solving. What types of shots do you need? Is it outside or inside? Do you have a hyper-stylized look or a more realistic one? What is your budget? All these questions come into play when you’re deciding how you want your nights to feel.
- Oscars 2025: Best Visual Effects Nominees
“All practical”, “ no CGI ”, the war is still ongoing. However, it is fascinating that the Oscars don’t seem to make such distinction (well, almost...). Practical or digital? Those are encapsulated under the same category “Best visual effects”. Production concepts of Wicked shared and done by Oliver Beck This year’s nominations include two films heavily marketed as using "only" practical effects, but we already have some breakdowns of all the scenes where VFX (and CG—for those making the distinction) are used. And it works! Both practical and digital effects have evolved a lot since their inception and it’s thanks to both of them that we have great movies (and some flops, of course!). On the other hand, we also have two very different takes on motion-captured apes performances, and 2024's most beloved sci-fi epic. Here’s the breakdown of this year’s nominees: ALIEN: ROMULUS – FEDE ALVAREZ Touted as a film relying entirely on practical effects, Alien: Romulus was a return to old-school filmmaking techniques. While this claim holds weight, we’re glad to see behind-the-scenes footage revealing how CGI was also used to bring some of the film's most gruesome creatures and scenes to life —along with a few other things (asteroid belts, spaceships, floating acid, etc.). Practically speaking? Plenty was done in-camera, including our favorite: the swimming facehugger, specifically designed to leap from the water towards its prey. Video from WetaWorkshop Instagram According to IMDb , 23 companies contributed to the film's effects, covering everything from concept art and previs to practical effects, miniatures, and digital work . It’s exciting to see filmmakers putting in the effort to revive the practical side of effects, despite the cost and complexity. However, the way the movie has been marketed—as though CGI is the “evil sibling”—feels unnecessary. “We went all the way to create creatures with (...) the philosophy of the old movies but with technology of today, to create something that people don’t see on screen everyday”, director Fede Alvarez told the Hollywood Reporter . WICKED - JON M. CHU Much like the Romulus campaign, Wicked has been marketed as "real" and "tangible", boasting impressive feats like planting nine million tulips. And, again, this is true—they built immense, magical sets and went all-in on practical elements. But let’s not ignore the significant digital contributions that helped bring the movie to life. Fortunately, we’re starting to see breakdowns and behind-the-scenes footage from Framestore , showcasing the amount of work of thousands of artists from both the practical and digital realms. This transparency is refreshing—especially when compared to Barbie’s infamous behind-the-scenes “grey screens”, which sparked online debates about the studios’ fakeness and their reluctance to reveal just how much VFX was actually used. DUNE: PART TWO – DENIS VILLENEUVE The Oscar for Best Visual Effects at the 97th Academy Awards was awarded to Dune: Part Two (Paul Lambert, Stephen James, Rhys Salcombe and Gerd Nefzer). No surprises here—Denis Villeneuve’s epic sequel once again delivered jaw-dropping visuals. From massive sandworms to the sprawling deserts of Arrakis, the film has lots of incredible and epic scenes. A mix of techniques were used, but we already covered the ones we liked the most in this article . Dune: Part Two | Official Trailer | ©Warner Bros KINGDOM OF THE PLANET OF THE APES – WES BALL Motion capture continues to evolve and innovate, and this film is the ultimate proof. With its hyper-realistic ape characters, Kingdom of the Planet of the Apes showcases how CG has reached new levels, enabling to create emotion and believable characters. While CG-heavy movies are (mostly) frowned upon today, the Planet of the Apes franchise has managed to sidestep this stigma . Even though we know those apes don’t exist, we remain fully invested in the story. The integration of visual effects doesn’t detract from the film—it elevates it and the artistry is undeniable. “Through those three apes’ films that technology [mocap] improved, became robust. We were able to take it outside in the rain or in the snow”, Eric Winquist, VFX supervisor BETTER MAN – MICHAEL GRACEY “We usually do stunts and fights, and then we were thrown into dancing apes”, Emma Cross, Weta FX motion capture. Weta FX were the ones that took on the challenge of creating a singing, dancing ape—a task a bit outside their usual expertise. This required them to innovate and adapt their workflow to make it work. “We had to make a lot of motion studies there, to sort of work out how you convincingly make all that sound and all that energy and all that breath come out of this CG character”, Dave Clayton, Weta FX Previs & Animation supervisor The decision to transform Robbie Williams into an ape wasn’t arbitrary. According to the director, it reflected how Williams sees himself, placing the biopic in the realm of magic realism. This creative choice brought a new layer of depth to the storytelling, offering audiences something we haven’t really seen before. ** Which of these movies do you think should win the Oscar for Best Visual Effects? Or is there another film you believe should have made the list? Let us know!
- The Alien Chestburster Scene throughout the years
In art, it's all about reinveinting what has already been made. As Pablo Picasso allegedly said: "Good artists borrow. Great artists steal" However, in the Alien franchise, this premise doesn’t quite hold—you can't just copy what the the previous director did. That’s not how it works. The series, now spanning seven movies (not counting the Predator crossovers), has had to continually reimagine and expand upon a lore and creature first invented in 1979. But what about its most iconic moment—the chestburster scene? How has each director managed to recreate and revitalize such a pivotal moment? Let’s explore the evolution of this scene across the Alien saga and see how each film put its unique spin on it. The original chestburster scene | ©Twentieth Century Fox ALIEN (1979) - RIDLEY SCOTT Ridley Scott’s original Alien introduced the world to the chestburster in a scene that has since become one of the most iconic in horror history and set the standard for all that followed. The concept originated with the film's writer, Dan O'Bannon, who suffered from Crohn's disease. Apparently, after a particularly excruciating night of pain, O'Bannon conceived the idea of a creature violently erupting from a person’s body—a metaphor for his own suffering. Alien's script The jump from script to screen was no easy feat, starting with the design of this version of the creature, up to the logistics of the scene. Bringing the chestburster to life required four takes and nearly 23 liters of fake blood . The terror was also amplified by the actors' genuine reactions—most of them were unaware of the exact nature of the scene, leading to real shock when the red liquid and actual animal guts sprayed across the set. Additionally, Scott carefully structured the film so that we had to wait over 50 minutes to catch the first glimpse of the alien, making the chestburster scene unforgettable to this day. Ridley Scott draws his own storyboards, know as "Ridleygrams" If you want to have a more in-depth understanding on how this scene was brought to life, check out this video ! ALIENS (1986) - JAMES CAMERON Terminator’s director took a different approach in Aliens , shifting the franchise towards action while retaining its horror roots. The first chestburster moment in this film is implied through a nightmare sequence, where Ripley (Sigourney Weaver) dreams of the creature emerging from her chest. This scene deepens Ripley’s PTSD from the first film and taps into the audience’s lingering fear. By not directly recreating the scene, Cameron preserved the impact of the original while building tension in a new, psychologically driven way. Nightmare sequence | ©Twentieth Century Fox But Cameron didn’t entirely shy away from a traditional alien chestburster moment. Later in the film, when Ripley and the Marines enter the alien hive, they discover a woman cocooned and begging for death, moments before a chestburster erupts from her. This scene is brief but significant for the characters in the movie. It's a reminder of what happens when you get facehugged. ALIEN³ (1992) - DAVID FINCHER David Fincher’s Alien³ introduced a new twist by having the chestburster emerge from a different host—an animal. In the theatrical release, it bursts from a dog, while in the special edition, it emerges from an ox. This adaptation expanded the Xenomorph mythos by showcasing the creature's ability to adapt to its host. The result is a variant known as the " Runner ", characterized by its agile body, which allows it to move on all fours and, oh, it can spit acid from its mouth too. Unlike the small, pale, elongated versions of chestbursters seen before, the Runner emerges almost fully formed, with distinct traits inherited from its host. Which version did you see, the dog or ox? | ©Twentieth Century Fox However, in the theatrical ending , Alien³ also gives us a human chestburster moment, this time with none other than Ripley herself. In the film's climax, having been impregnated with a Xenomorph queen embryo, she sacrifices herself by jumping into a furnace. As she falls, the alien bursts out of her, but she holds it close and ensures that both she and the creature are destroyed in the flames. It's Ripley's finale. ALIEN: RESURRECTION (1997) - JEAN-PIERRE JEUNET Because "we've seen it all before", Alien: Resurrection brings a darker, more aggressive and intentional use of the creature . Near the climax of the film, Purvis—a kidnapped civilian who was part of Dr. Mason Wren's breeding project—knows the chestburster inside him will soon emerge and kill him. Seizing control of his final moments, he grabs Dr. Wren and forces his head against his chest, ensuring that the chestburster will kill both of them. This scene gives the host a sense of agency, turning the moment into a weapon and adding emotional intensity to the traditional chestburster horror. It's chaotic, messy, and... surprising! PROMETHEUS (2012) - RIDLEY SCOTT While there isn’t a traditional chestburster scene in this movie, there is a harrowing moment reminiscent of it: a forced abortion . This occurs when the " Trilobite ", a parasitic alien, is surgically extracted from Dr. Shaw’s (Noomi Rapace) body using an automated surgical pod. Unlike the familiar chestburster, the Trilobite has a more primitive, squid-like appearance. This design choice was intentional, highlighting the creature’s role as a precursor to the Xenomorph. More squid like. More aquatic. The scene mirrors the horror of the original chest-bursting moment by exploring the concept of alien life taking root inside a human. Abortion sequence | ©Twentieth Century Fox ALIEN: COVENANT (2017) - RIDLEY SCOTT Returning to the franchise, Ridley Scott had the laborious task of reimagining something he had created decades before. While he avoided to do it in Prometheus , he brought it back in Alien: Covenant and, this time, it happens twice. The first instance is a variation though, more of a “backburster”. When a crew member becomes infected, his back begins to burst, crack and break apart during a medical exam, ultimately giving way to a Bloodburster that emerges, complete with its placenta-like sac and all. Yes, that scene is gruesome. Later in the film, there’s a recreation of the classic chestburster scene: a man lies on his back, choking as an alien breaks free from his chest, tearing through his T-shirt. The diffrence is that the alien that emerges from the dead body is fully formed, resembling a miniature version of the Xenomorph. Also, in contrast with the scene above, the tone here is more, let’s say, "cute"—for lack of a better word—rather than horrific, as it depicts David (the android, played by Fassbender) proudly watching the culmination of his work. Even the music is calmer, almost soothing. ©Twentieth Century Fox ALIEN: ROMULUS (2024) - FEDE ÁLVAREZ Last in the list is the recently released Alien: Romulus , which doesn’t shy away from this now well-rooted body horror moment of the franchise. It delivers a full, classic chestburster scene with brutal realism, blending the practical effects of the 1970s with all the tropes of modern horror. In this case, the young pilot Navarro gets face-hugged and then chest-burst pretty quickly, just as we might expect. However, this time, there’s no shirt in the way. This time, we see the alien literally bursting out of Navarro’s chest, with ribs cracking and skin ripping off. "I approached it as if it was a nature documentary. The direction I gave the puppeteers was stuff like, ‘The baby looks for the scent of the mother now,’ and so it raises his head to do that. That just makes it way more realistic", Fede Alvarez, for Entertainment Weekly In conclusion, each chestburster scene across the Alien saga has built upon the terror of the original, adapting to the times and the evolving expectations of horror fans. Whether through the introduction of new lore or ways to use special effects, these scenes have remained a central element of the franchise's appeal. Which one did you prefer or which one did you see first? Let us know in the comments!
- OpenAI’s Super Bowl Ad
This wasn’t the article we planned to write today, but given the buzz, it felt necessary. We’ve covered AI advancements and controversies before, but this one stands out for its irony. The Super Bowl is known for its high-profile, creative advertisements, with each 30-second slot costing a literal fortune. This year, among the usual array of big-budget ads (including many related to AI ), one stood out—not specifically for its message, but for what it revealed between the lines. OpenAI made its Super Bowl debut with The Intelligence Age , a 60-second ad that reportedly cost $14 million, according to The Verge . The spot features pointillism-inspired animation, transforming dots into iconic milestones of human progress—from fire and the wheel to DNA sequencing and space exploration. It culminates with ChatGPT assisting with everyday tasks like business planning and language tutoring, positioning AI as the next great leap in human innovation. A compelling message, sure. But what really got people talking, at least in our algorithm? The fact that this AI-powered company, championing the future of generative technology, didn’t use AI to create its ad. “The ad itself is a signal of how AI can assist—not replace, but aid and enhance—a human-led creative effort”, as described in OpenAI’s blog about the ad. THE IRONIC TWIST For the past two years, AI-generated content has flooded the internet, with endless claims that it spells the death of traditional creative industries. Yet, when it came to its own high-profile Super Bowl moment, OpenAI opted for traditional, human-made advertising animation rather that showcasing the technology it is pushing. To produce it, they partnered with Accenture Song . According to OpenAI CMO Kate Rouch, they did use their generative video model, Sora , for early prototyping, camera animations and rapid iteration . However, the final animation was crafted entirely by human artists. “This is a celebration of human creativity and an extension of human creativity”, Open AI CMO, Kate Rouch The decision seems in and on itself… surprising. The marketing industry is already heavily turning to AI for generating visuals, copy and even entirely finished commercials—largely because it’s cheaper. But is it better? OpenAI's choice suggests that, when the stakes are high, human creatives are still the best bet. So… is it all just silicon valley hype? AI ADS AND CONTROVERSY Still from the ad "Crush!" of Apple We’re in a moment of disruption. Backlash is inevitable. Fear is palpable. In advertising, we’ve already seen brands face intense criticism for their use of AI. However, OpenAI’s approach was a calculated marketing move, allowing them to sidestep the kind of backlash that other AI-powered ads have faced. Oh, you don't remember? Let us give you a refresher, with our top three most controversial AI related ads: Coca-Cola’s AI-Generated Christmas Ad attempted to deliver a heartwarming holiday message, but instead, it was criticized for feeling "uncanny" and "soulless". Many argued that it lacked the warmth and authenticity that define classic holiday campaigns. Apple’s "Crush" iPad Ad sparked outrage by showing creative tools—musical instruments, books, and art supplies—being crushed into an iPad. Critics called it a tone-deaf metaphor for technology replacing traditional creative methods. Google’s Gemini Olympics Ad was pulled after backlash. The ad, which featured an AI writing a heartfelt letter from a young girl to her favorite athlete, was seen as diminishing human emotion in favor of automation. SO, WHAT DOES IT ALL MEAN? Generative AI is improving by the second and as we’ve said before, the question isn’t if we should use it, but how we use it. OpenAI’s Super Bowl ad makes one thing clear: AI is a tool, not a replacement. And, for once, it's nice to see that humans are still the ones shaping the (hi)story.
- Vampires that slay!
With the new Nosferatu hitting theaters (which we haven't seen yet) and a confirmed reboot for Buffy the Vampire Slayer , we thought it'd be fun to look back at some vampire interpretations or variations that stuck with us over the years. Now, a bit of disclosure first: this is a personal opinion, and while vampires come in all shapes and sizes, we seem to find ourselves drawn to those that lean more into body horror and monster design, with cool stories to match. Forget the sparkling sun-resistant centenary dating teenagers; these are the ones that made us remember why these blood -sucking creatures were meant to terrify us in the first place. THE ANGEL (MIDNIGHT MASS, 2021) Set in a small, isolated island community, Midnight Mass explores themes of faith, death, and how belief shapes our understanding of the world. In this series, Mike Flanagan reframes vampire mythology through a deeply Catholic lens, where (almost) every vampire trope finds its biblical mirror: communion becomes literal blood drinking, resurrection becomes vampiric rebirth and eternal life becomes less a blessing than a curse. Even the burning sun takes on apocalyptic meaning, straight from the pages of Revelation. "[...] The fifth, since you asked, the fifth bowl of god’s wrath plunges the world into darkness... which won’t be an issue for you, or Monsignor, will it. Why it’s almost as if God is preparing you. For that", says Bev Keane, the antagonist and arguably one of the best villains since Dolores Umbridge. After all, it seems angels and vampires have very much in common. It just depends on how you look at them. FLESH PHANTOMS (AHS: DOUBLE FEATURE, 2021) The "untalented" | ©FX What if vampirism came in pill form? That's exactly what American Horror Story (AHS) gave us - a black pill that turns struggling artists into creative geniuses... with a bloody catch. These vampires aren't cursed by supernatural forces but by their own ambition. The show uses vampirism as a metaphor for addiction and the price of fame. However, the pill only works on those who have talent. If you lack that creative spark, it transforms you into a flesh phantom — feral, hobo-like creatures. Always thirsty, never satisfied or employed. These pale figures stalk the streets dressed in signature black coats with exaggerated shoulders, very much reminiscent of Nosferatu himself. “Those things you see haunting around town, they took the pill, but they’re just hacks, wannabes, dreamers”, Austin Summers, aspiring writer in the series. We must confess, Ryan Murphy's take on vampirism hits uncomfortably close to home, giving us a surprising spin on vampire mythology. THE REAPERS (BLADE II, 2002) Blade II | ©New Line Cinema We can’t talk about vampires without mentioning Blade II , the second installment in the Blade series and the first superhero Marvel movie ever (and yes, thanks Deadpool for finally bringing him back into the cinematic universe!). Directed by Guillermo del Toro, it brought a fresh, monstrous twist to the genre. And no, we’re not here to talk about Blade himself, the black, sun-resistant vampire hunter, but about the Reapers —an evolved strain of vampires who prey on their own kind. Their signature split jaws , created through a mix of practical effects and early 2000s CGI, became instantly iconic . Admit it—you couldn’t look away, even if you wanted to. MARCUS CORVINUS (UNDERWORLD: EVOLUTION, 2006) Made with practical make up, but CGI wings | ©Screen Gems The Underworld series gave us a complete lore with leather-clad warriors and amazing creature design for both vampires and lycans alike, straight from the mind of the amazing Patrick Tatopoulos . But it was in the second movie, Underworld: Evolution , that the universe was taken to a whole new level with the first vampire of them all: Marcus Corvinus. The first impression of him is exactly what you want it to be: a terrifying man-bat who uses his wings as harpoons to actually impale his victims. It's just… freaking cool. “I think the take we have on this, is that we need to see some practical stuff. Even if aspects of them is turned into CGI we need to make sure we always have something real on camera. It’s more believable”, Tatopoulos. THE INSECT-LIKE PREDATORS ( PRIEST , 2011) Set in a post-apocalyptic world where humanity lives in walled cities under the control of the Church, these vampires represent a radical departure from tradition. Unlike the aristocratic bloodsuckers we're used to, these creatures are eyeless, sleek and subterranean. They evoke primal, animalistic terror, functioning more like a hive of predatory insects than traditional “guys in suits”, as explained by visual effects supervisor, Jonathan Rothbart, in an interview with FX guide . And they didn't stop at one type - there were hive guardians, drones, and even a Queen, each with their own design. “We wanted to keep them close enough to a human design so they are similar, but far enough away to where there can never be a person inside a suit”, Rothbart, VFX supervisor, featurette . The solution? Making them fully CG creatures. For some scenes, they used stand-ins and then replaced them with digital monsters (like they did with Davy Jones ), allowing these vampires to move in ways that would be impossible for any human performer. All in all, vampires endure because they are endlessly adaptable . Whether rooted in elegance or monstrosity, their lore offers infinite opportunities for reinvention. These five examples show how creative storytelling, design and innovative filmmaking can turn even the most familiar creatures into something new and unforgettable. And we'll see which kinds of blood-sucking creatures the future holds for us. And you, what is your favorite kind of vampire? Let us know in the comments.
- Why Davy Jones’ CGI still looks so good?
It’s been almost 20 years since the Pirates of the Caribbean: Dead Man’s Chest hit theaters, and yet Davy Jones’ slimy, tentacled face still looks better than many modern CGI characters. Why? This is a question that many have asked us. Why does this character still look so good? Why modern characters don’t hold up even when technology clearly has advanced? Why? But why though? | ©Disney And, indeed. Why does the captain of the Flying Dutchman still reign supreme in the world of CGI characters? The short answer seems to be a series of smart decisions stacked one on top of the other. Here are a few of them for you to think about for your next idea. A CHARACTER THAT JUSTIFIES CGI Davy Jones is a character that makes sense as a CGI creation. This is something CGY pointed out in their analysis and we completely agree. Jones’ design includes a beard of 46 tentacles, each moving independently, which would’ve been a nightmare to achieve with animatronics or prosthetics. And this wasn’t done just because it looks cool. Au contraire— it’s integral to the narrative. They play the organ, hide the key to the Dead Man’s Chest, kill people and even give important cues about how Jones feels. Davy Jones using his tentacles | ©Disney Now compare that to other CGI-heavy characters like Azog from The Hobbit or even Thanos. Both have few unique traits beyond their height that require them to be CGI. After all, the Lord of the Rings (LOTR) franchise is known for its clever tricks to create the illusion of size. Techniques like forced perspective convinced us that hobbits and dwarves were smaller than humans—without relying on CGI. So, why not do the same with Azog? I mean, the Uruk-hai in the original trilogy were brought to life with makeup and they’re still fan favorites. More often than not, for a main character like that, a blend of both make-up and CGI creates the perfect balance for selling the illusion to audiences. Though, we must admit, this also comes with its own challenges, as seen with the extensive enhancements needed for Red Skull in Captain America . And this isn’t just about fully CG characters. Even replacing a limb, like a robotic arm, deserves the same scrutiny. Ask yourself: What does it bring to the narrative ? Can the character do something extraordinary with it or does it add to the story thematically? In the end, if a feature isn’t inherently tied to CGI, like a tentacled beard with narrative importance, why go through all the trouble? HALF-HUMAN, HALF-SEA MONSTER Humans are wired to recognize imperfections in other humans. If a CGI character isn’t perfect, our brains pick up on it instantly. This is when you fall into the “ uncanny valley ”. In this case, Davy Jones’ design is great because it walks the line between human and sea creature. Davy Jones | ©Disney His eyes, eyebrows and mouth remain human, allowing for nuanced expressions and it lets the audience connect with him. But a tentacled beard? We don’t know how it looks like in real life. Never seen one. So, we have no way to compare it to something else. This means it’s easier for us to accept it as a character that actually exists and is not digitally created. He's human enough to empathize with, alien enough to accept it as a “real” character. MOTION CAPTURE AS A FOUNDATION Motion capture ( mocap ) technology allowed Bill Nighy’s performance to shine through. Acting alongside his co-stars in a gray suit, Nighy delivered an emotional and quirky portrayal that animators used as the foundation for Davy Jones. But mocap isn’t a magic fix. The animation team still had to completely replace the actor with a fully CG character, as explained in Pirates of the Caribbean 2 ’s bonus feature, “ Meet Davy Jones: Anatomy of a Legend ”. “We made the decision on this film to cast actors, to play the parts and they’re in the shots playing the characters, and we’re gonna put our CG versions on top of them”, Hall Hickel, ILM animation director. To match Nighy’s performance, the animators created over 700 shapes in the CG face—from blinking eyes to raising cheeks. And that’s without factoring in the tentacles. Each one was simulated separately, with meticulous attention paid to how they moved, writhed and collided with one another. The goal was to ensure realism—they couldn’t pass through each other or behave like frictionless objects. Instead, the tentacles had to feel sticky, almost like pasta, with a subtle resistance in their motion. Still, some tentacles had to be hand-animated for specific, “directable” shots—like the moment they reveal the key to the Dead Man’s Chest. “And then the other part of it is, we’re trying to translate all of that onto a character that has an octopus on his face”, Geoff Campbell, digital model supervisor. Did you know? They put makeup around Bill Nighy’s eyes just in case CGI couldn’t pull it off. Turns out, they didn’t need it—those eyes are 100% CG! THE SKIN DILEMMA Skin is one of the toughest challenges for CGI artists. Why? Because real skin has subsurface scattering —light penetrates and bounces around beneath the surface, creating a soft, translucent effect. Think about how light shines through your ears or when you see a glow inside your mouth. It’s subtle, but in visual effects, it’s a nightmare to replicate. Fail to do this correctly and your character can look rubbery—and this applies to both CGI and prosthetics. Davy Jones | ©Disney But ILM had more than one trick up their sleeve to overcome these challenges: Little skin Most of his body is covered by clothing. Unlike She-Hulk or Azog, Jones’ VFX team didn’t have to worry about rendering realistic arms, legs or torsos. They focused their resources on his face and beard, where it mattered most. Reflective surfaces Davy Jones’ connection to the sea was a blessing for ILM. As a demi-god of the ocean, he’s constantly wet—and reflective surfaces are far easier to render than dry skin. The reflections obscure subsurface imperfections, giving his tentacled face a more believable and polished appearance. Lighting choices Lighting is everything in filmmaking. Same goes with visual effects. Like magic tricks, you only show enough to sell the illusion. The right lighting can elevate makeup, prosthetics or CGI to the next level. In Jones’ case, he’s often shown at night or under the harsh Caribbean sun, which naturally casts deep shadows and enhances key features. BLENDING PRACTICAL EFFECTS AND CGI Blending practical effects with CGI is always a smart move. Even if it’s not with the creature itself, it can be done with sets, costumes or other elements to create a more grounded visual experience. In Pirates of the Caribbean , practical effects played a crucial role in grounding the film’s visuals. Real ships, costumes and on-location environments gave the audience tangible elements to connect with . Take Jurassic Park as another example. The T-Rex still holds up today because the filmmakers blended animatronics with CGI—yes, a decision more driven by the limitations of the era but one that turned out to be a sweet spot. For close-ups, they used an animatronic made by no others than Stan Winston, switching to CGI for full-body shots. Again, by giving viewers a “ground truth”—something real to anchor the illusion—the CGI felt more believable. Now compare that to modern blockbusters where entire environments, characters and even suits are rendered digitally. While impressive, they can sometimes lack the authenticity that come from combining real and digital elements. This is why we’re seeing more films return to using animatronics. It’s not because CGI has ruined movies, but because the blend of both worlds seems to truly sells the illusion. FINAL THOUGHTS All in all, Davy Jones’ success boils down to one thing: thoughtful choices. From his design to his integration with practical effects, every decision was made to serve the character and the story. For filmmakers today, the lesson is clear: use the right tool for the job. CGI isn’t the answer to everything (and neither is AI). Today, as filmmakers, we have more tools than ever to craft the best illusions. But we have to learn how to use them wisely. So, choose the methods that serve your story, timeline, and budget. If in doubt, send us a probe!
- Digital Necromancy: When Hollywood Plays God
"They were so preoccupied with whether they could, they didn't stop to think if they should" - Dr. Ian Malcolm, Jurassic Park, 1993 Three decades ago, Jeff Goldblum's character warned us about the dangers of resurrecting dinosaurs, a warning that, ironically, now applies to Hollywood's fascination with resurrecting its own icons. While digital resurrection isn’t new, advances in CGI, AI and deepfake technology have made bringing actors back from the dead disturbingly easy. So, let’s dive into this practice! WHAT IS DIGITAL NECROMANCY? Digital necromancy refers to the practice of digitally resurrecting deceased performers through CGI and/or AI, enabling them to perform in roles and scenes they never filmed—or even speak lines they never uttered. This isn’t about recycling old footage anymore; it’s about creating entirely new performances. “Coined in an article in The Guardian , digital necromancy is the posthumous resurrection of digital images of celebrities and actors; they walk, they talk and – most exciting for advertising execs and studio heads – cash registers ring when the death knell tolls”, The New Daily . When done well, it can be well received, as in Furious 7 , where Paul Walker’s unfinished scenes were completed using his brothers as body doubles. The result? A heartfelt farewell that honored the actor. Weta FX breakdown on how they recreated Walker for the movie But the practice often lands in the “should we” zone. Take Audrey Hepburn, brought back more than a decade after her death in 1993 to... sell chocolate in a 2015 TV commercial. The CGI was well-executed, and while some accepted it as a nostalgic nod, reactions were still mixed. Contrast this with Bruce Lee’s resurrection to sell alcohol , which sparked outrage—not only for using his image in a commercial but also because Lee famously didn’t drink. NOT NEW IN THE INDUSTRY, BUT THE RULES ARE CHANGING Resurrecting deceased actors has often been driven by necessity —a way to complete a film after tragedy strikes, as seen before with Paul Walker. The first use of CGI to resurrect an actor seems to be with Brandon Lee, who in 1994, died on the set of The Crow due to a prop gun accident. According to Collider , the production was nearly complete, needing only 30 seconds of footage from the actor to wrap. To finish the film, the team turned to CGI, digitally superimposing Brandon’s face onto a body double for the remaining scenes. “When the accident occurred, it was unbearable and the first reaction was: we can’t go on. But the performance itself was done and I felt compelled to finish this work as a legacy to Brandon”, Edward R. Pressman, producer. This marked a historic turning point in filmmaking. For the first time, computers were used to recreate a performance posthumously—setting the stage for what we now call digital necromancy. However, today’s digital resurrections are no longer about necessity—they are a creative choice . James Dean, who died in 1955, was controversially cast in Finding Jack (now cancelled), a Vietnam War drama. Director Anton Ernst defended the decision, saying, “We searched high and low for the perfect character… and after months of research, we decided on James Dean”. In another upcoming project, Back to Eden , Dean’s digital likeness will lead a sci-fi journey across America, further showcasing how technology now allows filmmakers to use iconic stars purely for their nostalgic and cultural appeal. “If the dead – or rather, their digital clones – are damned to an eternity of work, who benefits financially? And do the dead have any rights?”, BBC Rook in Alien: Romulus | ©20th Century Studios A similar debate surrounds Ian Holm’s return in Alien: Romulus . Digitally recreated to play Rook, a new synthetic, Holm’s mix of animatronic with CGI likeness prompted divided opinions. Director Fede Alvarez acknowledged the challenges, admitting in Empire that the CGI had to be “fixed” for the home release. But the controversy isn’t about technology’s capacity to do it realistically or not. The real question is: Was Holm’s return even necessary, or was it simply digital necromancy for nostalgia’s sake? Adding to the debate, the BBC explains that agencies like CMG Worldwide are leading the charge in managing the digital legacies of deceased stars. These companies broker deals for digital resurrections, ensuring estates are compensated. VOICE RESURRECTIONS: A FRENCH CONTROVERSY But AI doesn’t just recreate faces—it can bring voices back too . Like when James Earl Jones signed over the rights to Lucasfilm for AI to recreate Darth Vader’s Voice. Well, the latest controversy is about the French dubbing of Sylvester Stallone’s film Armor , which was done using AI to recreate Alain Dorval’s voice, as he passed away in early 2024. “AI doesn’t replace the magic of human creativity—it opens new doors for it. Recreating Alain Dorval’s voice is a chance to show how technology can honor tradition while creating new possibilities in film production”, Eleven Lab’s CEO Mati Staniszewski for Variety . The public wasn’t convinced. When the trailer dropped, fans criticized the voice for lacking Dorval’s warmth, as reported by Paris Match . Even more, Dorval’s daughter revealed that her consent had only been for a test—not for the final release. The backlash highlights the pitfalls of AI recreations: even with consent, there’s a fine line between tribute and exploitation. THE MALCOLM MEASURE OF CREEPINESS: A STARTING POINT In the absence of apparent formal guidelines, creators like Pentex Productions on YouTube have tried to bring order to the chaos. In their essay, " A.I. See Dead People ", they propose what they call the "Malcolm Measure of Creepiness," a framework inspired by Dr. Ian Malcolm’s iconic warning from Jurassic Park . Here’s how it works. The framework assigns a score from 0 to 25 - the higher the score, the creepier (and less ethically sound) the digital resurrection is - based on four key factors: Time Since Death : Resurrecting someone recently deceased scores low (0-2 points), while decades later maxes out the creepiness at 5. Connection to the Role : Completing a known role scores low (0-2 points), but creating a new, unrelated performance scores 5. Consent : Direct consent from the actor scores 0, family consent 3, and no consent at all maxes out at 5. Technology Used : Recycled footage scores 1-2, while full AI/CGI recreations hit 5. A total score above 15 lands firmly in the "yikes zone", according to the framework. And here's the kicker - if it's for an advertisement, add an extra point for pure commercial exploitation. While the method isn’t official, it’s a practical starting point for understanding why some recreations feel respectful (like Paul Walker in Furious 7 , scoring 0-5) and others cross the line ( Alien: Romulus ' Ian Holm hitting 11-17). CAN WE, SHOULD WE? The "Malcolm Measure" reminds us that it’s not about the technology itself but about intent. Are we honoring an actor’s legacy or mining nostalgia for profit? Why not work with new actors? Recast? Create something different? While technology advances way faster than we can evolve, we - mere homo sapiens that we are - struggle to keep pace with its implications, may they be ethical, moral, psychological and more! Because it's not only the icons who will digitally come back from the dead. It's also loved ones and even Jesus (if you want an article abou it, let me know in the comments!). So, again: should vs. could.
- Behind the VFX of 'The Last of Us' and '3 Body Problem'
The Neuchâtel International Fantastic Film Festival (NIFFF) stands out as a premier Swiss event for genre film enthusiasts, showcasing fantastic, action-packed, and sometimes gory films. Needless to say, it’s an event we do not miss, and not only for the movies. There are also very interesting conferences on different subjects. This Monday, we attended two sessions on visual effects (VFX) as part of the NIFFF Extended program. THE LAST OF US - STORM STUDIOS Presented by Espen Nordahl, VFX Supervisor at Storm Studios in Norway, this conference delved into the intricate work behind the VFX for the hit series "The Last of Us". Nordahl’s team was responsible for 150 shots across six episodes, with around 30 people contributing over roughly five months of post-production. They handled complex shots involving “cordyceps” tendrils emerging from the mouths of the infected, face replacements, blue screen work and creating wounds and bite marks. One key difference between the game and the series was the method of infection. In the game, characters are infected via spores, but the series opted for direct physical contact to emphasize the human element. “You’re fighting against humans, not things and that's what the directors wanted”, explained Nordahl. This decision required the Storm studio team to conceptualize and test various prototypes for the cordyceps – you know, those weird, disgusting tendrils that come out of the mouths of the infected. Being involved very early in the pre-production allowed them to test different prototypes for this very important element throughout the series. "Our motto is to fail fast, get ideas quickly”, Nordahl This way, they can show early iterations to the directors, have feedback early on and determine which direction to take. This process was crucial in refining the cordyceps’ movement, aiming for a natural yet eerie effect. They did lots of trials, some too aggressive, others too limps, falling out the mouth like noodles, until they finally settled on a more plant-like, organic slow movement, reminiscent of how plant roots would behave. The final effect was achieved through a combination of hand animation and simulation, ensuring the tendrils did what they wanted in specific shots. Early look development of the tendrils | Image showed in presentation ©Storm Studio The team also had to do CG face replacements. Although all the infected had SFX makeup, in some shots, like real close-ups, the prosthetics still felt a bit too rubbery, necessitating CG enhancements or replacement. Of course, the team built upon the incredible work already done by the SFX department, which allowed to add a more realistic layer and disgusting appearance. In other instances, they had to “fill in the gaps”, adding tendrils, deteriorating teeth, etc. Finally, there were stunt doubles who needed their faces replaced with the actors’. "References are key to everything when you’re making photorealistic VFX", Nordhal emphasized. 3 BODY PROBLEM - PIXOMONDO Michael Schlesinger, Lead Compositor at Pixomondo in Germany, shared insights into the VFX work for "3 Body Problem". The variety of shots in this project ranged from subtle enhancements to a big a** atomic explosion in space. In this case, there were a lot of invisible VFX involved. A prominent example of this is that they had to add cold breath in 52 shots for the first episode, which was set in a cold environment. "It sounds simple, but you have to analyze how they talk and what they are doing to understand when the gas comes out of the mouth. It was all the more complicated, as they did not speak English!", Schlesinger explained. But the team also had to work on bigger full CG sequences, like a rocket launch in Cape Canaveral and the dream sequence with a paper boat, which was filmed against a blue backdrop with the actor. The origami boat then had to be replaced so that it had a more paper-like texture, in addition to creating the entire environment (water, sky, fog, etc.). Dream sequence | © 2024 Netflix, Inc. For the atomic bomb explosion in space, they took a different approach; they did it with compositing techniques. As mentioned before, referencing real footage is key, so Schlesinger was surprised to see that there is actual historical footage from "Operation Fishbowl", a 1962 operation led by the USA where nukes were launched 400 km into space. However, they had to balance scientific accuracy with creative liberty to make the explosion visually compelling. “Scientists probably will see this and go like, what the? But we do it for the viewers, for the shot”, he joked. Another particularly tricky aspect when you’re doing VFX is depicting scale. For the Pixomondo team, it was one level up, as in space there are no familiar reference points. So, to convey how big the golden sail for the rocket was, the team used techniques like reducing contrast for distant objects and sharpening closer ones, mimicking atmospheric effects even though space has no atmosphere. This approach helped create a convincing sense of scale. These conferences highlighted the meticulous work involved in VFX, showcasing the blend of technical skill and creativity required to bring fantastical or realistic elements to life on screen. But that’s not all we have in store from NIFFF this week. Stay tuned as we cover more exciting conferences and reveal our top five must-see movies from the festival. You won’t want to miss it!
- The art of miniature making
On the second day of the NIFFF Extended program, the focus pivoted from VFX— showcased with "The Last of Us" and "3 Body Problem" —to practical SFX miniature making with Simon Weisse and it was fascinating! Simon Weisse and his miniatures (image from the NIFFF) So, first, who is Simon Weisse ? He's a master of miniature making, best known for his work with Wes Anderson, including "The Grand Budapest Hotel", the Roald Dahl shorts and "Asteroid City". Through time, Weisse has carved a unique niche in the industry. He began his career in the late 80s, working for movies like Event Horizon and V for Vendetta, to mention a few. But with the advent of CGI. Things changed. Lewis & Clark spaceship from Event Horizon at Lyon's museum of Miniature. “Twenty years ago, I thought my career was over, that’s why I started making props. But since 'The Grand Budapest Hotel', I’ve never had more work!”, he told. This resurgence doesn't mean he's abandoned prop work. In his studio, for "Asteroid City", the team handled both miniatures and functional props. They created meteorites in various sizes, as well as jetpacks and guns. Among the miniatures was the UFO ingeniously assembled from everyday kitchen utensils, and the train—a particularly memorable challenge since the model arrived in pieces from the States. "I was hoping it would be like assembling something from Ikea. With a manual! It was far from it, and we had to creatively find a way to piece it together!" he recalls with a laugh. Now, let's clarify something about miniatures for cinema: these aren’t your average models you can build in your living room. Cinematic “miniatures” can be larger than a car! For instance, he was asked to create a 50-meter-long road for “The Wonderful Story of Henry Sugar”. It was so massive that he couldn’t find a place large enough to film it and had to use a greenhouse in Berlin. By the way, for those interested in seeing these works up close, the Museum of Cinema & Miniature in Lyon, France, houses many of his creations, including a forced perspective set from the short film "The Swan", directed by Anderson and Event Horizon's ship. Unfortunately, miniature making remains a labor-intensive and resource-heavy process, often requiring strong support from directors and producers. "For this kind of project to be feasible, it needs the full backing of the director and sometimes even the producer. Without their support, it simply won't happen," explains Weisse. This is where filmmakers like Wes Anderson make a significant difference. For "The Grand Budapest Hotel", the director's enthusiasm for mini-effects led Weisse and his team to spend about two months in preparation and another three to four months building the maquettes—all for just a few days of shooting. When asked about the skill set needed to be a miniature maker for movies , Weisse emphasized the meticulous nature of the work, which demands patience, passion, and a diverse skill set. His team in Berlin includes architects, carpenters, painters, and more, showcasing the range of talents required. “You must not be afraid of getting your hands dirty”, Weisse advises, adding, “the 3D guy is also able to cut wood, you know?” Finally, Weisse also addressed the ongoing debate between practical and digital effects, dismissing it as nonsensical. Often faced with "purists" who insist on one method over the other, Weisse, an enthusiast for new technologies, advocates for a synthesis of traditional and modern techniques. “Combining these old techniques with new ones is truly an asset,” he stated.