Sound is the invisible glue of VFX
- Alex Iwanoff
- Jun 18
- 4 min read
Imagine two identical balls rolling toward each other. In silence, most viewers think they simply pass through. But add a brief collision‑like click, and suddenly they seem to bounce off each other instead. This is the optical trick called the cross-bounce illusion.

This effect was explored in a study published in Scientific Reports, which showed that even imagined collision sounds can shift perception: participants who heard a collision noise were significantly more likely to see a bounce than those who didn’t.
This reveals something simple, but powerful: the brain fuses sight and sound. We don’t process visual and auditory information in isolation. On the contrary, we integrate them to build our version of reality. In ambiguous visual scenarios, sound steers what we think we see.
And it’s not just in science labs. This idea shows up everywhere, even in written format, like comic books or scripts. Think about it: KABOOM! CRACK! POOM! You don’t actually hear them, but your brain does something with those cues. It helps make the action feel more immediate, more real.

And this is a fact one must not overlook when doing a film. Yes, music sets the tone and emotion of a scene. But when it comes to grounding the action itself—especially when the action isn’t real—sound design does the heavy lifting. It tells us what’s hard, what’s heavy, what’s moving fast or slow.
And that’s where visual effects meet their invisible partner: sound design. Because when it’s done right, synchronization anchors believability.
VFX WITHOUT SOUND IS HALF THE STORY
Visual effects without the right sound?
They just don’t land. They feel hollow. Fake.
Think of a volcano erupting: the roar, ground-rumbling bass, debris crackling—without these, the scene falls flat, unbelievable. A brawl without the hitting sounds? You might as well be watching rehearsal footage (and even those have enhanced sounds!).
“Sound design immerses you in the film’s world and stirs your emotion”, Filmustage

This is already true for every genre (I mean, what is horror without the wet, bone-crunching sound effects?), but it becomes absolutely critical when what you’re seeing on screen doesn’t exist at all.
Take the iconic sounds of Star Wars. Like the VZOOM VZOOM (yes, you heard it!) of the lightsabers or the PEWW PEWW of the laser guns. These were created by sound designer Ben Burtt, who faced the challenge of bringing an entire galaxy to life through sound alone.
To give you an example, the shriek of the TIE fighters drew inspiration from WWII German dive bombers. The Ju‑87 Stuka were equipped with mechanical sirens called Jericho Trumpets. These weren’t tactical. Au contraire, they were installed purely for psychological impact, for terror. They announced the incoming chaos.
To create the same unease, Ben Burtt blended elephant calls with wet pavement tire noise, and added a Doppler effect to sell the motion. And voilà, a designed fear, echoing real-world trauma from a galaxy not so far away.
“Sound is what truly convinces the mind is in a place; in other words, ‘hearing is believing’”, Jesse Schell, Video game designer and CEO of Schell Games.
So, again, when audio falls short, everything falls apart. The LA Film School warns that sloppy audio can “ruin otherwise spectacular production”. No matter how gorgeous the simulation is or how scary your monster looks, if the audio isn’t there, credibility crumbles.
This is also the reason why you can sometimes save a mediocre effect with strong sound, or even skip the effect altogether and just suggest it with audio. Yes, you don’t always need to show it, just let your audience hear it.
“Sound is much more violent than images. There’s something physical, immediate about it—almost hand-to-hand. To make someone hear a sound is an act of intent”, Daniel Deshays, sound engineer and sound director.
AI‑DRIVEN SOUND DESIGN
Now, with generative AI, things are evolving fast. We’re starting to see tools that generate audio and visuals together, in sync from the start.

Academic work like Oxford’s “Audio‑Visual Synchronisation in the Wild” is feeding directly into new tools that prioritize perfect sync in real-world footage. Platforms like AutoFoley use deep learning to generate synchronized Foley for silent footage. Even ElevenLabs is exploring AI-generated sound effects (which are actually fun to play with). And platforms like Veo 3 and other beta-stage tools promise end-to-end pipelines: you create visuals with their matching audio.
But that raises a question: just because we can generate sound, are we doing it right? And also, in the case of VFX, can AI generate a sound that does not exist?
“Sound design is not about creating noise that simply mirrors what’s in the images, but about assigning value to certain elements within the visuals”, Daniel Deshays, sound engineer and sound director.
While AI can mimic, good sound design is still about intention. In a seminar called “The mise-en-scène of sound design”, Deshays points out that audio is about gesture. How far, how fast, how forcefully a sound evolves. Because in the end, “sound isn’t reality, it’s a way of listening. It’s an interpretation of the world”, as he put it himself. So, use the AI-tools, but use them wisely. Play with it. Mix it and remix it.
In conclusion, yes, you see with your eyes. But you believe with your ears.
Sound guides the imagination, fills in the blanks and makes the digital feel physical. It shapes expectations, sells impact and builds immersion.
That’s why sound is the invisible glue that makes the illusion stick.
Don’t do VFX, without having a budget for the sound design.
It’s the liant in a chef's dish.
The mortar between the bricks.
Without it, your effects might taste, look and feel bland.
Comentários