Sonic Design / Task 4 - Audio for Silent Movie & Eportfolio

Sonic Design / Final Project – Game Audio

December 31, 2025

01.12.2025 – 31.12.2025 / Week 12– Week 14

Ge Xianjing / 0377636

Sonic Design / Bachelor of Interactive Spatial Design (Honours)


Project Selection: Why I Chose BattleBlock Theater for Sound Design


Assignment Context

For the final project of the Game Audio module (Week 12–Week 15), students were tasked with redesigning the sound for a silent gameplay video provided by the lecturer. The focus of this assignment was not on composing background music, but on sound effects, Foley recording, ambience, and audio feedback that support gameplay and enhance immersion.

The project required students to plan and organize their own Foley recording sessions, record sounds using everyday objects and real environments, and apply sound manipulation techniques such as time and pitch adjustment, effects processing, layering, and panning. Ultimately, the goal was to synchronize sound precisely with visual actions in order to bring the gameplay scene to life.

With these requirements in mind, selecting an appropriate gameplay clip became a critical first step in the sound design process.


Introduction

For my final sound design project, my primary goal was to select a gameplay clip that offered rich sonic possibilities, particularly in terms of Foley recording and audio feedback design. Rather than choosing a visually impressive game with limited opportunities for hands-on sound creation, I wanted a clip that would allow me to actively demonstrate my skills in recording, editing, and audio-visual synchronization.

To make an informed decision, I analyzed and compared three different gameplay videos, each with its own visual style, pacing, and sound design challenges.


Comparison and Analysis of Gameplay Options

(My three preferred topics)

Option A: Ori and the Blind Forest

Ori and the Blind Forest is visually stunning and emotionally expressive. However, its sound design relies heavily on orchestral background music and abstract, magical sound textures.

From a sound design perspective, the character’s movement is extremely fluid, which makes it difficult to identify precise synchronization points for footsteps and physical interactions. In addition, recreating “magical flow” sounds using everyday objects is highly challenging and often requires extensive sound synthesis rather than organic Foley recording. As a result, this option was less suitable for demonstrating practical Foley techniques within the scope of this assignment.


Option B: Hollow Knight

Hollow Knight features a highly refined and atmospheric soundscape, with an emphasis on silence, reverb, and subtle environmental details. The game’s audio design is minimalistic and carefully restrained, contributing strongly to its mysterious tone.

However, this sparse approach also presents limitations for a student project. The sound design depends on high-fidelity textural sounds, such as precise cave reverberation and low-frequency ambience. While technically impressive, this style leaves fewer opportunities for dynamic sound events and playful Foley experimentation, which are key learning objectives of this project.


Final Choice: BattleBlock Theater

After careful consideration, I chose BattleBlock Theater as the gameplay video for my final sound design project. This decision was based on how well the game’s visual style, pacing, and interaction design align with the technical and creative goals of the assignment.


Reason 1: Clear and Grid-Based Animation

Unlike the fluid animation found in Ori, character movement in BattleBlock Theater is snappy, rhythmic, and clearly defined. Actions such as jumping, landing, colliding with walls, and swimming are visually distinct and occur at precise moments.

This clarity makes BattleBlock Theater especially suitable for sound design, as it allows for accurate audio-visual synchronization. Each movement presents a clear opportunity for sound placement, making it easier to design responsive and tightly synced sound effects.

Reason 2: Strong Potential for Creative Foley

The comical, paper-cut-inspired art style of BattleBlock Theater encourages exaggerated and playful sound design. Rather than aiming for realism, the game embraces a stylized aesthetic that supports creative and organic Foley techniques.

This gave me the freedom to experiment with everyday objects to create expressive sounds. For example, wooden stage footsteps could be recreated by tapping fingers on a wooden box, while swimming sounds could be simulated using a spray bottle to produce crisp, cartoon-like water textures. These approaches align well with the project’s emphasis on hands-on Foley recording and sound experimentation.



Research and Learning Approach


Before starting the actual sound recording and editing, I first searched online for gameplay videos of BattleBlock Theater with the original audio. I watched and listened to these videos several times to get a better understanding of the game’s overall sound style and how sound supports the gameplay.

Instead of trying to copy the original sounds exactly, I treated them as a reference. I focused more on what each sound does in the game—such as how movement, interactions, and rewards are communicated through audio—rather than how the sounds are made. This helped me understand how audio feedback works together with the visuals.

From this research, I noticed that the sounds in BattleBlock Theater are simple, short, and highly responsive. Many of them are exaggerated and closely synced with the character’s actions. Based on this, I started planning how I could recreate similar functions using everyday objects, while still adding my own creative ideas rather than producing a one-to-one copy.


Part 1: Foley Recording Plan (The "How-To" List)


After completing my audio planning and sound list, I started preparing for the Foley recording stage. Based on my final audio list, I identified which sounds needed to be recorded in a controlled environment and which could be captured at home using a microphone.

On Tuesday, I booked a recording session in the lecturer’s recording studio. The studio environment allowed us to clearly test and compare different sound possibilities for each action in the gameplay. Instead of recording everything in one take, we focused on testing, experimenting, and refining each sound through multiple attempts.

Before entering the studio, I prepared a collection of everyday objects and materials. These items were chosen for their texture, weight, and interaction qualities, which could be translated into exaggerated and playful sound effects that match the style of BattleBlock Theater. During the session, each object was tested against different actions in the video, such as footsteps, impacts, interactions, and environmental movement.

Some sounds were recorded in the studio for better clarity and noise control, while others were recorded at home using a microphone, where I could experiment more freely with positioning and performance. This combination allowed me to balance sound quality with creative flexibility.

In addition to Foley sounds, I also wanted to include instrument-based sounds as subtle background cues or opening elements. These sounds were not intended to function as background music, but rather as atmospheric hints that support the game’s theatrical tone. For this purpose, I used the Apple iPhone music application to test and record simple instrument sounds, which were later refined during post-production.

Foley materials prepared for sound testing


  • Sound testing session inside the recording room, where each action was repeatedly tested using different materials to find the most suitable sound
Recording studio control desk


Foley recording process in the studio


  • Testing instrument-based sounds to be used as subtle background or opening audio elements, supporting the theatrical atmosphere of the game.
Instrument sound testing for background cues

  • All original sound effects in this project were recorded by myself using my personal mobile phone.
    Some recordings were captured using a mobile recording application, which exported files with a system-defined format label (e.g. cloudmusic.wav).

    To ensure clarity and originality, all recorded files were manually renamed using descriptive Foley-based naming conventions and organised into an original recordings folder.

  • Any audio files that were used only for reference or testing were excluded from the final submission folders to comply with the original sound-only requirement.A collection of everyday objects prepared for Foley recording, including plastic, paper, metal items, fabric, and soft materials. These were tested to recreate footsteps, impacts, and environmental interactions.

water spray
Curtain roller sound
pop

Close the door


Replacing Library-Based Music with Real Instrument Recordings

During the sound design process, I initially tested several music-like audio cues as placeholders to understand timing and emotional rhythm. However, after the lecturer clarified that music libraries and pre-existing musical sounds were not allowed, I decided to fully replace all such elements with real, self-recorded instrumental sounds.

Instead of using any digital music sources, I shifted my approach toward physical sound-making using simple percussion instruments, allowing the project to remain fully compliant with the “original sound only” requirement while still maintaining rhythmic and expressive audio feedback within the gameplay.

This decision helped me better align with the core learning objective of the module: sound design through recording, manipulation, and intentional use of physical sound, rather than relying on musical composition or external audio resources.


Real Instrument Recording as Sound Design Elements

To replace library-based musical sounds, I selected several simple, non-digital percussion instruments and recorded them as sound effects rather than music.

1. Small Brass Cymbals

The brass cymbals were used to produce short metallic impacts and shimmering decay.
By controlling strike strength and damping techniques, I was able to generate sounds suitable for:

  • short transitions

  • reward cues

  • momentary emphasis without forming a musical phrase

These sounds were treated as sound effects with tonal qualities, not background music



2. Triangle Instrument

The triangle was recorded using light and controlled strikes to capture clean, high-frequency ringing sounds.
Instead of playing rhythmic patterns, I focused on:

  • single hits

  • short resonant tails

These sounds worked well as UI feedback and alert cues, offering clarity without distracting the player.



3. Wooden Maracas

The maracas were recorded with different shaking speeds and movement intensities.
This allowed me to generate:

  • soft rhythmic textures

  • subtle motion-based sound layers

By editing and trimming the recordings in Adobe Audition, the maraca sounds were transformed into non-musical ambient textures that support gameplay pacing rather than functioning as music.


Post-Processing and Sound Design Approach

All instrumental recordings were further processed in Adobe Audition to ensure they functioned as sound effects rather than musical elements.

The post-production process included:

  • trimming recordings into short, event-based clips

  • removing rhythmic repetition to avoid musical structure

  • subtle EQ adjustments to enhance clarity

  • dynamic control to maintain consistency with other Foley sounds

Noise Reduction

Parametric EQ



Dynamics



From Raw Recording to Usable Sound (Integrated Plugin Workflow)

After all recorded audio files were exported from the studio computer, I first conducted a basic organisation and categorisation of the sound materials. The recordings were grouped into Foley effects, character reaction sounds, environmental ambience, and special effects. This initial sorting process helped me clearly identify which sounds could be used directly and which ones required further refinement.

During this stage, I noticed that several raw recordings did not fully match the visual intensity of the gameplay footage. For example, some humming and groaning sounds recorded for the cat character appeared too thin and soft in their original form, making them unconvincing when representing moments of impact or attack. Instead of simply increasing the volume, these sounds needed to be reshaped to feel heavier, rougher, and more forceful through post-processing.

To achieve this, I applied an integrated plugin workflow in Adobe Audition, including EQ adjustments, dynamic processing, distortion, and layering techniques. By reshaping the frequency balance and texture of the original recordings, I was able to enhance their perceived weight and aggression while maintaining their organic qualities.

I also realised that certain exaggerated effects, such as explosion-like impacts or burst-style visual cues, required a more constructed sound design approach rather than relying on a single raw recording. For these moments, I layered multiple self-recorded sound sources—such as air bursts, object impacts, and noise textures—and further refined them using plugins to better match the rhythm, timing, and visual style of the gameplay.

Original sound 1
Many sounds have been recorded multiple times, so there may be repetitive content


Original sound 2

Audio Storyboard and Synchronisation

Based on a detailed analysis of the gameplay video, I created an Audio Storyboard to map every sound event to its corresponding timestamp. This allowed precise synchronisation between sound and visual actions during the editing process.








Plugin Workflow and Processing Logic

Early Audio Preparation

After opening the raw recordings, I noticed that some of the sounds were recorded from a slightly far distance, which caused the overall level to be relatively low and less clear. To improve this, I first adjusted the gain of each recording, increasing the volume to make the sound more present and easier to work with in later processing. This step helped ensure that all clips had a more consistent loudness before moving on to noise reduction and sound shaping.

When I initially attempted to process the sounds using Adobe Audition, I found that its control over noise reduction and spatial reverb was relatively limited, especially when more precise adjustments were required. After consulting a senior who specializes in game voice-over and sound effect design, I began using a more professional plugin workflow to process the sounds in stages.

1.Footstep Sound Processing

The original footstep recording sounded quite muffled, mainly because it was recorded by tapping on a cardboard box. Before further editing, I first applied noise reduction to remove unwanted background noise and clean up the recording.

After that, I used a Parametric Equalizer to adjust the tonal balance of the sound. Since the footsteps lacked clarity, I slightly boosted the higher frequencies to make the sound feel more defined and realistic. At the same time, I controlled unnecessary low frequencies to prevent the sound from becoming muddy.

Through noise reduction and EQ adjustment, the footstep sound became clearer and more suitable for use in the gameplay, while still retaining a natural and believable texture.


2.Reward Sound Design (Based on Glass Tapping)

The reward sound was created based on a glass tapping recording. I chose glass because it naturally produces a clear, bright, and “reward-like” tone that fits the visual feedback of collecting items in the game.

During post-processing, I first cleaned the recording using noise reduction, then adjusted the sound using an equalizer to boost the high frequencies and slightly reduce unnecessary low frequencies. This helped the sound feel sharper and more noticeable.

I also applied a light chorus effect to subtly widen the sound, making it feel more satisfying without becoming too heavy or distracting. The final result is a short, crisp reward sound that clearly communicates success to the player.
 EQ and Chorus

3.Transition Sound Design

The teacher gave me guidance

For the transition sound when the diamond appears, I used a combination of echo and reverb. By controlling delay time, feedback, and wet/dry balance, I created a smooth audio transition that connects the visual action and the reward sound naturally. This helped the sound feel more dynamic while still fitting within the game’s cartoon-style world.


4.Character Movement Through Grass 

For the sound of the character moving through grass, I chose to create the effect using a plastic bag rubbing technique. The texture and irregular noise produced by the plastic bag closely resemble the rustling sound of grass when a character walks or brushes past it, making it suitable as a Foley substitute.

During the recording stage, I rubbed and lightly squeezed the plastic bag at different speeds and pressures to capture variations in texture. These recordings were then edited and synchronised with the character’s movement in Adobe Audition to match the timing of the gameplay.

However, during the initial implementation, I noticed that the sound felt too harsh and uncomfortable, especially when layered with other sound effects. This was mainly caused by the original gain being set too high, which resulted in an overly sharp and “ear-piercing” sound.


Instead of discarding the sound, I refined it through post-processing. I reduced the overall gain, applied EQ to soften the high-frequency range, and adjusted the dynamics to ensure the sound blended more naturally with the environment. After these adjustments, the grass movement sound became more controlled and convincing without overwhelming the listener.

This process helped me understand the importance of balance and restraint in sound design, especially when working with textured Foley sounds that can easily become fatiguing if not carefully shaped.

Observation shows that the noise is still a bit loud and needs to be adjusted

5.Explosion Impact Sound 

To create the explosion impact sound, I started with a physical Foley recording using a blown-up plastic tube. By rapidly striking and bursting the inflated plastic, I was able to capture a sharp, high-energy transient at the exact moment the material collapsed. This short, aggressive burst provided a strong foundation for an explosion-style sound effect.

After recording, I first applied noise reduction to remove unwanted background noise while preserving the initial impact transient. This ensured that the sound remained clean and focused before further processing.


To give the sound more weight and intensity, I used a pitch shifter to lower the pitch, which helped transform the original sharp plastic snap into a deeper and heavier explosion-like impact. This pitch reduction made the sound feel more powerful and suitable for a large-scale in-game event.

Next, I applied reverb to simulate spatial depth and energy dispersion. The added reverb allowed the sound to feel as if it was occurring within a larger environment rather than sounding flat or too close to the listener.

Additionally, a subtle phase adjustment was applied to slightly alter the stereo perception of the sound. This created a wider and less centered image, helping the explosion feel more unstable and chaotic, which further enhanced its realism.

Through this combination of physical Foley recording and controlled post-processing, a simple plastic burst was transformed into a convincing explosion impact sound that effectively supports the visual intensity of the gameplay.

reverb

Selected Audio Journal Breakdown

The table below provides a detailed documentation of the sound effects I designed for this project. For each selected sound, I have recorded every step of the production process—from the initial Foley recording session (using household props) to the specific editing and processing techniques applied in Adobe Audition. This breakdown demonstrates how I transformed raw, everyday sounds into stylized game audio assets through careful layering and effect manipulation.

Audio Asset List



The effects audio folder

The effects audio folder


Sound Grouping & Foley Design Overview


I originally added ambient noise, but I felt it was unnecessary and later deleted it


Game Audio


Gameplay video with implemented sound




Reflection

What Worked Well

Throughout the production process, layering was the most successful and effective technique in my workflow. At the beginning, my recording of the “curtain close” sound simply resembled a jacket being shaken and lacked any sense of weight or theatrical impact. However, after layering a sliding sound (recorded by pulling real curtains) with an impact sound (closing a heavy book), and applying appropriate reverb, the sound was transformed into a dramatic, stage-like event.

This experience taught me that good sound design rarely depends on a single perfect recording. Instead, it is often the result of combining multiple simple sounds to create a new and more expressive auditory outcome.


What Did Not Work (and How I Resolved It)

One of the main challenges I encountered was designing the grass rustling sound. My initial attempt used a plastic bag, but the resulting sound was too loud and overly high-pitched. It sounded unnatural and distracting, drawing attention away from the main character’s actions.

Resolution:
Rather than re-recording the sound, I chose to fix the issue during post-production. I significantly reduced the gain and used a Parametric EQ to cut the high frequencies by applying a low-pass filter. This softened the texture and helped the sound blend into the background as ambient foliage, rather than standing out as harsh noise.


Additional Challenges During Production

Beyond individual sound issues, I faced several broader challenges during the overall production process.

1. Audio–Visual Synchronisation
I realised early on that even a well-designed sound effect can feel incorrect if it is slightly out of sync with the visuals. Small timing mismatches made actions such as jumping, hitting, or collecting items feel weak or unnatural. As a result, I spent a considerable amount of time fine-tuning sound placement on the timeline to ensure precise synchronisation with on-screen actions.

2. Balancing Sound Layers and Clarity
When multiple sound effects played simultaneously, some sounds began to mask others, reducing clarity. To resolve this, I carefully adjusted EQ and volume levels to prioritise important sounds. For example, footstep sounds were kept within the low–mid frequency range, while reward and feedback sounds were made brighter so that they could be easily perceived by the player.

3. Noise Reduction vs. Sound Quality
Another challenge was finding the right balance between removing background noise and preserving the natural texture of the sound. Excessive noise reduction caused some recordings to sound thin or artificial. Through experimentation, I learned to remove only the necessary noise while keeping enough detail to maintain a natural and dynamic sound.


What I Would Improve

If I had more time, I would create more variations for repetitive sounds such as footsteps and jumps. Currently, the character uses only two to three footstep samples, which could become repetitive over a longer gameplay experience. Ideally, I would record five to ten variations and apply slight pitch and volume differences to create a more organic and natural result.


Conclusion

This project was a valuable learning experience that strengthened both my technical control and creative problem-solving skills. It demonstrated that sound design is not simply about creating sounds, but about constant testing, refinement, and decision-making. Through careful post-processing in Adobe Audition, even simple household objects can be transformed into functional and expressive game audio, helping to build a cohesive and immersive sonic world.



Comments