Advanced Modeling and Animation / Week learning
Advanced Modeling and Animation / Week learning
September 23, 2025
24.09.2025 - 2025 / Week 1 - Week1 4
GeXianjing / 0377636
Advanced Modeling and Animation / Bachelor of Interactive Spatial Design (Honours)
Week 2 Key Learning Points
1. Display Length Units (Real-World Scale)
In VR development, maintaining accurate real-world scale is essential for player comfort and immersion.
-
Settings:
Scene > Units -
Check measurements:
Learning this helps ensure all 3D models fit realistically in a virtual environment.Edit Mode > Overlays > Measurement > Edge Length
2. Mirror Modifier
The Mirror Modifier allows symmetrical modelling along the X, Y, or Z axes.
-
Bisect: Cuts and keeps one side of the mesh.
-
Clipping: Prevents vertices from crossing the mirror plane.
It is particularly useful for modelling symmetrical structures like characters or architecture.
3. Boolean Modifier
This modifier combines two meshes using Union, Difference, or Intersect operations.
It simplifies the creation of complex shapes that would otherwise require manual editing.
4. Bevel Edges
The Bevel tool smooths out edges and corners to create more realistic geometry.
-
Segments: Control smoothness (more segments = smoother bevel).
This technique enhances the realism of models and prevents overly sharp edges.
5. Exporting to Unreal Engine (FBX Format)
Models are exported in FBX format to be used in Unreal Engine 5 (UE5).
Proper export ensures that scale, materials, and geometry are accurately transferred into the VR environment.
6. Material Concepts in UE5
Materials in Unreal Engine are built using node-based graphs called Material Expressions.
-
Scalar Parameter: Controls single numeric properties such as roughness or metallic level.
-
Vector Parameter: Controls color or emissive properties (R, G, B, A).
These parameters enable real-time material customization without recompiling shaders.
๐ก Reflection
Through this session, I learned how to model efficiently in Blender with proper proportions, apply essential modifiers, and prepare assets for Unreal Engine.
Understanding how materials function through scalar and vector parameters gave me clearer control over visual outcomes.
In the future, I plan to explore material instances and interactive VR elements to make my projects more dynamic and immersive.
Week 3
Week 3 My Learning Reflection Blog
๐งญ Overview
This week, I explored how materials and environmental elements work in Unreal Engine 5. It was my first time understanding how each surface in a 3D environment is defined by nodes, parameters, and lighting behavior. I learned how to use Unreal’s Material Editor to build realistic, stylized, and animated materials that react to light, transparency, and color changes in real time.
My main goal was to understand how to connect what I model in Blender with a fully interactive and visually immersive environment in Unreal Engine.
Learning About Materials
I discovered that materials in Unreal Engine define everything about a surface — from its color and shininess to its texture and transparency. The node-based system helped me visualize how light interacts with different objects. By adjusting the nodes, I could make surfaces look like metal, glass, or even soft skin.
Material Domain
I learned that each material has a specific domain:
-
Surface for regular 3D objects like walls or props.
-
Deferred Decal for details such as dirt or bullet holes.
-
Post Process for full-screen effects like color filters or motion blur.
Blend Mode
The Blend Mode decides how transparent or solid an object is. I practiced using:
-
Opaque for solid walls.
-
Masked for fences or leaves.
-
Translucent for glass or water surfaces.
Shading Model
Different shading models affect how light interacts with the object.
For example, Default Lit supports realistic lighting, Unlit ignores lighting completely (great for glowing effects), and Subsurface made me realize how skin or wax allows light to pass through it — a small but beautiful detail.
Exploring Material Parameters
I learned to use two main parameter types:
-
Scalar Parameter — controls single numeric values like roughness, metallic strength, or opacity.
-
Vector Parameter — controls color values (R, G, B, A).
These parameters can be changed instantly in a Material Instance, which saves time because I don’t need to rebuild the entire material every time I tweak a value.
Dynamic Effects — Time & Sine Nodes
One of my favorite discoveries was how to use the Time and Sine nodes to make objects pulse or flash.
By connecting these nodes to the emissive color, I created a simple but effective blinking light animation. It was exciting to see how basic math functions could create dynamic, living effects in a 3D world.
Week 4
☁️ Building the Environment
I also learned to set up the SkySphere, Exponential Height Fog, and Volumetric Clouds to enhance the atmosphere:
-
The SkySphere adds the sun, clouds, and stars.
-
Height Fog creates depth and realism by simulating how fog thickens with distance.
-
Volumetric Clouds react beautifully to lighting changes, giving the scene a cinematic feeling.
These tools helped me understand how environment lighting affects mood, storytelling, and realism in virtual spaces.
๐ ️ My Workflow Practice
I practiced creating new materials by:
-
Right-clicking in the Content Browser → Create → Material.
-
Naming it clearly and opening it in the Material Editor.
-
Adding nodes such as Vector Parameter, Multiply, or Sine.
-
Connecting them to build custom effects like transparency or flashing lights.
Seeing my materials come to life in real time was one of the most satisfying parts of this process.
Personal Reflection
This session helped me see how technical control and creative design work together in Unreal Engine. I used to think materials were just surface textures, but now I realize they are dynamic systems that respond to light, time, and interaction.
I’m especially interested in exploring interactive lighting, animated shaders, and VR-friendly materials in my future projects. Understanding these fundamentals gives me more confidence to bring my Blender models into immersive Unreal Engine environments that truly feel alive.
Week 4
In Week 4, we mainly focused on learning how to build a basic environment within an empty level in Unreal Engine.
The lesson started with an entirely blank scene, where we gradually added essential environmental components such as the Sky Atmosphere, Directional Light, Sky Light, Landscape (or Plane), and Exponential Height Fog. Together, these elements form the foundation of a visually coherent environment.
We also learned how to apply materials to the ground, adjusting parameters like Base Color, Roughness, and Normal Map to create more realistic surface reflections under lighting. Additionally, the use of the Post Process Volume was introduced to fine-tune the overall tone and atmosphere of the scene.
This week helped me develop a clearer understanding of the fundamental workflow of environment setup — how lighting, sky, and material settings work together to build an immersive foundation for later modeling and rendering tasks.
Week 5 was mainly reserved for continuing our Assignment 1 project. Our lecturer gave us extra time to further develop our individual works — refining model structures, adding details, and testing materials and lighting setups.
During this week, I focused on improving the two amusement ride models I had been working on and completed some unfinished parts in Blender, preparing them for import into Unreal Engine.
Week 6 :UE5 Level Sequence Animation
What I Learned
In this session, I learned how to create and control animations in Unreal Engine 5 using the Level Sequence tool.
It was my first time exploring the Sequencer panel in detail, and I gained a clearer understanding of how keyframes, actors, and timelines interact to bring a 3D scene to life.
What the Lecturer Demonstrated
During the class, the lecturer demonstrated the complete process step by step:
-
Creating a Level Sequence inside the Content Browser by right-clicking → Cinematics → Level Sequence.
-
Naming and opening the sequence in the Sequencer panel.
-
Adding actors (such as static meshes, cameras, or lights) to the sequence using the +Add → Actor to Sequence option.
-
Animating actor properties, especially transform values like Location and Rotation, by setting keyframes at different frames (for example, frame 0 and frame 150).

Previewing the result using the Play button to observe how the mesh moved smoothly between two points.
The lecturer also explained how Level Sequence works as a timeline that coordinates multiple elements—camera movement, lighting, visibility, or object motion—within the same animation clip.
My Practice and Understanding
I created a simple animation for a static mesh cube: it moved from one side of the scene to the other within 150 frames.
By experimenting with keyframes, I learned that the Sequencer automatically interpolates the transition between the start and end positions, creating smooth animation without needing to animate each frame manually.
I also noticed that adjusting the timing between keyframes can completely change the rhythm of the animation — shorter distances create quick motion, while longer timelines feel more cinematic.
Reflection
Through this exercise, I realized that Level Sequence is more than just a timeline — it is a storytelling tool inside UE5.
It allows designers to build camera shots, control lighting, and choreograph environmental motion, which can be very useful for my future projects such as interactive installations, cinematic spaces, or immersive retail environments.
Overall, this lesson helped me connect the idea of visual narrative with technical animation control, giving me a more complete understanding of how digital environments can express atmosphere and emotion through motion.
WEEK 7 – Finalizing Assignment 2 & Project Refinements
This week marks the submission period for Project 2 in Advanced Modelling & Animation. My main focus was to carefully review whether my work fully meets the assignment requirements, improve the weaker parts of my animation, and complete the final scene setup before recording the gameplay.
✔️ Checking the Assignment Requirements
At the beginning of the week, I went through the assignment brief again to make sure every part of my project meets the expectations:
-
Creating a theme park ride animation
-
Using Blueprint, Timeline, or Level Sequence
-
Adding interactive shaders or material reactions
-
Ensuring smooth performance
-
Presenting everything in a final gameplay video
After checking each section against my progress, I confirmed that my project includes all required components:
⭐ A central rotating ride structure
⭐ Wave-based secondary animation
⭐ Interactive light response via Blueprint
⭐ Custom shaders with emissive control
⭐ A complete gameplay recording
๐ Improving the Animation Based on Feedback
During my first presentation, my lecturer liked the concept of my rotating axis, but encouraged me to:
-
Extend the duration of the animation
-
Add movement beyond simple vertical/horizontal translation
-
Introduce a continuous wave-like motion for more flow and rhythm
To address this, I extended the entire animation timeline and added a phase-shifted wave pattern:
as the center rotates, the outer components rise and fall one after another, creating a dynamic “energy wave” motion.
I also synchronized the lighting and emissive effects with this wave, making the whole installation feel more futuristic and alive.
๐ Scene Enhancement & Material Issues
While improving the animation, I also wanted to redesign the environment.
I modified the ground material and adjusted the sky atmosphere to create a darker, more mysterious sci-fi mood.
However, when I merged the two levels together, an unexpected issue occurred —
many of the building materials completely disappeared.
This happened because Unreal Engine sometimes loses material references when:
-
Assets are moved or duplicated between projects
-
File paths change
-
Materials were migrated incorrectly
-
A folder was copied directly in the OS instead of using UE’s “Migrate” function
I suspect that part of my Assignment 1 models were duplicated outside of UE, which caused their material links to break during the merge.
✨ Adding the Neon Tunnel & Moving Light Arrows
To enrich the futuristic atmosphere, I added a neon tunnel at the entrance, made of glowing stripes that animate through panning textures.
Around the installation, I placed rotating and moving light arrows to guide the movement direction visually.
These small motion cues make the environment feel more alive and help highlight the ride’s spinning energy.
The neon tunnel also creates a dramatic introduction for the player before they reach the main installation, improving the overall experience.
๐ฎ Final Gameplay Demonstration
After completing the animation, interaction logic, lighting design, and environmental setup, I recorded my final gameplay video.
The final scene showcases:
-
Smooth rotation and wave motion
-
Emissive lighting reacting to the player
-
A cohesive dark-sci-fi atmosphere
-
Neon tunnel entrance and dynamic light arrows
-
A complete animation cycle in action
This final demonstration brings together everything I built over the past few weeks.
๐ Reflection
This week’s work helped me understand how animation, blueprint interaction, and environment design support each other.
I learned how important timing, lighting, and material reactions are when creating an immersive scene.
Fixing the material-loss issue also reminded me to always migrate assets properly inside Unreal Engine rather than copying folders directly.
Overall, I feel satisfied with how the final project turned out.
My ride now looks more dynamic, more futuristic, and more responsive — which is exactly what the assignment intended us to explore.
Belowๆฏไฝ ๆๆไพๅ ๅฎน็่ฑๆ็ · Week 10 Learning Log(็ฌฌๅๅจๅญฆไน ๅ ๅฎน),ๆๅทฒ็ป็จ้ๅๅๅฎข็ๆ ผๅผๆด็ๆๆต็ ่ช็ถ็่ฑๆๅ่ฟฐ,ๅฏ็ดๆฅๅคๅถๅฐไฝ ็ Blogger:
๐ Week 10 Learning Log — Control Rig, Animation Blueprint & Final Project Export
This week’s lesson focused on several important systems inside Unreal Engine:
Control Rig, Animation Blueprint, skeletal control logic, and the final VR project export workflow.
The lecturer walked us step-by-step through how controller systems work, how they bind to skeletons, and how animations can be generated directly inside Unreal Engine. We also learned the requirements for the final VR project and the complete export process.
๐ง 01. What is a Control Rig and What Can It Do?
Control Rig is Unreal Engine’s built-in procedural animation tool.
It allows us to create and control animations directly inside UE, instead of animating everything in Maya or Blender and then importing it.
The lecturer emphasized three main points:
-
It enables real-time animation logic, which is more dynamic and ideal for interactive or game-based projects.
-
We can use node-based scripting (Blueprint-style logic) to define how bones and controllers behave — including translation, rotation, and other parameters.
-
Control Rig supports both mechanical structures and organic rigs, allowing bones to move with proportional influence.
In simple terms:
Control Rig = an animation generator inside Unreal Engine.
๐ฉ 02. Creating Controllers and Binding Them to Bones
During the demonstration, the lecturer showed how to create controllers and bind them to multiple bones.
The basic workflow was:
-
Create a new controller for the skeleton.
-
Set up the “Get → Set” transform flow inside the Control Rig graph.
-
Use the controller’s transform to drive the assigned bone.
-
Assign different rotation ratios to different bone segments, such as:
-
Bone 1: 100%
-
Bone 2: 50%
-
Bone 3: 35%
This creates a natural chained movement.
-
The lecturer also mentioned that even if bones are not perfectly connected, we can manually establish control relationships in Control Rig to create flexible mechanical behavior.
๐งฉ 03. Animation Blueprint + Control Rig
After the controller setup was completed, we integrated it into the Animation Blueprint.
The typical workflow is:
-
Call the Control Rig inside the Anim Blueprint
-
Link its parameters to the logic graph
-
Adjust, blend, and synchronize animations in real time
The lecturer said:
“Control Rig provides a more dynamic development method, and it integrates well with interactive systems — which is exactly what we will be using for the final project.”
๐ 04. Detailed Controller Operations (In-Class Demo)
The lecturer demonstrated the entire setup process, including:
-
Locating the skeleton
-
Creating the Control Rig blueprint
-
Making controllers for the head, arms, etc.
-
Adjusting controller colors, rotation axes, and alignment
-
Allowing one controller to influence multiple bones
-
Using Execution Context nodes to drive logic
Important reminders:
-
Controller rotation axes must align with the skeleton
-
If a controller fails to drive the bone, the issue is usually an incorrect Get/Set Transform setup or an unassigned target bone
Students tried different setups and received one-on-one feedback during the class.
๐ฌ 05. Using Sequencer
Next, the lecturer demonstrated how to send animations into the Sequencer:
-
Add created animations to the Level Sequence
-
Run the animation via logic in the Level Blueprint
-
Sequencer will become a key part of our final VR demonstration
๐ฆ 06. Exporting the Project (APK) & VR Testing
The lecturer explained the final steps:
-
Fully prepare the scene
-
Add animations, lighting, environment setup
-
Package the project as APK
-
The lecturer will test each student’s work in VR
This export process is essential for completing the final project.








Comments
Post a Comment