Performative Media / Week learning
Performative Media / Week learning
September 23, 2025
24.09.2025 - 2025 / Week 1 - Week 4
GeXianjing / 0377636
Performative Media / Bachelor of Interactive Spatial Design (Honours)
Week 1 — Introduction to Performative Media
Human–Space–Media Interaction
This week marked my first class in Performative Media. The lecture focused on exploring how humans, physical space, and media can interact with each other, transforming the audience from passive viewers into active participants. We were also introduced to TouchDesigner, a node-based visual programming software that allows creators to design real-time visual and sound experiences through sensors like cameras, microphones, or motion detectors. It feels like “programming with LEGO blocks” — logical, yet highly creative.
We explored inspiring examples such as Team Lab’s immersive installations and OK Go’s robot performance video. Team Lab’s work impressed me deeply — the way every human movement can reshape the projection and sound makes the space feel alive. OK Go’s project, on the other hand, revealed the precision of code-driven performance art, where every movement was perfectly synchronized with the music.
Our first group activity required us to apply the Input → Process → Output model to analyze an interactive artwork. This structure helped me understand the essence of performative media — interaction. The audience provides the input, the system processes it, and the artwork responds. It’s not just about visual display; it’s about communication. I look forward to creating an installation that can respond to people, blurring the line between human and art.
| Stage | Description | Details & Function |
|---|---|---|
| Input (Sensor) | Touchpad / Vibration Module / (Possibly Camera or Microphone) |
|
| Process (Logic) | Interactive Rules |
|
| Output (Media / Experience) | Visual & Auditory Feedback |
|
Week 2 — Performative Media: What I Learned from Class and Practice
Today was the second week of Performative Media. The lecturer asked us to install TouchDesigner, saying that this week would be more hands-on. As I listened to the lecture while setting up the software, I finally started to connect the abstract idea of “performative media” with something tangible and real.
What I Learned
Creative Coding / Generative Art
The teacher explained “generative” in a way that made sense to me: we create rules and parameters, and through iteration and emergence, complex forms grow from simple logic. Before today, I thought “randomness” was just randomness — but I learned the difference:
- Randomness = pure unpredictability, no pattern.
- Noise = structured randomness, smooth and controllable, perfect for natural effects like terrain, clouds, or waves.
This framework fits perfectly with interactive work — the sensor collects audience data (movement, sound, etc.), the system processes it with code, and then outputs visuals or sound. That’s the classic Input → Process → Output loop.
The Three Building Blocks of TouchDesigner
I now understand TouchDesigner as three main “languages”:
- TOP — things you can see (images, pixels, GPU processing)
- CHOP — things that drive (numbers, time, devices)
- SOP — things that exist in space (3D geometry, particles)
It’s not about writing long code lines — it’s about connecting visual blocks like LEGO.
What I Did (My First Network)
My goal was simple: create a colorful, breathing noise animation.
- Added a Noise TOP, and tweaked Period and Harmonics to observe texture changes.
- Added a Ramp TOP for gradient colors, then used Lookup TOP to map the ramp’s colors to the noise values.
- Added a Null TOP and enabled its blue viewer flag to preview the output.
- Made it move: created an LFO CHOP, exported it to Translate or Rotate of a Transform TOP, then slowed Frequency to ~0.2 to make it “breathe.”
- Duplicated the Noise→Lookup chain with different parameters and used Composite TOP to layer them.
- Adjusted with Level / HSV Adjust / Bloom / Displace TOP to make it glow and feel alive.
The coolest part was learning how absTime.seconds can continuously drive parameters,
creating non-repetitive smooth motion — a nice contrast to the looping rhythm of the LFO.
Using different Seed and Period for the two noise layers also created a richer, more organic texture.
Group Progress & Assignment
By Week 4, our group will present a precedent study & critical analysis. We’ll analyze five areas — context/scenario, tools & hardware, I→P→O logic, audience role, and aesthetics — and identify what we can learn for our own project. Today the lecturer walked around and asked each group to clarify their common interests and project direction.
My Reflection
- Generative art is not about letting randomness take over. It’s about balancing order and chance.
- I realized the audience isn’t watching the animation — they make it move.
- The node-based workflow encourages fast experimentation and live response.
What I’ll Do Before Next Week
- Recreate today’s network in three variations: only LFO-driven / only absTime-driven / two noise layers + different Composite modes.
- Try linking a microphone or camera input through CHOP to drive visuals directly.
- Prepare two reference artists and an IPO breakdown for Week 3 slides.
One-sentence takeaway:
I learned to create “breathing” visuals through rules and noise — and to see that, in performative media, the real heartbeat comes from the data flowing between human and system.
Week 3
Step-by-Step Process
Step 1: Edge Detection
- Started with a simple
circle1TOP to create a base shape. - Connected it to
edge1to extract edges and create a high-contrast outline.
👉 This step helped me understand how to transform a solid shape into a line-based visual suitable for further effects.
Step 2: Displacement
- Created a
noise1texture and used it indisplace1to distort the circular outline. - Adjusted Period / Amplitude and Harmonic Gain / Exponent to control scale and complexity.
👉 I learned how noise textures generate natural, fluid motion.
Step 3: Feedback Loop
- Built a feedback chain:
feedback1 → level1 → transform1 → blur1 → comp1 → bloom1. - Set Target TOP in
feedback1tocomp1to close the loop. - Learned the role of each operator (decay, subtle scaling, flow softening, glow).
Learning Outcome
This practice helped me understand time-based layering in real-time visuals. The results resembled flowing water waves, energy pulses, and abstract biological motion. Small parameter changes drastically affect rhythm and direction.
Reflection
At first I didn’t fully understand the feedback target, but once I linked it back to comp1,
the loop worked perfectly. Even a minor scale change in transform1 can flip the animation’s flow.
Overall, I gained a deeper understanding of real-time generative visuals.
Week 4 — Group Presentation: Precedent Study & Critical Analysis
This week, our group successfully completed Assignment 1: Precedent Study & Critical Analysis for the Performative Media module. Our task was to research and critically analyze how media artists design interactions in performative works using the Input → Process → Output (IPO) model.
We selected two artists — Refik Anadol and Blast Theory — who represent contrasting yet complementary approaches to interactive art. Refik Anadol visualizes data through AI-driven immersive environments, while Blast Theory explores human relationships and storytelling through participation.
Assignment 1 — Precedent Study & Critical Analysis ( For the detailed content of Assignment 1, please refer to this post ) https://gexianjing.blogspot.com/2025/09/performative-media-poject-1.html
Click to view the group PPT content
Artworks analyzed:
- WDCH Dreams & Machine Hallucinations — Refik Anadol
- Can You See Me Now? & Rider Spoke — Blast Theory
Our lecturer commented that our presentation was “very complete and well-structured,” especially in connecting theory with emotional audience experience.
I learned that interactivity is not just technology, but human engagement and perception. Emotion, participation, and media systems must align to create meaningful experiences.
CHOPs Deep Dive in TouchDesigner
We explored CHOPs and how real-world data (sound, movement, sensors) can be transformed into responsive visuals. By using Constant, Math, LFO, Pattern, and Filter CHOPs, we learned animation control and rhythmic interaction. Generative art reminded me that rules + randomness create evolving results and emotional dialogue.
Week 5 — Learning Summary: CHOPs Logic and Animation Control
This week focused on building interactive control systems using CHOPs. We used Constant, Math, Pattern, LFO, and Switch to animate shapes, manage parameters, and create smooth transitions in real time.
A key exercise involved switching multiple Constant outputs to change colors dynamically. Then we explored rotation and motion control with Pattern, Count, and Lag. LFO waveforms became a practical tool for rhythmic motion.
What I learned from nodes:
- Constant stores reusable numeric values.
- Math remaps value ranges for animation.
- Rename unifies channel names for cleaner connections.
- Lag smooths rapid changes into stable motion.
This week deepened my understanding that animation is also mathematical — built through logical relationships and continuous feedback.
Week 6 — Noise + Audio Reactivity & Project Planning
Date: Oct 31
Focus: Noise displacement + audio analysis; concept brief setup
- Mapped motion/brightness data into noise displacement; explored routing with Analyze/Select CHOP.
- Added Palette → Audio Analysis: linked kick/low/high bands to brightness/offset/strength (Math + Lag smoothing).
- Began concept → prototype pipeline: modular build; projection framing; plan roles, sensors, IPO.
- Outcome: a responsive visual sketch plus lean concept slide outline.
Week 7 — Learning Log: Gesture Interaction Practice
This week, I continued developing gesture-based interaction in TouchDesigner, focusing on turning hand-tracking data into stable, responsive visuals.
What I learned
- Using Select CHOP: isolated pinch distance and XY channels for cleaner control.
- Smoothing & Remapping: Lag for smoothing, Math for range mapping, Limit for clamping negatives.
- CHOP Reference Control: linked data to Circle TOP radius/center; adjusted center growth; added webcam displacement.
- Two Hands: used the second hand as a separate modulation source for multi-parameter control.
Notes on Final Project
The lecturer reminded us that this prototype leads into the final installation. We should keep the physical setup small, modular, and easy to transport, and we can contact the lecturer anytime for feedback.
Week 8 — Interaction Mapping & Prototype Planning
(Performative Media – Week 8)
This week shifted the way I think about interactive artworks. I used to focus mainly on visuals — brighter colors, prettier particles, stronger bloom — but through Activity 1 and Activity 2, I realized interactive work is built from a relationship: a chain from the human, to the system, to the meaning behind behavior.
Activity 1 — Interaction Mapping Sprint
User Action → Input Sensor → Processing → Output Behaviour → Intended Meaning
Writing this line forced me to examine the logic bones of the work. Instead of “how can I make this look cool?” I had to ask: “Why should the system react this way?” “What does the user’s action mean?”
My Interaction Map: Echo Flow Logic Chain
- User Action: audience raises hands, waves, or makes sound, becoming participants.
- Input Sensor: hand tracking (direction/position/speed) + audio analysis (amplitude/frequency).
- Processing: mapped into force strength, turbulence, brightness/bloom, frequency-based layers.
- Output Behaviour: particles swirl/stretch/scatter; bright zones burst; feedback creates trails.
- Intended Meaning: light feels soft, responsive, alive; the audience co-shapes it.
Activity 2 — Prototype Priority Setting
Don’t polish visuals before core interaction is stable. If the chain is broken, even the prettiest bloom won’t save the piece.
1. Core Interaction (must be first)
Gesture / sound → force + brightness mapping → particle behaviour change
- gesture tracking cannot jitter
- audio input must be clean
- particles must respond smoothly
- the loop cannot break
2. Secondary Layer (optional)
- alternative colour modes
- additional gesture mappings
- mode switching by frequency differences
3. Aesthetic Polish (last)
- smoother particle trails
- cleaner color grading
- balanced bloom
- consistent layers and composition
Personal Reflection
I now understand meaning comes from how the system interprets intention. Fast movement should explode with energy; slow movement should feel gentle. Soft sound makes light tremble lightly; loud sound bursts the field outward. This isn’t control — it’s relationship.
An interactive artwork is not about control; it is about resonance.
Next Steps
- Improve smoothing for hand tracking inputs
- Refine low / mid / high frequency mappings for audio
- Fix feedback overexposure issues
- Test proximity-based particle density
- Apply polish only at the very end
Week 9 — Weekly To-Do
1. Main Goal of the Week
To present our early prototype, gather feedback, and clarify next steps so the interaction becomes understandable and immediate for the audience.
2. What We Will Test / Improve
- clarity of input–output relationships
- visibility and speed of system response
- rhythm and pacing of demo video (current feels too slow)
- scale of movement — gestures must trigger bigger changes
- communication of the idea behind the interaction
📌 In short: The audience should understand what’s happening within the first 2 seconds.
3. One Concrete Task
Re-edit the demo video and adjust particle behavior so gestures and sound produce stronger, more noticeable visual changes. 👉 Essential task: Make the interaction clear and immediately readable.
4. Quick Reflection
✔ What worked well:
- shorter video with only essential clips
- stronger force / noise / particle spread makes gestures clearer
- simpler explanation helps visuals speak
✔ What to review next week:
- speed up feedback and reduce lag
- consider simplifying inputs to hand gestures + voice
- make visual language more intuitive (contrast, bloom, directional flow)
ECHO FLOW – Week 10 Progress|From Digital Energy Field to an Immersive Fabric-Light Installation
During Week 10, our group shifted Echo Flow from a digital concept (Assignment 2) into a tangible spatial installation for Assignment 3. With feedback from our lecturer and deeper discussions among our group members, we began reconsidering one crucial question:
If our particle system truly “exists,” what physical form should it take? And how should audiences enter, feel, and disturb it?
This blog entry records our Week 10 development — including lecturer feedback, our upgraded installation plan, and the next steps toward building an immersive light-fabric environment.
01|Lecturer Feedback: Strong visuals, but the installation is still too abstract
Our lecturer gave positive comments on the TouchDesigner prototype:
-
Strong visual presence
-
Clear gesture/sound reaction
-
The particle system already feels expressive
But the main issue was also very clear:
The work is still too abstract. It lacks a physical object or spatial structure that audiences can enter, approach, or engage with.
We were encouraged to strengthen:
-
The meaning behind the concept
-
A concrete physical form to anchor the visuals
-
A defined audience flow and spatial behaviour
-
A complete installation sketch and equipment plan
This feedback made us realise that moving toward Assignment 3 means letting the piece fall “out of the screen” and truly occupy physical space.
02|Group Discussion: Making the energy field physical, letting audiences walk inside
After feedback, our discussion focused on how to transform Echo Flow into a hybrid of physical and digital elements. Several key decisions emerged:
① The Energy Core — a cosmic metaphor made physical
We plan to anchor the particle system within a real object:
-
A transparent dome or acrylic sphere
-
Internal LED or fibre-optic lighting
-
Natural elements (plants, stones, soil) to shape the surrounding space
-
Projected visuals forming the “outer energy field”
The core represents a living, reactive energy source.
When the audience moves, it “breathes,” pulses, expands, or contracts.
② Adding ‘Spatial Memory’: the system remembers where people walked
We expanded the interaction logic:
Using TouchDesigner’s:
-
Feedback TOP
-
Optical Flow
-
Position tracking via camera
we plan to record the viewer’s walking path.
This produces short-lived glowing trails —
as if the space remembers their presence.
It deepens the theme:
Even after a person leaves a position,
their “energy trace” remains in the field.
③ The major breakthrough: Fabric screen + reflective surfaces
This became our most important advancement this week.
Instead of projecting on a flat screen, we decided to use:
✔ Semi-transparent fabric (sheer curtains) as the projection medium
Fabric allows the particle visuals to appear:
-
floating, vapor-like
-
layered through multiple fabric sheets
-
drifting with air movements
Most importantly:
The audience can step inside the fabric layers,
as if entering a cloud of light or an energy corridor.
When people walk between the curtains, the fabric moves gently.
The projected particles distort physically, bonding digital imagery with real-world motion.
✔ Mirrors / reflective surfaces to shape light behaviour
Mirrors are not for showing the audience’s reflection, but for:
-
bending the projection
-
breaking the particle lines
-
redirecting light into unexpected paths
This reinforces the cosmic idea —
like gravitational lensing or distorted starlight.
03|Initial Exhibition Layout (Week 10 Draft)
(1)Spatial Structure Draft
┌─────────────────────────────┐
│ Overhead soft lighting │
│ (simulated cosmic glow) │
└─────────────────────────────┘
[Fabric Curtain] [Fabric Curtain]
│ │
│ Audience enters │
│ and walks inside │
● Energy Core (acrylic dome)
← Camera & microphone are hidden in the structure →
(2)Audience Journey (explainable within 30 seconds)
-
Viewer enters between the fabric layers
-
Raises hand → particle distortions ripple across the fabric
-
Makes sound → pulses of light expand from the core
-
Walks → trails of light follow their movement
-
Stands still → the energy field slowly returns to equilibrium
The entire experience feels like:
Walking inside a river of light,
where movement and voice reshape the digital universe.
04|Next Steps (for Week 11)
✔ 1. Fabric projection test
-
Buy sheer fabric (white or grey)
-
Test projection distance & brightness
-
Try double-layer fabric for depth
✔ 2. Prototype of the Energy Core
-
Acrylic dome or plastic sphere
-
LED or inner lighting structure
-
Stability + mounting test
✔ 3. Sensor positioning
-
Camera placement
-
Hidden microphone setup
-
Ensure tracking still works when the audience enters the fabric space
✔ 4. TouchDesigner projection layering
-
Core particle layer
-
Outer flowing layer
-
Ground or wall memory trails
✔ 5. Exhibition mockup
-
Top / Front / Side views
-
A near-final conceptual render
05|Conclusion:From particles on screen to an immersive energy field
Week 10 helped us understand that:
Interaction becomes meaningful only when experienced physically.
The new direction of Echo Flow for Assignment 3 transforms it into:
An immersive field of energy that audiences can walk through, disturb, reshape,
and leave temporary traces within.
Fabric, mirrors, particle systems, and spatial memory together create a space where:
-
the audience becomes an active participant
-
their presence has visual impact
-
their movement leaves light behind
-
the digital environment is no longer flat, but alive
In this new iteration, viewers are not “watching a work” —
they are entering a small universe that reacts to them,
recording their existence through momentary echoes of light.
当然可以!以下是 将你提供的两段内容(纱布方向 + 水面方向)整合后的统一版本。
这段是 英文版,逻辑顺畅且自然衔接,适合直接放进你的博客或周记。
WEEK 11 — Direction Refinement & Installation Exploration
Based on our lecturer’s feedback from last week, we began to re-evaluate the physical direction of our installation. Although our original idea using sheer fabric and projected particles created an atmospheric and immersive effect, the lecturer pointed out that the concept was still too abstract and lacked a tangible physical anchor.
Fabric Layout Discussion (Initial Direction)
At the beginning of the week, our group focused on using multiple layers of sheer fabric to shape the spatial layout. After class, we continued discussing our ideas through WhatsApp and created several rough sketches to visualise how the audience might walk into the projection instead of simply observing it.
We explored references such as fabric-based light tunnels, layered projection corridors, and soft material installations to understand how particles could “float” across overlapping fabric layers. Our sketches included:
-
approximate projector and sensor positions
-
the audience walking path
-
fabric height and layering structure
This direction helped us imagine a dreamy, fluid visual atmosphere—but it still lacked a strong physical component.
Shift Toward Water-Surface Interaction (Updated Direction)
After further consultation, the lecturer suggested that we explore a more material-driven approach, especially one that allows a clearer physical-digital relationship. The idea of interacting through water resonated strongly with our concept.
We realised that water’s natural properties—ripples, reflection, and surface distortion—could visually and conceptually support our particle behaviour much better.
Thus, we began refining a new setup:
-
finding a container capable of holding water
-
placing a screen or projection surface beneath the water layer
-
designing the container’s outer frame to match the theme
-
allowing the audience to interact directly on the water surface
In this configuration, gestures on the water—such as tapping, stirring, or making ripples—would distort the projected visuals below, generating light ripples, particle scattering, and pulsating effects.
This approach offers a clearer physical anchor, a more intuitive interaction, and a sensory connection that expands the meaning of our original TouchDesigner particle design.
Comments
Post a Comment