Technical explanation, aesthetic intent, and peer feedback
This piece is a fullscreen WebGPU fragment shader built with gulls.js. It combines the current webcam frame with the previous rendered frame stored in a persistent backbuffer. Per pixel it blends them using out = v * 0.05 + last * 0.95, while a sin/cos UV warp (driven by the feedback intensity slider) shifts where each sample is taken. I then add an FBM layer generated from value noise (5 octaves), sampled with both spatial and time offsets to produce soft turbulence.
The composition is controlled by six interactions: four sliders (feedback intensity, noise scale, chromatic shift, color rotation) plus two toggles (video on/off, noise on/off). Chromatic separation is done by sampling R and B channels at opposite offsets around the warped UV. A lightweight vignette keeps the eye centered. If camera capture fails, the project automatically switches to a webcam-independent shader variant (noise + backbuffer echo) while preserving the controls, so the piece still runs and can be critiqued on any machine.
My visual goal was to treat the webcam as unstable matter instead of identity documentation. I wanted the image to feel like a membrane: recognizable for a second, then absorbed into a noisy fluid field. The project leans into controlled disorientation—color channels separate, edges drift, and texture breathes in cycles. Rather than maximizing glitch, I aimed for a narrow zone between beauty and signal loss.
I constrained myself to one full-screen plane and a dark interface, so all complexity had to emerge from temporal modulation and sampling decisions. The palette comes primarily from the camera itself, but chromatic offsets and rotation skew it into synthetic tones. The result should feel intimate and slightly uncanny: users see themselves, but only as a transforming light event.
My classmate said the piece felt "meditative until the feedback goes high, then suddenly chaotic," which matched my expectation and confirmed that the slider range has a meaningful tipping point. They especially liked the RGB split because it made motion feel deeper than the flat screen. One mismatch with my expectation was that color rotation read as subtle to them; I expected it to be more immediately obvious. I may amplify this parameter in a future revision.
On compatibility: camera capture worked on my machine and in Chromium, but my classmate initially blocked permission and got the fallback mode. They were still able to evaluate interactions and noise behavior, which validated the fallback path as a practical submission safeguard.
a3.htmla3.jsindex.html