Mindful motion

MindfulMotion is my vibe-coded working prototype exploring how gesture-based gameplay can double as a micro-wellness reset.

Modern mindfulness tools often rely on audio guidance, rigid timers, or onboarding flows that create friction when users most need calm. Born from my curiosity about combining touch interaction with breathing practice it reimagines guided breathing as an interactive, gesture-first journey. Eight interconnected visual experiences map touch directly to structured breath patterns. These range from box breathing to extended exhales, guiding users through tension, stabilization, integration, and expansion without accounts, tutorials, or spoken prompts.

  • Eight interconnected breath practices forming a full nervous system reset arc

    Gesture-driven interactions that directly map physical motion to breath rhythm

    Visual metaphors (alignment, bloom, drift, horizon, gravity) embodying regulation states

    Seamless stage-based interface with responsive design across desktop and mobile

  • Distinct breath patterns per game to avoid repetitive inhale mechanics

    Visual language differentiation (radial, linear, vertical, panoramic, alignment fields)

    Breath-paced animation timing grounded in physiological regulation patterns

    Responsive gesture mapping for touch and mouse parity

    Soft gradients, noise diffusion, and high-DPR rendering to prevent banding

    Minimal UI: titles and instructions outside the stage to preserve immersion

  • React + TypeScript for component architecture and state orchestration

    Canvas/WebGL rendering for high-performance animated visual systems

    DevicePixelRatio scaling + dither overlays to prevent gradient banding

    Pointer event abstraction for unified touch and mouse gesture handling

    Config-driven game registry for scalable addition of new breath modules

    Performance-optimized animation loops with clean unmount lifecycle management

My role and key decisions

I defined the interaction architecture, breath-pattern mapping, and visual system across all eight modules. A key decision was structuring the app as a somatic progression rather than standalone exercises. I unified gesture logic across devices, elevated rendering fidelity, and reframed each mechanic as a regulation practice balancing expressive interaction with physiological grounding.

Impact

Mindful Motion translates evidence-based breathing patterns into embodied visual rituals. By removing onboarding friction and replacing passive instruction with responsive interaction, the system lowers barriers to self-regulation. The structured arc supports transitions from agitation to stability and expansion, positioning the app as both a wellness tool and a design-led exploration of nervous system awareness.

Previous
Previous

Mixed realities