Developed a gesture-driven interaction system using gyroscope-based sensors, mapping real body movements to real-time control of virtual vehicles, exploring embodied interaction and human-machine mapping strategies.
Creating intuitive mapping between physical gestures and virtual vehicle control while maintaining responsive and natural interaction. The system needed to accurately capture subtle body movements and translate them into precise vehicle commands.
Implemented a sophisticated sensor fusion algorithm that combines gyroscope data with accelerometer readings. Designed custom gesture recognition patterns that feel natural and provide immediate feedback through haptic and visual cues.