Entry 03 / 09
Dec 2025
Multimodal Hand Tracking for Stroke Rehab
Patent-pending framework that unifies iPhone and Apple Vision Pro into a single tracking surface for hand-eye coordination assessment in stroke rehabilitation. Co-invented at Interactive 3D Lab; built in Swift and RealityKit; validated in controlled studies with healthcare practitioners at 89% cross-device accuracy. Research paper in preparation.
We wanted to assess hand-eye coordination in stroke patients without sensors or a lab rig. Vision Pro tracks hands beautifully but only inside its own world space; the iPhone sees the room from outside. The hard part wasn't the geometry - it was the latency budget. Frame sync between the two devices had to stay tight enough that a fast reach didn't desync the surfaces, and that single constraint dictated the whole architecture. The clinical sessions were where the design pressure became real: practitioners don't have time to recalibrate, so the system had to stay accurate without intervention for a full assessment.