miVON: Mobile Immersive Virtual Outreach Navigator
HCI Team - Samsung R&D Center Silicon Valley (2009)
The project was to design and develop a novel method for interacting with 3D contents on mobile platforms (e.g., cellphone, tablet, etc.), showing position-dependent rendering (PDR) of a 3D scene such as a game or virtual world. The project has started since 2008 and I have joined in the middle at June of 2009. I have developed a smartphone based prototype that combined inertial and vision based sensing for 6DOF egomotion detection. The system disambiguates shifting and rotating motions based on vision-based pose estimation.
Marti, S.; Kim, S.; Chae, H. (2009). Position dependent rendering of 3D content on mobile phones using gravity and imaging sensors. Paper presented at the Samsung Best Paper Award 2009. [PDF]
Detecting ego-motion on a mobile device displaying three-dimensional content. [US Patent #8310537]