Mobile virtual interior stylization from scale estimation
We present a new feature for AR/VR applications for consumer mobile devices equipped with video camera (e.g., smartphone). Direct or indirect scale estimation of scene or objects is necessary for realistic rendering of virtual objects in real-world environment. Standard approaches usually rely on 3D vision with sensor fusion (e.g., visual SLAM), or pattern recognition (e.g., using AR markers, reference object learning), and suffer from various limitations. Here, we argue that combining inertial measurements and visual cues, the problem reduces to a 1D parameter estimation representing distance from device to floor. In particular, we discuss robust solutions to solve absolute scale estimation problem for indoor environments.