Show Something: Intelligent Shopping Assistant Supporting Quick Scene Understanding and Immersive Preview
In this paper, we introduce an intelligent shopping assistant system supported by quick scene understanding and augmented-reality 3D preview. By understanding the scene that users are looking at, and using the detected scene information, our system recommends related products that are not in the current scene and which could potentially interest users. With the help of existing speech recognition techniques, our system extracts users’ voice command and keywords, and provides responses in real-time, which allows users to search and filter specific products just by using voice. After finding the potential target products, our system provides users with an augmented-reality preview experience. It automatically brings products to the suitable space in front of the users by using life-size three-dimensional virtual products and spatial understanding. Users can also use two-hand gestural manipulation to operate the virtual products. Through our system, users can obtain products that are strongly related to the current environment, and intuitively preview the products in the current scene by automatic placement and two-hand manipulation to make shopping decisions.