The ubiquity of personal mobile devices enables consumers access to a wealth of online product and shopping information at their fingertips. Even while holding a product in a retail shop, consumers can browse detailed product information, lookup reviews, and compare prices online through their devices. But in terms of the interactive experience, there is still a distinct gap between what users physically see and touch at the shop, and the digital content that they are reading on their mobile device. In order to reduce this division of physical and digital information spaces, we attempt to combine the two spaces to make digital content accessible directly on physical objects.
Mixed Reality (MR) is an interactive technology that combines real and virtual content, often through the use of head-mounted displays (HMD). The idea is that text, sound or video from the online world could be fused with and made to respond to objects in our immediate surroundings. It has the potential to enable a more immersive experience. Hardware technologies, such as Microsoft HoloLens, have become increasingly popular recently as they become more portable and untethered, enabling us more freedom while using them.
In the near future, when hardware devices become even more compact and less cumbersome, for example in the form of a normal pair of glasses, MR will become even more ubiquitous. It is this future vision where we are aiming this work towards.
Here, we describe the design and implementation of our MR-Shoppingu system in detail.
In order to create an interactive in-store shopping experience that enhances physical products with augmented online content, we envision that users would interact with physical products naturally, without the need for any special input; the system would be able to continuously detect these user actions, and react by augmenting the physical products with relevant information. In order to achieve this, our system is guided by these design requirements:
- Continuous Context Awareness - the system is continuously aware of the context surrounding its users and the products that they are interacting with.
- Natural User Actions - Users only need to use their natural gestures and physical actions, without the need to learn a new interface, or interaction method. The system should be able to react to user’s everyday activities and actions.
- Incorporate online capabilities & content - make use of digital capabilities, such as those information and functionalities that are currently only available online (e.g. virtual bookmarks and reviews), and combine them with the physical shopping experience, bridging the gap between physical and digital worlds.
Using these design requirements, we aim to provide relevant information and recommendation to the user at the appropriate time.
The MR-Shoppingu system
MR-Shoppingu draws upon Rakuten’s vast experience with e-commerce. As shown in the video, Dr Kelvin Cheng, a research scientist at the Rakuten Institute of Technology’s Computational Interaction Group showed a glimpse of the future, using Microsoft HoloLens and some cans of Yona Yona Ale. As he picked up the can, a glowing purple circle appeared around it in 3D. As he rotated the can, its different faces triggered different kinds of displays, all of which appeared magically alongside the product. Pricing information, customer reviews and even a video of the beer being poured into a glass could be seen through the headset – making for a far more compelling pitch than, say, a poster showing a glass of frothy beer beaded with condensation.
“A day with Mixed Reality”
We envision how our system could work in the future through a user scenario:
- Taro goes into a café for a coffee and a slice of cake.
- After having an enjoyable time at the café, he walks over to the shelves to have a look at products that the café is selling.
- As he approaches the shelf, immediately the MR-Shoppingu system shows him the beans of the coffee he was just drinking, through his pair of Mixed Reality glasses.
- As Taro goes over to pick up the bag of beans, the system shows the price, user ratings and reviews.
- Taro turns the bag around to find a video playing, showing him how the coffee was planted in Peru and the roasting process.
- As Taro puts the bag down, the system suggests another product by highlighting another bag of beans on the same shelf.
Building an interactive world
MR-Shoppingu provides an interactive in-store shopping experience that enhances physical products by combining continuous context awareness, natural user actions, and augmented online content that is relevant to the user and their context of use at the particular time. This combination may help to increase efficiency and certainty of purchase, and enables a more personalized and entertaining experience for consumers.
Eventually, HMDs will become as lightweight and unobtrusive as a pair of glasses, and it will be able to understand our surrounding and our behaviour more so than ever before, changing the way we interact with and experience the world around us, and changing the way we communicate with each other.
As our next step, we aim to conduct a user study to investigate how much users prefer using MR-Shoppingu to purchase, and how effective it is in helping consumers shop in physical retail shops. In the long term, we hope to be able to make the system more flexible and robust, by automatically detecting objects, and recognizing products by searching in online product databases in real-time.
MR-Shoppingu was introduced at the IFIP International Conference on Entertainment Computing (ICEC 2017) in September 2017, and received the Best Poster award. For more details of this work, please see Rakuten.Today  and our published paper .
, Azuma, R.T.: A survey of augmented reality. Presence 6(4), 355–385 (1997)
 Rakuten Today : "How mixed reality could spice up bricks-and-mortar shopping"
 Cheng K., Nakazawa M., Masuko S. (2017) MR-Shoppingu: Physical Interaction with Augmented Retail Products Using Continuous Context Awareness. In: Munekata N., KunitaI., Hoshino J. (eds) Entertainment Computing – ICEC 2017. Lecture Notes in Computer Science, vol 10507. Springer, Cham