Usually, when you wants to find out where to place a product in a store (where in the building, which shelf, and how high), you either ask customers to carry out tests at home, or invite them to a mockup of a shop floor.
It’s a time-consuming process, particularly if you have to take people to a physical location, and gives a limited insight into their decision-making – you can see what a person ultimately chose, but not the process that brought them there.
In a project for Kellogg’s, Accenture and Qualcomm have shown how cutting-edge extended-reality (XR – a combination of virtual and augmented reality) and eye-tracking technology could change that, making it possible to see what captures the customer’s attention and leads them to make their choices.
We spoke to Raffaella Camera, who is part of Accenture’s global Extended Reality Group, about the project. Camera and her team use XR to solve tough challenges in different parts of the value chain.
“Once we have identified the big solution areas – the specific areas where we think the value chain might have some challenges – then we create some offerings that package up platforms and services needed to create a solution for our clients,” she explained.
“One of the value chain areas that we have identified was merchandising, [which] is the art of figuring out optimum placement in a store, on a shelf to make sales for both retailers and brands.”
The limitations of reality
Camera and her team found that traditional merchandising tests were not only slow – they were also expensive due to the cost of recreating a store, or modifying one to create different experiences.
“It also has limited consumer reach, because you have to move the participants to a specific geographical area, and even more importantly, the communication between the retailers and the brands is often poor,” she said. “It’s difficult to communicate the results back and forth, and to see what those really are, and to agree on the optimum placement.”
To solve those problems, Camera and her team decided to reinvent the way brands and retailers perform research and tests.
“We wanted to use VR – specifically mobile VR, so untethered headsets – and even more so, we wanted to add something new to that, which was eye-tracking, and specific detailed analytics around eye-tracking,” she said.
“So we did a couple of things – on one side we partnered and we worked with Qualcomm, who provided us with a reference headset powered by the Snapdragon 845, with added eye-tracking technology from Tobii. And on the other, we used Kellogg’s […] to use our solutions in a real use-case scenario.”
Testing in VR
Kellogg’s was launching a new product, Pop-Tarts Bites, and wanted to find out the best position for it in stores to maximize sales, while avoiding cannibalizing the sales of the brand’s other products. It had already carried out in-home user tests and surveys, so Camera and her team set out to discover if carrying out the same tests in a virtual store would yield different results.
“When we created the [virtual] shelf, basically we were allowing customers to shop as they would if they were in the store,” she said. “So they could look at the shelf, they could pick up products, they could put them in the cart, they could put them back, they could turn them around and look at the ingredients – do everything they normally would.
“We also made sure that the shelf would be exactly the way it would be in real life, and exactly the way that it was during the test. And what we found out in doing it is that, basically we tested two different hypotheses. In one hypothesis we placed the product higher up on the shelf on the left, and in another test we placed it lower on the right. And […] we came to two different conclusions, including some specific data points that came from eye-tracking that led us to different end merchandising solutions.”
The results of the VR and 2D tests were very similar (customers expected to find new products higher on the shelf), but the addition of eye-tracking meant the researchers were able to gather a raft of extra data about how people make their choices – and discover that, because of the other products surrounding the Pop Tarts Bites, total sales for the brand increased by 18%.
“We were able to tell when they were picking up a specific product and putting it in the cart, how did they get to that point,” says Camera. “What did they look at before? And when they were looking at that specific thing, were they actually picking it up? What else did they do?”
“The only way you’d be able to get [this behavior before] would be to stop the test every time and ask the participant ‘What were you doing?’ And sometimes they might not even know, because a lot of these things are unconscious.”
The right hardware
For the VR test, Camera and her team used a reference headset from Qualcomm rather than an off-the-shelf device from Oculus or HTC.
“Every year we release a reference design based on the latest and greatest flagship chipset,” explained Patrick Costello, senior director of business development at Qualcomm. “In this case it was the 845 chipset, and the primary reason we elected to use our reference headset was that, [although] it obviously isn’t available to commercial consumers, we integrated the eye-tracking solution from our partner Tobii into the reference design.
“That was a very critical element to this proof of concept that we wanted to showcase – the data and analytics that you could collect from an eye-tracking solution, and how it could help business decision-makers make their decisions for VR – in this case for merchandising.
“So while it’s not commercially available hardware to consumers, we’re working directly with OEMs to enable this functionality on their commercial devices, and we expect the hardware to be available in the second half of this year.”
What else can eye-tracking do?
VR and eye-tracking technology have clear benefits for merchandising, but they have the potential for much more.
“With eye-tracking there’s something called foveated rendering, which brings higher resolution to where the fovea is looking in the eye [the part of the eye that provides clearest vision],” Costello explained.
Increasing resolution in a small area this way enables developers to achieve better frame rates and give users a better experience with the same hardware.
“It can [also] be used, for example, for creating better avatar engagements,” Costello said. “So if you’re in an avatar and you’re communicating in social VR, it’s better to have natural eye movement between the avatars, and that can be happen with eye-tracking. It can be used for things like doing automatic IPD [inter-pupillary distance] adjustment for better focus in the device. It can be used for biometric identification for logging into a device.”
It can also be used for training. One VR app already using eye-tracking effectively is Ovation – a training tool to help users improve their public speaking and awards points for making eye-contact with the audience.
“Even other applications where you might be showing a product or showing an experience to people in a specific situation,” adds Camera. “Being able to gather any sub-conscious data and being able to analyse that, currently, has not been able to be done in any other way.”