Fireflies in Visual Search
What is Visual Search?
Instead of using text or speech as an input, visual search uses camera or photo to find what you are looking for online.
In the context of Amazon, visual search is mostly dedicated to help customers complete their shopping missions. It comes in handy when customers are looking for something in front of them that is hard to describe with words.
I've been working on this product since I joined the team in 2015. We went through several generations of redesigns to deliver more powerful and straight-forward experiences.
To try out this product, just download the Amazon Shopping mobile app on your phone, then look for the camera icon on the top right corner in the app.
The next project on my portfolio is all about the design we went through to get the product to its current form, so here I want to instead focus on a specific exploration that I did - developing one visual language for all the camera features, with the goal to help customers better understand how to use this product.
The Blue Dots - A Current Visual Language on Camera
Below is a video showing how the blue dots looks like on the live camera. They represent and follow feature points from every frame. Customers can tell from looking at them what the system is looking at.
Study the Customer Flow
Breaking the Experience Down Into Standardizable Moments
Once I dive deep into the customer flow and break it down into small moments, I can identify the ones when introducing a visual language can solve the customers' problems.
Select a visual language to visualize these moments
During the process of me talking to multiple engineers from our computer vision team trying to understand the basic technology for visual search, harris corner detector came up quite often as a common theme in the different recognition services we used. To some extent, some of the visual cues on the interface today are already loosely based on the harris corners. I believe a product need to be true to itself and it's characteristics need to come from within, therefore harris corners has the potential to be developed into the visual language for a computer vision based product.
Some ground work was actually done by the design team who define the Firefly experience in the fire phone a couple years ago. To understand the thinkings and design decisions went into that experience, I talked to the design director, interaction and motion designer from that team.
All these work I did before I start any visual or animation work allows me to stand on the shoulders of giants.
Defining Fireflies' Behavior
The life cycle of a firefly.
I used particle system at the beginning to define fireflies behaviors. However, after testing it live in a prototype and showing it to other people, it was obvious that the behaviors are limited in terms of the variety of the information they need to carry. So I ended up building a custom logic from scratch to get the effects that can successfully convey the message to customers.
Below is a video of the prototype I developed to illustrate the happy path of a search experience.
Using the Visual Language Across the Board
Since a lot of the moments I defined above actually happens across various features across the product we work on, I did an exercise to replace the visual elements in other features with fireflies, trying build a cohesive story while still being able to convey understandable messages at the same time.
Below is an slide show of some examples where fireflies are used in AR and other scenarios to connect moments along the journey to a seamless flow.