Google Project SoliHow do you interact with an invisible computer?

That was the essence of the brief we received from Google’s ATAP team, who partnered with AllofUs to explore and develop the most compelling use cases for their miniature radar technology.

Officially launched at the 2016 I/O Conference, Soli is the result of an intense and generative collaboration to “…envision a future in which the human hand becomes a universal input device for interacting with technology.” In other words, it’s a tangible interface; opening up limitless interaction possibilities, with simply the touch of our fingertips rubbing together; as a button, a slider, a dial.

Immerse

Working alongside Google’s ATAP Engineers, we ran countless hardware tests and experiments in order to define the tech capabilities that could be translated into sets of core gestures and inputs.

In parallel, we mined the innovation landscape to help identify opportunities and white spaces for exploring radar technology. We also took a deep dive into the language of gestural communication and ergonomic principles, defining a set of standard models for the ATAP team to test with the hardware.

Frame

To envision possible use cases, we facilitated a series of co-creative workshops that imagined connections and relationships between user archetypes such as Teens, the Differently-Abled, or the Elderly across a variety of contexts including Education, Health, Fitness, and Home life. This helped us define the primary interaction models for the sensor, culminating into a set of four characteristics that helped Google understand the potential areas for further investigation and future development.

Make, Test, Evolve

From sketches to animations to hardware prototypes, our interaction principles were then stress-tested against a selection of tasks with different users. We developed high-fidelity working prototypes of the most strongly validated use cases, which the ATAP team would then integrate in the ongoing development of the hardware.

Ultimately our discovery work manifested into three key embodiments of the radar technology: as a “precious” device (like Google Home or a Nest thermostat), as an embedded component within a device, or as a ubiquitous presence (radar everywhere!) Each embodiment was packaged with its own set of interaction principles and gestural language.

Currently, Google have released its alpha dev kit for Soli, and emerging evidence of our gestural language and principles have started to appear in a range of consumer devices, from wearables to domestic products, proving “your hands are the only interface you need.”