AllofUs partnered with Google’s Advanced Technologies and Projects team (ATAP) to explore and prototype use cases for Google Soli, their micro-radar technology.
Though the task in hand was to prototype applications and use cases for Google Soli we were very conscious that technologies like Google Soli didn't sit in silos. So we started the Google Soli project with a deep dive into the language of gestural communication and ergonomic principles. Looking across a range of different interactions from voice to gesture and touch. This helped us identify opportunities and white space for Google Soli's radar technology, which in turn helped us define a set of standard models for the Google ATAP team to test with the hardware.
To envision possible use cases, we facilitated a series of co-creation workshops that looked at connections and relationships between user archetypes such as Teens, the Differently-Abled, or the Elderly across a variety of contexts including Education, Health, Fitness, and Home life. This helped us define the primary interaction models for Google Soli, culminating in a set of four characteristics that helped Google understand the potential areas for further investigation and future development.
Ultimately the work we did, identified three broad strategic implementations for Google Soli:
Each implementation was packaged with its own set of interaction principles and gestural language.
Google Soli is now in full commercial development within Google and is available as an alpha dev kit.
Soli is rumoured to be embedded into the new pixel 4 phone to be released on 15 October 2019