Google’s Advanced Technologies and Projects team (ATAP) partnered with AllofUs to explore and prototype use cases for Google Soli, their micro-radar technology.
Though the task in hand was to prototype applications and use cases for Google Soli we were very conscious that technologies like Google Soli didn't sit in silos. So we started the Google Soli project with a deep dive into the language of gestural communication and ergonomic principles. Looking across a range of different interactions from voice to gesture and touch. This helped us identify opportunities and white space for Google Soli's radar technology, which in turn helped us define a set of standard models for the Google ATAP team to test with the hardware. We then worked with the Google engineers to run a series of hardware tests and experiments to turn these into workable and usable user interactions.
To envision possible use cases, we facilitated a series of co-creation workshops that looked at connections and relationships between user archetypes such as Teens, the Differently-Abled, or the Elderly across a variety of contexts including Education, Health, Fitness, and Home life. This helped us define the primary interaction models for Google Soli, culminating in a set of four characteristics that helped Google understand the potential areas for further investigation and future development.
We took the work we did in the research phase and started to build that into a range of prototypes and working proofs of concept. We used an iterative, make, test, learn approach to test and validate the use cases we'd identified, with the most successful going forward for further development with the Google engineers.
Ultimately the work we did identified three broad strategic implementations for Google Soli:
- As an embedded component within a device - fixed in a product
- As a ubiquitous presence - fixed in an environment, e.g. a room
- As a “precious” device that was always on and reacted to your context, e.g. a wearable that knew you were driving a car and offered the right tools for that occasion - radar everywhere
Each implementation was packaged with its own set of interaction principles and gestural language.
Google Soli is now in full commercial development within Google and is available as an alpha dev kit. Emerging evidence of our gestural language and principles have started to appear in a range of consumer devices, from wearables to domestic products.