In the brand-new book, SuperSight: What Augmented Reality Means for Our Lives, Our Work, and the Way We Imagine the Future, David Rose writes: “SuperSight is this decade’s convergence technology. It inherits the last thirty-plus years of enabling technologies like machine learning, computer vision, wearables, edge computing, 5G wireless, deep personalization, affective computing, and new interaction paradigms like gesture and voice—packaged in the familiar wear-all-day form of glasses.”
In the brand-new episode of The Resonance Test, Rose—a friend and former EPAM Continuum colleague—unpacks that statement with producer Ken Gordon, pulling out a long chain of colorful conversational insights.
SuperSight is an integrated technology, Rose says, that can “orchestrate and help simplify and tune and customize a lot of other systems, as long as there's open standards for how things talk to each other.” Operating at systems level can help us with in a variety of almost magical ways, such as personalized digital coaching, enhanced accurate medical diagnosis, and augmented learning.
But it’s not all good news—or a simple story. Rose walks through though the garden of dramatically named SuperSight Hazards—Social Insulation, State of Surveillance, Cognitive Crutches, Persuasive Persuasion, Training Bias, and SuperSight for Some—taking the time to explain the real dangers of this developing tech.
Rose shuttles us all around the SuperSight universe, talking about creating prototypes to help people with handwashing during the pandemic (one of which involved “using cuteness to seduce people into washing for 20 seconds”), the possibilities of glanceable commerce (will we go from eye tracking to the shopping cart?), the challenges of diminished reality, SuperSight city planning, even using AR to read his book.
So listen to this SuperSight-flavored conversation. It’ll augment your intelligence.
Host: Kenji Ross
Editor: Kyp Pilalas
Producer: Ken Gordon