The Resonance Test 41: David Rose

complex systems

“I’d Like to Focus on the Positive Valence Stuff”

The Resonance Test 41: David Rose

November 8, 2019
by Toby BottorfKen Gordon
rose2hero

What happens when a renowned futurist joins EPAM Continuum? Here’s what happens: things get interesting. You feel this immediately by listening to a just-recorded conversation between our new colleague, David Rose—author of Enchanted Objects, former MIT Media Lab savant and Warby Parkerite—and our less-new but equally articulate colleague, Toby Bottorf. What’s special about Rose? It’s not just his informed outlook on augmented reality (“The new term is ‘spatial computing,’”) or his colorful experiences in tech and business. Yes, Rose casually deploys a variety of interesting words in conversation—“the topology of your face,” “pupillary distance,” “e-com funnel,” “a Louis XV gold sapphire brooch,” “vomiting rainbows on the plane of your face” and “pre-attentive processing”—but that’s not it either. It is the multiple possibilities Rose’s optimistic imagination brings to our work. “I tend to think that there are more journalists, whistle-blowers, and people who are willing to be alarmist in the world, rather than people who are trying to create the desirable future states,” he says. “And so I’d like to focus on the positive valence stuff.” We’re with him! Listen up, and you just might be, too.

Host: Pete Chapin
Editor: Kyp Pilalas
Producer: Ken Gordon

The Resonance Test 41: David Rose

TOBY BOTTORF

We are talking today with David Rose, one of our newest colleagues at EPAM Continuum, a guy who spends a lot of time thinking about the future—but he’s [also] got a pretty interesting past. So, David, you’re new to EPAM Continuum. You’ve come to us from Warby Parker, the MIT Media Lab. Tell us a little bit about what you just finished doing.

DAVID ROSE

Well, at Warby Parker, as you know, the company is known for their home try-on service. So, if you’re inspired to buy a pair of affordable glasses, they will ship five to your home, without prescription lenses in them. You can try them on for your friends and family, and [you might] decide that you fall in love with lens number two or frame number two—and from there, you can finish the order online without ever having to go into a store. And that presented a big problem for the company, because many people, once they’ve made a frame choice, are asked: “So what’s your prescription?” It’s not like we keep our prescriptions in the bottom file drawer. I mean, does anybody keep file drawers anymore? Not only was it hard to lay your hands on a prescription because it was at the nearby optometrist, who was not eager to give [the] Warby Parker call center operators that prescription because they would lose the sale and they knew it… but also, there’s a regulation in this country that, in many states, you have to get a new script, despite the fact that your eyes don’t actually change very quickly—every couple of years. Every two years.

TOBY BOTTORF

My prescription, like a prescription for a drug, expired—

DAVID ROSE

—Right, right.

TOBY BOTTORF

—after two years.

DAVID ROSE

Well, by design.

TOBY BOTTORF

Uh-huh.

DAVID ROSE

There’s a lobby for that.

TOBY BOTTORF

OK.

DAVID ROSE

So, we tried to solve that problem by building a new business that needed to be built, which is to verify and correct your prescription from the comfort of your own home. And if you think about an eye test, it’s not that complicated. There has to be a stimulus, also called an optotype, which are those tumbling Es or Landolt Cs or sometimes they’re even pictograms that are designed for kids [who] are pre-literate. You have to be a certain distance away from the unknown stimulus and optotype. And you have to test people to the point of failure, so that they can’t see the 2015 line, the 2010 line.

We built a system that used a computer or your laptop, for example, to put up the optotypes. Your phone [will] detect, very precisely, how far away you were from said eye charts, and then your phone [will] indicate what you see. So, you swipe with the phone, as if you’re a remote control, in the direction of the opening of a Landolt C, which can be one of eight different orientations. And then with that information, we ask you a set of questions about the history of eye disease in your family and other conditions. We can send that data to an ophthalmologist in your state by law, and then they can issue a new prescription. It was my job to put together a team to prototype this, to iterate on the user experience, and ultimately to roll it out in 20 states, which make up about 80% of their customers. And it’s perceived as a real success because it removes a really important point of friction in that user journey.

TOBY BOTTORF

You’re wearing glasses. I wear glasses. Certainly the eye test is one of two pain points for me. The other is, if I take off my glasses, I can’t see very well. To try on a new pair is kind of like, “I don’t know what I look like in these.” Because I can stand in front of a mirror but I can’t see clearly. There may be some remedies for that with interesting technology, too, right?

DAVID ROSE

When I was at Warby, the iPhone X came out, which has a front-facing camera [that] can unlock your phone. So that technology, which I think Apple acquired, turned out to be perfect for the second use case which is: “Show me glasses on my face in a really convincing way—we call it ‘virtual try-on’—that will inspire so much confidence in what I will look like with these new glasses that I will be compelled to buy them.” And luckily, because Apple in that system has a[n] infrared projector and an infrared camera that’s dedicated just to reading the topology of your face as being unique to you, we could use that point cloud, that 40,000-point cloud, in order to understand your pupillary distance, your nose-bridge height, the width of your face, and then really convincingly put a 3D model with textures and shading and transparency of glasses on your face.

That’s been a huge boon to the business because a lot of that population, as you might guess, does have iPhone X[s] or now iPhone 11s, and they’re trying on glasses rabidly and buying through that experience. That whole idea of a virtual try-on, I think, is one of the best use cases for augmented reality. There are many, many use cases, some people say maybe 15 or 18 things you could use this mixing virtual with real [for], but for many businesses, where the rubber hits the road is being able to be in the e-com funnel. “Let me see what those glasses look like on my face. Let me see what the shoes look like on my feet. Let me see what the haircut looks like on my head. Let me see what the makeup looks like [on my face] or even something like [trying] jewelry [on my body].” My friend works at Cartier in New York, and they have all these shop window displays on Fifth Avenue that millions of people walk by. After 5:00 p.m., when Cartier, for safety reasons, takes all the jewelry out of the window … we spoke about, “Well, that would be a great opportunity for virtual try-on: You’re walking by the window; the window sees your face; it reflects your face; and now [it reflects your face] with a new amazing Louie XV Gold Sapphire brooch.”

TOBY BOTTORF

For folks [who] haven’t applied social media filters to their pictures (they might not know what we’re talking about): “Augmented reality” is a couple of big words. What are we talking about? Also: What aren’t we talking about?

DAVID ROSE

With augmented reality—the new term is actually “spatial computing” and I kind of like spatial computing more—it’s taking the base plane of the real world and superimposing a digital layer on top of that base plane. So, if the base plane is what is revealed to you in your front-facing selfie camera, it’s putting other ornaments or decorations or costumes or vomiting rainbows or whatever on the plane of your face, and allowing you to see that, oftentimes in real time. If you take the other camera on the phone, it’s pointing that camera at the world and being able to recognize where you are. For example, Google Nav now has an application where if you walk out of the subway in New York and you’re lost—like most of us are in that moment—rather than looking down at a plan-view map, now a fox pops up. And it’s a red fox, kind of from the Pokémon vocabulary, and you just follow the fox. That is kind of a use of having the front-facing camera recognize where you are in Manhattan, based on not putting QR codes on any buildings but just recognizing the skyline, because all of those buildings are now trained on so they can determine that you’re at 42nd and 8th. And then they superimpose, on the view through the camera, the red fox, which is now walking down off to the right—so now you follow that fox.

TOBY BOTTORF

Sounds both charming and easier to follow.

DAVID ROSE

[Laughs] Well, I think a lot of people are debating: What is the killer app for this new mixed-reality world? Is it navigation (the following-the-fox example)? Early signs of what AR will become are now found in lots of high-end cars, right? You throw the car into reverse; not only do you see, through the rear-facing camera of the car, what you shouldn’t hit or what you might hit, but also there’s a superimposition of, “If you turn the wheel to the left, here’s an arc of how the car will move out of the parking spot,” for example. Or, in your front-facing heads-up display, you can see a path that superimposed on the road in front of you in order to figure out where to turn next. This, generalized, will become what is baked into glasses.

TOBY BOTTORF

Google Glass. I remember trying those on when they were brand new and I felt: “Man, this is like a Segway for your face.” Not something I wanted.

DAVID ROSE

There’s a very bright line in between glasses that show disembodied information that is floating in front of your head, and when you turn your head, it turns with your head—that was Google Glass—and the new technology that is part of HoloLens, it’s part of Magic Leap, and it’s part of 50 other glasses companies that are coming out soon to eCommerce sites near you, which is a technology called “specialized location and mapping,” otherwise known as SLAM. This SLAM technology uses front-facing cameras that are embedded in the frames of glasses in order to image the world in front of you, which then gives you the ability to actually superimpose information on the physical world, in places where it’s appropriate to put information in the physical world. So if there’s an open space on a wall, or an open space on a table, you might superimpose information [on] those places. And when you turn your head, the information stays right there.

TOBY BOTTORF

That sounds really promising to me, in terms of lightening the cognitive load, because if the information has object permanence, I can go back to it. The other thing that’s lightened your load is if the information is not connected to the world, it has to have some other information architecture to be accessed and re-found—and then you gotta learn that, you know, cyber-information architecture, as opposed to [it] just being in the world.

DAVID ROSE

That’s right. I founded a company 19 years ago now called Ambient Devices, where the big idea was: Could you take this insight that cognitive psychologists have had for a long time, which is called pre-attentive processing—this is the ability for your brain to process things in peripheral vision, in parallel, in less than a quarter of a second, without any cognitive load—could we design more information that can exist in your visual periphery that you love having around? Because, just like you like having a window around, because without even glancing, you can tell: It’s raining outside or it’s getting dark or lots of other information that our reptilian brains can perceive in the periphery. We were making dedicated objects to render information as light or as a pattern or as an angle or other pre-attentive phenomena. But now, with AR, you can imagine painted pixels or a digital dashboard that tells you about the things that you most care about in your periphery. Right? Everyone has the set of things that they care about. For some of us, these things are jumping on to wearables. Like all of the things that are on your watch—“Are you late for your next meeting? Is there an important notification from someone that you care about? Are your servers down? How’s the mood of a loved one?”—all of these status indicators. But now they don’t have to be on your phone. Now they can be kind of spread thinly in the world around you. And that will happen through augmented reality.

TOBY BOTTORF

So they’re a little bit less needy of attention.

DAVID ROSE

Well, that’s where design comes in. A lot of what has been prototyped or speculated about the coming AR future is a world of immense clutter. Where you can imagine [the] superimposition of reputation systems, or navigation, or advertising, that’s just going to be totally polluting our visual field. But that’s kind of this dystopian view of it. I think, with design and with understanding people’s needs and understanding people’s psychological states and really deeply understanding people, we can start to provide services that can exist in your visual field that are things that you want.

TOBY BOTTORF

I’m glad you brought that up. If we’re talking about what the killer app for spatial computing or augmented reality might be, I want to also make sure we spend a minute thinking about our current understanding of the worst-case scenario[s], because a lot of new technology has unintended consequences. I think there’s general consensus that the dystopian view of this is that it’s just billboards everywhere; that again, your attention has been monetized.

DAVID ROSE

The popular fear is that this coming world of kind of augmented vision is going to be mostly undesirable and that the big players that are monetizing attention today, through advertising, will have a brand-new rich field to clutter with promotions of all sorts. And they’ll be able to do it in a much more insidious way because, not only will they know where you are, but they will also know what catches your eye and what the gaze vector specifically is looking at. As you walk by a sign or a store display and you dwell for 400 milliseconds on something, well, then that’s going to be retargeted at you.

TOBY BOTTORF

Gonna follow you home.

DAVID ROSE

Yeah, we’ve been working here on a set of hazards. I’m calling this whole field the intersection between computer vision and wearables as SuperSight because that is the promise. That is the fantasy. That is the superpower. [We’re talking about] X-ray vision and being able to see in time-lapse, being able to see in slow-mo, being able to see through things, being able to see the meta cloud of information that now surrounds everything.

TOBY BOTTORF

The glasses that are sitting on my face were invented close to 700 years ago. There haven’t been a whole lot of upgrades. They’re super-hip. They’re called “progressives,” which is a much nicer word than “bifocals,” but there hasn’t been a lot of technological advancement.

DAVID ROSE

And those convex lenses were hip, even then. Actually, glasses have been fashionable since the invention of glasses. Part of my job at Warby Parker was to think about what new sensors and display technologies might be embedded in this base of the temples and the frame—literally, the frames of glasses. And some of the things we were looking at, which I think are still intriguing use cases for many types of companies, [were the following:] One is to just make them AirPods. This open-ear audio is something that Bose has recently commercialized in their Bose AR frames. It’s just another form factor for headphones—and a microphone—and I think those get more and more interesting as you find specific service design scenarios that take advantage of the open air-ness of the fact that you could be biking as well as listening and still hear the traffic. You could be doing an outdoor yoga session and these frames, because they have an accelerometer and a compass in them, will know that you’re doing downward dog and not just faking it. If they’re doing a coaching app, like Aptive, they will know what your pace is. They recently just enabled a new gesture, which I think is actually brilliant because it’s a familiar gesture, where if you look skyward, they tell you what the AccuWeather forecast is. Isn’t that nice? That’s not a new, novel gesture that you have to learn. That’s just learning from how people move their head around and try and doing something sympathetic to an existing gesture.

TOBY BOTTORF

It means for me that when I’m trying to remember something, I’m going to be told the weather. [Rose laughs] Because that’s my habit.

DAVID ROSE

Well, maybe for you, they need to be personalized to actually tell you the thing you’re trying to remember.

TOBY BOTTORF

We went off on a nice interesting tangent. I want to loop back to what you were describing as a part of SuperSight and these hazards.

DAVID ROSE

We’ve described six buckets or clusters of concern. For each of them, it’s my intention to not just raise them as issues or to be catastrophizing about what the problem is, but also to propose a remedy, to propose a way out—either a design way out or [legislative] way out—of the woods. The first one is social insulation. In the same way that we have bubble filters today and different people are experiencing different realities, in terms of the news. I think that’s going to be one of the most profound effects. You could be walking hand-in-hand with someone through the city, and he could be interested in another set of things [from you], perhaps superheroes jumping around, like Spider-Man weaving around buildings. I could be interested in architectural history. I could be getting my glasses for free, subsidized by ads. We would [each] be experiencing a totally different stroll. What does that do to social interaction between two people? So, I think there needs to be new synching gestures that allow people to see the same thing. That’s another opportunity for design.

TOBY BOTTORF

The same thing was true of cell phones, when people started talking to themselves in public. The first impression was: “Uh-oh, keep a wide berth.” Now it’s become a little bit more normal. I do think that there’s going to be a problem if you can’t tell what somebody else is seeing. Are hey looking at you or looking through you at something else?

DAVID ROSE

Thad Starner, who was a proponent of wearables when I was at the Media Lab in the 90s: It was very hard to talk to him because he was always staring past you and doing something else, using a chording keyboard in his pocket. Something was always going on. In addition to talking to you, he’s also coding. I have a pair of the North glasses, which have a little pico projector in the temple of one of the eyes, and they just launched this week a Twitter feed, much to my chagrin. [Laughs] So, in addition to getting a heads-up display for your talk, so you can see what to say for each slide, or notifications, or when your Uber is coming, or what Spotify song is playing, now you can see your Twitter stream right there in your glasses. That’s going to be good for human interaction [laughs].

Let’s talk about another issue. In the same way that GPS and calculators, a long time ago, gave us these cognitive crutches, I think that [these crutches are] going to be another thing to deal with, in terms of having everything in front of you labeled. So, even after having been at EPAM Continuum for about four weeks, I still struggle with: “Who’s who? What are their roles? What projects are they working on? What superpowers do they have? Who’s an expert at what? Who aspires to what?” Remembering [all] that times 130 people or so is—I’m going to be at that for months.

TOBY BOTTORF

Can I tell you what my killer app for augmented reality is?

DAVID ROSE

Sure.

TOBY BOTTORF

Name tags.

DAVID ROSE

Do you worry that if everyone has a name tag that’s superimposed, either where you put a name tag or maybe larger on top, floating over their head: Will it make you dependent upon that technology?

TOBY BOTTORF

So, if GPS is a precedent, then yes, most people’s sense of direction has really atrophied. Personally, I would imagine it would be a learning tool, and that it would be the kind of thing that I would like to modulate a little bit or it’s on a gradual fade that I can say, “Whoa—bring it back.”

DAVID ROSE

I think that’s the way out of these cognitive crutches. In the same way that pilots are, because of autopilot, probably are becoming worse and worse, the less that they have their hands on the yoke of the plane. Maybe instead of just hanging out and drinking coffee, they should instead use the context of the cockpit to do simulations as they fly.

TOBY BOTTORF

There’s a great book, The Glass Cage, that talks about the perils of automation. Pilots are the best example because we’re dividing up the labor where we’re taking away the repetitive stuff that keeps [pilots’] skills sharp and asking [them] to only intervene very rarely and in an emergency. It’s the worst possible combination.

DAVID ROSE

Do you remember what other professions [Glass Cage author Nicholas Carr] spoke about? I think one was architecture, and he said the lack of having the hand sketching is homogenizing the buildings that we design in AutoCAD.

TOBY BOTTORF

That’s super resonant because here at EPAM Continuum, everybody, regardless of whether sketching is their medium or something else, we think and sharpen our thinking by sketching. You can sketch in an Excel spreadsheet. You can sketch in 3D foam. But make a thing. Figure out what you’re really talking about by making.

DAVID ROSE

That’s kind of a perfect segue to what I’m doing here—

TOBY BOTTORF

—Yeah.

DAVID ROSE

—which is making things. I know that we have an amazing tradition of deep customer research and being reserved and appropriately slow at inventing solutions before we understand context and people deeply. I think that the culture of the Media Lab is different, and maybe interestingly different. To me, the tradition there is more about quickly prototyping, almost blindly, quickly prototyping as a way of learning and understanding what a new material might be. If you liken electronics or sensing or augmented reality as to a new material, there’s this [sense of]: Jump right in, try it; you will be wrong in lots of ways about what the abilities are of this new material! And if you’ve ever thrown a pot—have you ever thrown a pot on a wheel?

TOBY BOTTORF

I taught ceramics at summer camp one year when I was in college, having never thrown a pot before that. That was an interesting week before the kids showed up.

DAVID ROSE

So, the materiality of clay is such that it behaves very differently than what you might imagine. You can’t just do a sketch and then produce that thing because you don’t have the ability to pull up the edge of a pot in the way that you would imagine. So, it’s like getting in there, experiencing the material, experiencing the plasticity of it, and the limitations of it. I think we can apply that same metaphor to this material of AI. You know: What will it be good at? How will it fail, interestingly? And where will it be brittle? And I firmly believe that the best way to understand that is to have an idea for something, start building it, start failing, and then you’ll discover. I think this kind of notion of prototyping a concept quickly is maybe best true for new materials, where you really don’t have a sense of the capabilities and the limitations. Once you become an expert, then you know how to apply those materials to specific client situations. And you become a master craftsperson in that respect.

TOBY BOTTORF

I think there’s an ideal middle ground. I think our designers and strategists are super careful not to predetermine a technology or solution, which I think is right. But maybe we sometimes over-index on that, and I can totally see what you’re describing. See. Interestingly, we talked about things, even when we’re not talking about vision. I see what you’re talking about, in terms of the difference between having a technology as a solution in search of a problem and just trying to solve the wrong thing with it again and again and again, which is a stupid way of failing fast. And investigating it, a little bit more open[ly] and curious[ly], so you can understand: What is its essential quality? What is it good at? What does it want to be? And you’re still going to go back to people and understand their needs, but with a greater fluency about potential solutions.

DAVID ROSE

Right: What is buildable? Just in terms of the categories of the benefits of what augmented reality might be able to do for us. I’ve been trying to catalog some of these wishes that people have and some of the types of benefits that [they] might deliver. In addition to the labeling of the world, either [with] big nametags [for people] or [a taxonomy of] plants and animals or whatever that you need—architecture—labeled, the ability for this technology to look back at us and help us understand how we perform and what our state is, to me is fascinating because we’re now used to this idea of personal digital assistants. We’re used to those [digital assistants] answering Wikipedia [queries], gathering shopping lists, playing music for us. The proliferation of [places] where these digital assistants will be is kind of answered, right? Amazon just launched rings and glasses and other things that will have this technology embedded. But if we can actually use this technology to look at ourselves, and kind of be the perfect coaches that help us play sports later in life, with more confidence and less injury; that can notice how we react in different situations; that can stimulate us when we’re about to fall asleep and driving; that can soothe us when we’re [stressed]; that can introduce us when we’re lost; that can seek help before we make mistakes. I think that’s really powerful: To think about how these things will be our companions, desirable companions—probably shut-off-able companions—as we go through all these tasks throughout our day.

TOBY BOTTORF

I like that you call them “companions,” because there’s a certain amount of life and relational stuff going on there. This conversation has been really cool for me, to get a sense of how you think about the future.

We’ve talked about dystopian scenarios and the list you just gave of, I thought, really optimistic scenarios. How do you balance optimism about the future—and where’s that come from—with some guardedness around unintended consequences and potential negative use cases, dark patterns?

DAVID ROSE

My philosophy is that most new technologies can be used in positive or negative ways. Most of the things that we love about our spouse are also the things that drive us crazy. They’re always two Janus sides of any new tool. The way that I want to be in the world is to optimize for and look for and design and build the things that help us understand each other more deeply, that connect [us] to each other more deeply, that help us understand more about the environment, that function in the world in a way that’s desirable. I think at the same time, we do need to paint the provocative dark scenarios that will help us do things like privacy-by-design architectures as we build these things.

But I tend to think that there are more journalists, whistleblowers, and kind of people who are willing to be alarmist in the world rather than people who are trying to kind of create the desirable future states. I’d like to focus on the positive valence stuff.

Read the Dialog Box version of this conversation.

filed in: complex systems, design trends, IoT, retail and hospitality, tech trends

About the Author