Weekly Metaverse #132: AR will enable a push model for information
And that'll usher in a change comparable to what smartphones brought.
Google Moves AR Forward
One of my big questions about Apple’s AR strategy has always been what killer apps they plan to launch with. Well, Google may be solving that problem for them - this week they announced new AR capabilities for Maps that are pretty much what I consider to be the prototypical use case for consumer AR.
When you search for a store or location in your vicinity, you'll be able to see relevant results in your field of view when you’re looking at Live View. It sounds like an experience that’s kind of pointless on a phone, but obviously nobody’s thinking about that - on a pair of AR glasses, this is going to be how you search your surroundings.
They’re also improving the AR translate abilities of the app. Having just come back from South Korea, I can say that even in its current form, Google Translate on AR is spectacularly useful. Still, it’s definitely awkward to pull out your phone, load the app, point it at something and wait for the results to load. The result of that was that I used it in places where it was obviously valuable (menus were the main use case), but there was plenty of text that wasn’t worth the hassle.
When you’ve got AR glasses, you can enable translation at the OS level, so when you look at something in a foreign language it translates anything that enters your field of view. At that point, all of the information in a foreign language around you is available by default, instead of you having to take action for each thing you want to read. That’s the power of a push model (where information is put in front of you when it’s relevant, without action from you) vs. a pull model (where you have to go get the information when you want it). If this is available by default in Google Maps on Apple’s forthcoming AR headset, it just might be enough that you see tourists wearing it in foreign countries.
And that’s a nice transition to the next part of today’s newsletter…
AR in the ER and the Power of Push
This week, I came across this depressing (but good) read about a guy who spent a meaningful amount of time in an ICU because his dad was there. It’s worth a few minutes of your time if you want to take a pause for some non-metaverse reading, but if not you can just skip past it and get to the part where I wax poetic about the metaverse in healthcare (and more generally how AR will enable a push model for information that will create a change in the way we use information comparable in significance to the change that smartphones wrought).
One of the big issues that the author raises about the ICU is the lack of continuity of knowledge: “Second, there’s no institutional memory beyond the medical chart. While the medical chart says what was prescribed, it never says anything beyond that. So, for example, my dad’s poor reaction to Haldol was not included on the chart. We had to inform each new nurse not to give him Haldol, and then still had a new nurse give him Haldol when he was agitated.”
I can understand why this is the case, because it mirrors one of the problems I sometimes face with information that I want to remember at some particular time or place in the future. Sometimes there are good ways to do this - if I want to remember to take something with me when I leave in the morning, I put it in front of my door.
Other times, there are not. I have a small safe in my house (please don’t come rob me, there’s nothing worth much in it) with an electric keypad that’s powered by a AA battery, which is inside the safe. Recently, I tripped over my attention-seeking dog and ended up accidentally kicking the safe as a result. I later tried to open the safe to retrieve an important document, only to realize that my kick had dislodged the battery, so the safe wouldn’t open. Bear with me here, I’m getting to a point.
The makers of the safe, being the wise people that they are, included a secret keyhole covered by a piece of plastic along with a set of keys, presumably to be used if the keypad stopped working because the battery died. All I had to do was find those keys. Thankfully, after an hour of searching, as I got in my car to go to Home Depot and get some tools to get the damn thing open one way or another, I had an epiphany - the backup keys were in my glove box. (I put them there so if someone broke into my house, they wouldn’t find them. I don’t stand by my logic.)
So you’re probably wondering what the point of this rather mundane story is. That’s fair. This is it: AR (in combination with AI) is going to give us a superpower that’s going to improve the way we interact with the world - the ability to have a piece of information delivered to you when you need it.
The mobile internet gave this to us in a pull format - if you’re out in the world and you need to find a coffee shop, you get out your phone and look. Though it doesn’t seem like it these days, that was a huge development! Suddenly, instead of picking a coffee shop because you can see it from where you’re standing, which is not a great selection method if you want to optimize for good coffee, you could pick one based on all of the information on the internet. By pulling out the right piece of information at the right time, we could make vastly more informed decisions.
What AR/AI will do is allow information to be pushed to us. Back to our ICU example - the dad in this story shouldn’t have had Haldol, but where can you practically note that such that a nurse will find it at the right time? If you just put it as a note on the patient’s chart, you’re basically requiring every nurse to read every chart in its entirety every time they interact with a patient, just in case it contains information relevant to that interaction. If you’ve ever been in a hospital, you probably realize that there isn’t time for that.
Pop on your AR glasses, though, and when the little camera on them sees you reaching for the Haldol, you suddenly get a big red alert that says “DON’T GIVE THIS PATIENT HALDOL” smack dab in the middle of your vision. Important information, pushed to you at the exact moment it’s useful.
The hospital is an ideal place for this, because the amount of information you could have about a given patient could cover reams of paper, and you might have a variety of different hospital staff interacting with a given patient. There was a comment about the hospital post in which a nurse said that one thing that commonly gets lost is whether a patient has one arm that’s significantly better for drawing blood. If a nurse figures out your left is way better, when she goes off shift, the next nurse may well end up jabbing you in the right a bunch of times.
That sounds bad, but again - there’s no great way to convey that information. You can’t put a little area on the chart for “best arm to draw blood from” because there are so many pieces of information like that that the chart would be longer than Infinite Jest. If you tell your handy-dandy AI patient manager that the left arm is best, though, then the next time a nurse goes to draw blood, “USE THE LEFT ARM” will be right there.
There are plenty of more mundane uses for this:
You put your backpack in the trunk of an Uber. When you arrive at your destination, you get a notification reminding you to get your backpack out of the trunk.
Your significant other texts you and asks you to pick something up from the grocery store on your way home from work. If you’re like me, about 50% of time you’d go into autopilot, arrive home, get reminded that you were supposed to get something, and then go back out and get it. Instead, you get a reminder when you’re near the grocery store.
You saw the perfect gift for someone in August and bought it, then you put it away in the garage for Christmas. When you start doing Christmas shopping, you’re reminded that you bought it and exactly where it’s stored.
Some of these lean more on AI than others, but at this point, given the speed at which AI is progressing, I’m kinda figuring we’ll all have our own version of Jarvis by the time we have high-quality AR glasses in the form factor of my Ray-Bans.
Metaverse News
Niantic shows off AR headset prototype powered by Snapdragon AR2 Gen 1: Niantic’s an intriguing one. On the one hand, they’re primarily known for software. On the other hand, they’ve arguably done more than anyone to bring AR mainstream. They clearly know what makes AR engaging, and I certainly look forward to see what they can do with control of both the hardware and software sides of the experience.
Are Augmented Reality Menus the Future of Restaurants?: Instead of reading about a dish, you can see it in 3D and watch a video of the chef preparing it.
Apple’s new job listings point to 3D metaverse world: Apple’s hiring for VR and AR - no surprise there.
Bloomingdale’s opens multi-brand metaverse store for holidays: It’s an interesting concept, though I don’t think clothing is the right fit for this kind of thing yet. VR furniture shopping feels like it’d be a much better fit.
Magic Leap CEO details the future of augmented reality: Always worth hearing from a longtime leader in the space.