The metaverse – everyone has different ideas about what exactly it is, but what most agree on is it’s the “next level” of the internet. More immersive than the current flat, 2D web, and more connected than current social media, it promises us an escape into digital realities where we can work, play and live unfettered by the constraints of geography or the physical world.
One area of uncertainty – or certainly an area where there are differences of opinion – is how exactly we will interface with this brave new world. Mark Zuckerberg has renamed his company Meta to signify how important he thinks the idea is. He is throwing his money behind virtual reality (VR) headsets which he clearly feels will be the default gateway into a world of 3D, fully immersive experiences. Others – such as those behind the popular Sandbox and Roblox platforms – say there is a lot of mileage still in flat displays, such as computers and mobile phones. After all, most of us already have them, removing the friction necessitated by requiring an expensive new hardware purchase simply to join in the fun.
A third option – which in some ways is really a middle-ground between two extremes – revolves around new applications of augmented reality. Rather than fully immersing the user in computer-generated visuals, AR overlays digital imagery on the user’s view of the actual world. It can work via smartphones, as demonstrated by what is probably still the “killer app” for the concept, Nintendo’s Pokemon Go, or Instagram and Snapchat’s plethora of “funny face” filters. But I believe it will also become increasingly common to access AR via headsets or glasses. This is an application of the technology that’s already been adopted in industry and could soon be found in all manner of consumer use cases.
Why AR in the metaverse?
The reason that I believe AR will find its most valuable use cases in persistent, online realities is that the most exciting metaverse ideas involve blurring the boundaries between the real and the virtual. While we can certainly look forward to fully immersive 3D environments in which to play games, work, and socialize, to most of us it won’t be practical or desirable to spend all of our time wearing a clunky headset cutting us off from the real world.
AR – as well as its hybrid cousin mixed reality (MR) – means we can stay in touch with what is happening around us as we carry out tasks that still inevitably require contact with the real world. For work, this means we will be able to hold meetings where colleagues we are co-located with will appear alongside remote workers allowing everyone to collaborate as if they are in the same room. Current VR-based co-working solutions, such as Meta’s Horizon Workrooms, treat all participants as if they are located remotely – even if some happen to be sitting in the same room. Interacting physically with a participant who is sitting next to us requires us to remove the headset and drop out of the virtual environment. This is a boundary that wouldn’t necessarily exist in an AR environment designed to remove the boundaries between physical and virtual.
An AR metaverse retail experience might involve visiting a real outlet to try on physical products in an actual dressing room after we’ve shortlisted them by browsing the shop’s entire inventory virtually. At the same time, we will be able to get advice and opinions from a friend who is tagging along via their own glasses which they are wearing at home.
And AR has already started to establish a foothold in gaming thanks to the previously mentioned Pokemon Go as well as other games featuring well-known brands such as Harry Potter, Jurassic Park, and Batman. By throwing more metaverse elements into the mix, such as social integration and persistent, personalized worlds that players can build and explore together, we can expect to see entirely new gameplay experiences emerging as the technology matures.
Glasses – the ideal metaverse device?
It’s easy to understand why many believe that glasses are likely to present the most streamlined and user-friendly way of bringing many of these ideas to life. I recently spoke to David Jiang, who, after spending time at Harvard and with Google, where he was one of the first people in the world to experience Google Glass, has now co-founded high-tech, immersive glasses startup called Viture.
He told me that he sees the journey towards eventual uptake of glasses – not just to access the metaverse but to carry out many of the functions that we rely on mobile phones for today – as the next step in a natural progression that mirrors how way we use many other devices has evolved. Clocks and timepieces, for example, progressed from immobile devices like sundials to bulky mechanical clocks, to pocket watches that we carry with us, and then wristwatches – the first true “wearables.”
“This is a trend that will never go backward,” he says, “A device like a computer goes from immobile to mobile, and then wearable – desktops, to mobile phones, and later to wearable devices like AR glasses.”
Jiang also sees glasses and slimline head-mounted displays as the device form factor which will allow developers to unleash the potential of the metaverse. This is another way in which they represent the next evolution of user interface device after the mobile phone. The smartphone was, after all, the device form factor which enabled a new breed of web 2.0 apps, built around giving users the ability to upload their own content and interact directly with each other. Without smartphones, it’s unlikely that the likes of Uber, Instagram, and Airbnb would have become as ingrained in our lives as they did. It’s certainly possible that AR and MR glasses will be the enabler of whatever it is that we come to see as the defining application of web3 and metaverse technology.
So does this mean the smartphone has had its day? Well, it’s certainly possible. If the metaverse is really going to change our lives in all the ways that it’s predicted to, is it likely that we’ll want to continue interacting with it through a tiny screen we have to carry around in our pockets – or do we want a screen that can instantly expand to any size we need it to be – filling our entire field of vision when we are watching a movie or playing a game, or shrinking to a postage-stamp when we want to talk to someone as we go about our day-to-day business? AR allows information about what we are seeing to be overlaid directly onto our view of the real world – so looking at our car might instantly tell us how much charge is left in the battery, while looking at a fellow attendee at a business networking event or singles speed-dating event might tell us who they are and what they are interested in talking about. From movies and gaming to shopping, working, and relaxing, the form factor of the device we are using plays a big role in the possibilities we have as we communicate and consume information in the digital realm. While there have been false starts before – such as the original iteration of Google Glass – it may be that it was simply the case that the hardware appeared before the use cases which made it worthwhile. With the age of the metaverse fast approaching – if we are to believe what we are told – it could be a different story for the upcoming generation of devices.
Click here to see my webinar with Viture CEO and co-founder David Jiang, where we discuss his Viture One mixed reality glasses which recently raised over $3 million via Kickstarter, as well as dive deeper into his vision of how headsets and glasses will shape the metaverse.