It’s time for companies to embrace the immersive metaverse

We’re excited to bring Transform 2022 back in person on July 19 and pretty much July 20-28. Join AI and data leaders for insightful conversations and exciting networking opportunities. Register today!

Two of the strongest innovation trends today are immersive experiences and the development of the metaverse. Forrester predicts that 2022 will be the year organizations’ investments in immersive experiences will turn browsing into virtual living. The consumer technology industry’s new take on its version of the metaverse, including speech-generated virtual worlds, is making headlines daily.

What is not discussed so often is how the blending of immersive and metaversal technologies and experience designs will amplify the effects of each. With immersive and metaversal technology development accelerating, organizations don’t have time to wait and see what happens. Instead, they should focus on bringing the change now.

When thinking about metaversal experiences, it’s important to realize that while the metaverse as a specific set of experiences provided by inside or outside a technology company’s own space is in the early stages of development, the real metaverse is in the has quietly expanded over time. Most of us just don’t think of the technology-driven, connected experiences we have as metavers, but they are.

Consider the philosophical question of where the mind ends and the rest of the world begins. It is arguable that any way of capturing, sharing and analyzing data, information, knowledge and wisdom (DIKW) outside of people’s own minds and personal interactions is metaversal. By that standard, humans have slowly built up a metaverse over millennia, from cave paintings to the printing press to the telephone.

Now, thanks to the Internet, wireless connectivity, and new technological manufacturing capabilities, the pace of metaverse expansion is accelerating. Advances in technology also have the potential to make the metaverse less intrusive and more seamless. For example, a field service technician today can point a tablet at a device and get an AR image with an arrow pointing to the part they need to remove.

That’s useful, but it can go even further. Just as landline phone users were limited to making calls from specific locations where a phone was available, today’s metaversal experiences require an intrusive headset to deliver an immersive experience. Future experiences will be more like interacting with a smartphone, wherever you are and what you’re doing.

The differences between tomorrow’s immersive metaverse and today’s experience-sharing tools are the number of senses involved and the friction or intrusiveness of the wearables accompanying the experience. In an immersive metaverse experience, the technician’s glasses will draw their vision to a part, and when they look at that part, they will hear a pleasant sound or get a pleasant smell. The gloves they wear direct their hands to the correct tool and help them use that tool correctly. If the technician does something that could injure him, the wearables will use the right senses to keep them safe. In the immersive metaverse, the technician intuitively interacts with reality to do his job safely and efficiently.

Right now, there are some obvious use cases for immersive metaversal experiences beyond cool and engaging brand engagement. The first is task handling, such as our immersive metaverse example involving the field technician. In these cases, customer experiences can be enhanced, optimized and made more cost-effective at scale with immersive metaversal solutions.

Perhaps the bank that provides customer service via app and phone calls will develop an immersive process that alerts customers to potential issues in real time and guides them through a solution on whatever device or platform the customer prefers?

Immersive, metaversal customer experiences can also accelerate the pace of B2B e-commerce and large-scale consumer purchases. We are already seeing car manufacturers display 3D models of builds as the customer selects options online. What if the customer could choose their options and then virtually walk around the car, look under the hood, sit in the driver’s seat and smell the leather? On the B2B side, what if a factory manager could have the same kind of immersive interaction with an industrial engine, instead of having to fly or drive to the manufacturer to see it before placing an order?

Businesses need to understand what’s possible in the metaverse, what’s already in use, and what customers or employees will expect as more organizations create immersive experiences to differentiate their products and services. The opportunities could include improvements in what companies do now, as well as revolutionary changes in the way companies operate, connect and interact with customers and employees to increase loyalty.

How can leaders start identifying opportunities in the metaverse? As always, start with low-hanging fruit like commerce and brand experiences that can benefit from immersive support. Also think about the technology that makes what you need possible. From an architectural point of view, it is helpful to think of immersive experiences as a three-tiered cake. The top layer is where users access through systems of engagement. The middle layer is where messages are sent, received and routed to the right people through integration systems. The bottom layer includes the databases and transactions – the registration systems.

As companies consider new options for user interfaces (UI), user experience (UX), and customer experience (CX), they need to think about how evolving metaversal technology and user expectations can affect those systems of engagement. For example, future engagement could include desktop, mobile, and wearable experiences, as well as experiences that have not yet been developed, such as headset experiences or an experience synthesized across all of these devices.

The possibility of rapid and dramatic changes in experiences requires thinking outside of silos. How will users move fluidly between entry points to those systems of engagement? Organizations that can figure that out can transform the way users interact with a more immersive, continuous experience. We are already seeing some organizations adopt this approach.

For example, customers who had a problem with a product or service called the company’s customer service for years as a first step. Few people enjoy calling customer service because it takes time and can be frustrating. Businesses don’t like to handle a lot of customer service calls because it’s an expensive way to resolve customer issues. Now, some organizations have moved most customer service processes to their app so that customers only need to talk to a service representative if they have a problem that the app can’t solve — and they can call from within the app. Expect to see that kind of fluidity grow across different touchpoints, especially as more immersive technology becomes available.

The most important concept for organizations to keep in mind is that we already live, work and play where the metaverse meets reality. Now we wait to see how these new technologies will make the metaverse less intrusive and more immersive, but the fundamental building blocks for creating and delivering richer metaversal user experiences already exist for visionary organizations to work with.

Andy Forbes is the Salesforce solution architect at Capgemini Americas. Michael Martin is the enterprise architect of mobile solutions at Capgemini Americas.

DataDecision makers

Welcome to the VentureBeat Community!

DataDecisionMakers is where experts, including the technical people who do data work, can share data-related insights and innovation.

If you want to read about the very latest ideas and up-to-date information, best practices and the future of data and data technology, join us at DataDecisionMakers.

You might even consider contributing an article yourself!

Read more from DataDecisionMakers

This post It’s time for companies to embrace the immersive metaverse

was original published at “”