Just over a year ago, Meta Platforms (formerly Facebook) CEO Mark Zuckerberg outlined a vision for the future of the social media giant—the metaverse. “We believe the metaverse will be the successor to the mobile internet, we’ll be able to feel present – like we’re right there with people no matter how far apart we actually are,” he said. He bet a huge amount of money. In November, he laid off thousands of employees working on the project.

I recently moderated a panel discussion on the Web discussing the Industrial Metaverse. I wanted to establish a definition. I failed. Perhaps the best short description of the Metaverse is that point where the physical and digital worlds come together.

English technology analyst Benedict Evans wrote, “Sometimes it seems like every big company CEO has read the same article about the same tech trend, and sent the same email to their team, asking “What’s our strategy for this?!” A couple of years ago there were a lot of emails asking for a 5G strategy, and now there are a lot of emails asking about metaverse.”

Yes, it seems like only yesterday I was swamped with press releases and requests for interviews about 5G. But that was yesterday, and yesterday’s gone (to quote singers Chad and Jeremy). Evans continues.

Evans continued, “Answering the 5G email was actually pretty easy, partly because almost no-one needs a 5G strategy at all, but also because we knew what 5G meant. We probably don’t know what ‘metaverse’ means. More precisely, we don’t know what someone else means. This word has become so vague and broad that you cannot really know for sure what the speaker has in mind when they say it, since they might be thinking of a lot of different things.”

The metaverse needs a digital twin. This is the digital representation of physical reality. It also needs sensory input. There must be visualization of the digital output—perhaps AR glasses or VR headsets. Oh, and application sense must form part of the mix or else why do it. 

The word metaverse occurred first during the recent Rockwell Automation event Automation Fair on my third day. Even then, it was stated ironically. Rockwell did have an exhibit with a virtual reality headset. The application about to be released, however, was on flat screens accessible by all.

The metaverse experience that Siemens showcased at its Web Summit builds on the company’s digital twin technology—physics-based, real-time and photorealistic digital twins. By offering a completely digital model of the underwater farm, this technology enables the Nemo’s Garden team to develop, adapt, and control its underwater biospheres at scale. The photo-realistic metaverse experience at Web Summit is implemented with Unity, a cross-platform game engine.

With VR glasses on, visitors experience underwater farming with four senses: They can see and smell the vegetables and fruits, they can hear the sound of the underwater environment, and they can even feel the vegetables and fruits through full-feedback gloves. Visitors not wearing VR glasses can follow the action on a big screen and influence what is seen in the metaverse. 

Molex and Arrow Electronics conducted market research and returned an optimistic look at the value of the metaverse in industrial applications. They see manufacturers among first to benefit from convergence of physical and virtual processes to improve product-design cycles and factory of the future initiatives.

Some of the technologies commonly thought of as part of the metaverse include AR/VR, digital twins, robotics, machine learning, artificial intelligence, and predictive analytics.

Share This

Follow this blog

Get a weekly email of all new posts.