The metaverse conjures a slew of virtual worlds similar to those depicted in novels such as Snow Crash and Ready Player One. However, the industrial metaverse may very well become a reality first.
The results of a solid show of 150 businesses that have joined forces with Nvidia on the Omniverse, the new simulation environment used to build metaverses big and small. The industrial metaverse will consist of digital twins of things such as automobile industries.
BMW has built a digital twin in the virtual world so that it could perfect the design and construct the real factory later. Another example is Deutsche Bahn, a large German train company that is modeling over 5,000 train stations in its rail system so that it can virtually monitor its real-world rail system. What is interesting about this discussion — whether the gaming or the enterprise metaverse will happen first — is that both are galvanizing one another.
At the GTC fall event, we discussed such topics with Matthew Ball, the CEO of Epyllion and the author of The Metaverse book, Rev Lebaredian, Nvidia's vice president of Omniverse and simulation; Peggy Johnson, CEO of Magic Leap; Tony Himelgarn, CEO of Siemens Digital Industry Software; and Inga V. Bibra, the head of Mercedes-Benz research and development.
Here's a transcript of our panel discussion.
Dean Takahashi is the lead writer for GamesBeat at VentureBeat. We have two events planned for October 4 and October 26. Use Dean50 code for 50% off.
Matthew Ball: I am an investor, author, and producer, who am mostly interested in the metaverse.
Rev Lebaredian: I lead the OmniVerse and simulation teams at Nvidia, which is focused on all of this metaverse stuff, and more specifically the industrial metaverse.
Peggy Johnson: I'm the CEO of Magic Leap, a head-worn augmented reality device that allows you to intelligently integrate digital content into the physical world. We're not building the metaverse, but we strive to give you a window into the metaverse.
Tony Hemmelgarn: I am the president and CEO of Siemens Digital Industry Software. We develop software for designing goods, manufacturing, and other kinds of things. Much of the work we've done over the years is in what we've called the digital twin, where the real world represents the virtual, or vice versa.
Inga V. Bibra: I am the head IT of Mercedes-Benz research and development. I'm particularly interested in how we can utilize the metaverse in the industrial context, in particular on the engineering side. Engineering, product conception, planning, production, all the way through the life cycle of the digital twins that arise.
VentureBeat: Since we've discussed the metaverse and Omniverse at several GTCs before, I'd like to start with a progress report on where things are. How are metaverse projects, applications, and developments progressing? Let's start with Tony.
Hemmelgarn: We work with the digital twin a lot. It's a well-known concept. However, the value of the digital twin is how close the virtual world can be to the physical world. If you can make decisions in confidence knowing that your digital twin representation is comprehensive, then you can move a lot faster when customers who use our software design the products they're working on.
The metaverse has given us the ability to make that more realistic, more photorealism. The work we do with Nvidia is to make it instantaneously photorealistic, so you can see exactly what's going on. It's everything from examples like constantly monitoring production or factory operations. You get notified of a production problem, and we simulate and optimize solutions.
This isn't an animation. An animation is fine, but it's not enough. We need to be able to simulate the physical properties of something to say, "If I make a change, what happens?" We're working on those kinds of use cases with our customers today. We're already showing the value of what they may be.
There are also instances of doing it in product design. They want to see it in a photorealism mode. We can do that at the time of design to demonstrate it in real time and help them make decisions. We’ve got a number of use cases in mind.
Bibra: That's a good start for me as well. The metaverse is altering a lot of how we'll interact in our personal lives, but also it's a significant shift in the way we'll collaborate in the future. If you look at automotive and automotive engineering, you'll see that we'll have this immersive, real-time environment where we can collaborate.
Imagine if you're making a change in your automobile as an engineer. You have your production planning colleague immediately seeing that change and being able to modify parameters of manufacturing equipment and feed it back to the engineers. These are closed loops with the physical simulation capabilities. Our intention is to stay in the virtual space as long as possible.
We're still working on a few specific applications, like a virtual tour, driving the car virtual, and having a real experience of that. We're learning that way to see what's possible and where we might also have obstacles in the way of moving to the virtual stage of engineering.
Johnson: In many ways, Magic Leap is the original of the space. Dean had a comment about how Magic Leap was criticized for spending so much, but maybe we didn't spend enough money. It's an exciting time to be in the metaverse.
Matthew stated in his book that there are many technologies that need to be integrated in order to enter a new era. These things are beginning to jell. We're seeing useful use cases for the technology. Initially, for us, we're focusing on the enterprise metaverse, because the devices are still a little bit big for consumers. Eventually, with further silicon integration, we'll return to that glasses format.
Lebaredian: We've been working on this metaverse thing for a long time. At Nvidia, all of the technologies and all the things we need to build are just beginning to come together now, or that's what it feels like. Siemens has certainly been working on that, as have many others. You need to reach a certain threshold of things happening for it to really pop.
We started working on OmniVerse many years ago, perhaps four or five years ago. We anticipated that early adopters would come from smaller niche industries, from media and entertainment to visual effects. Eventually, we would expand into architecture, engineering and construction, ACE, and then follow that into manufacturing and more industrial use cases.
The other way we've found is that the opposite has happened. From the people who are coming to us and the demand that we're seeing, it's mostly from the industrial sector. Companies that build things are realizing that the amount of complexity they'll be able to do this efficiently moving forward is only by first simulating the things they build. They need digital twins. We need a way to iterate and design without having to do it in real life, first.
This is a fantastic moment. We were surprised that firms like Mercedes had already realized this. We're still at the beginning of it, but already there's been significant progress.
I'd add a few more things here. Omniverse began as a result of robotics, but has now surpassed the highest level of supercomputing. It's also interesting to see different companies come out and say that the metaverse would be a $5 trillion value by 2030.
I'd like to discuss the progress of digital twin projects a bit further and maybe provide some additional information here. How are companies like Siemens, Magic Leap, Nvidia, and Mercedes constructing these digital twins? What are your techniques?
Hemmelgarn: I mentioned this a moment ago, saying that Siemens' 3D software consists of our manufacturing simulation, computer analytics, whether it's computational fluid dynamics, or showing fluid flow. All these things are part of what we do within our software.
This is why our software has grown so rapidly in the development of digital twins with our customers. Products are very complex. An automobile has tens of thousands of requirements, or an airplane or whatever. How do you alter one requirement without knowing how it affects everything else in a virtual way? If you can't represent software, you can't represent electronics, or better yet the manufacturing and automation, and all the things that go into making that product. You can't even simulate it.
Our software has been doing this for many years, allowing us to connect digital twins with our customers to make things happen much faster. Perhaps as Peggy mentioned before about COVID-19, the ones that were able to go through COVID-19 much faster than others were the ones that were able to have a more complete digital twin. They couldn't just meet to look at the product concept. That's what our software has done for many years.
When it comes to software like that, you want to keep pushing forward. What we're doing with Nvidia is making it more real. If we can get photorealism instantaneously, and I always see it–for years, I'd say, “Go to the cave.” That's what I'm thinking about when I think about OmniVerse. Making it more widespread throughout the whole design process. That's what we've been doing and where we anticipate to take it heading forward.
Bibra: That's absolutely correct. We'll never finish building the industrial metaverse. But you need to start somewhere. You need to bring data sources together. You need to make it photorealistic. You need to build up, let's say, the technologies that are required for that. We need to deal with enormous data, with APIs and other technologies.
The challenge we all face is bringing these different data sources together, while at the same time focusing on specific areas where we may actually experience what it's like working in the metaverse. This way, you can get a lot of efficiency, but also speed.
When it comes to data, it's also very important. We're very conscious of data privacy, but we also need to ensure that we develop the mindset to work in the virtual world. It's just getting started. We're just starting to realize the potential. Building that digital trust is also one of our main objectives.
Johnson: As a manufacturer of a device that looks into the metaverse, our focus has been on the ecosystem of solutions that we can add to the device. We’ve worked with all kinds of companies and platforms like OmniVerse, which has been fantastic, but mostly, bringing these solutions to the device, so our end customers can have their choice of things. We have companies like Navis that do reality visualization in factory-based solutions. Another company called Tactile that does a lot of workflow processes inside of factories.
When you see something as a digital twin, it's a bit more cognitively–you're able to see what's going on in the machine. We're just discovering all of the psychology of the help that these digital twins can offer employees, particularly new employees, when they first learn about all of those many components in something like an automobile. There's a lot of excitement ahead in that ecosystem.
VentureBeat: This is GTC, so I suppose we must think about computer architecture in some way. Matthew, what kind of computer architecture or computing power do we need for the metaverse?
Ball: This is a great place to summarize a bunch of thoughts here. McKinsey projects $5 trillion by the end of the decade for the value of the metaverse. That's actually modest. Goldman Sachs, Morgan Stanley, and KPMG all estimate $13 to $16 trillion by the end of the decade. Of course, Jensen has stated as much as half of world GDP based on this year alone. That's dated in the future, so it might be 70 or 80 or more
This is a question of allocation. What we're really seeing is the fact that the entire world, or much of it, will perform real-time simulation. This allows us to see that this has been a process or progress for decades. Or the complexity required such costly computing devices and runtime that almost no one could access them.
Tony discusses the importance of car requirements, while Rev is talking about Earth 2.0. What we're referring to is making the entire world more legible to software, simulating it in that software, and doing it in real time, including trillions of dollars in real estate and architectural assets, as well as a vehicle fleet.
We don't have the computing power for that today, certainly. Intel has estimated that a thousandfold increase in computing efficiency is required. Meta has said it more than 500 times. But what's important is to recognize that it's a continuum. We're slowly closing that gap. While there are certain use cases that we look toward and say, "If we achieve level X or deployment Y, we can do thing A or Z," that skill set, that application is expanding daily.
Lebaredian: Inga and Tony touched on the importance of remaining virtual as long as possible in the digital world. Where we're going is essentially staying virtual forever, even after you build the thing in the real world. You've built your factory to produce your Mercedes cars. There's still a lot of value, if not the most value, in having that digital twin still exist alongside the real thing.
If you have a method to connect the two so that you may see the current state of your factory inside the digital version, you will get a lot of superpowers here that computation and software can give you. This is why Siemens comes in handy. It allows anyone to go into the past by looking at everything that was previously recorded, or go into the future to try it.
Doing all of this is clearly extremely powerful. If you can do it, it will work.
The computational need is great. Here are some ideas about how to make the most of it. The speed of light can be hampered by the location of the processors that make the computation. It's important to have a distributed, heterogeneous supercomputer that can process huge amounts of data at a distance, or even at the same location.
We're busy developing a lot of these technologies. It's not just about the chips and the computers. It's also about a whole new kind of network, new kinds of networking that can safely and securely transport huge amounts of 3D data and spatial data.
Peggy, what kind of computing power is required for the metaverse? But also, what computing architecture or platforms must still be developed?
Johnson: I look at it in a couple of ways. First, I couldn't agree more with Rev. What we learned from Magic Leap One was the need for real-time rendering, turning data around quickly, making instant decisions. The previous team built an OS from the ground up. But it did give us an advantage in terms of extensibility.
If we hadn't done it, we might not have been able to develop an AR device. This was an area no one had ever imagined. We had to come up with a whole device without a lot of a road map. That was a good move. We're also an open platform. People use many different solutions. We have to be able to do that.
We can't do all of this on the device, as you need to do certain things with very little latency. It's going to take the power of something like Cloud XR that Nvidia has. That's going to open up the possibilities of what the metaverse can be, to have that kind of compute at our fingertips.
VentureBeat: What are the boundaries of what will happen for the various experiences, such as XR, digital twin, and persistent simulations?
Ball: It's a fun question. To the extent of Peggy's remarks, it's important to recognize that often these questions are answered in isolation — "What is the computing power requirement?" — without considering the ways all of these things impact on the device. We're talking about wearables on the face. We're talking about the device's weight and how it impacts the user's mobility. You can reduce the size of the device, or you can increase the battery life.
Because the right question isn't necessarily compute power. It's not even necessarily this issue of where you place certain activities, as Rev said. It's all about looking at all of these individual points while also tailoring them to the intended application. This is why it's appropriate to start talking about these as industrial versus consumer use cases or prosumer use cases. The question and the answer are always going to differ in this case.
First and foremost, these industries have access to higher-speed local computing devices. They can be rendering work or doing calculations on a supercomputer that might be 100 or 1000 feet away. Additionally, a worker might be using a local relay station, or possibly wearing extra horsepower on their back.
This is why we think of this as a complicated and difficult to answer problem. We know that most of the individual contributors aren't quite there, but it's more about figuring out the puzzle pieces and the right solution for the problem, for the user, than a single answer. Certainly, we can tell that some form of these three different locations will be required. Then we'll have to use edge, and then much farther away data centers to support that work.
Rev, what foundations have already been built, in your mind, for the metaverse?
Lebaredian: It all begins with designing computers that are capable of doing real-time simulations. That's what Nvidia has always done. The GPUs we build, at the start, were mostly designed for rendering, which is a physics simulation. It's the physics of how light interacts with matter that drives it. Years later, we introduced programmability so that you could use our GPUs to perform other kinds of simulation. That's when we discovered high-performance computing and supercomputing to perform
The combination of this and the new interconnects and networking capabilities that we're working on and designing are giving us the chance to tackle problems that before we knew were completely impossible. We're still working on it. Omniverse is our way of doing that. They just largely haven't fully developed all of them. We're just demonstrating what's possible.
Moore's Law is no longer applicable in computing history. The types of computers we have to construct to keep going forward are not just going to be many times faster than the ones we've made in the previous decades. These computers must be distributed. Many applications we're doing are right on your body, just as Peggy was talking about.
We've already made some great improvements, but we're only just getting started. This is the most difficult computer science challenge of all time: simulated everything in the world and integrating the virtual versions with the real world. We're just getting started. But we're finally at a point where we can see a direction to where we want to go.
VentureBeat: How do you feel about Moore's Law's passing just in time for the metaverse? We have the Khronos Group that recently merged to form the Metaverse Standards Forum. We have GLTF and USD. What do you think about standards?
Johnson: I'm glad that even though this is still a fairly new technology, those standards are beginning to jell. Because that's where I grew up, I'll always use analogies from the mobile phone industry, but remember when you could only send a text message to someone who owned a Verizon phone? We have to get beyond that from the start. USD [universal scene description], for one. There's so much value in having a cohesive industry out there.
SentiAR, a company based in the United States, is putting up a live heart right in front of your eyes, much more accurately, than when you see something on a 2D screen.
However, it will take everyone to join together to fulfill these early expectations. We don't want to have any wallsed gardens. We need to transcend those days and have open platforms where we can all work together. The solutions are there, but we'll never realize them unless we don't collaborate on them.
Hemmelgarn: I've been doing this for a long time, especially when it comes to 3D geometry and these types of things. There are many standards out there, right? But my opinion is, standards are really dependent on use cases, where you go, and what you do. For example, when I'm doing a design review of a camera being designed with my software, somebody says, "Yeah, but can you show me what it is like?"
For years, we have used a JT format, which is almost de facto standard in many industries, like automotive, and others. But people wanted to do a bit more in that. They wanted to be able to cut sections and do measurements but still maintain it at a high level, not as detailed as a CAD model might be. We used it for supply chain collaboration and those kinds of things.
We'll support the standards that are driven by our customers. Then it's back to use cases. I fully support the idea of USD and all of these standards. But there's going to be more that come into play when you talk about a metaverse and the idea of all of the things we've discussed today. It's much more about what you do for simulation and the physics behind it to make decisions.
VentureBeat: Inga, how are industrial customers navigating new adoption hurdles, especially considering that corporations have very stringent constraints?
Bibra: At Benz, we're well-known for pushing industry standards, particularly in the German Automotive Association. We have large working groups working on that, because it's crucial for manufacturing. It's always been to be able to share data across many parties in a uniform format.
It's also important to continue to push that in a way that's case-based and open. When we communicate with one another, we need to learn the same languages. We need software providers, data providers, and computing providers to push exactly that, whether it be JT standards or USD formats.
Hemmelgarn: I'd add that as long as we accept fundamental, and indeed extraordinary, limitations on computation, among other resources like network infrastructure, that's going to be required. Activities that are already pushing the limits of what a device network or user or company is capable of, it doesn't have the freedom to adopt the wrong standards or excessive standards for the use case they're deploying.
As China becomes stronger, there is a clear answer, both in the global economy and in commerce and industry. We have the USD and the euro, and increasingly the pound, as we also have the most compatible infrastructure today, which is just the intermodal shipping container, which isn't universal depending on the use case, but rather fairly widespread.
That's where we end up, this slew of myriad different standards, partial compatibility and incompatibility, which, while scaling internationally, ends up being the most efficient way for everyone to work together.
Rev: How do you see the speed of the metaverse's development, whether it's the industrial metaverse or the metaverse in general? How fast do you see all the different parties moving here?
Lebaredian: I was surprised to learn that such a large amount of interest in the industrial world is going to take a long time for companies like Mercedes to realize how powerful these kinds of technologies are. Over the next five years, everyone will undergo a significant change. The companies that do this, or those that don't, will be at a tremendous disadvantage.
Matthew, how do you see the speed of development and progress between industrial applications and the gaming and consumer applications of the metaverse, which we hear a lot more about?
Ball: I'd repeat that same sentiment this morning as I read the headlines. In all of the categories we've mentioned, four of the largest corporations on earth are still considering digital twins for design rather than operation, and without a clear roadmap for what Rev mentioned, which is spatial or 3D internet interaction. That's critical.
If the answer was that the metaverse has arrived, to whatever extent we want to say it, and then everyone must choose and acquire them, then learn to use them, and then become proficient at them, we're talking about a much longer timeline. But most corporations have realized that the future isn't just in simulation; it's real-time simulation in 3D with interconnection. They're now deploying these solutions so that those standards become established.
The average observer might overlook this transition's most crucial elements. Even if we're still a step away from actually connecting all of those individual facilities and disjointed simulations, the plumbing and the electrical system are being laid right now.
VentureBeat: I think a sure sign of progress would be to begin seeing a lot of reuse, things that have been created for one application being used for another. I wonder if there are any tools, platforms, or standards in common among these different industries of industrial, gaming, and consumer?
Lebaredian: It's becoming increasingly evident in the gaming and entertainment industries. There are a number of large libraries and marketplaces that you can use to obtain 3D assets that are more than just geometry and triangles. You can go to Sketchfab, Shutterstock, CG Trader, and even the game engine marketplaces like the Unreal marketplace or Unity marketplace. All of them have either moved or are moving toward supporting USD as the primary means of interchange.
Because the representation of the types of data and the assets that you need on the industrial side is far more complex. There is no standardization yet, or very little, for a lot of the complexity. However, we anticipate it to grow over time.
Ball: This is going to happen in particular in the gaming industry. In some respect, we're seeing that as we get years into this transition, we have a large consumer base that has spent, at this point, tens of billions of dollars on virtual goods. They don't have an infinite appetite to keep buying these things. That's going to require the rest of the industry to begin adapting integrations into those providers who already have the assets and the file formats, who manage the entitlements.
If that applies to gaming firms or a Spider-Man outfit, you can imagine what will happen to those who have invested huge sums in 3D imaging of their environment, their possessions, and their infrastructure. As they look to innovations and integrations, and productizing some of these for 3D space, that same business case will exist.
Lebaredian: Recently, Lowe's, the home improvement retailer in the United States, distributed a large amount of assets from the products that they sell at Lowe's. They published it as USD, but they put this out there for the research community to experiment with. We're going to see a lot more of that, where all of the goods we buy will come with a digital twin of that item, the same way you can place it in your virtual world.