Nvidia’s digital twin platform will change the way scientists and engineers think

Follow along with VentureBeat’s coverage of Nvidia’s GTC 2022 event >>

Nvidia has announced several major upgrades to its scientific computing platform for digital twins and has released these capabilities for widespread use. Highlights include the general release of Modulus, a physics-informed AI tool, support for new Omniverse integrations, and support for a new 3D AI technique called adaptive Fourier neural operators (AFNO). Both Modulus and Omniverse are available for download today.

These developments promise to change the way engineers think about simulation from an occasional off-line process to operational models baked into running operations, Dion Harris, Nvidia chief product manager of accelerated computing, told VentureBeat.

These recent efforts complement other recent announcements, such as the intent to create Earth 2, ongoing collaborations with climate change researchers, and ongoing efforts to simplify engineering design, testing and development within the metaverse. Nvidia has also partnered with leading climate research supercomputer programs such as the European Center for Medium-Range Weather Forecast (ECMWF) on Destin-E.

Highlights of Nvidia digital twin announcement

Nvidia announced Modulus on GTC last fall, which is now live. It is a physics-informed neural network model that allows you to train models for complex systems using physics-informed instructions. This will improve climate simulations and explore physical, mechanical and electrical trade-offs in product and building design. It helps accelerate the creation of AI-based surrogate models that abstract physical principles from real-world data.

The new Omniverse integration allows teams to feed the output of these AI physics models into the Omniverse. This makes it easier to combine better AI models with visualization tools built into Omniverse. More importantly, these new models are much faster than conventional physics models, making it easier to run them in real time or explore more variations as part of scenario planning. “It creates a different operating model for how you would interact with these datasets and simulation workflows,” Harris said.

The integration with Omniverse will make it much easier for engineers to weave digital twins capabilities into existing workflows. Nvidia builds a variety of connectors that allow engineers to incorporate models from existing product engineering, architecture, and simulation tools. Omniverse also helps teams ingest data from AI models.

Omniverse provides a centralized hub for data collection in interactive collaboration between datasets and disciplines. It ingests data from various sources and uses the universal scene description format for organizing data on the platform. For example, a better model in climate research could relate to atmospheric data, geospatial data, and human interaction data. Harris said there’s still work to be done on building universal scene description plugins for different platforms, which is one reason Omniverse is free for developers.

Another major upgrade is support for adaptive Fourier neural operators (AFNO). This scientific term describes the training of neural networks that reflect 3D spatial states. AFNO is part of a broader class of novel approaches, including Fourier neural operators (FNO) and physics-informed neural operators (PNO). These techniques encode the 3D spatial relationships based on partial differential equation models, allowing teams to create more accurate surrogate AI models. Traditional AI models that use convolution or other pixel-based approaches that encode the arrangement of 3D objects less accurately.

Better climate models with AI

Nvidia also announced early results of these tools applied to climate research as part of the FourCastNet project. This collaboration between Nvidia and leading climate researchers from Purdue, Lawrence Berkeley, the University of Michigan and others. FourCastNet is an AI surrogate model used to perform mid-range climate change predictions on a global scale. The research paper describes how the team uses AFNO to produce a very fast but very accurate model that can be used for some of these midrange models.

In climate and weather research, resolution is characterized in square kilometers, which resemble pixels. The smaller the squares, the better. The state-of-the-art first-principles models such as the ECMWF’s Integrated Prediction System (IFS) can achieve a resolution of 9 km. The state-of-the-art FourCastNet model is faster but less accurate than the state-of-the-art models built using traditional first principle approaches.

Today, FourCastNet can achieve 18 km resolution 45,000 times faster and consumes 12,000 times less energy at the same accuracy as IFS. Previous surrogate models reached a maximum resolution of 25 km. A factor in improving accuracy is the huge data requirements for training surrogate models compared to traditional approaches. For example, the process of improving the resolution from 18 km to 9 km requires about 30 times as much data.

There are two scales of weather and research centers, including about 17 larger climate change centers and about 175 smaller regional weather research groups. The smaller centers tended to focus on well-defined boundaries that neglected the impact of adjacent weather phenomena. The new FourCastNet model enables the smaller weather centers to simulate weather patterns moving across borders.

“This will democratize climate change research,” Harris said.

One caveat is that this model was trained on 40 years of climate data, which required a lot of processing time and energy. But once trained, it can be run on cheap computers. For example, the FourCastNet researchers were able to run a simulation on a 2-node Nvidia cluster that previously required a 3060-node supercomputer cluster.

Harris expects that first-principle models and surrogate models will coexist for some time to come. First-principles approaches will form a kind of ground truth, while the surrogate models will allow engineers to iterate on simulation scenarios much faster. Nvidia has been working on ways to improve on both. For example, Nvidia has tuned its software to accelerate weather research and forecasting (WRF) and small-scale modeling consortium (COSMO).

An ensemble of earthen

This FourCastNet work complements Nvidia’s Earth-2 announcement at Fall GTC. Earth-2 is a special system that Nvidia is building to accelerate research into climate change. Earth-2 will combine hardware developments from Modulus, Omniverse and Nvidia into a cohesive platform. Omniverse integration makes it easier to incorporate AI models, climate data, satellite data, and data for other sources to create more accurate representations using all of these inputs.

“The Earth-2 system will integrate everything we build into a cohesive platform,” Harris said.

This makes it easier to combine different scientific disciplines, research techniques and models into one source of truth. The collaborative aspect of Omniverse will help researchers, policy planners, executives and citizens work together to solve some of the world’s most pressing problems.

Discovering new strangers

Faster simulations also mean that researchers can investigate the effects of simulation with slightly different assumptions within a model. Climate change researchers use the term ensemble to describe a process of testing multiple models with small variations. For example, they can run a simulation 21 times to investigate the impact of minute variations of assumptions on the overall projection. FourCastNet allows researchers to simulate 1000 member ensembles, providing much greater confidence in the prediction.

Harris said: “It’s not just about making the models run faster. You can also run it more to get a more accurate estimate of the result. You get a new understanding of how to think about it when you see this complex system in motion in 3D space.”

Siemens already had similar models in use, but only in the design phase. These faster simulation techniques allowed them to continuously run similar types of models during operations. For example, Siemens has used these techniques to more efficiently model heat transfer systems in a power plant and the performance of wind turbines. A new wind performance surrogate model is expected to lead to optimized wind farm layouts that can produce up to 20% more power than previous designs.

“We’re seeing digital twins being applied in everything from medical to manufacturing, scientific and even entertainment applications,” Harris said.

VentureBeat’s mission is to be a digital city square for tech decision makers to learn about transformative business technology and transactions. Learn more

This post Nvidia’s digital twin platform will change the way scientists and engineers think

was original published at “https://venturebeat.com/2022/03/25/nvidias-digital-twin-platform-will-change-how-scientists-and-engineers-think/”