Nvidia NERF: AI technology turns 2D photos into 3D objects faster


NVIDIA Currently working on a new technology with which it is possible to convert normal 2D images into 3D objects. This technology is based on Artificial Intelligence and is called Neural Radiation Field (NERF). This method can be used in many fields.

like nvidia in one blog entry Writes, the latest NERF approach takes just a few seconds to train a model. After a 3D scene is created, only milliseconds are needed to render the images from the scene. The technology relies on multiple images capturing the object in question from different positions. However, if there are multiple objects in the photo, they shouldn’t move too quickly. Otherwise there is a risk of the result being wrong.




The developers call their version “Instant NERF” and stress that it is currently the fastest method. Traditional methods take several hours to create a 3D scene. Instant NERF is said to be capable of completing tasks 1000 times faster than other NERF approaches. The technology uses artificial intelligence to determine shadows within an image and build a model of exposure.

can be used in many fields

Immediate NERF can be used to improve autonomous vehicles in the future. With technology, systems can recognize and sense the size and shape of real-world objects. In addition, this process is likely to play a role in the entertainment sector and architecture. Several photos will suffice to create a 3D model of the atmosphere. This can save developers and architects a lot of time.

See all:


Gaming Logo Nvidia GPU Graphics Card Graphics Geforce Graphics Chip Nvidia Geforce Graphics Unit Retracing GeForce Now
Gaming Logo Nvidia GPU Graphics Card Graphics Geforce Graphics Chip Nvidia Geforce Graphics Unit Retracing GeForce Now
NVIDIA

More from Laurence Porter
Release date announced in the new trailer, here are the pre-order bonuses
12/09/2022 at 10:45 amFrom Jonathan Hersh , Blizzard has missed the official...
Read More
Leave a comment

Your email address will not be published. Required fields are marked *