As graphical processing unit chipmaker (GPU) Nvidia continues to capitalize on its newly confirmed status as the world’s most valuable company, the organization has not let the limelight distract it from rolling out key increments and extensions to its core product roadmap.

This month has seen the company detail its Nvidia Omniverse Cloud Sensor RTX, a collection of microservices designed to enable what the company calls “physically accurate” sensor simulation for fully autonomous machines.

Beyond autonomous vehicles, this technology also has application relevance for robotic arms, mobile robots, humanoids and wider digitally-enabled smart spaces.

The Nvidia RTX brand relates to ray-tracing (RT), an image-processing technology – found in the modern age of video games including Cyberpunk, Tomb Raider and even Minecraft – used to handle lighting effects as if they were true photons. With its extensive capability to replicate the visual fidelity of the real world and its logic based on the agreed laws of physics, this technology can also be applied to autonomous machines and their need to capture, encode and process images of our their surroundings.

(Data) Hungry Humanoids

The GPU giant reminds us that sensors working at this level deployed inside industrial manipulators (and the above-mentioned list of digital robotics) now form a multi-billion-dollar industry.

Although the ‘how it works’ manual for this tool would run to many pages, we can say that software application developers can use Nvidia Omniverse Cloud Sensor RTX technology to test sensor perception and associated AI software at scale in physically accurate, realistic virtual environments before real-world deployment.

That physically accurate competency means, in theory, that your robot buddy is now smarter, slicker in terms of operation and crucially that much safer.

Building Digital Planet Earth

“Developing safe and reliable autonomous machines powered by generative physical AI requires training and testing in physically based virtual worlds,” said Rev Lebaredian, vice president of Omniverse and simulation technology at Nvidia. “NVIDIA Omniverse Cloud Sensor RTX microservices will enable developers to easily build large-scale digital twins of factories, cities and even a [digital version of] Earth — helping accelerate the next wave of AI.”

Built on the OpenUSD framework and powered by Nvidia RTX ray-tracing and neural-rendering technologies, Omniverse Cloud Sensor RTX promises to accelerate the creation of simulated environments by combining real-world data from videos, cameras, radar and lidar with synthetic data.

While the image-rendering technologies of yesterday were able to put the ‘skin’ on Lara Croft’s digitally defined body image skeleton and make her look like a human, the neural-rendering technologies of this era make use of deep learning neural network models to decide whether the sunlight creeping in through a tomb is bright enough to look like dawn’s sunrise without a hint of midday glare. This is how far we are now extending the capability of these data engines.

Lebaredian and team actually use rather more fundamental real world use case examples and say that, “Even for scenarios with limited real-world data, the microservices can be used to simulate a broad range of activities, such as whether a robotic arm is operating correctly, an airport luggage carousel is functional, a tree branch is blocking a roadway, a factory conveyor belt is in motion, or a robot or person is nearby.”

Real-World Deployment

The Omniverse Cloud Sensor RTX announcement comes at the same time as Nvidia researchers won a computer vision and pattern recognition award for autonomous vehicle control. The team’s winning end-to-end autonomous driving workflow can be replicated in high-fidelity simulated environments with Omniverse Cloud Sensor RTX. This gives robotics developers the ability to test scenarios in physically accurate environments before deploying them in the real world.

Next time you go for a drive, just remember: 

Mirror, signal, is there a neural-rendering ray-tracing empowered vehicle coming up unexpectedly on the left… and then maneuver.