Will AI robots see the world the way humans do? Probably not. Advances in digital vision systems indicate that how robots may see the physical environment around them may be more akin to animal visual perception than that of humans. And while robotic eye development now lies within the province of human engineers, there are hints that digital entities can evolve eyesight on their own over time.

The evolutionary eyeful came about when researchers at Sweden’s Lund University and the Massachusetts Institute of Technology created artificial animals that began developing vision from scratch, generation by generation.

“We have succeeded in creating artificial evolution that produces the same results as in real life,” says Professor Dan-Eric Nilsson, sensory researcher and evolutionary biologist at Lund University. “It’s the first time AI has been used to follow how a complete vision system can arise without instructing the computer how it should come to be.”

While the initial goal was to gain insight into human evolution, there are implications on the engineering side. By studying how evolution solves problems, engineers can build biosystems that work better in reality.

”Using AI, we can explore potential evolutionary futures and see which solutions are waiting around the corner, long before nature itself gets there,” says Nilsson. Engineers can develop robust and efficient technical systems that are as adaptable as biological solutions often are.

Robotic vision systems inspired by biology may veer into surprising directions. Scientists at the Chinese Academy of Sciences say they have engineered an artificial compound eye inspired by the fruit fly that allows robots to see and smell simultaneously. As reported by TechXplore, the Chinese scientists packed 1,027 visual units into a tiny sensor package just 1.5 millimeters in size that gives robots a 180-degree field of vision. Most robots need to turn their heads to see sideways. The new system means a robot can detect threats from the front and sides much like an insect would. And much like a fly, the sensors include tiny hairs called setae between the lenses to keep the lens clear and prevent moisture buildup in humid environments.

The new sensor still faces some engineering hurdles related to image resolution and synching the processing speeds of the visual and smell sensors. Once perfected, the sensors are likely to have a major impact on autonomous machine intelligence. One expected application is for emergency response drones that would be able to navigate collapsed buildings and detect invisible gases.

Biological eyes also are the inspiration for robotic vision being developed at the University of North Carolina’s Department of Applied Physical Sciences. The goal is to build an artificial vision system that doesn’t rely on software to clean up images afterward but instead adapts physically in real time the way a real eye does. UNC researchers are using shape-shifting liquid metal to create a curved artificial pupil that opens or closes in changing light conditions while providing a 180-degree field of view.

Perhaps the most striking feature of the UNC system is its ability to mimic the pupils of animals and not just humans. By controlling different sections of the liquid metal independently, a robot could have round pupils with vertical slits like a cat’s for better night vision. More exotic shapes might include pupils like that of a sheep or a cuttlefish. The idea is to develop robotic vision tailored to specific tasks. For example, a robot built to navigate open landscapes might benefit from a wide, horizontal pupil that emphasizes the horizon while a robot with a narrow pupil shape would be better at tasks requiring more depth of field and focus. The ability to physically adjust an artificial eye also means the robot can make object identification quicker and easier. While the UNC system needs further refinement, the potential applications include autonomous vehicles that often struggle to cope with changing light conditions.

It might sound like science fiction but the bottom line is the same across the board. Engineers are no longer treating robot vision systems as passive camera-like collectors of images processed by software. The research points to a near future where artificial eyes actively interact in an environment with changing light conditions just like living eyes do. These artificial eyes, however, may not see the world just like humans do.