AI ON THE HORIZON

scroll down
Artificial intelligence aims to help liberate drivers.
move
More than ever, companies now are determined in getting vehicles moving in ways they haven’t before. Though we are a long way from realizing the dream of technology we could only dream about in science-fiction movies, the automobile has become increasingly sophisticated, a powerful computer on wheels thanks to advanced onboard systems with autonomous capability. This technological progress is driven to improve our lives by addressing constant challenges in society.
Already set to impact many industries, including retail and healthcare, artificial intelligence (or AI) is the heart and soul of future technology, evolving as we do. It can change how a product is sold to a consumer, can help prevent disease or can allow us to create smarter city infrastructure—all driven by data.
In the automotive world, AI learns how we behave on the road, analyzing driving conditions the same ways a human being would. It can sense how other drivers behave and uses other key data metrics, such as weather and traffic, with the goal of helping us avoid the mistakes we tend to make that often lead to collisions. “We want to develop solutions that can help us to do things we otherwise would be unable to do,” said Wolfgang Rother of Audi Communications. “It’s the start of a new age of mobility.”
A PERCEPTIVE REVOLUTION
AI is becoming such an important part of our near future that it’s become the focal point of discussion at CES®. NVIDIA, a leader in AI innovations and a key collaborator with Audi and its autonomous efforts, delivered its keynote about how it plans to redefine what being on the road will be.
CEO Jensen Huang announced plans for the world’s first autonomous machine processor, DRIVE Xavier; platforms that incorporate both AI and augmented reality for improved real-time information; and self-driving ride-sharing vehicles. Huang said, “One of the most important industries that AI is going to revolutionize is transportation, including mobility services and trucking.”
As a transportation innovator, Audi has created a robust portfolio of mobility technologies to help relieve drivers from the stress of workday commutes. Audi also has used AI to create possibilities on how to help further simplify drivers’ experiences on the road. Audi Artificial Intelligence (Audi AI) is designed to be capable of self-learning and thinking via distinctive, dynamic, intelligent and empathetic systems. The freedom that Audi AI provides to drivers represents a new variety of a premium experience.
Through this valuable, decade-long collaboration with NVIDIA, Audi has been able to engineer technological breakthroughs that include the development of Audi virtual cockpit, Audi connect®, the current range of infotainment systems, and the capability to become autonomous. The most recent developments appear in the new Audi A8 sedan, with various NVIDIA-powered available features including rear seat entertainment.
“Over time, the network gradually learns what a car, tree, person and a house is.”
CAMERA
DEPTH
3D BRILLIANCE
One important collaborator for Audi and its artificial intelligence advancements is Audi Electronics Venture GmbH (AEV), which developed a mono camera technology development project that uses AI to generate a precise 3D model of a vehicle’s surroundings at all times. Positioned on the car’s front hood is a front-facing camera that acts as the sensor, capturing data ahead of the car within a 120-degree angle and delivering 15 images per second at 1.3 megapixels.
The images are processed in a neural network, which classifies them into object categories. Then the system makes a distinction between vehicles, buildings, pedestrians, road markers and traffic signs. “We show the network input scenes to allow it to learn,” said Maximilian Muehlegg, Software development / Machine Learning of AEV. “Over time, the network gradually learns what a car, tree, person and a house is, and once that training is complete, it can be shown different scenes it hasn’t seen yet.”
The presented AI module uses neural networks to calculate distance information, creating ISO lines—or virtual boundaries that define a constant distance—which, when combined with depth estimates, produce the 3D model of the vehicle’s surroundings. The neural network receives multiple videos recorded with a stereo camera in order to view different road situations. As a result, the network is able to independently understand rules to produce the 3D information provided by the images taken with the mono camera.
“Our goal is to train AI to determine how far away an object is, similar to the way the human eye works to gauge distance,” Muehlegg said. “The two cameras provide images from each other to let the network learn, matching one image to another. Once it learns how to match well enough, you can remove one camera—just like if you cover one eye, we can ‘roughly’ say how far away one object is without using the other eye, which is essentially how this system works.
“Once you know what’s in front of you, the next step is figuring out how to react, and we can construct a virtual 3D environment—like a video game—to ‘turn’ the camera to avoid potential hazards,” Muehlegg said. “The beauty is we only need a mono camera, whereas multiple sensors are traditionally required to achieve the same feat. It’s astonishing to realize what the sensor is capable of when we try to get 100% information out of it.”
SEGMENTATION
3D MODEL
“Once you know what’s in front of you, the next step is figuring out how to react, and we can construct a virtual 3D environment—like a video game—to ‘turn’ the camera to avoid potential hazards,” Muehlegg said. “The beauty is we only need a mono camera, whereas multiple sensors are traditionally required to achieve the same feat. It’s astonishing to realize what the sensor is capable of when we try to get 100% information out of it.”