Update: Our annual GPU Technology Conference will be virtual. This post has been updated to reflect the sessions that will be available online during GTC Digital.
The progress of self-driving cars can be seen in test vehicles on the road. But the major mechanics for autonomous driving development are making tracks in the data center.
Training, testing and validating self-driving technology requires enormous amounts of data, which must be managed by a robust hardware and software infrastructure. Companies around the world are turning to high-performance, energy efficient GPU technology to build the AI infrastructure needed to put autonomous driving deep neural networks (DNNs) through their paces.
At next month’s GPU Technology Conference in San Jose, Calif., automakers, suppliers, startups and safety experts will discuss how they’re tackling the infrastructure component of autonomous vehicle development.
By attending sessions on topics such as DNN training, data creation. and validation in simulation, attendees can learn the end-to-end process of building a self-driving car in the data center.
Mastering Learning Curves
Without a human at the wheel, autonomous vehicles rely on a wide range of DNNs that perceive the surrounding environment. To recognize everything from pedestrians to street signs and traffic lights, these networks require exhaustive training on mountains of driving data.
Tesla has delivered nearly half a million vehicles with AI-assisted driving capabilities worldwide. They’re gathering data while continuously receiving the latest models through over-the-air updates.
For NVIDIA’s own autonomous vehicle development, we’ve built a scalable infrastructure to train self-driving DNNs. Clement Farabet, vice president of AI Infrastructure at NVIDIA, will discuss Project MagLev, an internal end-to-end AI platform for developing NVIDIA DRIVE software.
The session will cover how MagLev enables autonomous AI designers to iterate training of new DNN designs across thousands of GPU systems and validate the behavior of these designs over multi-petabyte-scale datasets.
Virtual Test Tracks
Before autonomous vehicles are widely deployed on public roads, they must be proven safe for all possible conditions the car could encounter — including rare and dangerous scenarios.
Simulation in the data center presents a powerful solution to what has otherwise been an insurmountable obstacle. By tapping into the virtual world, developers can safely and accurately test and validate autonomous driving hardware and software without leaving the office.
Having data as diverse and random as the real world is also a major challenge when it comes to validation. Nikita Jaipuria and Rohan Bhasin, research engineers at Ford, will discuss how to generate photorealistic synthetic data using generative adversarial networks (GANs). These simulated images can be used to represent a wide variety of situations for comprehensive autonomous vehicle testing.
In addition to these sessions, GTC attendees will hear the latest NVIDIA news and experience demos and hands-on training for a comprehensive view of the infrastructure needed to build the car of the future. Register before Feb. 13 to take advantage of early rates and receive 20% off with code CMAUTO.
The post Look Under the Hood of Self-Driving Development at GTC 2020 appeared first on The Official NVIDIA Blog.