Nvidia's third regional GPU Technology Conference kicked off today and NVIDIA co-founder and CEO Jen-Hsun Huang unveiled technology that will accelerate the deep learning revolution.
"GPU computing is at the beginning of something very, very important, a brand new revolution, what people call the AI revolution, the beginning of the fourth industrial revolution," Huang told a crowd of scientists, engineers, entrepreneurs and press, gathered at Amsterdam’s gleaming waterfront music hall. "However you describe it, we think something really big is around the corner."
Huang unveiled Xavier, Nvidia's next-generation system-on-a-chip for powering self-driving cars; announced an agreement with TomTom, the Dutch mapping and navigation group, to use AI to create a cloud-to-car mapping system for self-driving cars; detailed our DriveWorks Alpha 1 release, and highlighted the work Nvidia is doing with European startups and research labs.
Xavier, an all-new SoC based on Nvidia's next-gen Volta GPU, will be the processor in future self-driving cars. Xavier features performance and energy efficiency, while supporting deep-learning features important to the automotive market. A single Xavier-based AI car supercomputer will be able to replace today’s fully configured DRIVE PX 2 with two Parker SoCs and two Pascal GPUs.
"This is the greatest SoC endeavor I have ever known, and we have been building chips for a very long time," Huang said. "Just imagine what an autonomous vehicle can do in the near future with Xavier."
The Xavier SoC integrates a custom 8 core CPU architecture, and a new computer vision accelerator. The processor will deliver 20 TOPS (trillion operations per second) of performance, while consuming only 20 watts of power. As the brain of a self-driving car, Xavier is designed to be compliant with critical automotive standards, such as the ISO 26262 functional safety specification.
Packed with 7 billion transistors, and manufactured using 16nm FinFET process technology, a single Xavier AI processor will be able to replace today’s DRIVE PX 2 configured with dual mobile SoCs and dual discrete GPUs — at a fraction of the power consumption.
Xavier samples will be available the fourth quarter of 2017 to automakers, tier 1 suppliers, startups and research institutions who are developing self-driving cars.
Huang also detailed the Alpha 1 release of our DriveWorks software, which incorporates a number of new modules, including support for free space detection - which helps self-driving cars determine where it’s safe for cars to drive; distance detection; lane detection; and 3D bounding boxes, which determine the size and shape of objects around the car.
Huang showed how a new neural network, PilotNet, will enable the handling of more challenging situations, such as construction sites, night driving and foul weather. Another neural network, OpenRoadNet, will enable free space computation and enable the creation of the occupancy grid to help cars determine where they can safely drive.
"Together we will work as an industry to move autonomous driving forward, this is going to be an area of research and development for years to come," Huang said. DriveWorks Alpha 1 will be released to early partners in October.
TomTom will port and run localization and mapping software on DRIVE PX 2 AutoCruise. In addition, our NVIDIA DriveWorks software will integrate support for TomTom’s HD mapping environment.
Huang also announced that TomTom will port and run localization and mapping software on DRIVE PX 2 AutoCruise. In addition, Nvidia's NVIDIA DriveWorks software will integrate support for TomTom’s HD mapping environment.
TomTom is working to create high-definition maps of the world’s driveable roads. "You want to localize your car with a centimeter of accuracy, because you don’t want to miss by 20 centimeters when you have a self driving car," explained Alain De Taeye, of TomTom’s management board.
"People used to believe creating navigable maps was unaffordable, now people believe HD maps, which are very detailed, very accurate is unaffordable - it’s not: you need to be clever about it and use AI and AI platforms to automatically create and maintain them," De Taeye said.
Huang announced two of Europe’s top AI research centers will collaborate with NVIDIA to ramp up their efforts in the field.
Huang also announced that SAP is now using DGX-1 AI supercomputers at its operations in Potsdam, Germany, and in Israel, where teams are building machine learning solutions for enterprises.
Over the last two months, DGX-1 has been adopted by AI labs around the world, including those based at UC Berkeley, Stanford University and OpenAI.
DGX-1 packs some 170 teraflops of computing power, equal to 250 conventional servers, into a single box. It uses eight NVIDIA Pascal powered Tesla P100 accelerators, interconnected with high-speed NVIDIA NVLink technology, and includes a range of deep learning frameworks.
Huang also called out four European startups — among the more than 1,500 worldwide — that are using GPU-powered AI.
BenevolentAI, Nvidia's first DGX-1 customer in Europe, is using AI to help medical professionals understand the vast amounts of medical research published every year.
Smilart is using GPU-powered AI to analyze faces, even if their appearance has changed, or if an image is captured in low light or from a challenging angle.
Intelligent Voice uses AI to not only recognize speech, but distinguish between speakers and even detect a speaker’s emotions.
Sadako Technologies uses AI to train robots to sort trash. So far it has saved more than 60,000 tons of plastic from going to landfills.