In this video from ISC 2019, Gilad Shainer from Mellanox describes how HDR technology is proliferating across the TOP500 list of the world's most powerful supercomputers.
"Today Mellanox announced that the company’s InfiniBand solutions accelerate six of the top ten HPC and AI Supercomputers on the June TOP500 list. The six systems Mellanox accelerates include the top three, and four of the top five: The fastest supercomputer in the world at Oak Ridge National Laboratory, #2 at Lawrence Livermore National Laboratory, #3 at Wuxi Supercomputing Center in China, #5 at Texas Advanced Computing Center, #8 at Japan’s Advanced Industrial Science and Technology, and #10 at Lawrence Livermore National Laboratory."
HDR 200G InfiniBand, the fastest and most advanced interconnect technology, makes its debut on the list, accelerating four supercomputers worldwide, including the fifth top-ranked supercomputer in the world located at the Texas Advanced Computing Center, which also represents the fastest supercomputer built in 2019.
Mellanox InfiniBand and Ethernet connect 296 systems, or 59% of overall TOP500 platforms, demonstrating 37% growth in 12 months, from June 2018 to June 2019. This includes all of the 126 InfiniBand systems, and most of the 25 gigabit and faster Ethernet platforms. Mellanox’s Ethernet solutions are used in 170 Ethernet systems on the list, or 63% of the total Ethernet platforms, demonstrating 124% growth in 12 months.
The TOP500 list has evolved in recent years to include more hyperscale, cloud, and enterprise platforms, in addition to the HPC systems. More than half of the systems on the June 2019 list can be categorized as non-HPC application platforms, interconnected with Ethernet, where a large number of these systems represents US, Chinese and other hyperscale infrastructures.
We are proud to connect nearly 60% of the TOP500 supercomputers with our InfiniBand and Ethernet interconnect technologies, demonstrating growth of 37% from last year’s list. We are also proud to see the first HDR 200 Gigabit InfiniBand supercomputers on the TOP500 list, including the fifth most powerful supercomputer in the world and the fastest supercomputer built in 2019 at the Texas Advanced Computing Center. The innovations built into our HDR InfiniBand solutions, along with their performance advantages, make it the best interconnect of choice for the world’s leading compute and storage infrastructures. There are multiple HDR InfiniBand systems around the world today serving scientific and research activities that are not all on the TOP500 list, and we continue to see strong momentum for this technology across all geographies,” said Eyal Waldman, president and CEO of Mellanox Technologies. “We are excited to see 170 of the Ethernet systems utilize Mellanox’s 25 Gigabit or faster Ethernet solutions, demonstrating an increase of 124% year over year, and the growing adoption of our Ethernet adapters, switches and cables for hyperscale, cloud and enterprise infrastructures. The need to analyze ever-growing amounts of data drives designing several leading supercomputers with multiple network adapters per node for higher data throughput. The new NVIDIA DGX SuperPOD supercomputer utilizes ten 100 gigabit EDR InfiniBand adapters, for a total of one Terabit of data throughput per node. We have already started designing our NDR 400 gigabit InfiniBand generation that will power the next generation of supercomputing and machine learning platforms.”
Learn more: http://top500.org
and
http://insidehpc.com/newsletter
"Today Mellanox announced that the company’s InfiniBand solutions accelerate six of the top ten HPC and AI Supercomputers on the June TOP500 list. The six systems Mellanox accelerates include the top three, and four of the top five: The fastest supercomputer in the world at Oak Ridge National Laboratory, #2 at Lawrence Livermore National Laboratory, #3 at Wuxi Supercomputing Center in China, #5 at Texas Advanced Computing Center, #8 at Japan’s Advanced Industrial Science and Technology, and #10 at Lawrence Livermore National Laboratory."
HDR 200G InfiniBand, the fastest and most advanced interconnect technology, makes its debut on the list, accelerating four supercomputers worldwide, including the fifth top-ranked supercomputer in the world located at the Texas Advanced Computing Center, which also represents the fastest supercomputer built in 2019.
Mellanox InfiniBand and Ethernet connect 296 systems, or 59% of overall TOP500 platforms, demonstrating 37% growth in 12 months, from June 2018 to June 2019. This includes all of the 126 InfiniBand systems, and most of the 25 gigabit and faster Ethernet platforms. Mellanox’s Ethernet solutions are used in 170 Ethernet systems on the list, or 63% of the total Ethernet platforms, demonstrating 124% growth in 12 months.
The TOP500 list has evolved in recent years to include more hyperscale, cloud, and enterprise platforms, in addition to the HPC systems. More than half of the systems on the June 2019 list can be categorized as non-HPC application platforms, interconnected with Ethernet, where a large number of these systems represents US, Chinese and other hyperscale infrastructures.
We are proud to connect nearly 60% of the TOP500 supercomputers with our InfiniBand and Ethernet interconnect technologies, demonstrating growth of 37% from last year’s list. We are also proud to see the first HDR 200 Gigabit InfiniBand supercomputers on the TOP500 list, including the fifth most powerful supercomputer in the world and the fastest supercomputer built in 2019 at the Texas Advanced Computing Center. The innovations built into our HDR InfiniBand solutions, along with their performance advantages, make it the best interconnect of choice for the world’s leading compute and storage infrastructures. There are multiple HDR InfiniBand systems around the world today serving scientific and research activities that are not all on the TOP500 list, and we continue to see strong momentum for this technology across all geographies,” said Eyal Waldman, president and CEO of Mellanox Technologies. “We are excited to see 170 of the Ethernet systems utilize Mellanox’s 25 Gigabit or faster Ethernet solutions, demonstrating an increase of 124% year over year, and the growing adoption of our Ethernet adapters, switches and cables for hyperscale, cloud and enterprise infrastructures. The need to analyze ever-growing amounts of data drives designing several leading supercomputers with multiple network adapters per node for higher data throughput. The new NVIDIA DGX SuperPOD supercomputer utilizes ten 100 gigabit EDR InfiniBand adapters, for a total of one Terabit of data throughput per node. We have already started designing our NDR 400 gigabit InfiniBand generation that will power the next generation of supercomputing and machine learning platforms.”
Learn more: http://top500.org
and
http://insidehpc.com/newsletter
- Category
- Network Cards
Be the first to comment