Nvidia is one of or the leading forces when it comes to machine learning and artificial intelligence. The company has been building core pieces of technologies that enable data scientists and users to leverage AI & ML. We hosted Kevin Deierling, SVP of NVIDIA Networking to talk about this work and also deep dive into the Data Processing Unit or DPU which he feels is at the core of AI/ML. Here are some of the topics we covered in this episode:
• The work Nvidia is doing in the networking and AI/ML space?
• What role is a data center or ability to process data quickly and more efficiently going to play in the cloud-native world?
• How does Kevin define a Data Processing Unit?
• What is driving the emergence and adoption of Data Processing Unit?
• How does Kevin define data center in 2021 when procuring GPU and DPU can be a cost and time challenge, when everyone is moving to the cloud?
• What does the architecture of DPU look like?
• How is Nvidia planning to put its technology into the hands of developers so they can play with these technologies?
• What kind of use-cases are there for DPU? Who are the early adopters?
• How are DPUs leveraging cloud-native technologies like Kubernetes.
• The work Nvidia is doing in the networking and AI/ML space?
• What role is a data center or ability to process data quickly and more efficiently going to play in the cloud-native world?
• How does Kevin define a Data Processing Unit?
• What is driving the emergence and adoption of Data Processing Unit?
• How does Kevin define data center in 2021 when procuring GPU and DPU can be a cost and time challenge, when everyone is moving to the cloud?
• What does the architecture of DPU look like?
• How is Nvidia planning to put its technology into the hands of developers so they can play with these technologies?
• What kind of use-cases are there for DPU? Who are the early adopters?
• How are DPUs leveraging cloud-native technologies like Kubernetes.
- Category
- Data Centers
Be the first to comment