In May, Nvidia announced its A100 GPU, the first high end AI chip from Nvidia since 2017. This chip is fast, hot, big, and really sets the stage for AI hardware for the next few years. To discuss its implications, we’ve invited two of the most original voices covering semiconductors today, Paul Teich, principle analyst at Liftr, a semiconductor veteran who spent over 20 years at AMD and Dylan [last name omitted by request] who is a data scientist by trade and writes about the semiconductor industry on his website semianalysis.com.
Key Points From This Episode:
- The current landscape of AI chips in the data center and Nvidia’s competitive positioning
- What’s new in Nvidia’s A100 GPU
- What it takes for an AI chip startup to deploy in today’s public cloud environments
- The importance of Nvidia’s software stack beyond CUDA
- How Nvidia will leverage its Mellanox acquisition to further expand into the data center
More FYI Podcasts: https://ark-invest.com/research/podcast
Learn more about ARK: https://ark-invest.com/
Disclosure: http://bit.ly/1C5DBVL
Key Points From This Episode:
- The current landscape of AI chips in the data center and Nvidia’s competitive positioning
- What’s new in Nvidia’s A100 GPU
- What it takes for an AI chip startup to deploy in today’s public cloud environments
- The importance of Nvidia’s software stack beyond CUDA
- How Nvidia will leverage its Mellanox acquisition to further expand into the data center
More FYI Podcasts: https://ark-invest.com/research/podcast
Learn more about ARK: https://ark-invest.com/
Disclosure: http://bit.ly/1C5DBVL
- Category
- Data Centers
Be the first to comment