Optimizing Performance for HPC: Part 2 – Interconnect with InfiniBand
Following up on the first webinar in this series (Optimizing the Nehalem for HPC) covering HPC performance and optimization for the new Intel Xeon 5500 series (Nehalem), aspects of the InfiniBand interconnect will be examined.
By
Thursday, July 23rd, 2009
Following up on the first webinar in this series (Optimizing the Nehalem for HPC) covering HPC performance and optimization for the new Intel Xeon 5500 series (Nehalem), we examine aspects of the InfiniBand interconnect and its impact on clusters.
Join speakers from IBM, Intel and Qlogic as they discuss the capabilities of InfiniBand and the benefits it brings to Nehalem HPC clusters, improvements in processor and interconnect utilization, benchmark results.