After the Big Show: A Report from SC07

After a fun but exhausting week at SC07, HPC editor Doug Eadline returns to report on the show.

I’ve attended my fair share of Supercomputing (SC) shows, and I believe it is no accident that the show is held the week before the U.S. Thanksgiving holiday. You need a certain recovery time after four days of all things HPC, including some very late nights. A short work week to clean out your bag, sort through the business cards, make a few calls, and you are home free with a long relaxing weekend in front of you. That is, unless you have to write a newsletter column.

Every year I ask people (and people ask me) what was the big thing at SC this year? Despite the claims you may have seen from press releases issued during the show, there was nothing really big and overshadowing at SC07. There were clusters, lots of clusters. Blue Gene stayed in the Top500, SiCortex had a running machine, and there was a even small booth from a quantum computing company in the disruptive technology area (at least in this universe anyway).

For me, the biggest thing was the unrelenting traffic on the show floor. The show typically ends at 4 p.m. on Thursday. In previous years the exhibit hall starts to clear out around 1 p.m., save for vendors checking out the competition, and some swag wranglers. The place gets pretty dead. So dead, in fact, many vendors start packing up parts of their booths.

Not this year. I was standing near the front of the hall at about 3:45 p.m. on Thursday, and the traffic was still strong. I saw vendors still talking to possible customers, and people still scanning the exhibit hall for one more hit of high tech. I also saw plenty of smiling vendors. A good week to peddling HPC it seems.

If you missed the show, or if you did not get to see everything you wanted to see, fear not. This year, the crack Linux Magazine team was armed with a video camera and microphone. Here’s the scary part. I was the one of the people with the microphone. Keep your eye on Today’s HPC Clusters for the videocasts. We will be bringing you SC07 coverage packed with high level interviews and low level HPC hi-jinks. Not to be missed.

Two other items I wanted to mention before I go buy a turkey. First, SC07 marked the introduction of the Green 500 List. I wanted to mention this last week, but the list was not yet posted at press time. As the name implies, this list looks at the power usage of HPC system and rates them in terms of MFLOPS per Watt.

Of course, the IBM Blue Gene systems were nine of the top ten. The best result was 357.23 MFLOPS/Watt for the Science and Technology Facilities Council, Daresbury Laboratory, UK. By the way, the Daresbury system was 121 on the Top500 list proving once and for all you can be less filling and taste great all at the same time.

Second, last week also marked the launch of the Multi-core Cookbook. I wanted to mention this again because I have multi-core issues. I believe multi-core is huge shift in the way we will be computing for the foreseeable future. Parallel computing is here to stay, whether we like it or not.

In my opinion, we really do not have a handle on programming anything that involves more than one core (Previously referred to as a CPU). The Multi-core Cookbook is a way to understand the issues and get up and running quickly. The site will be filling out over the next several weeks, so check back often. Of course feedback is always welcome.

In closing, most people I talked to did not care for Reno all that much. The mountains were nice, but the word that came to mind was skeevy. Next year SC08 is the somewhat weird, but less skeevy, Austin Texas.

Comments on "After the Big Show: A Report from SC07"

Hi, I think your website might be having browser compatibility issues. When I look at your blog in Chrome, it looks fine but when opening in Internet Explorer, it has some overlapping. I just wanted to give you a quick heads up! Other then that, wonderful blog!

Well I really enjoyed studying it. This post procured by you is very constructive for proper planning.

5pv9Sw kiphiclclbut, [url=http://lkturmftjhni.com/]lkturmftjhni[/url], [link=http://dopqmeyrqupo.com/]dopqmeyrqupo[/link], http://ztxmcnzjykrw.com/

Leave a Reply