Contact Sales

RSS

SC12 Recap - Where Were the Labs?

Barbara Murphy

CMO

I recently read a quote from the US secretary of Energy, Steven Chu. He said: “The nation that leads the world in high-performance computing will have an enormous competitive advantage across a broad range of sectors.”  I couldn’t agree more, so it was a shame to experience the absence of so many national government laboratories at Supercomputing 2012.  Sadly, many of the big national labs pulled out from the show, in part due to federal budget cut-backs, but also as a result of the scandal involving the General Services Administration.  Not exactly the recipe for leading the world in high-performance computing.

The annual Supercomputing show is a critical event for HPC users to share information and learn about the latest and greatest technologies.  It is hard to believe that only just a short time ago we were all trudging through snow in Salt Lake City.  With over 10,000 attendees, the show continues to showcase compelling HPC developments and the inclusion of commercial HPC customers continues to be a dominant trend.  Finally we are seeing a plethora of HPC technologies (primarily funded by the US government) address the broad commercial market for big data applications in design, discovery and analytics.  The 7th Annual Parallel Data Storage Workshop, which Panasas founder and chief scientist, Dr. Garth Gibson was one of the lead organizers, drew a packed house, highlighting the criticality of parallel file systems for HPC.

The major theme on the show floor was big data.  Contrary to the view that big data is primarily created from social media sites, the greatest source of big data is internally generated data, solving problems with tools such as simulation or visualization.  This fact was confirmed at the show by Addison Snell from Intersect360 Research.  He has established the annual Supercomputing Conference tradition of handing out HPC trivia cards, with insights from the research conducted by his firm during the prior year.  For Panasas, it was an endorsement of our perspective that the biggest big data applications are found in markets traditionally called HPC.  You can read more about this in our big data white paper – http://www.panasas.com/sites/default/files/uploads/docs/bigdata_wp_lr_1093.pdf

Panasas contributed to the show buzz by announcing Hadoop support on the recently launched hybrid SSD/SATA storage platform, ActiveStor 14.  Judging from the packed booth every day, the interest level in parallel storage for big data is hot.  Our goal in adding Hadoop support to the ActiveStor family was to provide a trusted, secure platform for HPC users who already have made significant investment in their compute and storage clusters.  Now the same data used for design and discovery applications can be leveraged for analytic workflows, all managed on a single POSIX compliant platform.

Lastly, between juggling, customer meetings, press interviews and talking to new prospects, Panasas though leaders swarmed the video channels.  The now famous red hatted, Rich Bruckner, had an opportunity to interview Panasas founder, Garth Gibson, to get his insights on Hadoop – http://insidehpc.com/2012/11/19/panasas-chief-scientist-on-where-hpc-meets-big-data-and-hadoop/

 

HDD vs. SSD

 

Meanwhile veteran Panasas architect, chief juggler and CTO, Dr. Brent Welch, senior director of marketing, Geoffrey Noer, and yours truly shared industry, product and technology insights in various video outlets – http://www.youtube.com/watch?v=ESj6Ha_YyY4&feature=youtu.be

The year is almost over and we are already planning our next SC show in the Mile High City. Let’s hope that we save the country from going off the fiscal cliff so that we can continue to fund the great work of our national laboratories and enjoy their participation at Supercomputing 2013.