Hewlett Packard Enterprise has become an integral part of the Winter Classic. After four years, it's hard to imagine a Winter Classic competition that doesn't begin with HPE training the students on how to run and optimize LINPACK and HPCG on a HPE/Cray cluster.
And the training! HPE pulls out all the stops with separate sessions covering:
- The system and environment the students will be using
- Using Slurm and compilers
- Cray MPI and scientific libraries
- Separate in-depth overviews of HPL and HPCG
Their training materials are clear, concise, and follow a logical path to give these students what is usually their very first introduction to HPC concepts, using a real supercomputer, and running key benchmarks.
They also staff a slack channel to answer student questions and give tips and tricks along the way.
They have a tight knit and highly skilled mentor team, as you can see from the mentor interviews below. What shines through is that they truly enjoy working with the students and are doing their best to give them a warm introduction to HPC.
2024 Winter Classic: HPE
2024 HPE Mentor Interview
When we revealed the 2024 Winter Classic HPL and HPCG scores, we were shocked at how close they were. The top teams were separated by a tiny number of points, as you can see in the video.
In this episode of our show, I talk to the HPE mentors who guided the students through their first step in the competition.
There's a lot of work that goes into being a Winter Classic mentor organization and we want to recognize that, plus get a behind the scenes look at what happened, of course.
As usual, the HPE mentor team did a great job of taking the students through the opening of the competition, take a look at what they did and how they did it in the video below.
2023 Winter Classic: HPE
2023 HPE Mentor Interview
In our most recent update, “Triumph and Tragedy with HPL/HPCG”, we detailed how our dozen 2023 Winter Classic Invitational cluster competition teams dealt with their Linpack/HPCG module, mentored by HPE.
In this episode of our incredibly popular 2023 Winter Classic Studio Update Show, we interview the mentors behind the event, the folks who readied the systems, trained the students, and fielded their questions during the weeklong challenge.
We want to shine some light on the mentor organizations who are critical to making this competition possible. It’s not an easy job, the mentors have to provide clusters for the teams, give them log ins to the boxes, teach them how to use the system, bring them up to speed on the operating environment, train them on the application(s), and, well, lots of stuff.
HPE did an exemplary job in this, their second year of mentoring students on Linpack and HPCG. They provided a Frontier-like configured set of training/practice clusters for the student to work out on and shepherded the teams through the entire process.
2023 Winter Classic: Triumph & Tragedy with HPL/HPCG
HPL. HPCG. Bookends. One will show you the best possible performance from your cluster while the other will show you the worst. Running and optimizing these two foundational HPC benchmarks was the task for the twelve 2023 Winter Classic Invitational Student Cluster Competition team.
The mentor organization, HPE in this case, provided the students with everything they needed to run and optimize the two benchmarks. This included training, access to virtual clusters, and answers to the questions that popped up during the one-week practice period. We’ll be interviewing the HPE mentor team to get a behind the scenes look at how this module went.
In this latest version of our Studio Update Show, Dan Olds and Addison Snell take you through the results and how they changed the leaderboard (there were big changes).
2022 Winter Classic: HPE
Justin Hotard, HPE's GM of HPC and AI, joins Dan and Addison on the show to break down the competition and discuss its importance. He also drops a bombshell by more than doubling the Brueckner Award Scholarships, adding $12,000 from HPE and $6,000 PERSONALLY to the scholarship fund. This is an amazing gesture and hugely appreciated!
HPE Mentors 2022 Winter Classic Field
As part of our continuing coverage of the 2022 Winter Classic Student Cluster Competition, we want to shine a light on the first mentor organization in the competition.
Hewlett Packard Enterprise really stepped up to the plate by teaching the twelve student teams how to use a HPE/Cray cluster plus how to run and optimize the LINPACK and HPCG benchmarks.
Just to refresh your memory, the Winter Classic competition is a marathon event that exclusively features Historically Black and Hispanic universities competing in a virtual cluster competition spanning eight weeks. In addition to working with HPE on HPL/HPCG in the first week of the competition, student teams will also work with NASA, Oak Ridge National Lab, and AWS in coming weeks.
The goal of the competition is that by the end of it, students will be able to say that they have worked on real-world supercomputers, running real-world applications and benchmarks, and that they know how to optimize them. This should help pave the way for them to work in HPC, which is a great thing.
HPE did one of the best mentoring jobs we've seen in the two years of this competition. Just about every team turned in results for both benchmarks, which doesn't often happen in these competitions. To highlight their contribution, Episode 4 of our increasingly popular "2022 Winter Classic Student Cluster Competition Studio Update Show" featured the HPE team and covers why they got involved with this competition, how they configured the systems, and the training they provided to the students.
2022 Winter Classic: First Results are in!
The first results are in! Student teams turned in their HPL (Linpack) and HPCG benchmark results and we got ’em.
Twelve student teams spent last week under the tutelage of HPE learning about HPC and how to run (then optimize) HPL and HPCG. The results were outstanding. Nearly all the teams completed the task and some of their numbers were pro level in terms of, for example, Linpack efficiency.
You’ll have to watch (or fast forward) through the video below in order to get all of the results and details. But let me whet your appetite with some tidbits:
- One of the Texas Tech teams took home the Linpack crown with a two-node score of 6,631 GFLOP/s. It was an extremely close battle. Two other teams were within two points of the leader.
- The HPCG results were a great story. Tennessee State University took the win by either a highly skilled approach or luck. We’ll ask them when we interview them. Watch the video to see what they did to top the other teams by 30% and more.
Oh yeah, that video I keep referring to? Here’s the link to it: