After working in Antarctica for the Antarctic Meteorological Research Center (AMRC) at the University of Wisconsin-Madison and then for CIRES, Shelley Knuth traded in her expeditionary gear to use her data skills and experience in CU Boulder’s supercomputing lab.
Knuth was ready for a change of environment where some days she battled gale-force winds and temperatures of minus 20 degrees after spending five seasons deployed in the field for AMRC and CIRES (Cooperative Institute for Research in Environmental Sciences). In 2014 she came to CU’s supercomputing lab as a data management specialist. She is now the assistant vice chancellor of research computing.
“I got my start hanging off the side of weather towers in Antarctica while I was doing research for my master’s,” Knuth said. “When you do research on the weather, you go where the data is. I installed instruments out on weather stations. If they stopped working, I had to go out there and fix them. The Antarctic network supports a lot of research across the globe. Its data gets put into climate models that are addressing so many important scientific questions. That was a great start to my career.”
While working on her master’s degree in atmospheric and oceanic sciences from the University of Wisconsin, Knuth was working for AMRC, which runs the U.S. Antarctic Automatic Weather Station Network. The weather stations capture basic weather data: temperature, wind speed and direction, air pressure and relative humidity. The AMRC program managed satellite data coming from weather satellites and supports the largest network of automatic weather stations across the Antarctic for the U.S. Knuth also played a role on that team collecting and modifying satellite imagery for use by the public.
In 2009, she came to CU Boulder to work at CIRES, which is a partnership of the National Oceanic and Atmospheric Administration (NOAA) and CU, on a project investigating energy transfer between ocean and air in Antarctica. Knuth worked for Professor John Cassano at CIRES and was also deployed to Antarctica while working at CIRES.
In 2014, she earned a PhD in atmospheric and oceanic sciences at CU Boulder.
Research computing at CU Boulder
Knuth brings to the supercomputing lab the knowledge and skills she obtained while working on her degrees and in the field.
“It was really cool working in Antarctica, but it’s a hard job,” she said. “I was ready to try something new. My work with Research Computing is appealing because I don’t have to specialize in just one discipline. I can be a jack of all trades now.”
Her team of 20 people in the supercomputing lab operates a supercomputer, data storage for research projects, a new hybrid cloud, a friction-free network for large data transfers; is developing a new platform for securing research data; and contributes to infrastructure support for courses.
The department also provides training and consulting on topics related to high-performance computing, including data management, programming and basic supercomputer use.
Science has come to the point where we need these advanced technological tools to further understand how the physical world and all its systems work. It’s about math and it’s about big data. My department helps to make the millions and millions of bytes of data collected in the field meaningful and useful.”
“Our niche is through large-scale computing,” Knuth said. “We’ve been around for 12 years now. We have an on-site supercomputer service called Alpine. It’s a third-generation system at CU. The first one was Janus—funded through an NSF grant. Our second system was Summit, which came online in 2017, also funded by an NSF grant. Usually systems are replaced roughly every five years.”
The Alpine service will come online for the community in the next few weeks. It’s already in the early adopter stages with some users doing beta testing.
“Alpine is technologically at the forefront of what our researchers need in order to be successful in their disciplines,” Knuth said. “We’re excited to be offering it. We designed it in a different way from our previous system, so it can scale better. We’re incorporating technology that will allow researchers to implement innovative workflows such as machine learning and artificial intelligence. A lot of our researchers are doing impressive work on smaller scales so we want to make sure we’re addressing their needs, too.”
The Research Computing group supports CU researchers through large-scale computing projects. The Peta Library, which stores 4 peta bytes of data, supports large-scale data storage. A system called Blanca is available for faculty members who want to purchase their own computer, which is then operated for them by research computing. The support group helps researchers to be competitive in their fields when it comes to crunching data; to be more effective when applying for grants and in general to keep them functioning at the top of their field.
“In the past year we’ve hired a team that is coming online to meet compliance requirements that are being set forth to obtain grants from the Department of Defense,” Knuth said. “We’re building an enclave to support that. It’s about keeping research data secure. And the need for data storage continually expands, so we must keep on top of that as well. Additionally, we are excited to bring CU’s cloud service into our group and help our researchers explore the possibilities with that infrastructure.”
In addition to her CU responsibilities, Knuth is also one of two executive directors of the Center for Research Data and Digital Scholarship and chairs the Rocky Mountain Advanced Computing Consortium (RMACC), which collaborates throughout the Rocky Mountain region on cyber infrastructure projects. She is also a leader in the RMACC Women in High Performance Computing group, and will be the general co-chair for the Practice and Experience in Advanced Research Computing (PEARC) 2023 conference.
“A lot of our researchers are doing impressive work on a smaller scale,” she said. “So we want to make sure we’re accommodating them, as well.”
“Science has come to the point where we need these advanced technological tools to further understand how the physical world and all its systems work. It’s about math and it’s about big data. My department helps to make the millions and millions of bytes of data collected in the field meaningful and useful.”