Panasas uses cookies to improve and customize your browsing experience on our website. By browsing our site, you consent to the use of cookies. Your privacy is important to us. To learn more about cookies and how we use them, please see our Privacy Policy.

Contact Us

Thank You

The form was submitted successfully. We will be in touch with you soon.

Contact Us

Academic Research

Storage that accelerates your discoveries.

Read the quick guide 

Powering Data-Intensive Academic Research

Panasas ActiveStor® storage solutions running the PanFS® file system power innovative research in agriculture, astrophysics, bioinformatics, climate research, computational chemistry, genome analysis, geophysics, high energy physics, machine learning, materials science, molecular biology, and beyond. See what Panasas makes possible.

A Long, Rich History of Academic Partnerships

For over two decades, universities and research institutes around the globe have trusted Panasas to deliver next-generation high performance computing (HPC) storage technologies. We have proudly provided reliable, cost-effective, high-performance storage access to thousands of researchers at more than a hundred higher education institutions.

Born from the brain power at Carnegie Mellon University’s Parallel Data Lab, the PanFS® parallel file system enables top university HPC facilities to advance their data-intensive research by maximizing HPC cluster application processing. It was designed to provide the unparalleled reliability research teams need to keep their projects on track and to eliminate the management burdens associated with problematic roll-your-own and open-source storage solutions.

Academic Research Infographic Thumbnail

Uncomplicated Storage for the Most Complex Projects

Small IT teams, multiple research groups, lean budgets, massive datasets, mixed workloads – these cease to be problems with Panasas as your data foundation.

Increase productivity and enable groundbreaking insights with simple to manage, scalable, high-performance storage. Panasas solutions offer the flexibility for scratch-only or consolidated scratch, home directories, and project storage with single-tier mixed workload performance to support the requirements of growing numbers of academic researchers. With thousands of users requiring constant large-scale, high-throughput data access, low-touch administration makes the solutions affordable and the life of the administrator easy.

Download Infographic

Unparalleled Uptime

Academic HPC facilities choose PanFS for its built-in prevention and automated failure recovery logic, which ensure the highest of uptimes for their community of researchers.

Flexible

With hundreds of research groups working on a multitude of data-intensive projects, the single-tier PanFS architecture is ideal for mixed file size workloads and storage consolidation projects.

Low-Touch Administration

Limited IT staff and budget constraints call for a reliable, cost-effective, and especially easy-to-manage high-performance storage platform for frustration-free monitoring and performance management.

Why Academic Institutions Partner with Panasas

Here’s what some of our academic partners have to say about working with us.

Minnesota Supercomputing Institute

The Panasas partnership gives us access to the kind of high-performance technology that is critical to tackle an increasing variety of sophisticated uses cases with MSI’s ‘Agate’ cluster. This long-term partnership has provided our researchers with the top-flight technology and technical support they need.”

Dr. Jim Wilgenbusch

Director of Research Computing at UMN

WATCH VIDEO | READ PRESS

University of Wollongong

This will allow us to accelerate workloads and store important data for our various research projects. We’re going to get really fast access to files, to the active layer. This collaboration between two very high-value players coming together in true partnership will support Australian science in general, and more specifically, the University’s role in furthering our understanding in these important areas of science.”

Dr. James Bouwer

Director of Cryo-EM at Molecular Horizons, University of Wollongong

READ SUCCESS STORY

MINES ParisTech

“With Panasas ActiveStor, researchers save time since they no longer have to move or copy data files before they can process them. They gain productivity since there are no more HPC bottlenecks to access the storage system. Bandwidth is four to five times higher than before.”

Gregory Sainte-Luce

IT Manager at the Materials Research Center, MINES ParisTec

VIEW CASE STUDY

UC San Diego Center for Microbiome Innovation

“While most of the world sees research in microbiology being made in flasks or tubes, a lot of modern methods generate immense amounts of data that require powerful storage such as Panasas ActiveStor.”

Dr. Sandrine Miller-Montgomery

Executive Director of the UC San Diego Center for Microbiome Innovation

VIEW CASE STUDY

University of Texas at Dallas, School of Arts, Technology, and Emerging Communications (ATEC)

“The Panasas user interface and intuitiveness exceeds those [EMC Isilon and Dell Compellent storage tools] products. Once I assign the students to their storage resources, ActiveStor is pretty much hands off for me for the rest of the semester.”

Jeff Smith

Systems Administrator and Architect, ATEC at the University of Texas at Dallas

VIEW CASE STUDY

California Institute of Technology (Caltech)

We tried other storage solutions with parallel file systems in the past. Some were just not worth the continued investment. Users complained about slow response times and the file systems were a hassle manage. The Panasas file system just works.”

Sharon Brunett

Senior Scientist at Caltech Center for Advanced Computing Research (CACR)

VIEW CASE STUDY

Academic Research References

Many academic research projects, papers, and publications have used and cited Panasas. The following is a list of some of those projects.

Notre Dame University

Tovar et al. (2022, Feb). Dynamic Task Shaping for High Throughput Data Analysis Applications in High Energy Physics. IPDPS 2022.

Stanford University

Jia et al (2018, May 2). Isometry: A Path-Based Distributed Data Transfer System. ICS’18.

Warsaw University of Technology

Pomeran et al. (2019, Sep). Tentative oblique subduction high resolution models, lead to reducing costs of HPCC usage. Balkan Geophysical Society.

Harvard University

Hermann et al. (2018, Nov 2). Implementing the DICOM Standard for Digital Pathology. Journal of Pathology Informatics.