October 21–22, 2025
Redwood Rooms
SLAC National Accelerator Laboratory
Scientific user facilities (SUFs) at the U.S. Department of Energy (DOE) drive scientific discovery and innovation by delivering world-class experimental capabilities that expand the frontiers of biology, chemistry, physics, and materials science. Over the next 5 years, upgrades at SUFs will generate over an order of magnitude more data, promising to accelerate the pace of scientific innovation if correctly harnessed. However, this flood of data poses challenges for the scientific community, despite continued growth in HPC hardware performance. The current state of the practice and tools optimized for HPC are insufficiently flexible and productive to address the high-stakes, short timelines, and rapidly evolving requirements of highly dynamic scientific user experiments. Additionally, traditional HPC software tools demand expertise that most users of SUFs cannot realistically apply within the pace and pressures of modern experiments, underscoring the need for more accessible, high-productivity approaches. Emerging AI/ML technologies, though promising, do not address these needs, and will not lead to a productive, high-performance software ecosystem without decisive action.
This workshop will explore the research challenges and opportunities in building a highly productive, high-performance software ecosystem for large scale scientific data analysis for users at the SUFs. The goal of the workshop is to identify key research directions that, if addressed, would substantially change the status quo and deliver an order of magnitude increase in productivity and performance for users of SUFs across the DOE complex.