Better Software for Reproducible Science

a tutorial presented at

The International Conference for High-Performance Computing, Networking, Storage, and Analysis (SC25)

on 8:30 am - 12:00 pm CST Sunday 16 November 2025

Presenters: David E. Bernholdt (Oak Ridge National Laboratory) and Anshu Dubey (Argonne National Laboratory)


This page provides detailed information specific to the tutorial event above. Expect updates to this page up to, and perhaps shortly after, the date of the tutorial. Pages for other tutorial events can be accessed from the main page of this site.


Quick Links

On this Page


Description

Producing scientific software is a challenge. The high-performance modeling and simulation community, in particular, faces the confluence of disruptive changes in computing architectures and new opportunities (and demands) for greatly improved simulation capabilities, especially through coupling physics and scales. Simultaneously, computational science and engineering (CSE), as well as other areas of science, are experiencing an increasing focus on scientific reproducibility and software quality. Large language models (LLMs), can significantly increase developer productivity through judicious off-loading of tasks. However, models can hallucinate, therefore it is important to have a good methodology to get the most benefit out of this approach.

In this tutorial, attendees will learn about practices, processes, and tools to improve the productivity of those who develop CSE software, increase the sustainability of software artifacts, and enhance trustworthiness in their use. We will focus on aspects of scientific software development that are not adequately addressed by resources developed for industrial software engineering, offering a strategy for the responsible use of LLMs to enhance developer productivity in the context of scientific software development, incorporating testing strategies for the generated code, and discussing reproducibility considerations in the development and use of scientific software.


Agenda

Time (CST)TitlePresenter
8:30 AMIntroduction David E. Bernholdt (ORNL)
8:45 AMMotivation and Overview of Best Practices in HPC Software Development David E. Bernholdt (ORNL)
9:15 AMImproving Reproducibility Through Better Software Practices David E. Bernholdt (ORNL)
10:00 AMMorning break
10:30 AMResponsible Software Development with LLMs Anshu Dubey (ANL)
12:00 PMAdjourn

Presentation Slides

The slides will be published at https://doi.org/10.6084/m9.figshare.30394186.

Note that the DOI will become active once the presentations are published.


How to Participate


Hands-On Activities

Introduction

The hands-on activities for this tutorial are involve the use of a large language model (LLM) to generate tests and code according to specifications (prompts) you will develop. Participation in these activities is encouraged, but not required. After interested participants have been given some time to try the exercise on their own, the instructor will review their prompts and code with the class and these materials will be made available to all participants.

Attendees can participate in the hands-on section in two modes: using the LLM’s web interface, or using CodeScribe, a tool that enables using chat-completion through the API interface of the LLM. The main objectives of the hands-on can met by using the web interface. The advantage of using CodeScribe is to get exposure to the chat-completion technique, and getting to know a tool that can be very handy for writing code.

Advance preparation

If you wish to participate in the hands-on activities, we strongly encourage you to do a bit of preparation before the tutorial starts. This is especially true if you want to use CodeScribe, which may require some advance interaction with the tool developers to integrate new LLM APIs.

Preparation for using the LLM web interface

  1. You will need access to an LLM chat tool. The instructor will be using ChatGPT, but any comparable LLM, including institutionally-supported tools, should work.

Preparation for using CodeScribe

It is important that you do this preparation with enough lead time that we can assist you if necessary before SC25 starts. We will not be able to provide any support for CodeScribe usage issues during the tutorial itself.

  1. You will need API access to an LLM tool. This is a level beyond the web interface and it incurs an extra charge on some platforms. However many institutionally-supported LLMs offer API access at no additional cost. Participants will be responsible for any additional costs incurred. The instructor will be using ChatGPT, but any comparable LLM should work. CodeScribe also supports the freely downloadable Llama model: <???>

  2. CodeScribe is a Python code, so you will need a working Python installation on a system that you will be able to access during the tutorial (remote access is fine).

  3. Download and install CodeScribe from https://github.com/adubey64/CodeScribe a. Installation instructions are provided in the README file: https://github.com/adubey64/CodeScribe?tab=readme-ov-file#installation b. You are encouraged to watch the two tutorials on the installation and use of CodeScribe in this Box folder: https://anl.app.box.com/folder/336154643880?s=zv3zdbphqprdz8rjh1c84xpeqd8yg32u. They are 19 minutes and 11 minutes long, respectively.

  4. You will need to integrate your CodeScribe installation with your the API of your LLM of choice. It may be necessary for participants to add support for their custom models to CodeScribe. They should look at the file https://github.com/adubey64/CodeScribe/blob/development/code_scribe/lib/_llm.py, copy the class most closely resembling the target model and create a PR. With enough lead time we will try our best to help make it work.

During the tutorial

At an appropriate point in the tutorial, we will make available the prompts the instructor used and the generated code for your reference. Links will be provided here.


If you’re interested in this tutorial, you might be interested in some of the other software-related events taking place in the SC25 conference.


Stay in Touch


Requested Citation

The requested citation the overall tutorial is:

David E. Bernholdt and Anshu Dubey, Better Software for Reproducible Science tutorial, in The International Conference for High-Performance Computing, Networking, Storage, and Analysis (SC25), St. Louis, Missouri, 2025. DOI: 10.6084/m9.figshare.30394186.

Note that the DOI will become active once the presentations are published.

Individual modules may be cited as Speaker, Module Title, in Better Software for Reproducible Science tutorial…


Acknowledgements

This tutorial is produced by the Consortium for the Advancement of Scientific Software (CASS).

This work was supported by the U.S. Department of Energy, Office of Science, Office of Advanced Scientific Computing Research, Next-Generation Scientific Software Technologies (NGSST) and Scientific Discovery through Advanced Computing (SciDAC) programs.

This work was supported by the U.S. Department of Energy Office of Science, Office of Advanced Scientific Computing Research (ASCR), and by the Exascale Computing Project (17-SC-20-SC), a collaborative effort of the U.S. Department of Energy Office of Science and the National Nuclear Security Administration.