Skip to main content
eScholarship
Open Access Publications from the University of California
Cover page of Evolving to Find Optimizations Humans Miss: Using Evolutionary Computation to Improve GPU Code for Bioinformatics Applications

Evolving to Find Optimizations Humans Miss: Using Evolutionary Computation to Improve GPU Code for Bioinformatics Applications

(2024)

GPUs are used in many settings to accelerate large-scale scientific computation, including simulation, computational biology, and molecular dynamics. However, optimizing codes to run efficiently on GPUs requires developers to have both detailed understanding of the application logic and significant knowledge of parallel programming and GPU architectures. This paper shows that an automated GPU program optimization tool, GEVO, can leverage evolutionary computation to find code edits that reduce the runtime of three important applications, multiple sequence alignment, agent-based simulation and molecular dynamics codes, by 28.9%, 29%, and 17.8% respectively. The paper presents an in-depth analysis of the discovered optimizations, revealing that (1) several of the most important optimizations involve significant epistasis, (2) the primary sources of improvement are application-specific, and (3) many of the optimizations generalize across GPU architectures. In general, the discovered optimizations are not straightforward even for a GPU human expert, showcasing the potential of automated program optimization tools to both reduce the optimization burden for human domain experts and provide new insights for GPU experts.

Artificial Intelligence for the Electron Ion Collider (AI4EIC)

(2024)

The Electron-Ion Collider (EIC), a state-of-the-art facility for studying the strong force, is expected to begin commissioning its first experiments in 2028. This is an opportune time for artificial intelligence (AI) to be included from the start at this facility and in all phases that lead up to the experiments. The second annual workshop organized by the AI4EIC working group, which recently took place, centered on exploring all current and prospective application areas of AI for the EIC. This workshop is not only beneficial for the EIC, but also provides valuable insights for the newly established ePIC collaboration at EIC. This paper summarizes the different activities and R&D projects covered across the sessions of the workshop and provides an overview of the goals, approaches and strategies regarding AI/ML in the EIC community, as well as cutting-edge techniques currently studied in other experiments.

Cover page of Streaming Large-Scale Microscopy Data to a Supercomputing Facility

Streaming Large-Scale Microscopy Data to a Supercomputing Facility

(2024)

Data management is a critical component of modern experimental workflows. As data generation rates increase, transferring data from acquisition servers to processing servers via conventional file-based methods is becoming increasingly impractical. The 4D Camera at the National Center for Electron Microscopy generates data at a nominal rate of 480 Gbit s-1 (87,000 frames s-1), producing a 700 GB dataset in 15 s. To address the challenges associated with storing and processing such quantities of data, we developed a streaming workflow that utilizes a high-speed network to connect the 4D Camera's data acquisition system to supercomputing nodes at the National Energy Research Scientific Computing Center, bypassing intermediate file storage entirely. In this work, we demonstrate the effectiveness of our streaming pipeline in a production setting through an hour-long experiment that generated over 10 TB of raw data, yielding high-quality datasets suitable for advanced analyses. Additionally, we compare the efficacy of this streaming workflow against the conventional file-transfer workflow by conducting a postmortem analysis on historical data from experiments performed by real users. Our findings show that the streaming workflow significantly improves data turnaround time, enables real-time decision-making, and minimizes the potential for human error by eliminating manual user interactions.

Cover page of Quantum-centric supercomputing for materials science: A perspective on challenges and future directions

Quantum-centric supercomputing for materials science: A perspective on challenges and future directions

(2024)

Computational models are an essential tool for the design, characterization, and discovery of novel materials. Computationally hard tasks in materials science stretch the limits of existing high-performance supercomputing centers, consuming much of their resources for simulation, analysis, and data processing. Quantum computing, on the other hand, is an emerging technology with the potential to accelerate many of the computational tasks needed for materials science. In order to do that, the quantum technology must interact with conventional high-performance computing in several ways: approximate results validation, identification of hard problems, and synergies in quantum-centric supercomputing. In this paper, we provide a perspective on how quantum-centric supercomputing can help address critical computational problems in materials science, the challenges to face in order to solve representative use cases, and new suggested directions.

Cover page of False vacuum decay and nucleation dynamics in neutral atom systems

False vacuum decay and nucleation dynamics in neutral atom systems

(2024)

Metastable states of quantum many-body systems with confinement offer a means to simulate false vacuum phenomenology, including nonequilibrium dynamical processes like decay by nucleation, in truncated limits. Recent work has examined the decay process in one-dimensional (1D) ferromagnetic Ising spins and superfluids. In this paper, we study nucleation dynamics in 1D antiferromagnetic neutral atom chains with Rydberg interactions, using both numerical simulations and analytic modeling. We apply a staggered local detuning field to generate the metastable and ground states. Our efforts focus on two dynamical regimes: decay and annealing. In the first, we corroborate the phenomenological decay rate scaling and determine the associated parameter range for the decay process; in the second, we uncover and elucidate a procedure to anneal the metastable state from the initial to the final system, with intermediate nucleation events. We further propose experimental protocols to prepare the required states and perform quenches on near-term neutral atom quantum simulators, examining the experimental feasibility of our proposed setup and parameter regime.

Cover page of Long-lived oscillations of metastable states in neutral atom systems

Long-lived oscillations of metastable states in neutral atom systems

(2024)

Metastable states arise in a range of quantum systems and can be observed in various dynamical scenarios, including decay, bubble nucleation, and long-lived oscillations. The phenomenology of metastable states has been examined in quantum many-body systems, notably in one-dimensional (1D) ferromagnetic Ising spin systems and superfluids. In this paper, we study long-lived oscillations of metastable and ground states in 1D antiferromagnetic neutral atom chains with long-range Rydberg interactions. We use a staggered local detuning field to achieve confinement. Using theoretical and numerical models, we identify novel spectral signatures of quasiparticle oscillations distinct to antiferromagnetic neutral atom systems and interpret them using a classical energy model of short-range meson repulsion. Finally, we evaluate the experimental accessibility of our proposed setup on current neutral-Atom platforms and discuss experimental feasibility and constraints.

The Early Data Release of the Dark Energy Spectroscopic Instrument

(2024)

The Dark Energy Spectroscopic Instrument (DESI) completed its 5 month Survey Validation in 2021 May. Spectra of stellar and extragalactic targets from Survey Validation constitute the first major data sample from the DESI survey. This paper describes the public release of those spectra, the catalogs of derived properties, and the intermediate data products. In total, the public release includes good-quality spectral information from 466,447 objects targeted as part of the Milky Way Survey, 428,758 as part of the Bright Galaxy Survey, 227,318 as part of the Luminous Red Galaxy sample, 437,664 as part of the Emission Line Galaxy sample, and 76,079 as part of the Quasar sample. In addition, the release includes spectral information from 137,148 objects that expand the scope beyond the primary samples as part of a series of secondary programs. Here, we describe the spectral data, data quality, data products, Large-Scale Structure science catalogs, access to the data, and references that provide relevant background to using these spectra.

New constraints on ultraheavy dark matter from the LZ experiment

(2024)

Searches for dark matter with liquid xenon time projection chamber experiments have traditionally focused on the region of the parameter space that is characteristic of weakly interacting massive particles, ranging from a few GeV/c2 to a few TeV/c2. Models of dark matter with a mass much heavier than this are well motivated by early production mechanisms different from the standard thermal freeze-out, but they have generally been less explored experimentally. In this work, we present a reanalysis of the first science run of the LZ experiment, with an exposure of 0.9 tonne×yr, to search for ultraheavy particle dark matter. The signal topology consists of multiple energy deposits in the active region of the detector forming a straight line, from which the velocity of the incoming particle can be reconstructed on an event-by-event basis. Zero events with this topology were observed after applying the data selection calibrated on a simulated sample of signal-like events. New experimental constraints are derived, which rule out previously unexplored regions of the dark matter parameter space of spin-independent interactions beyond a mass of 1017 GeV/c2.