Skip to main content
eScholarship
Open Access Publications from the University of California

LBL Publications

Lawrence Berkeley National Laboratory (Berkeley Lab) has been a leader in science and engineering research for more than 70 years. Located on a 200 acre site in the hills above the Berkeley campus of the University of California, overlooking the San Francisco Bay, Berkeley Lab is a U.S. Department of Energy (DOE) National Laboratory managed by the University of California. It has an annual budget of nearly $480 million (FY2002) and employs a staff of about 4,300, including more than a thousand students.

Berkeley Lab conducts unclassified research across a wide range of scientific disciplines with key efforts in fundamental studies of the universe; quantitative biology; nanoscience; new energy systems and environmental solutions; and the use of integrated computing as a tool for discovery. It is organized into 17 scientific divisions and hosts four DOE national user facilities. Details on Berkeley Lab's divisions and user facilities can be viewed here.

Deep Generative Models for Fast Photon Shower Simulation in ATLAS

(2024)

Abstract: The need for large-scale production of highly accurate simulated event samples for the extensive physics programme of the ATLAS experiment at the Large Hadron Collider motivates the development of new simulation techniques. Building on the recent success of deep learning algorithms, variational autoencoders and generative adversarial networks are investigated for modelling the response of the central region of the ATLAS electromagnetic calorimeter to photons of various energies. The properties of synthesised showers are compared with showers from a full detector simulation using geant4. Both variational autoencoders and generative adversarial networks are capable of quickly simulating electromagnetic showers with correct total energies and stochasticity, though the modelling of some shower shape distributions requires more refinement. This feasibility study demonstrates the potential of using such algorithms for ATLAS fast calorimeter simulation in the future and shows a possible way to complement current simulation techniques.

Artificial Intelligence for the Electron Ion Collider (AI4EIC)

(2024)

The Electron-Ion Collider (EIC), a state-of-the-art facility for studying the strong force, is expected to begin commissioning its first experiments in 2028. This is an opportune time for artificial intelligence (AI) to be included from the start at this facility and in all phases that lead up to the experiments. The second annual workshop organized by the AI4EIC working group, which recently took place, centered on exploring all current and prospective application areas of AI for the EIC. This workshop is not only beneficial for the EIC, but also provides valuable insights for the newly established ePIC collaboration at EIC. This paper summarizes the different activities and R&D projects covered across the sessions of the workshop and provides an overview of the goals, approaches and strategies regarding AI/ML in the EIC community, as well as cutting-edge techniques currently studied in other experiments.

Software Performance of the ATLAS Track Reconstruction for LHC Run 3

(2024)

Charged particle reconstruction in the presence of many simultaneous proton–proton (pp) collisions in the LHC is a challenging task for the ATLAS experiment’s reconstruction software due to the combinatorial complexity. This paper describes the major changes made to adapt the software to reconstruct high-activity collisions with an average of 50 or more simultaneous pp interactions per bunch crossing (pile-up) promptly using the available computing resources. The performance of the key components of the track reconstruction chain and its dependence on pile-up are evaluated, and the improvement achieved compared to the previous software version is quantified. For events with an average of 60pp collisions per bunch crossing, the updated track reconstruction is twice as fast as the previous version, without significant reduction in reconstruction efficiency and while reducing the rate of combinatorial fake tracks by more than a factor two.

Cover page of Estimating geographic variation of infection fatality ratios during epidemics.

Estimating geographic variation of infection fatality ratios during epidemics.

(2024)

OBJECTIVES: We aim to estimate geographic variability in total numbers of infections and infection fatality ratios (IFR; the number of deaths caused by an infection per 1,000 infected people) when the availability and quality of data on disease burden are limited during an epidemic. METHODS: We develop a noncentral hypergeometric framework that accounts for differential probabilities of positive tests and reflects the fact that symptomatic people are more likely to seek testing. We demonstrate the robustness, accuracy, and precision of this framework, and apply it to the United States (U.S.) COVID-19 pandemic to estimate county-level SARS-CoV-2 IFRs. RESULTS: The estimators for the numbers of infections and IFRs showed high accuracy and precision; for instance, when applied to simulated validation data sets, across counties, Pearson correlation coefficients between estimator means and true values were 0.996 and 0.928, respectively, and they showed strong robustness to model misspecification. Applying the county-level estimators to the real, unsimulated COVID-19 data spanning April 1, 2020 to September 30, 2020 from across the U.S., we found that IFRs varied from 0 to 44.69, with a standard deviation of 3.55 and a median of 2.14. CONCLUSIONS: The proposed estimation framework can be used to identify geographic variation in IFRs across settings.

Cover page of Helping Faculty Teach Software Performance Engineering

Helping Faculty Teach Software Performance Engineering

(2024)

Over the academic year 2022–23, we discussed the teaching of software performance engineering with more than a dozen faculty across North America and beyond. Our outreach was centered on research-focused faculty with an existing interest in this course material. These discussions revealed an enthusiasm for making software performance engineering a more prominent part of a curriculum for computer scientists and engineers. Here, we discuss how MIT’s longstanding efforts in this area may serve as a launching point for community development of a software performance engineering curriculum, challenges in and solutions for providing the necessary infrastructure to universities, and future directions.

Cover page of Managing changes in peak demand from building and transportation electrification with energy efficiency

Managing changes in peak demand from building and transportation electrification with energy efficiency

(2024)

The Department of Energy funded Berkeley Lab to provide technical assistance to two municipal utilities on how energy efficiency and demand flexibility can mitigate the peak demand impacts of building and transportation electrification. Berkeley Lab worked with these utilities, Sacramento Municipal Utility District (SMUD) and Fort Collins Utilities, to identify research questions that supported their planning needs. For both utilities, Berkeley Lab developed scenario-based load forecasts that considered baseline and high-efficiency building electrification. For SMUD, the forecast also explored the sensitivity of peak demand to extreme weather (a winter cold snap) at the system-evel. For Fort Collins Utilities, the forecast addressed the impacts of low, medium, and high levels of building and transportation technology adoption on select distribution feeders. Berkeley Lab is also developing a guidance document for utilities that will draw on lessons learned from the technical assistance and provide a framework for conducting similar analyses.

Cover page of One Year In: Tracking the Impacts of NEM 3.0 on California’s Residential Solar Market

One Year In: Tracking the Impacts of NEM 3.0 on California’s Residential Solar Market

(2024)

On December 15, 2022, the California Public Utilities Commission passed an overhaul of the net metering program for the state’s investor-owned utilities. The changes replaced the long-standing net energy metering (NEM) tariffs with a net billing tariff (NBT) structure—colloquially known as “NEM 3.0”—which significantly reduces the compensation for behind-the-meter solar photovoltaic (PV) systems. The NEM tariffs remained open for new interconnection applications until April 15, 2023, but after that date, all new interconnection applications were submitted under NBT. Now, one year later, we have an opportunity to evaluate how the California solar market has evolved under this new compensation regime. As a precursor to its annual Tracking the Sun report, Berkeley Lab has released a short technical brief describing key trends in the California residential solar market since the roll-out of the new NBT structure. The purpose of this analysis is to provide empirical insights into how the market has evolved over the past year, confirming some expectations while also revealing several striking surprises.

Cover page of Commercial, industrial, and institutional discount rate estimation for efficiency standards analysis Sector-level data 1998–2023

Commercial, industrial, and institutional discount rate estimation for efficiency standards analysis Sector-level data 1998–2023

(2024)

Underlying each of the U.S. Department of Energy’s (DOE’s) federal appliance and equipment energy conservation standards are a set of complex analyses of the projected costs and benefits of regulation. Any new or amended standard must be designed to achieve significant additional energy conservation, provided that it is technologically feasible and economically justified (42 U.S.C. 6295(o)(2)(A)). DOE determines economic justification based on whether the benefits exceed the burdens, considering a variety of factors, including the economic impact of the standard on consumers of the product and the savings in lifetime operating cost compared to any increase in price or maintenance expenses (42 U.S.C. 6295(o)(2)(B)). As part of this determination, DOE conducts a life-cycle cost (LCC) analysis, which models the combined impact of appliance first cost and operating cost changes on a representative commercial building sample to identify the fraction of customers achieving LCC savings or incurring net cost at the considered efficiency levels. Thus, the commercial discount rate value(s) used to calculate the present value of energy cost savings within the LCC model implicitly plays a role in estimating the economic impact of potential standard levels. This report provides an in-depth discussion of the commercial discount rate estimation process relying on the Capital Asset Pricing Model (CAPM) to estimate a business’ cost of equity, and by adding a risk adjustment factor to the risk-free rate associated with long-term U.S. Treasury bonds to estimate their cost of debt. It is an update to previous reports on estimating commercial discount rates from firm-level and sector-level financial data (e.g., Fujita, 2021, 2016). Major topics covered in this report include the following: • Discount rate estimation methods and rationale • Data sources used and data limitations • Discount rate distributions for use in standards analysis • Discount rate estimation methods and distributions specific to the small business subgroup analysis

Cover page of Exploring Wholesale Energy Price Trends: The Renewables and Wholesale Electricity Prices (ReWEP) tool, Version 2024.1

Exploring Wholesale Energy Price Trends: The Renewables and Wholesale Electricity Prices (ReWEP) tool, Version 2024.1

(2024)

The Renewables and Wholesale Electricity Prices (ReWEP) visualization tool from Berkeley Lab has been updated with nodal electricity pricing and wind and solar generation data through the end of 2023. ReWEP users can explore trends in wholesale electricity prices and their relationship to wind and solar generation. ReWEP includes nodal pricing trends across locations, regions, and different timeframes. The tool consists of maps, time series, and other interactive figures that provide: (1) a general overview of how average pricing, negative price frequency, and extreme high prices vary over time, and (2) a summary of how pricing patterns are related to wind and solar generation. Interactive functionality allows investigation by year, season, time of day, and region, where region is defined as the Independent System Operators (ISO) or Regional Transmission Organizations (RTO) region. ReWEP also contains prices throughout much of the western United States from the Western Energy Imbalance Market and the Western Energy Imbalance Service Market.

Cover page of A call to action for building energy system modelling in the age of decarbonization

A call to action for building energy system modelling in the age of decarbonization

(2024)

As urban energy systems become decarbonized and digitalized, buildings are increasingly interconnected with one another and with the industrial and transportation sector. Transformation strategies to cost-effectively integrate distributed energy sources, and to increase load flexibility and efficiency, generally increase complexity. This complexity causes challenges that the industry is unprepared to deal with. Today's simulation programs, and the processes in which they are used, have not been developed to meet the challenges of decarbonization. Nor have they been designed for, or do they keep pace with, the energy system digitalization. Modeling, simulation and optimization tools, and the processes in which they are used, need to undergo an innovation jump. We show a path to more holistic tools and workflows that address the new requirements brought forward by the increased complexity. Without concerted actions, the building simulation community will fall short of supporting the 2050 decarbonization targets declared by many governments.