The U.S. Department of Energy’s (DOE) Argonne National Laboratory has received nearly $3 million in funding for two interdisciplinary projects that will further develop artificial intelligence (AI) and machine learning technology.
The two grants were presented by the DOE’s Office of Advanced Scientific Computing Research (ASCR). They will aid Argonne scientists and collaborators to seek AI and machine learning work in the development of approaches to handle enormous data sets or develop better outcomes where minimal data exists.
One project is an alliance with partners from the DOE’s Los Alamos National Laboratory, Johns Hopkins University, and the Illinois Institute of Technology in Chicago. For this project, Argonne scientists will formulate techniques and methods to run with huge dynamical systems.
By integrating mathematics and scientific principles, they will construct strong and accurate surrogate models. These types of models can greatly reduce the time and cost of working complex simulations, such as those used to forecast the climate or weather.
The models we build with this award will allow us to obtain dramatic reductions in time-to-solutions and costs. Now we can ramp up the work we have been doing and test it on scientific use-cases right here at Argonne. For example, instead of using a massive machine to simulate the climate, we could run many smaller cheaper simulations.
– Romit Maulik, Project 1 Lead and Computational Scientist, Argonne National Laboratory
A second award went to Argonne mathematician, who will be collaborating with a researcher at the University of Chicago. Working with an interdisciplinary team of researchers from applied mathematics, statistics and computer sciences, they will use machine learning accelerated simulations to improve forecasting, data assimilation and prediction of the frequency of extreme events.
Extreme events — such as a cold snap in Texas, a cascading blackout on the East Coast or a heatwave in Portland — can have critical consequences for people, the power grid and infrastructure. But existing modelling technology is not good enough to accurately estimate their frequency.
In these scenarios, a standard machine learning approach does not work. With this project, researchers are figuring out how to get around this lack of data. This could significantly enhance the ability to estimate the probability of extreme weather events and their related impacts on the power grid.
These two projects are part of five the DOE recently awarded for interdisciplinary work using AI to advance the science conducted in the national labs. All five are focused on developing reliable and efficient AI and machine learning methods to address a broad range of science needs.
As reported by OpenGov Asia, experts are now using machine learning to help solve one of humanity’s biggest problems: climate change. With machine learning, researchers can use the abundance of historical climate data and observations to improve predictions of Earth’s future climate. These predictions will have a major role in lessening our climate impact in the years ahead.
Machine learning algorithms use available data sets to develop a model. This model can then make predictions based on new data that were not part of the original data set. Regarding climate change, there are two main approaches by which machine learning can help further the understanding of climate: observations and modelling. In recent years, the amount of available data from observation and climate models has grown exponentially. Hence, machines can analyse all of the data.
As the computational capacity grows — along with the climate data — researchers will be able to engage increasingly sophisticated machine learning algorithms to sift through this information and deliver improved climate models and projections.