During a snowstorm in Marquette, Michigan, a NASA-developed Precipitation Imaging Package uses a weatherproof, high-speed camera and a bright light to record precipitation falling through the camera’s view. The data collected here and other locations in the United States, Canada and Europe aim to improve weather and climate predictions as precipitation variability and intensity stray from past patterns. Credit: Julia Shates, NASA Jet Propulsion Laboratory

In research that could improve weather forecasting and winter driving safety, a University of Michigan-led study distinguished nine distinct precipitation types—varieties of rain, snow and mixed-phase (e.g., sleet)—using unsupervised machine learning and nearly a decade's worth of high-speed camera data. The research is in Science Advances.

Rain and snow are equally likely at temperatures above and below freezing (between -3°C and 5°C), which makes it difficult to make accurate predictions. This study is part of a NASA effort to enable observations to inform and improve .

"In the short term, better forecasting can help people adjust their daily commute or prepare for big events like floods or an ice storm. On longer time scales, it can help predict how snowpack or runoff timing will change fresh water availability for a region," said Claire Pettersen, an assistant professor of climate and space sciences and engineering at U-M.

The researchers used and machine learning to home in on the underlying factors that distinguish and sort the data based on them. Their classification system includes two types of rain, two of snow and five unique mixes:

  • Drizzle—light, steady rainfall
  • Heavy rainfall—intense rainfall with numerous small drops
  • Light rain-to-mix transition—light sleet with dense ice pellets
  • Heavy rain-to-mix transition—intense sleet with dense ice pellets
  • Light mixed-phase—a low volume of slushy, partially frozen particles
  • Heavy mixed-phase—a high volume of slushy, partially frozen particles
  • Heavy snow-to-mix transition—large snowflakes and aggregate particles
  • Light snowfall—light, fluffy snowfall
  • Heavy snowfall—an intense, heavy snowstorm

Zeroing in on rain-to-snow phase transitions

Predicting the phase—as in solid or liquid—of precipitation is challenging today for several reasons. Temperatures between -3°C (26.6°F) and 5°C (41°F), can produce snow or rain with equal likelihood. Weather models approximate the complex microphysics happening inside of clouds, which introduces errors. And satellites, which track weather systems from space, are based on observations of precipitation from field campaigns that might not reflect current conditions.

To expand and update the data weather models are built upon, NASA developed a specialized camera system called the Precipitation Imaging Package, or PIP for short, and deployed it at seven sites in the United States, Canada and Europe. The instrument, a video disdrometer, uses a weatherproof, high-speed camera and a bright light to record precipitation falling through the camera's view. Its measurements of particle sizes, , density, concentration and fall speeds help delineate snow and rain. (A disdrometer is an instrument that measures the speed and size distribution of precipitation particles.)

A Precipitation Imaging Package—a weatherproof, high-speed camera—captures falling snowflakes during a heavy storm in Storrs, Connecticut on January 7, 2022. The video provides data on the particle size, size distribution, density, concentration and fall rate, all of which contribute to the long-term data set that was used to classify precipitation into nine distinct categories. Credit: Larry Bliven, NASA Goddard Space Flight Center

In this study, the researchers leveraged PIP data collected continuously over nine years at those seven sites, amounting to 1.5 million minute-scale particle measurements. They complemented it with surface weather station measurements for a complete picture of environmental conditions that included temperature, relative humidity, dewpoint, pressure and wind speed.

Uncovering data set structure with UMAP dimensionality reduction

To make sense of this enormous data set, the research team used dimensionality reduction—a statistical technique that simplifies data with many variables to understand underlying patterns. Of the two models tested, a nonlinear method, which allows indirect or conditional relationships between variables, outcompeted a conventional linear method that requires direct relationships.

When compared to independent weather radar data from Marquette, Michigan, the nonlinear method tracked precipitation transitions that matched radar observations while reducing ambiguous cases by 36% compared to the linear method.

"Precipitation processes are very nonlinear. Many things influence precipitation as it falls that affect what we experience at the surface," said Pettersen.

Called UMAP for Uniform Manifold Approximation and Projection, the nonlinear method reduced dimensionality by 75%, identifying three main factors that distinguish precipitation: phase, intensity and particle characteristics. These factors helped identify and define these distinct precipitation types. Importantly, UMAP also highlights pathways between types, providing deeper understanding of transitions like heavy rain to sleet.

In an effort to share their results widely, the research team provides an to view the data and a for anyone to try their classification system. Without prior computer programming knowledge, users can feed in climate variables they have readily available and receive a probability distribution of what the class of precipitation will look like.

"We're excited to see how other people will use this and hope that it provides some benefit to the modeling community through the interface and the lookup table that we've built out here," said Fraser King, a research fellow of climate and space sciences and engineering at U-M and the lead author of the study.

The data is available on .

More information: Fraser King et al, Decoding global precipitation processes and particle evolution using unsupervised learning, Science Advances (2025).

Journal information: Science Advances