AI Endorsed by Expert Meteorologists: DeepMind's Weather Forecast Model


DeepMind released a Generative model able to outperform widely-used nowcasting methods in 89% of situations for its accuracy and usefulness assessed by more than 50 expert meteorologists. The model focuses on predicting precipitations in the next 2 hours and achieves that surprisingly well. To train such a generative model, you need a bunch of data from both the human faces and the kind of face you want to generate. This kind of model often uses GANs for training purposes and then uses the generator model independently.
image
Louis Bouchard Hacker Noon profile picture

I explain Artificial Intelligence terms and news to non-experts.

DeepMind just released a Generative model able to outperform widely-used nowcasting methods in 89% of situations for its accuracy and usefulness assessed by more than 50 expert meteorologists! Their model focuses on predicting precipitations in the next 2 hours and achieves that surprisingly well. It is a generative model, which means that it will generate the forecasts instead of simply predicting them. It basically takes radar data from the past to create future radar data. So using both time and spatial components from the past, they can generate what it will look like in the near future.

You can see this as the same as Snapchat filters, taking your face and generating a new face with modifications on it. To train such a generative model, you need a bunch of data from both the human faces and the kind of face you want to generate. Then, using a very similar model trained for many hours, you will have a powerful generative model.

This kind of model often uses GANs architectures for training purposes and then uses the generator model independently. If you are not familiar with generative models or GANs, I invite you to watch one of the many videos I made covering them, like this one about Toonify. If this sounds either interesting or too good to be true, watch the video!

References

►Read the full article: https://www.louisbouchard.ai/deepmind-rain-nowcasting/►Reference: Ravuri, S., Lenc, K., Willson, M., Kangin, D., Lam, R., Mirowski, P., Fitzsimons, M., Athanassiadou, M., Kashem, S., Madge, S. and Prudden, R., 2021. Skillful Precipitation Nowcasting using Deep

Generative Models of Radar, https://www.nature.com/articles/s41586-021-03854-z


►Colab Notebook: https://github.com/deepmind/deepmind-research/tree/master/nowcasting
►My Newsletter (A new AI application explained weekly to your emails!): https://www.louisbouchard.ai/newsletter/

Video Transcript

00:00

You've most certainly planned a trip to the beach for the day, checked the weather before

00:04

going, which said it was sunny, and just when you arrived, it started raining.

00:08

This or a similar situation happened to all of us.

00:11

We always talk about the weather.

00:13

There are two big reasons for that: it has a big impact on our lives and activities,

00:18

and we sometimes have nothing better to talk about.

00:20

A common agreement is that it seems like weather forecasts in the next few hours are completely

00:26

random.

00:27

Especially when it comes to rain prediction.

00:29

Well, there's a reason for that.

00:31

It's actually highly complex.

00:33

These short-term weather predictions are called Precipitation Nowcasting and are made using

00:38

various methods to predict what will happen in the next two hours.

00:42

These methods are driven by powerful numerical weather prediction systems predicting the

00:46

weather by solving physical equations.

00:49

They are quite powerful for long-term predictions but struggle to find fine-grain forecasts

00:55

above your head at a specific time of the day.

00:57

It's just like statistics.

00:58

It is easy to predict what an average human will do in a situation but quite impossible

01:03

to predict what a particular individual will do.

01:06

If you'd like to get deeper into these mathematical models, I already explained how they work

01:10

in more detail in my other video about global weather prediction using deep learning.

01:14

Even though we have a lot of radar data to predict what will happen, the mathematical

01:19

and probabilistic-based methods fail to be precise.

01:22

You can see where this is going.

01:24

When there's data, there's AI.

01:25

Indeed, this lack of precision may change in the future, and in part because of DeepMind.

01:29

DeepMind just released a Generative model able to outperform widely-used nowcasting

01:35

methods in 89% of situations for its accuracy and usefulness assessed by more than 50 expert

01:42

meteorologists!

01:43

Their model focuses on predicting precipitations in the next 2 hours and achieves that surprisingly

01:49

well.

01:50

As I just said, it is a generative model, which means that it will generate the forecasts

01:54

instead of simply predicting them.

01:56

It basically takes radar data from the past to create future radar data.

02:00

So using both time and spatial components from the past, they can generate what it will

02:05

look like in the near future.

02:07

You can see this as the same as Snapchat filters, taking your face and generating a new face

02:12

with modifications on it.

02:14

To train such a generative model, you need a bunch of data from both the human faces

02:19

and the kind of face you want to generate.

02:21

Then, using a very similar model trained for many hours, you will have a powerful generative

02:27

model.

02:28

This kind of model often uses GANs architectures for training purposes and then uses the generator

02:34

model independently.

02:35

If you are not familiar with generative models or GANs, I invite you to watch one of the

02:39

many videos I made covering them, like the one appearing on the top right corner right

02:44

now.

02:45

One of the most basic architectures to achieve image generation is called a UNet.

02:49

It basically takes an image, or past radar data, in this case, encodes it using trained

02:54

parameters, and takes this encoded information to generate a new version of the same image,

03:00

which in our case would be the same radar data of the next few minutes.

03:05

Here's what it looks like when feeding a typical UNet with forecast data compared to what it

03:09

should look like, the target.

03:12

You can see that it is relatively good, but not really precise and surely not enough to

03:16

be used in our daily lives.

03:18

Here's a comparison with the currently used numerical weather prediction approach like

03:22

PySTEPS.

03:23

It's a bit better, but you can see that it is not perfect either.

03:27

We cannot really further improve the probabilistic methods using math equations, so trying different

03:32

approaches becomes interesting.

03:34

Also, the fact that we have a lot of radar data to train our models is quite encouraging

03:38

for the deep learning approaches.

03:40

This is why DeepMind successfully created a GAN-like architecture made explicitly for

03:45

this task.

03:46

And here are the results.

03:47

You can see how much closer it is to reality with more fine-grain details.

03:52

Really impressive!

03:53

They achieved that by using both time and spatial components from the past radar data

03:58

to generate what the radar data could look like in the near future.

04:02

By the way, if you find this interesting, I invite you to subscribe and like the video

04:06

and share the knowledge by sending this video to a friend.

04:09

I'm sure they will love it, and they will be grateful to learn something new because

04:13

of you!

04:14

And if you don't, no worries, thank you for watching!

04:16

So more precisely, the previous 20 minutes radar observations are sent to the model to

04:21

generate 90 minutes of possible future predictions.

04:25

It is trained just like any GAN architecture guiding the learning process by penalizing

04:29

the difference between the generated radar predictions and the real predictions, which

04:34

we have in our dataset for training.

04:37

As you can see here, there are two losses and a regularization term, which are the penalties

04:41

that will lead our model during training.

04:44

The first one is a temporal loss.

04:46

This temporal loss will force the model to be consistent in its generation over multiple

04:50

frames by comparing them with the real data over a specific amount of time (or frames)

04:56

to be temporarily realistic.

04:58

This will remove weird jumps or inconsistency over time that couldn't happen in the real

05:04

world.

05:05

The second loss is the same thing but for spatial purposes.

05:08

It ensures spatial consistency by comparing the actual radar data versus our generated

05:13

prediction at a specific frame.

05:15

In short, it will force the model to be "spatially intelligent" and produce confident predictions

05:20

instead of large blurry predictions like we saw for UNet.

05:24

The last term and not least is the regularization term.

05:27

It will penalize differences in the grid cell resolution between the real radar sequences

05:33

and our predictions using many examples at a time instead of comparing the predictions

05:38

one by one like the two losses.

05:40

This will improve performance and produce more location-accurate predictions.

05:45

So you will send some observations, get their predictions, compare them with the real radar

05:50

data you have with these three measurements we just covered, and update your model based

05:55

on the differences.

05:56

Then, you repeat this process numerous times with all your training data to end up with

06:01

a powerful model that learns how the weather is changing and can ideally accurately generalize

06:06

this behavior to most new data it will receive afterward.

06:10

You can see how accurate the results are where UNet is accurate but blurry, and the numerical

06:15

method overestimates the rainfall intensity over time.

06:19

Of course, as they say, no method is without limitations, and theirs struggle with long-term

06:24

predictions, and just like most deep learning applications, they also struggle with rare

06:29

events that do not frequently appear in the training datasets, which they will work on

06:33

improving.

06:34

Of course, this was just an overview of this new paper attacking the super interesting

06:38

and useful task of precipitation nowcasting.

06:42

I invite you to read their excellent paper for more technical detail about their implementation,

06:46

training data, evaluation metrics, and the expert meteorologist study.

06:50

They also made a Colab Notebook you can play with to generate predictions.

06:54

Both are linked in the description below.

06:57

Thank you very much for watching for those of you who are still here, and I will see

07:00

you next week!

Join Hacker Noon

Create your free account to unlock your custom reading experience.