Skip to content

Results for Droughts

Recap

Recall that we are trying to investigate how a new variable called Vegetation Optical Depth (VOD) compares to some of the previously used variables Land Surface Temperature (LST), Soil Moisture (SM) and Normalized Vegetation Difference Index (NDVI). The previous variables were often used to recover droughts, but studies have shown that VOD will better characterize drought conditions. So, the objective is to see how these VOD compares to other variables via information theory and other similarity measures. In addition, the data representation could make a difference in how similar VOD is to the other variables. For example, many spatial dimension or many temporal dimensions. So, we are going to look at different representations and different combinations of VOD, LST, SM, and NDVI and do comparisons.


Data

For this first set of results, we are looking at the CONUS dataset which is located in California. We have a period of 6 years where there were drought occurences and not drought occurences.

  • Droughts - 2012, 2014, 2015)
  • No Drought - 2010, 2011, 2013

We have a spatial resolution of ... and a temporal resolution of 14 days.


Methods

Entropy

For this first experiment, we only want to classify the expected uncertainty that each variable has with different temporal representations. So for example, we can do self-comparisons like what's the expected uncertainty of VOD with 1 temporal dimension versus 3 temporal dimensions? We will do this for all 4 variables individually. We will use RBIG to measure the expected uncertainty (aka Entropy)

H(\mathbf{X}) = \mathbb{E}_\mathbf{x} \left[ -\log p(\mathbf{x}) \right]

Similarity

For this second experiment, we want to see how similar VOD is to the other variables under different spatial-temporal representations. E.g. VOD vs. SM, VOD vs. NDVI, etc. So there are a few ways to do this. We want to show that we can do it with Mutual Information (MI) because it uses probability distributions:

I(\mathbf{X,Y}) = H(\mathbf{X}) + H(\mathbf{Y}) - 2 H(\mathbf{X,Y})

We also have some of the standard measures like

  • Pearson correlation coefficient
  • Spearman correlation coefficient
  • RV-Coefficient (Multivariate extension of pearson)
  • centered kernel alignment (CKA)

Hypothesis

Adding temporal features will definitely increase the amount of entropy within a variable. We also expect to see some differences in the entropy value obtained from the different variables. However, there should be a point where adding more temporal dimensions may not add much more information.

We see some trend that perhaps gives us intuition that there could be a 'sweet' spot for the amount of temporal dimensions to use.


Preprocessing

  1. Climatology

I remove the climatology because we want to characterize the anomalies outside of the climate patterns.

  1. Similar Data

I ensure that the lat-lon-time locations are consistent across variables.

Note: We will have less samples as we increase the number of features because of the boundaries.


Entropy

png

png

png

Mutual Information

VOD

png

png

png

png

png

png

png

png

png

NDVI

png

png

png

png

png

png

png

png

png

SM

png

png

png

png

png

png

png

png

png

LST

png

png

png

png

png

png

png

png

png

Taylor Diagram

Temporal Dims: 1

png

png

png

Temporal Dims: 2

png

png

png

Temporal Dims: 3

png

png

png

Temporal Dims: 4

png

png

png

Temporal Dims: 5

png

png

png

Temporal Dims: 6

png

png

png

Temporal Dims: 7

png

png

png

Temporal Dims: 8

png

png

png

Temporal Dims: 9

png

png

png

Temporal Dims: 10

png

png

png

Temporal Dims: 11

png

png

png

Temporal Dims: 12

png

png

png