Skip to content
Advertisement

Mask layer is not working with MLPs, how to add a custom layer with masking?

I’m using MLPs to forecast a time series, I implement a code that contain a mask layer to let the model skip the mask values.

for instance, in my data, the time series has a lot of NaN values, I fill it by a ‘value = -999’. I don’t want to remove it, but I want the Keras masking to skip it in gentle way.

My code as the following:

model = Sequential()
model.add(Masking(mask_value=-999, input_shape=(n_steps_in, )))
model.add(Dense(1024, activation='relu'))
model.add(Dense(n_steps_out))

I read an answer that said it is impossible to let masking works with MLPs.

How to add a mask layer for MLPs, or a custom mask layer to solve this problem?

Advertisement

Answer

What you’re doing has a fundamental flaw.

The most important rule of masking is that,

the dimension(s) you’re applying the mask on needs to stay without changing until the final prediction of the model

If that dimension(s) the mask is applied on changes, there’s no way to propagate it forward.

Some scenarios masking will work,

  • If you want to mask specific batch items in an MLP. For example, in your MLP, you can mask a whole feature vector, but not a single feature in that vector. Here your mask would be a [batch size] size tensor and mask_value would be the same size as the feature vector.

  • If you want to mask a specific (batch item, time step) combination of an LSTM model, you can have the mask as [batch size, time step] sized tensor, where mask_value would have the same size as the feature vector of a time step.

So in summary, you can’t mask only items, not specific features.

User contributions licensed under: CC BY-SA
4 People found this is helpful
Advertisement