Skip to content
Advertisement

Tag: predict

How to increase the number of decimals when predicting an attribute using LSTM on Python?

I have an LSTM model on Python (Keras) in which predicts floats with long decimals. However, the predicted value has fewer decimals than expected. For instance: Input value: 41.39011366661221 Predicted value: 41.396626 Should I use something on my model? I’ve also tried to normalize the input. Unfortunately, I get the same number of output decimals. Any clue? Answer Your input

Advertisement