With clipping, I mean thresholding the output, e.g., via sth like
min/max(some_constant, actual_output)
or like in an leaky relu:
min/max(some_constant * 0.001, actual_output)
Alternatively, you could use an sigmoidal function (something like tanh but
with a larger co-domain) as the output unit,
On 10 September 2017 at 22:03, Sebastian Raschka
wrote:
> You could normalize the outputs (e.g., via min-max scaling). However, I
> think the more intuitive way would be to clip the predictions. E.g., say
> you are predicting house prices, it probably makes no sense to have a
> negative predictio
You could normalize the outputs (e.g., via min-max scaling). However, I think
the more intuitive way would be to clip the predictions. E.g., say you are
predicting house prices, it probably makes no sense to have a negative
prediction, so you would clip the output at some value >0$
PS: -820 an
Greetings,
Is there any way to force the MLPRegressor to make predictions in the same
value range as the training data? For example, if the training data range
between -5 and -9, I don't want the predictions to range between -820 and
-800. In fact, some times I get anti-correlated predictions, for