On 09/04/2012 04:27 PM, Olivier Grisel wrote:
> 2012/9/4 Alexandre Gramfort :
>>> You can just subtract the minimum and divide by the max-min to get
>>> [0, 1] features and then go from there.
>>> We could actually add something to do this.
>> that would be useful for pipelines so scaling is done o
2012/9/4 Alexandre Gramfort :
>> You can just subtract the minimum and divide by the max-min to get
>> [0, 1] features and then go from there.
>> We could actually add something to do this.
>
> that would be useful for pipelines so scaling is done on training fold
> and not on full data.
Yes, I wo
> You can just subtract the minimum and divide by the max-min to get
> [0, 1] features and then go from there.
> We could actually add something to do this.
that would be useful for pipelines so scaling is done on training fold
and not on full data.
Alex
-
Hi Sheila.
I think we don't have a function for this because it seems pretty easy
to do yourself.
That might be also true for "Scaler" and "Normalizer", though.
You can just subtract the minimum and divide by the max-min to get
[0, 1] features and then go from there.
We could actually add someth
Hello All,
Another short question,
How to scale data features to a given range?
For example I want to scale my data features to [-1, +1] or [0,100] .
sklearn.preprocessing don't have any direct function for this !!
Thanks
--
Sheila
-