My bad, I looked at your question in the context of your 2nd e-mail in this
topic where you talked about custom loss functions and SVR.
On Wed, 13 Sep 2017 at 15:20 Thomas Evangelidis wrote:
> I said that I want to make a Support Vector Regressor using the rbf kernel
> to minimize my own loss fu
I said that I want to make a Support Vector Regressor using the rbf kernel
to minimize my own loss function. Never mentioned about classification and
hinge loss.
On 13 September 2017 at 23:51, federico vaggi
wrote:
> You are confusing the kernel with the loss function. SVM minimize a well
> def
You are confusing the kernel with the loss function. SVM minimize a well
defined hinge loss on a space that's implicitly defined by a kernel mapping
(or, in feature space if you use a linear kernel).
On Wed, 13 Sep 2017 at 14:31 Thomas Evangelidis wrote:
> What about the SVM? I use an SVR at th
What about the SVM? I use an SVR at the end to combine multiple
MLPRegressor predictions using the rbf kernel (linear kernel is not good
for this problem). Can I also implement an SVR with rbf kernel in
Tensorflow using my own loss function? So far I found an example of an SVC
with linear kernel in
it's pretty easy to implement this by creating your own Pipeline subclass,
isn't it?
On 14 Sep 2017 4:55 am, "Gael Varoquaux"
wrote:
> On Wed, Sep 13, 2017 at 02:45:41PM -0400, Andreas Mueller wrote:
> > We could add a way to call non-standard methods, but I'm not sure that
> is the
> > right wa
On Wed, Sep 13, 2017 at 02:45:41PM -0400, Andreas Mueller wrote:
> We could add a way to call non-standard methods, but I'm not sure that is the
> right way to go.
> (like pipeline.custom_method(X, method="kneighbors")). But that assumes that
> the method signature is X or (X, y).
> So I'm not sure
On 09/13/2017 01:18 PM, Thomas Evangelidis wrote:
Thanks again for the clarifications Sebastian!
Keras has a Scikit-learn API with the KeraRegressor which implements
the Scikit-Learn MLPRegressor interface:
https://keras.io/scikit-learn-api/
Is it possible to change the loss function in
Hi Ryan.
I don't think there's a good solution. Feel free to open an issue in the
issue tracker (I'm not aware of one for this).
You can access the pipeline steps, so you can access the kneighbors
method via the "steps" attribute, but that wouldn't
take any of the previous steps into account, a
> Is it possible to change the loss function in KerasRegressor? I don't have
> time right now to experiment with hyperparameters of new ANN architectures. I
> am in urgent need to reproduce in Keras the results obtained with
> MLPRegressor and the set of hyperparameters that I have optimized for
Thanks again for the clarifications Sebastian!
Keras has a Scikit-learn API with the KeraRegressor which implements the
Scikit-Learn MLPRegressor interface:
https://keras.io/scikit-learn-api/
Is it possible to change the loss function in KerasRegressor? I don't have
time right now to experime
> What about the SVR? Is it possible to change the loss function there?
Here you would have the same problem; SVR is a constrained optimization problem
and you would have to change the calculation of the loss gradient then. Since
SVR is a "1-layer" neural net, if you change the cost function to
Thanks Sebastian. Exploring Tensorflow capabilities was in my TODO list,
but now it's in my immediate plans.
What about the SVR? Is it possible to change the loss function there? Could
you please clarify what the "x" and "x'" parameters in the default Kernel
functions mean? Is "x" a NxM array, wher
Dear Roman,
I tried to search through on the web but i didn't get any information or
example.
Could you give me an example of using _CFNode.centroids_?
I would appreciate it if you would help me.
On Wed, Aug 23, 2017 at 2:28 PM, Roman Yurchak
wrote:
> > what are the data samples in this clust
13 matches
Mail list logo