Hi Aris,

A simple approach to gaining some of the benefits of an RBF kernel is
to add synthetic features to your training set. For example, if your
original data consists of 3-dimensional vectors [x, y, z], you could
compute a new 9-dimensional feature vector containing [x, y, z, x^2,
y^2, z^2, xy, xz, y*z].

This basic idea can be taken much further:
  1. http://www.eecs.berkeley.edu/~brecht/papers/07.rah.rec.nips.pdf
  2. http://arxiv.org/pdf/1109.4603.pdf

Hope that helps,
-Jey

On Thu, Sep 18, 2014 at 11:10 AM, Aris <arisofala...@gmail.com> wrote:
> Sorry to bother you guys, but does anybody have any ideas about the status
> of MLlib with a Radial Basis Function kernel for SVM?
>
> Thank you!
>
> On Tue, Sep 16, 2014 at 3:27 PM, Aris < wrote:
>
>> Hello Spark Community -
>>
>> I am using the support vector machine / SVM implementation in MLlib with
>> the standard linear kernel; however, I noticed in the Spark documentation
>> for StandardScaler is *specifically* mentions that SVMs which use the RBF
>> kernel work really well when you have standardized data...
>>
>> which begs the question, is there some kind of support for RBF kernels
>> rather than linear kernels? In small data tests using R the RBF kernel
>> worked really well, and linear kernel never converged...so I would really
>> like to use RBF.
>>
>> Thank you folks for any help!
>>
>> Aris
>
>

---------------------------------------------------------------------
To unsubscribe, e-mail: user-unsubscr...@spark.apache.org
For additional commands, e-mail: user-h...@spark.apache.org

Reply via email to