Hello,

*Question 1: *I am new to Spark. I am trying to train classification model
on Spark DataFrame. I am using PySpark.  And aFrame object in df:ted a
Spark DataFrame object in df:

from pyspark.sql.types import *

query = """select * from table"""

df = sqlContext.sql(query)

My question is how to continue extend the code to train models (e.g.,
classification model etc.) on object df?  I have checked many online
resources and haven't seen any similar approach like the following:

lr = LogisticRegression(maxIter=10, regParam=0.3, elasticNetParam=0.8)
# Fit the modellrModel = lr.fit(df)

Is it a feasible way to train the model? If yes, where could I find
the reference code?

*Question 2:  *Why in MLib dataframe based API there is no SVM model
support, however, in RDD-based APIs there was SVM model?

Thanks a lot!


Best,


Shi

Reply via email to