Yes, see https://dzone.com/articles/predictive-analytics-with-spark-ml
Although the example uses two labels, the same approach supports multiple
labels.
Sent from my iPad
> On Nov 7, 2017, at 6:30 AM, HARSH TAKKAR wrote:
>
> Hi
>
> Does Random Forest in spark Ml
In the doc you refer:
// The Avro records get converted to Spark types, filtered, and// then
written back out as Avro recordsval df =
spark.read.avro("/tmp/episodes.avro")df.filter("doctor >
5").write.avro("/tmp/output")
Alternatively you can specify the format to use instead:
[image: Copy to
Hello.
I am running Spark 2.1, Scala 2.11. We're running several Spark streaming
jobs. In some cases we restart these jobs on an occasional basis. We have
code that looks like the following:
logger.info("Starting the streaming context!")
ssc.start()
logger.info("Waiting for termination!")
Is there a way to flush the API?
I execute http://localhost:18080/api/v1/applications?status=runningning
In the results I will get a list of applications but not all are still running.
This is causing an issue with monitoring what is actually running.
To compound the problem these are
Hi, I’m obviously new to Spark Structured Streaming, and I want to
1.) Open one (a single) Connection to a Mqtt broker / topic spewing JSON Objects
2.) Transform JSON to Wide Table
3.) Do several different queries on wide Table
What I do:
val lines = session.readStream
Hi
Does Random Forest in spark Ml supports multi label classification in scala
?
I found out, sklearn provides sklearn.ensemble.RandomForestClassifier in
python, do we have the similar functionality in scala ?
Hello Behroz,
you can use a SparkListener to get updates from the underlying process (c.f.
https://spark.apache.org/docs/2.2.0/api/java/org/apache/spark/scheduler/SparkListener.html
)
You need first to create your own SparkAppListener class that extends it:
-
-
To unsubscribe e-mail: user-unsubscr...@spark.apache.org
Anyone ?
--
Sent from: http://apache-spark-user-list.1001560.n3.nabble.com/
-
To unsubscribe e-mail: user-unsubscr...@spark.apache.org