I am trying to do the same but till now no luck...
I have everything running inside docker container including mesos master, mesos
slave , marathon , spark mesos cluster dispatcher.
But when I try to submit the job using spark submit as a docker container it
fails ...
Between this setup is on
I need to pass the value of the filter dynamically like where id=someVal and
that someVal exist in another RDD.
How can I do this across JavaRDD and DataFrame ?
Sent from my iPad
On Jul 2, 2015, at 12:49 AM, ayan guha guha.a...@gmail.com wrote:
You can directly use filter on a Dataframe