Re: Kill Spark Streaming JOB from Spark UI or Yarn

2017-08-27 Thread Matei Zaharia
The batches should all have the same application ID, so use that one. You can also find the application in the YARN UI to terminate it from there. Matei > On Aug 27, 2017, at 10:27 AM, KhajaAsmath Mohammed > wrote: > > Hi, > > I am new to spark streaming and not

Re: different behaviour linux/Unix vs windows when load spark context in scala method called from R function using rscala package

2017-08-27 Thread Georg Heiler
Why don't you simply use sparklyr for a more R native integration of spark? Simone Pallotta schrieb am So. 27. Aug. 2017 um 09:47: > In my R code, I am using rscala package to bridge to a scala method. in > scala method I have initialized a spark context to be

different behaviour linux/Unix vs windows when load spark context in scala method called from R function using rscala package

2017-08-27 Thread Simone Pallotta
In my R code, I am using rscala package to bridge to a scala method. in scala method I have initialized a spark context to be used later. R code: s <- scala(classpath = "", heap.maximum = "4g") assign("WrappeR",s$.it.wrapper.r.Wrapper) WrappeR$init() where init is a scala function and