Thanks Guys. It was a classLoader issue. Rather than linking to the
SPARK_HOME/assembly/target/scala-2.11/jars/ i was linking the individual
jars. Linking to the folder instead solved the issue for me.
Regards
Sumit Chawla
On Wed, Sep 21, 2016 at 2:51 PM, Jakob Odersky wrote:
> Your app is f
Your app is fine, I think the error has to do with the way inttelij
launches applications. Is your app forked in a new jvm when you run it?
On Wed, Sep 21, 2016 at 2:28 PM, Gokula Krishnan D
wrote:
> Hello Sumit -
>
> I could see that SparkConf() specification is not being mentioned in your
> pr
Hello Sumit -
I could see that SparkConf() specification is not being mentioned in your
program. But rest looks good.
Output:
By the way, I have used the README.md template
https://gist.github.com/jxson/1784669
Thanks & Regards,
Gokula Krishnan* (Gokul)*
On Tue, Sep 20, 2016 at 2:15 AM, Cha
Hi All
I am trying to test a simple Spark APP using scala.
import org.apache.spark.SparkContext
object SparkDemo {
def main(args: Array[String]) {
val logFile = "README.md" // Should be some file on your system
// to run in local mode
val sc = new SparkContext("local", "Simple Ap