Just add spark_1.4.1_yarn_shuffle.jar in ClassPath or create a New Maven
project using below dependency:
dependency
groupIdorg.apache.spark/groupId
artifactIdspark-core_2.11/artifactId
version1.4.1/version
/dependency
dependency
groupIdorg.apache.spark/groupId
artifactIdspark-sql_2.11/artifactId
Normally people would establish maven project with Spark dependencies or,
use sbt.
Can you go with either approach ?
Cheers
On Tue, Aug 18, 2015 at 10:28 AM, Jerry jerry.c...@gmail.com wrote:
Hello,
So I setup Spark to run on my local machine to see if I can reproduce the
issue I'm having
Hello,
So I setup Spark to run on my local machine to see if I can reproduce the
issue I'm having with data frames, but I'm running into issues with the
compiler.
Here's what I got:
$ echo $CLASSPATH