It seems like you've marked the spark jars as provided, in this case they
would only be provided you run your application with spark-submit or
otherwise have Spark's JARs on your class path. How are you launching your
application?

On Fri, Nov 4, 2016 at 2:00 PM, shyla deshpande <deshpandesh...@gmail.com>
wrote:

> object App {
>
>
>  import org.apache.spark.sql.functions._
> import org.apache.spark.sql.SparkSession
>
>   def main(args : Array[String]) {
>     println( "Hello World!" )
>       val sparkSession = SparkSession.builder.
>       master("local")
>       .appName("spark session example")
>       .getOrCreate()
>   }
>
> }
>
>
> <properties>
>   <maven.compiler.source>1.8</maven.compiler.source>
>   <maven.compiler.target>1.8</maven.compiler.target>
>   <encoding>UTF-8</encoding>
>   <scala.version>2.11.8</scala.version>
>   <scala.compat.version>2.11</scala.compat.version>
> </properties>
>
> <dependencies>
>   <dependency>
>     <groupId>org.scala-lang</groupId>
>     <artifactId>scala-library</artifactId>
>     <version>${scala.version}</version>
>   </dependency>
>
>   <dependency>
>       <groupId>org.apache.spark</groupId>
>       <artifactId>spark-core_2.11</artifactId>
>       <version>2.0.1</version>
>       <scope>provided</scope>
>   </dependency>
>   <dependency>
>       <groupId>org.apache.spark</groupId>
>       <artifactId>spark-sql_2.11</artifactId>
>       <version>2.0.1</version>
>       <scope>provided</scope>
>   </dependency>
>
>   <dependency>
>     <groupId>org.specs2</groupId>
>     <artifactId>specs2-core_${scala.compat.version}</artifactId>
>     <version>2.4.16</version>
>     <scope>test</scope>
>   </dependency>
> </dependencies>
>
> <build>
>   <sourceDirectory>src/main/scala</sourceDirectory>
> </build>
>
>


-- 
Cell : 425-233-8271
Twitter: https://twitter.com/holdenkarau

Reply via email to