I feel so good that Holden replied.
Yes, that was the problem. I was running from Intellij, I removed the
provided scope and works great.
Thanks a lot.
On Fri, Nov 4, 2016 at 2:05 PM, Holden Karau wrote:
> It seems like you've marked the spark jars as provided, in this case they
> would only be provided you run your application with spark-submit or
> otherwise have Spark's JARs on your class path. How are you launching your
> application?
>
> On Fri, Nov 4, 2016 at 2:00 PM, shyla deshpande
> wrote:
>
>> object App {
>>
>>
>> import org.apache.spark.sql.functions._
>> import org.apache.spark.sql.SparkSession
>>
>> def main(args : Array[String]) {
>> println( "Hello World!" )
>> val sparkSession = SparkSession.builder.
>> master("local")
>> .appName("spark session example")
>> .getOrCreate()
>> }
>>
>> }
>>
>>
>>
>> 1.8
>> 1.8
>> UTF-8
>> 2.11.8
>> 2.11
>>
>>
>>
>>
>> org.scala-lang
>> scala-library
>> ${scala.version}
>>
>>
>>
>> org.apache.spark
>> spark-core_2.11
>> 2.0.1
>> provided
>>
>>
>> org.apache.spark
>> spark-sql_2.11
>> 2.0.1
>> provided
>>
>>
>>
>> org.specs2
>> specs2-core_${scala.compat.version}
>> 2.4.16
>> test
>>
>>
>>
>>
>> src/main/scala
>>
>>
>>
>
>
> --
> Cell : 425-233-8271
> Twitter: https://twitter.com/holdenkarau
>