I have found the easiest way to set up a development platform is to use the 
databricks sbt-spark-package plugin
<https://github.com/databricks/sbt-spark-package>  (assuming you are using
scala+sbt). You simply add the plugin to your <project>/project/plugins.sbt
file and add the sparkVersion to your build.sbt file. It automatically loads
the necessary packages to build your applications.

It also provides the sbt console command that sets up a local spark repl to
prototype code against. 



--
View this message in context: 
http://apache-spark-user-list.1001560.n3.nabble.com/how-to-setup-the-development-environment-of-spark-with-IntelliJ-on-ubuntu-tp27333p27357.html
Sent from the Apache Spark User List mailing list archive at Nabble.com.

---------------------------------------------------------------------
To unsubscribe e-mail: user-unsubscr...@spark.apache.org

Reply via email to