I am facing the same issue as listed here:
http://apache-spark-user-list.1001560.n3.nabble.com/Packaging-a-spark-job-using-maven-td5615.html
Solution mentioned is here:
https://gist.github.com/prb/d776a47bd164f704eecb
However, I think I don't understand few things:
1) Why are jars being split
I have an SBT Spark project compiling fine in Intellij.
However when I try to create a SparkContext from a worksheet:
import org.apache.spark.SparkContext
val sc1 = new SparkContext(local[8], sc1)
I get this error:
com.typesafe.config.ConfigException$Missing: No configuration setting found
for