Use mvn dependency:tree or sbt dependency-tree to print all of the
dependencies. You are probably bringing in more servlet API libs from
some other source?

On Fri, Jan 23, 2015 at 10:57 AM, Marco <marco....@gmail.com> wrote:
> Hi,
>
> I've exactly the same issue. I've tried to mark the libraries as 'provided'
> but then IntelliJ IDE seems to have deleted the libraries locally....that is
> I am not able to build/run the stuff in the IDE.
>
> Is the issue resolved ? I'm not very experienced in SBT....I've tried to
> exclude the libraries:
>
> /name := "SparkDemo"
>
> version := "1.0"
>
> scalaVersion := "2.10.4"
>
>
> libraryDependencies += "org.apache.spark" %% "spark-core" % "1.2.0"
> exclude("org.apache.hadoop", "hadoop-client")
>
> libraryDependencies += "org.apache.spark" % "spark-sql_2.10" % "1.2.0"
>
>
> libraryDependencies += "org.apache.hadoop" % "hadoop-common" % "2.6.0"
> excludeAll(
> ExclusionRule(organization = "org.eclipse.jetty"))
>
> libraryDependencies += "org.apache.hadoop" % "hadoop-mapreduce-client-core"
> % "2.6.0"
>
>
> libraryDependencies += "org.apache.hbase" % "hbase-client" %
> "0.98.4-hadoop2"
>
> libraryDependencies += "org.apache.hbase" % "hbase-server" %
> "0.98.4-hadoop2"
>
> libraryDependencies += "org.apache.hbase" % "hbase-common" %
> "0.98.4-hadoop2"
>
> mainClass in Compile := Some("demo.TruckEvents")/
>
> but this does not work also.
>
> Thanks,
> Marco
>
>
>
> --
> View this message in context: 
> http://apache-spark-user-list.1001560.n3.nabble.com/Got-java-lang-SecurityException-class-javax-servlet-FilterRegistration-s-when-running-job-from-intela-tp18035p21332.html
> Sent from the Apache Spark User List mailing list archive at Nabble.com.
>
> ---------------------------------------------------------------------
> To unsubscribe, e-mail: user-unsubscr...@spark.apache.org
> For additional commands, e-mail: user-h...@spark.apache.org
>

---------------------------------------------------------------------
To unsubscribe, e-mail: user-unsubscr...@spark.apache.org
For additional commands, e-mail: user-h...@spark.apache.org

Reply via email to