Hi,

I use Spark for for its newAPIHadoopRDD method and map/reduce etc. tasks.
When I include it I see that it has many transitive dependencies.

Which of them I should exclude? I've included the dependency tree of
spark-core. Is there any documentation that explains why they are needed
(maybe all of them are necessary?)

Kind Regards,
Furkan KAMACI

PS: Dependency Tree:

[INFO] +- org.apache.spark:spark-core_2.10:jar:1.4.1:compile
[INFO] |  +- com.twitter:chill_2.10:jar:0.5.0:compile
[INFO] |  |  \- com.esotericsoftware.kryo:kryo:jar:2.21:compile
[INFO] |  |     +-
com.esotericsoftware.reflectasm:reflectasm:jar:shaded:1.07:compile
[INFO] |  |     +- com.esotericsoftware.minlog:minlog:jar:1.2:compile
[INFO] |  |     \- org.objenesis:objenesis:jar:1.2:compile
[INFO] |  +- com.twitter:chill-java:jar:0.5.0:compile
[INFO] |  +- org.apache.spark:spark-launcher_2.10:jar:1.4.1:compile
[INFO] |  +- org.apache.spark:spark-network-common_2.10:jar:1.4.1:compile
[INFO] |  +- org.apache.spark:spark-network-shuffle_2.10:jar:1.4.1:compile
[INFO] |  +- org.apache.spark:spark-unsafe_2.10:jar:1.4.1:compile
[INFO] |  +- net.java.dev.jets3t:jets3t:jar:0.7.1:compile
[INFO] |  +- org.apache.curator:curator-recipes:jar:2.4.0:compile
[INFO] |  |  \- org.apache.curator:curator-framework:jar:2.4.0:compile
[INFO] |  |     \- org.apache.curator:curator-client:jar:2.4.0:compile
[INFO] |  +- org.apache.commons:commons-lang3:jar:3.3.2:compile
[INFO] |  +- org.apache.commons:commons-math3:jar:3.4.1:compile
[INFO] |  +- com.google.code.findbugs:jsr305:jar:1.3.9:compile
[INFO] |  +- org.slf4j:jul-to-slf4j:jar:1.6.6:compile
[INFO] |  +- org.slf4j:jcl-over-slf4j:jar:1.6.6:compile
[INFO] |  +- com.ning:compress-lzf:jar:1.0.3:compile
[INFO] |  +- net.jpountz.lz4:lz4:jar:1.2.0:compile
[INFO] |  +- org.roaringbitmap:RoaringBitmap:jar:0.4.5:compile
[INFO] |  +- commons-net:commons-net:jar:2.2:compile
[INFO] |  +- org.spark-project.akka:akka-remote_2.10:jar:2.3.4-spark:compile
[INFO] |  |  +-
org.spark-project.akka:akka-actor_2.10:jar:2.3.4-spark:compile
[INFO] |  |  |  \- com.typesafe:config:jar:1.2.1:compile
[INFO] |  |  +-
org.spark-project.protobuf:protobuf-java:jar:2.5.0-spark:compile
[INFO] |  |  \- org.uncommons.maths:uncommons-maths:jar:1.2.2a:compile
[INFO] |  +- org.spark-project.akka:akka-slf4j_2.10:jar:2.3.4-spark:compile
[INFO] |  +- org.scala-lang:scala-library:jar:2.10.4:compile
[INFO] |  +- org.json4s:json4s-jackson_2.10:jar:3.2.10:compile
[INFO] |  |  \- org.json4s:json4s-core_2.10:jar:3.2.10:compile
[INFO] |  |     +- org.json4s:json4s-ast_2.10:jar:3.2.10:compile
[INFO] |  |     \- org.scala-lang:scalap:jar:2.10.0:compile
[INFO] |  |        \- org.scala-lang:scala-compiler:jar:2.10.0:compile
[INFO] |  +- com.sun.jersey:jersey-server:jar:1.9:compile
[INFO] |  |  \- asm:asm:jar:3.1:compile
[INFO] |  +- com.sun.jersey:jersey-core:jar:1.9:compile
[INFO] |  +- org.apache.mesos:mesos:jar:shaded-protobuf:0.21.1:compile
[INFO] |  +- io.netty:netty-all:jar:4.0.23.Final:compile
[INFO] |  +- com.clearspring.analytics:stream:jar:2.7.0:compile
[INFO] |  +- io.dropwizard.metrics:metrics-core:jar:3.1.0:compile
[INFO] |  +- io.dropwizard.metrics:metrics-jvm:jar:3.1.0:compile
[INFO] |  +- io.dropwizard.metrics:metrics-json:jar:3.1.0:compile
[INFO] |  +- io.dropwizard.metrics:metrics-graphite:jar:3.1.0:compile
[INFO] |  +- com.fasterxml.jackson.core:jackson-databind:jar:2.4.4:compile
[INFO] |  |  +-
com.fasterxml.jackson.core:jackson-annotations:jar:2.4.0:compile
[INFO] |  |  \- com.fasterxml.jackson.core:jackson-core:jar:2.4.4:compile
[INFO] |  +-
com.fasterxml.jackson.module:jackson-module-scala_2.10:jar:2.4.4:compile
[INFO] |  |  \- org.scala-lang:scala-reflect:jar:2.10.4:compile
[INFO] |  +- org.apache.ivy:ivy:jar:2.4.0:compile
[INFO] |  +- oro:oro:jar:2.0.8:compile
[INFO] |  +- org.tachyonproject:tachyon-client:jar:0.6.4:compile
[INFO] |  |  \- org.tachyonproject:tachyon:jar:0.6.4:compile
[INFO] |  +- net.razorvine:pyrolite:jar:4.4:compile
[INFO] |  +- net.sf.py4j:py4j:jar:0.8.2.1:compile
[INFO] |  \- org.spark-project.spark:unused:jar:1.0.0:compile

Reply via email to