The following command succeeded (on Linux) on Spark master checked out this
morning:

mvn -Pyarn -Phive -Phadoop-2.4 -DskipTests install

FYI


On Thu, Jul 31, 2014 at 1:36 PM, yao <yaosheng...@gmail.com> wrote:

> Hi TD,
>
> I've asked my colleagues to do the same thing but compile still fails.
> However, maven build succeeded once I built it on my personal macbook (with
> the latest MacOS Yosemite). So I guess there might be something wrong in my
> build environment. Wonder if anyone tried to compile spark using maven
> under Mavericks, please let me know your result.
>
> Thanks
> Shengzhe
>
>
> On Thu, Jul 31, 2014 at 1:25 AM, Tathagata Das <
> tathagata.das1...@gmail.com>
> wrote:
>
> > Does a "mvn clean" or "sbt/sbt clean" help?
> >
> > TD
> >
> > On Wed, Jul 30, 2014 at 9:25 PM, yao <yaosheng...@gmail.com> wrote:
> > > Hi Folks,
> > >
> > > Today I am trying to build spark using maven; however, the following
> > > command failed consistently for both 1.0.1 and the latest master.
>  (BTW,
> > it
> > > seems sbt works fine: *sbt/sbt -Dhadoop.version=2.4.0 -Pyarn clean
> > > assembly)*
> > >
> > > Environment: Mac OS Mavericks
> > > Maven: 3.2.2 (installed by homebrew)
> > >
> > >
> > >
> > >
> > > *export M2_HOME=/usr/local/Cellar/maven/3.2.2/libexec/export
> > > PATH=$M2_HOME/bin:$PATHexport MAVEN_OPTS="-Xmx2g -XX:MaxPermSize=512M
> > > -XX:ReservedCodeCacheSize=512m"mvn -Pyarn -Phadoop-2.4
> > > -Dhadoop.version=2.4.0 -DskipTests clean package*
> > >
> > > Build outputs:
> > >
> > > [INFO] Scanning for projects...
> > > [INFO]
> > >
> ------------------------------------------------------------------------
> > > [INFO] Reactor Build Order:
> > > [INFO]
> > > [INFO] Spark Project Parent POM
> > > [INFO] Spark Project Core
> > > [INFO] Spark Project Bagel
> > > [INFO] Spark Project GraphX
> > > [INFO] Spark Project ML Library
> > > [INFO] Spark Project Streaming
> > > [INFO] Spark Project Tools
> > > [INFO] Spark Project Catalyst
> > > [INFO] Spark Project SQL
> > > [INFO] Spark Project Hive
> > > [INFO] Spark Project REPL
> > > [INFO] Spark Project YARN Parent POM
> > > [INFO] Spark Project YARN Stable API
> > > [INFO] Spark Project Assembly
> > > [INFO] Spark Project External Twitter
> > > [INFO] Spark Project External Kafka
> > > [INFO] Spark Project External Flume
> > > [INFO] Spark Project External ZeroMQ
> > > [INFO] Spark Project External MQTT
> > > [INFO] Spark Project Examples
> > > [INFO]
> > > [INFO]
> > >
> ------------------------------------------------------------------------
> > > [INFO] Building Spark Project Parent POM 1.0.1
> > > [INFO]
> > >
> ------------------------------------------------------------------------
> > > [INFO]
> > > [INFO] --- maven-clean-plugin:2.5:clean (default-clean) @ spark-parent
> > ---
> > > [INFO]
> > > [INFO] --- maven-enforcer-plugin:1.3.1:enforce (enforce-versions) @
> > > spark-parent ---
> > > [INFO]
> > > [INFO] --- build-helper-maven-plugin:1.8:add-source
> (add-scala-sources) @
> > > spark-parent ---
> > > [INFO] Source directory:
> > > /Users/syao/git/grid/thirdparty/spark/src/main/scala added.
> > > [INFO]
> > > [INFO] --- maven-remote-resources-plugin:1.5:process (default) @
> > > spark-parent ---
> > > [INFO]
> > > [INFO] --- scala-maven-plugin:3.1.6:add-source (scala-compile-first) @
> > > spark-parent ---
> > > [INFO] Add Test Source directory:
> > > /Users/syao/git/grid/thirdparty/spark/src/test/scala
> > > [INFO]
> > > [INFO] --- scala-maven-plugin:3.1.6:compile (scala-compile-first) @
> > > spark-parent ---
> > > [INFO] No sources to compile
> > > [INFO]
> > > [INFO] --- build-helper-maven-plugin:1.8:add-test-source
> > > (add-scala-test-sources) @ spark-parent ---
> > > [INFO] Test Source directory:
> > > /Users/syao/git/grid/thirdparty/spark/src/test/scala added.
> > > [INFO]
> > > [INFO] --- scala-maven-plugin:3.1.6:testCompile
> > (scala-test-compile-first)
> > > @ spark-parent ---
> > > [INFO] No sources to compile
> > > [INFO]
> > > [INFO] --- maven-site-plugin:3.3:attach-descriptor (attach-descriptor)
> @
> > > spark-parent ---
> > > [INFO]
> > > [INFO] --- maven-source-plugin:2.2.1:jar-no-fork (create-source-jar) @
> > > spark-parent ---
> > > [INFO]
> > > [INFO]
> > >
> ------------------------------------------------------------------------
> > > [INFO] Building Spark Project Core 1.0.1
> > > [INFO]
> > >
> ------------------------------------------------------------------------
> > > [INFO]
> > > [INFO] --- maven-clean-plugin:2.5:clean (default-clean) @
> spark-core_2.10
> > > ---
> > > [INFO]
> > > [INFO] --- maven-enforcer-plugin:1.3.1:enforce (enforce-versions) @
> > > spark-core_2.10 ---
> > > [INFO]
> > > [INFO] --- build-helper-maven-plugin:1.8:add-source
> (add-scala-sources) @
> > > spark-core_2.10 ---
> > > [INFO] Source directory:
> > > /Users/syao/git/grid/thirdparty/spark/core/src/main/scala added.
> > > [INFO]
> > > [INFO] --- maven-remote-resources-plugin:1.5:process (default) @
> > > spark-core_2.10 ---
> > > [INFO]
> > > [INFO] --- exec-maven-plugin:1.2.1:exec (default) @ spark-core_2.10 ---
> > > Archive:  lib/py4j-0.8.1-src.zip
> > >   inflating: build/py4j/tests/java_map_test.py
> > >  extracting: build/py4j/tests/__init__.py
> > >   inflating: build/py4j/tests/java_gateway_test.py
> > >   inflating: build/py4j/tests/java_callback_test.py
> > >   inflating: build/py4j/tests/java_list_test.py
> > >   inflating: build/py4j/tests/byte_string_test.py
> > >   inflating: build/py4j/tests/multithreadtest.py
> > >   inflating: build/py4j/tests/java_array_test.py
> > >   inflating: build/py4j/tests/py4j_callback_example2.py
> > >   inflating: build/py4j/tests/py4j_example.py
> > >   inflating: build/py4j/tests/py4j_callback_example.py
> > >   inflating: build/py4j/tests/finalizer_test.py
> > >   inflating: build/py4j/tests/java_set_test.py
> > >   inflating: build/py4j/finalizer.py
> > >  extracting: build/py4j/__init__.py
> > >   inflating: build/py4j/java_gateway.py
> > >   inflating: build/py4j/protocol.py
> > >   inflating: build/py4j/java_collections.py
> > >  extracting: build/py4j/version.py
> > >   inflating: build/py4j/compat.py
> > > [INFO]
> > > [INFO] --- maven-resources-plugin:2.6:resources (default-resources) @
> > > spark-core_2.10 ---
> > > [INFO] Using 'UTF-8' encoding to copy filtered resources.
> > > [INFO] Copying 6 resources
> > > [INFO] Copying 20 resources
> > > [INFO] Copying 7 resources
> > > [INFO] Copying 3 resources
> > > [INFO]
> > > [INFO] --- scala-maven-plugin:3.1.6:add-source (scala-compile-first) @
> > > spark-core_2.10 ---
> > > [INFO] Add Test Source directory:
> > > /Users/syao/git/grid/thirdparty/spark/core/src/test/scala
> > > [INFO]
> > > [INFO] --- scala-maven-plugin:3.1.6:compile (scala-compile-first) @
> > > spark-core_2.10 ---
> > > [INFO] Using zinc server for incremental compilation
> > > [info] Compiling 342 Scala sources and 34 Java sources to
> > > /Users/syao/git/grid/thirdparty/spark/core/target/scala-2.10/classes...
> > > [warn] Class javax.servlet.ServletException not found - continuing
> with a
> > > stub.
> > > [error]
> > > [error]      while compiling:
> > >
> >
> /Users/syao/git/grid/thirdparty/spark/core/src/main/scala/org/apache/spark/HttpServer.scala
> > > [error]         during phase: typer
> > > [error]      library version: version 2.10.4
> > > [error]     compiler version: version 2.10.4
> > > [error]   reconstructed args: -classpath
> > >
> >
> /Users/syao/git/grid/thirdparty/spark/core/target/scala-2.10/classes:/Users/syao/.m2/repository/org/apache/hadoop/hadoop-client/2.4.0/hadoop-client-2.4.0.jar:/Users/syao/.m2/repository/org/apache/hadoop/hadoop-common/2.4.0/hadoop-common-2.4.0.jar:/Users/syao/.m2/repository/commons-cli/commons-cli/1.2/commons-cli-1.2.jar:/Users/syao/.m2/repository/org/apache/commons/commons-math3/3.1.1/commons-math3-3.1.1.jar:/Users/syao/.m2/repository/xmlenc/xmlenc/0.52/xmlenc-0.52.jar:/Users/syao/.m2/repository/commons-httpclient/commons-httpclient/3.1/commons-httpclient-3.1.jar:/Users/syao/.m2/repository/commons-collections/commons-collections/3.2.1/commons-collections-3.2.1.jar:/Users/syao/.m2/repository/commons-lang/commons-lang/2.6/commons-lang-2.6.jar:/Users/syao/.m2/repository/commons-configuration/commons-configuration/1.6/commons-configuration-1.6.jar:/Users/syao/.m2/repository/commons-digester/commons-digester/1.8/commons-digester-1.8.jar:/Users/syao/.m2/repository/commons-beanutils/commons-beanutils/1.7.0/commons-beanutils-1.7.0.jar:/Users/syao/.m2/repository/commons-beanutils/commons-beanutils-core/1.8.0/commons-beanutils-core-1.8.0.jar:/Users/syao/.m2/repository/org/codehaus/jackson/jackson-core-asl/1.8.8/jackson-core-asl-1.8.8.jar:/Users/syao/.m2/repository/org/codehaus/jackson/jackson-mapper-asl/1.8.8/jackson-mapper-asl-1.8.8.jar:/Users/syao/.m2/repository/org/apache/avro/avro/1.7.6/avro-1.7.6.jar:/Users/syao/.m2/repository/com/google/protobuf/protobuf-java/2.5.0/protobuf-java-2.5.0.jar:/Users/syao/.m2/repository/org/apache/hadoop/hadoop-auth/2.4.0/hadoop-auth-2.4.0.jar:/Users/syao/.m2/repository/org/apache/commons/commons-compress/1.4.1/commons-compress-1.4.1.jar:/Users/syao/.m2/repository/org/tukaani/xz/1.0/xz-1.0.jar:/Users/syao/.m2/repository/org/apache/hadoop/hadoop-hdfs/2.4.0/hadoop-hdfs-2.4.0.jar:/Users/syao/.m2/repository/org/mortbay/jetty/jetty-util/6.1.26/jetty-util-6.1.26.jar:/Users/syao/.m2/repository/org/apache/hadoop/hadoop-mapreduce-client-app/2.4.0/hadoop-mapreduce-client-app-2.4.0.jar:/Users/syao/.m2/repository/org/apache/hadoop/hadoop-mapreduce-client-common/2.4.0/hadoop-mapreduce-client-common-2.4.0.jar:/Users/syao/.m2/repository/org/apache/hadoop/hadoop-yarn-client/2.4.0/hadoop-yarn-client-2.4.0.jar:/Users/syao/.m2/repository/com/sun/jersey/jersey-client/1.9/jersey-client-1.9.jar:/Users/syao/.m2/repository/org/apache/hadoop/hadoop-yarn-server-common/2.4.0/hadoop-yarn-server-common-2.4.0.jar:/Users/syao/.m2/repository/org/apache/hadoop/hadoop-mapreduce-client-shuffle/2.4.0/hadoop-mapreduce-client-shuffle-2.4.0.jar:/Users/syao/.m2/repository/org/apache/hadoop/hadoop-yarn-api/2.4.0/hadoop-yarn-api-2.4.0.jar:/Users/syao/.m2/repository/org/apache/hadoop/hadoop-mapreduce-client-core/2.4.0/hadoop-mapreduce-client-core-2.4.0.jar:/Users/syao/.m2/repository/org/apache/hadoop/hadoop-yarn-common/2.4.0/hadoop-yarn-common-2.4.0.jar:/Users/syao/.m2/repository/javax/xml/bind/jaxb-api/2.2.2/jaxb-api-2.2.2.jar:/Users/syao/.m2/repository/javax/xml/stream/stax-api/1.0-2/stax-api-1.0-2.jar:/Users/syao/.m2/repository/javax/activation/activation/1.1/activation-1.1.jar:/Users/syao/.m2/repository/com/sun/jersey/jersey-core/1.9/jersey-core-1.9.jar:/Users/syao/.m2/repository/org/apache/hadoop/hadoop-mapreduce-client-jobclient/2.4.0/hadoop-mapreduce-client-jobclient-2.4.0.jar:/Users/syao/.m2/repository/org/apache/hadoop/hadoop-annotations/2.4.0/hadoop-annotations-2.4.0.jar:/Users/syao/.m2/repository/commons-codec/commons-codec/1.5/commons-codec-1.5.jar:/Users/syao/.m2/repository/org/apache/httpcomponents/httpclient/4.1.2/httpclient-4.1.2.jar:/Users/syao/.m2/repository/org/apache/httpcomponents/httpcore/4.1.2/httpcore-4.1.2.jar:/Users/syao/.m2/repository/org/apache/curator/curator-recipes/2.4.0/curator-recipes-2.4.0.jar:/Users/syao/.m2/repository/org/apache/curator/curator-framework/2.4.0/curator-framework-2.4.0.jar:/Users/syao/.m2/repository/org/apache/curator/curator-client/2.4.0/curator-client-2.4.0.jar:/Users/syao/.m2/repository/org/apache/zookeeper/zookeeper/3.4.5/zookeeper-3.4.5.jar:/Users/syao/.m2/repository/org/eclipse/jetty/jetty-plus/8.1.14.v20131031/jetty-plus-8.1.14.v20131031.jar:/Users/syao/.m2/repository/org/eclipse/jetty/orbit/javax.transaction/1.1.1.v201105210645/javax.transaction-1.1.1.v201105210645.jar:/Users/syao/.m2/repository/org/eclipse/jetty/jetty-webapp/8.1.14.v20131031/jetty-webapp-8.1.14.v20131031.jar:/Users/syao/.m2/repository/org/eclipse/jetty/jetty-jndi/8.1.14.v20131031/jetty-jndi-8.1.14.v20131031.jar:/Users/syao/.m2/repository/org/eclipse/jetty/orbit/javax.mail.glassfish/1.4.1.v201005082020/javax.mail.glassfish-1.4.1.v201005082020.jar:/Users/syao/.m2/repository/org/eclipse/jetty/orbit/javax.activation/1.1.0.v201105071233/javax.activation-1.1.0.v201105071233.jar:/Users/syao/.m2/repository/org/eclipse/jetty/jetty-security/8.1.14.v20131031/jetty-security-8.1.14.v20131031.jar:/Users/syao/.m2/repository/org/eclipse/jetty/jetty-util/8.1.14.v20131031/jetty-util-8.1.14.v20131031.jar:/Users/syao/.m2/repository/org/eclipse/jetty/jetty-server/8.1.14.v20131031/jetty-server-8.1.14.v20131031.jar:/Users/syao/.m2/repository/com/google/guava/guava/14.0.1/guava-14.0.1.jar:/Users/syao/.m2/repository/org/apache/commons/commons-lang3/3.3.2/commons-lang3-3.3.2.jar:/Users/syao/.m2/repository/com/google/code/findbugs/jsr305/1.3.9/jsr305-1.3.9.jar:/Users/syao/.m2/repository/org/slf4j/slf4j-api/1.7.5/slf4j-api-1.7.5.jar:/Users/syao/.m2/repository/org/slf4j/jul-to-slf4j/1.7.5/jul-to-slf4j-1.7.5.jar:/Users/syao/.m2/repository/org/slf4j/jcl-over-slf4j/1.7.5/jcl-over-slf4j-1.7.5.jar:/Users/syao/.m2/repository/log4j/log4j/1.2.17/log4j-1.2.17.jar:/Users/syao/.m2/repository/org/slf4j/slf4j-log4j12/1.7.5/slf4j-log4j12-1.7.5.jar:/Users/syao/.m2/repository/com/ning/compress-lzf/1.0.0/compress-lzf-1.0.0.jar:/Users/syao/.m2/repository/org/xerial/snappy/snappy-java/1.0.5/snappy-java-1.0.5.jar:/Users/syao/.m2/repository/com/twitter/chill_2.10/0.3.6/chill_2.10-0.3.6.jar:/Users/syao/.m2/repository/com/esotericsoftware/kryo/kryo/2.21/kryo-2.21.jar:/Users/syao/.m2/repository/com/esotericsoftware/reflectasm/reflectasm/1.07/reflectasm-1.07-shaded.jar:/Users/syao/.m2/repository/com/esotericsoftware/minlog/minlog/1.2/minlog-1.2.jar:/Users/syao/.m2/repository/org/objenesis/objenesis/1.2/objenesis-1.2.jar:/Users/syao/.m2/repository/com/twitter/chill-java/0.3.6/chill-java-0.3.6.jar:/Users/syao/.m2/repository/commons-net/commons-net/2.2/commons-net-2.2.jar:/Users/syao/.m2/repository/org/spark-project/akka/akka-remote_2.10/2.2.3-shaded-protobuf/akka-remote_2.10-2.2.3-shaded-protobuf.jar:/Users/syao/.m2/repository/org/spark-project/akka/akka-actor_2.10/2.2.3-shaded-protobuf/akka-actor_2.10-2.2.3-shaded-protobuf.jar:/Users/syao/.m2/repository/com/typesafe/config/1.0.2/config-1.0.2.jar:/Users/syao/.m2/repository/io/netty/netty/3.6.6.Final/netty-3.6.6.Final.jar:/Users/syao/.m2/repository/org/spark-project/protobuf/protobuf-java/2.4.1-shaded/protobuf-java-2.4.1-shaded.jar:/Users/syao/.m2/repository/org/uncommons/maths/uncommons-maths/1.2.2a/uncommons-maths-1.2.2a.jar:/Users/syao/.m2/repository/org/spark-project/akka/akka-slf4j_2.10/2.2.3-shaded-protobuf/akka-slf4j_2.10-2.2.3-shaded-protobuf.jar:/Users/syao/.m2/repository/org/json4s/json4s-jackson_2.10/3.2.6/json4s-jackson_2.10-3.2.6.jar:/Users/syao/.m2/repository/org/json4s/json4s-core_2.10/3.2.6/json4s-core_2.10-3.2.6.jar:/Users/syao/.m2/repository/org/json4s/json4s-ast_2.10/3.2.6/json4s-ast_2.10-3.2.6.jar:/Users/syao/.m2/repository/com/thoughtworks/paranamer/paranamer/2.6/paranamer-2.6.jar:/Users/syao/.m2/repository/org/scala-lang/scalap/2.10.4/scalap-2.10.4.jar:/Users/syao/.m2/repository/org/scala-lang/scala-compiler/2.10.4/scala-compiler-2.10.4.jar:/Users/syao/.m2/repository/com/fasterxml/jackson/core/jackson-databind/2.3.0/jackson-databind-2.3.0.jar:/Users/syao/.m2/repository/com/fasterxml/jackson/core/jackson-annotations/2.3.0/jackson-annotations-2.3.0.jar:/Users/syao/.m2/repository/com/fasterxml/jackson/core/jackson-core/2.3.0/jackson-core-2.3.0.jar:/Users/syao/.m2/repository/colt/colt/1.2.0/colt-1.2.0.jar:/Users/syao/.m2/repository/concurrent/concurrent/1.3.4/concurrent-1.3.4.jar:/Users/syao/.m2/repository/org/apache/mesos/mesos/0.18.1/mesos-0.18.1-shaded-protobuf.jar:/Users/syao/.m2/repository/io/netty/netty-all/4.0.17.Final/netty-all-4.0.17.Final.jar:/Users/syao/.m2/repository/com/clearspring/analytics/stream/2.5.1/stream-2.5.1.jar:/Users/syao/.m2/repository/com/codahale/metrics/metrics-core/3.0.0/metrics-core-3.0.0.jar:/Users/syao/.m2/repository/com/codahale/metrics/metrics-jvm/3.0.0/metrics-jvm-3.0.0.jar:/Users/syao/.m2/repository/com/codahale/metrics/metrics-json/3.0.0/metrics-json-3.0.0.jar:/Users/syao/.m2/repository/com/codahale/metrics/metrics-graphite/3.0.0/metrics-graphite-3.0.0.jar:/Users/syao/.m2/repository/org/tachyonproject/tachyon/0.4.1-thrift/tachyon-0.4.1-thrift.jar:/Users/syao/.m2/repository/org/apache/ant/ant/1.9.0/ant-1.9.0.jar:/Users/syao/.m2/repository/org/apache/ant/ant-launcher/1.9.0/ant-launcher-1.9.0.jar:/Users/syao/.m2/repository/commons-io/commons-io/2.4/commons-io-2.4.jar:/Users/syao/.m2/repository/org/scala-lang/scala-reflect/2.10.4/scala-reflect-2.10.4.jar:/Users/syao/.m2/repository/org/spark-project/pyrolite/2.0.1/pyrolite-2.0.1.jar:/Users/syao/.m2/repository/net/sf/py4j/py4j/0.8.1/py4j-0.8.1.jar
> > > -deprecation -feature -bootclasspath
> > >
> >
> /Library/Java/JavaVirtualMachines/jdk1.7.0_40.jdk/Contents/Home/jre/lib/resources.jar:/Library/Java/JavaVirtualMachines/jdk1.7.0_40.jdk/Contents/Home/jre/lib/rt.jar:/Library/Java/JavaVirtualMachines/jdk1.7.0_40.jdk/Contents/Home/jre/lib/sunrsasign.jar:/Library/Java/JavaVirtualMachines/jdk1.7.0_40.jdk/Contents/Home/jre/lib/jsse.jar:/Library/Java/JavaVirtualMachines/jdk1.7.0_40.jdk/Contents/Home/jre/lib/jce.jar:/Library/Java/JavaVirtualMachines/jdk1.7.0_40.jdk/Contents/Home/jre/lib/charsets.jar:/Library/Java/JavaVirtualMachines/jdk1.7.0_40.jdk/Contents/Home/jre/lib/jfr.jar:/Library/Java/JavaVirtualMachines/jdk1.7.0_40.jdk/Contents/Home/jre/lib/JObjC.jar:/Library/Java/JavaVirtualMachines/jdk1.7.0_40.jdk/Contents/Home/jre/classes:/Users/syao/.m2/repository/org/scala-lang/scala-library/2.10.4/scala-library-2.10.4.jar
> > > -unchecked -language:postfixOps
> > > [error]
> > > [error]   last tree to typer: Ident(Server)
> > > [error]               symbol: <none> (flags: )
> > > [error]    symbol definition: <none>
> > > [error]        symbol owners:
> > > [error]       context owners: variable server -> class HttpServer ->
> > > package spark
> > > [error]
> > > [error] == Enclosing template or block ==
> > > [error]
> > > [error] Template( // val <local HttpServer>: <notype> in class
> HttpServer
> > > [error]   "org.apache.spark.Logging" // parents
> > > [error]   ValDef(
> > > [error]     private
> > > [error]     "_"
> > > [error]     <tpt>
> > > [error]     <empty>
> > > [error]   )
> > > [error]   // 9 statements
> > > [error]   ValDef( // private[this] val resourceBase: <?> in class
> > HttpServer
> > > [error]     private <local> <paramaccessor>
> > > [error]     "resourceBase"
> > > [error]     "File"
> > > [error]     <empty>
> > > [error]   )
> > > [error]   ValDef( // private[this] val securityManager: <?> in class
> > > HttpServer
> > > [error]     private <local> <paramaccessor>
> > > [error]     "securityManager"
> > > [error]     "SecurityManager"
> > > [error]     <empty>
> > > [error]   )
> > > [error]   DefDef( // def <init>(resourceBase:
> > java.io.File,securityManager:
> > > org.apache.spark.SecurityManager): org.apache.spark.HttpServer in class
> > > HttpServer
> > > [error]     <method> <triedcooking>
> > > [error]     "<init>"
> > > [error]     []
> > > [error]     // 1 parameter list
> > > [error]     ValDef( // resourceBase: java.io.File
> > > [error]       <param> <paramaccessor>
> > > [error]       "resourceBase"
> > > [error]       "File"
> > > [error]       <empty>
> > > [error]     )
> > > [error]     ValDef( // securityManager:
> org.apache.spark.SecurityManager
> > > [error]       <param> <paramaccessor>
> > > [error]       "securityManager"
> > > [error]       "SecurityManager" // private[package spark] class
> > > SecurityManager extends Logging in package spark,
> > > tree.tpe=org.apache.spark.SecurityManager
> > > [error]       <empty>
> > > [error]     )
> > > [error]     <tpt> // tree.tpe=org.apache.spark.HttpServer
> > > [error]     Block(
> > > [error]       Apply(
> > > [error]         super."<init>"
> > > [error]         Nil
> > > [error]       )
> > > [error]       ()
> > > [error]     )
> > > [error]   )
> > > [error]   ValDef( // private[this] var server: <?> in class HttpServer
> > > [error]     private <mutable> <local>
> > > [error]     "server"
> > > [error]     "Server"
> > > [error]     null
> > > [error]   )
> > > [error]   ValDef( // private[this] var port: <?> in class HttpServer
> > > [error]     private <mutable> <local>
> > > [error]     "port"
> > > [error]     "Int"
> > > [error]     -1
> > > [error]   )
> > > [error]   DefDef( // def start(): Unit in class HttpServer
> > > [error]     <method> <triedcooking>
> > > [error]     "start"
> > > [error]     []
> > > [error]     List(Nil)
> > > [error]     "scala"."Unit" // final abstract class Unit extends AnyVal
> in
> > > package scala, tree.tpe=Unit
> > > [error]     If(
> > > [error]       Apply(
> > > [error]         "server"."$bang$eq"
> > > [error]         null
> > > [error]       )
> > > [error]       Throw(
> > > [error]         Apply(
> > > [error]           new ServerStateException."<init>"
> > > [error]           "Server is already started"
> > > [error]         )
> > > [error]       )
> > > [error]       Block(
> > > [error]         // 16 statements
> > > [error]         Apply(
> > > [error]           "logInfo"
> > > [error]           "Starting HTTP Server"
> > > [error]         )
> > > [error]         Assign(
> > > [error]           "server"
> > > [error]           Apply(
> > > [error]             new Server."<init>"
> > > [error]             Nil
> > > [error]           )
> > > [error]         )
> > > [error]         ValDef(
> > > [error]           0
> > > [error]           "connector"
> > > [error]           <tpt>
> > > [error]           Apply(
> > > [error]             new SocketConnector."<init>"
> > > [error]             Nil
> > > [error]           )
> > > [error]         )
> > > [error]         Apply(
> > > [error]           "connector"."setMaxIdleTime"
> > > [error]           Apply(
> > > [error]             60."$times"
> > > [error]             1000
> > > [error]           )
> > > [error]         )
> > > [error]         Apply(
> > > [error]           "connector"."setSoLingerTime"
> > > [error]           -1
> > > [error]         )
> > > [error]         Apply(
> > > [error]           "connector"."setPort"
> > > [error]           0
> > > [error]         )
> > > [error]         Apply(
> > > [error]           "server"."addConnector"
> > > [error]           "connector"
> > > [error]         )
> > > [error]         ValDef(
> > > [error]           0
> > > [error]           "threadPool"
> > > [error]           <tpt>
> > > [error]           Apply(
> > > [error]             new QueuedThreadPool."<init>"
> > > [error]             Nil
> > > [error]           )
> > > [error]         )
> > > [error]         Apply(
> > > [error]           "threadPool"."setDaemon"
> > > [error]           true
> > > [error]         )
> > > [error]         Apply(
> > > [error]           "server"."setThreadPool"
> > > [error]           "threadPool"
> > > [error]         )
> > > [error]         ValDef(
> > > [error]           0
> > > [error]           "resHandler"
> > > [error]           <tpt>
> > > [error]           Apply(
> > > [error]             new ResourceHandler."<init>"
> > > [error]             Nil
> > > [error]           )
> > > [error]         )
> > > [error]         Apply(
> > > [error]           "resHandler"."setResourceBase"
> > > [error]           "resourceBase"."getAbsolutePath"
> > > [error]         )
> > > [error]         ValDef(
> > > [error]           0
> > > [error]           "handlerList"
> > > [error]           <tpt>
> > > [error]           Apply(
> > > [error]             new HandlerList."<init>"
> > > [error]             Nil
> > > [error]           )
> > > [error]         )
> > > [error]         Apply(
> > > [error]           "handlerList"."setHandlers"
> > > [error]           Apply(
> > > [error]             "Array"
> > > [error]             // 2 arguments
> > > [error]             "resHandler"
> > > [error]             Apply(
> > > [error]               new DefaultHandler."<init>"
> > > [error]               Nil
> > > [error]             )
> > > [error]           )
> > > [error]         )
> > > [error]         If(
> > > [error]           Apply(
> > > [error]             "securityManager"."isAuthenticationEnabled"
> > > [error]             Nil
> > > [error]           )
> > > [error]           Block(
> > > [error]             // 3 statements
> > > [error]             Apply(
> > > [error]               "logDebug"
> > > [error]               "HttpServer is using security"
> > > [error]             )
> > > [error]             ValDef(
> > > [error]               0
> > > [error]               "sh"
> > > [error]               <tpt>
> > > [error]               Apply(
> > > [error]                 "setupSecurityHandler"
> > > [error]                 "securityManager"
> > > [error]               )
> > > [error]             )
> > > [error]             Apply(
> > > [error]               "sh"."setHandler"
> > > [error]               "handlerList"
> > > [error]             )
> > > [error]             Apply(
> > > [error]               "server"."setHandler"
> > > [error]               "sh"
> > > [error]             )
> > > [error]           )
> > > [error]           Block(
> > > [error]             Apply(
> > > [error]               "logDebug"
> > > [error]               "HttpServer is not using security"
> > > [error]             )
> > > [error]             Apply(
> > > [error]               "server"."setHandler"
> > > [error]               "handlerList"
> > > [error]             )
> > > [error]           )
> > > [error]         )
> > > [error]         Apply(
> > > [error]           "server"."start"
> > > [error]           Nil
> > > [error]         )
> > > [error]         Assign(
> > > [error]           "port"
> > > [error]           Apply(
> > > [error]             server.getConnectors()(0)."getLocalPort"
> > > [error]             Nil
> > > [error]           )
> > > [error]         )
> > > [error]       )
> > > [error]     )
> > > [error]   )
> > > [error]   DefDef( // private def setupSecurityHandler: <?> in class
> > > HttpServer
> > > [error]     <method> private
> > > [error]     "setupSecurityHandler"
> > > [error]     []
> > > [error]     // 1 parameter list
> > > [error]     ValDef(
> > > [error]       <param>
> > > [error]       "securityMgr"
> > > [error]       "SecurityManager"
> > > [error]       <empty>
> > > [error]     )
> > > [error]     "ConstraintSecurityHandler"
> > > [error]     Block(
> > > [error]       // 16 statements
> > > [error]       ValDef(
> > > [error]         0
> > > [error]         "constraint"
> > > [error]         <tpt>
> > > [error]         Apply(
> > > [error]           new Constraint."<init>"
> > > [error]           Nil
> > > [error]         )
> > > [error]       )
> > > [error]       Apply(
> > > [error]         "constraint"."setName"
> > > [error]         "Constraint"."__DIGEST_AUTH"
> > > [error]       )
> > > [error]       Apply(
> > > [error]         "constraint"."setRoles"
> > > [error]         Apply(
> > > [error]           "Array"
> > > [error]           "user"
> > > [error]         )
> > > [error]       )
> > > [error]       Apply(
> > > [error]         "constraint"."setAuthenticate"
> > > [error]         true
> > > [error]       )
> > > [error]       Apply(
> > > [error]         "constraint"."setDataConstraint"
> > > [error]         "Constraint"."DC_NONE"
> > > [error]       )
> > > [error]       ValDef(
> > > [error]         0
> > > [error]         "cm"
> > > [error]         <tpt>
> > > [error]         Apply(
> > > [error]           new ConstraintMapping."<init>"
> > > [error]           Nil
> > > [error]         )
> > > [error]       )
> > > [error]       Apply(
> > > [error]         "cm"."setConstraint"
> > > [error]         "constraint"
> > > [error]       )
> > > [error]       Apply(
> > > [error]         "cm"."setPathSpec"
> > > [error]         "/*"
> > > [error]       )
> > > [error]       ValDef(
> > > [error]         0
> > > [error]         "sh"
> > > [error]         <tpt>
> > > [error]         Apply(
> > > [error]           new ConstraintSecurityHandler."<init>"
> > > [error]           Nil
> > > [error]         )
> > > [error]       )
> > > [error]       ValDef(
> > > [error]         0
> > > [error]         "hashLogin"
> > > [error]         <tpt>
> > > [error]         Apply(
> > > [error]           new HashLoginService."<init>"
> > > [error]           Nil
> > > [error]         )
> > > [error]       )
> > > [error]       ValDef(
> > > [error]         0
> > > [error]         "userCred"
> > > [error]         <tpt>
> > > [error]         Apply(
> > > [error]           new Password."<init>"
> > > [error]           Apply(
> > > [error]             "securityMgr"."getSecretKey"
> > > [error]             Nil
> > > [error]           )
> > > [error]         )
> > > [error]       )
> > > [error]       If(
> > > [error]         Apply(
> > > [error]           "userCred"."$eq$eq"
> > > [error]           null
> > > [error]         )
> > > [error]         Throw(
> > > [error]           Apply(
> > > [error]             new Exception."<init>"
> > > [error]             "Error: secret key is null with authentication on"
> > > [error]           )
> > > [error]         )
> > > [error]         ()
> > > [error]       )
> > > [error]       Apply(
> > > [error]         "hashLogin"."putUser"
> > > [error]         // 3 arguments
> > > [error]         Apply(
> > > [error]           "securityMgr"."getHttpUser"
> > > [error]           Nil
> > > [error]         )
> > > [error]         "userCred"
> > > [error]         Apply(
> > > [error]           "Array"
> > > [error]           "user"
> > > [error]         )
> > > [error]       )
> > > [error]       Apply(
> > > [error]         "sh"."setLoginService"
> > > [error]         "hashLogin"
> > > [error]       )
> > > [error]       Apply(
> > > [error]         "sh"."setAuthenticator"
> > > [error]         Apply(
> > > [error]           new DigestAuthenticator."<init>"
> > > [error]           Nil
> > > [error]         )
> > > [error]       )
> > > [error]       Apply(
> > > [error]         "sh"."setConstraintMappings"
> > > [error]         Apply(
> > > [error]           "Array"
> > > [error]           "cm"
> > > [error]         )
> > > [error]       )
> > > [error]       "sh"
> > > [error]     )
> > > [error]   )
> > > [error]   DefDef( // def stop(): Unit in class HttpServer
> > > [error]     <method> <triedcooking>
> > > [error]     "stop"
> > > [error]     []
> > > [error]     List(Nil)
> > > [error]     "scala"."Unit" // final abstract class Unit extends AnyVal
> in
> > > package scala, tree.tpe=Unit
> > > [error]     If(
> > > [error]       Apply(
> > > [error]         "server"."$eq$eq"
> > > [error]         null
> > > [error]       )
> > > [error]       Throw(
> > > [error]         Apply(
> > > [error]           new ServerStateException."<init>"
> > > [error]           "Server is already stopped"
> > > [error]         )
> > > [error]       )
> > > [error]       Block(
> > > [error]         // 2 statements
> > > [error]         Apply(
> > > [error]           "server"."stop"
> > > [error]           Nil
> > > [error]         )
> > > [error]         Assign(
> > > [error]           "port"
> > > [error]           -1
> > > [error]         )
> > > [error]         Assign(
> > > [error]           "server"
> > > [error]           null
> > > [error]         )
> > > [error]       )
> > > [error]     )
> > > [error]   )
> > > [error]   DefDef( // def uri: String in class HttpServer
> > > [error]     <method> <triedcooking>
> > > [error]     "uri"
> > > [error]     []
> > > [error]     Nil
> > > [error]     "String"
> > > [error]     If(
> > > [error]       Apply(
> > > [error]         "server"."$eq$eq"
> > > [error]         null
> > > [error]       )
> > > [error]       Throw(
> > > [error]         Apply(
> > > [error]           new ServerStateException."<init>"
> > > [error]           "Server is not started"
> > > [error]         )
> > > [error]       )
> > > [error]       Return(
> > > [error]         Apply(
> > > [error]           "http://
> > ".$plus(Utils.localIpAddress).$plus(":")."$plus"
> > > [error]           "port"
> > > [error]         )
> > > [error]       )
> > > [error]     )
> > > [error]   )
> > > [error] )
> > > [error]
> > > [error] uncaught exception during compilation: java.lang.AssertionError
> > > java.lang.AssertionError: assertion failed:
> > javax.servlet.ServletException
> > >     at scala.reflect.internal.Symbols$Symbol.info(Symbols.scala:1212)
> > >     at
> > scala.reflect.internal.Symbols$Symbol.initialize(Symbols.scala:1374)
> > >     at
> > >
> >
> scala.tools.nsc.symtab.classfile.ClassfileParser.parseExceptions$1(ClassfileParser.scala:1051)
> > >     at
> > >
> >
> scala.tools.nsc.symtab.classfile.ClassfileParser.scala$tools$nsc$symtab$classfile$ClassfileParser$$parseAttribute$1(ClassfileParser.scala:920)
> > >     at
> > >
> >
> scala.tools.nsc.symtab.classfile.ClassfileParser.parseAttributes(ClassfileParser.scala:1080)
> > >     at
> > >
> >
> scala.tools.nsc.symtab.classfile.ClassfileParser.parseMethod(ClassfileParser.scala:666)
> > >     at
> > >
> >
> scala.tools.nsc.symtab.classfile.ClassfileParser.scala$tools$nsc$symtab$classfile$ClassfileParser$$queueLoad$1(ClassfileParser.scala:557)
> > >     at
> > >
> >
> scala.tools.nsc.symtab.classfile.ClassfileParser$$anonfun$parseClass$1.apply$mcV$sp(ClassfileParser.scala:567)
> > >     at
> > >
> >
> scala.tools.nsc.symtab.classfile.ClassfileParser.parseClass(ClassfileParser.scala:572)
> > >     at
> > >
> >
> scala.tools.nsc.symtab.classfile.ClassfileParser.parse(ClassfileParser.scala:88)
> > >     at
> > >
> >
> scala.tools.nsc.symtab.SymbolLoaders$ClassfileLoader.doComplete(SymbolLoaders.scala:261)
> > >     at
> > >
> >
> scala.tools.nsc.symtab.SymbolLoaders$SymbolLoader.complete(SymbolLoaders.scala:194)
> > >     at
> > >
> >
> scala.tools.nsc.symtab.SymbolLoaders$SymbolLoader.load(SymbolLoaders.scala:210)
> > >     at scala.reflect.internal.Symbols$Symbol.exists(Symbols.scala:893)
> > >     at
> > >
> scala.tools.nsc.typechecker.Typers$Typer.typedIdent$2(Typers.scala:5064)
> > >     at
> > >
> >
> scala.tools.nsc.typechecker.Typers$Typer.typedIdentOrWildcard$1(Typers.scala:5218)
> > >     at
> scala.tools.nsc.typechecker.Typers$Typer.typed1(Typers.scala:5561)
> > >     at
> scala.tools.nsc.typechecker.Typers$Typer.typed(Typers.scala:5642)
> > >     at
> > scala.tools.nsc.typechecker.Typers$Typer.typedType(Typers.scala:5769)
> > >     at
> > scala.tools.nsc.typechecker.Typers$Typer.typedType(Typers.scala:5772)
> > >     at
> > scala.tools.nsc.typechecker.Namers$Namer.valDefSig(Namers.scala:1317)
> > >     at
> > scala.tools.nsc.typechecker.Namers$Namer.getSig$1(Namers.scala:1457)
> > >     at
> > scala.tools.nsc.typechecker.Namers$Namer.typeSig(Namers.scala:1466)
> > >     at
> > >
> >
> scala.tools.nsc.typechecker.Namers$Namer$$anonfun$monoTypeCompleter$1$$anonfun$apply$1.apply$mcV$sp(Namers.scala:731)
> > >     at
> > >
> >
> scala.tools.nsc.typechecker.Namers$Namer$$anonfun$monoTypeCompleter$1$$anonfun$apply$1.apply(Namers.scala:730)
> > >     at
> > >
> >
> scala.tools.nsc.typechecker.Namers$Namer$$anonfun$monoTypeCompleter$1$$anonfun$apply$1.apply(Namers.scala:730)
> > >     at
> > >
> >
> scala.tools.nsc.typechecker.Namers$Namer.scala$tools$nsc$typechecker$Namers$Namer$$logAndValidate(Namers.scala:1499)
> > >     at
> > >
> >
> scala.tools.nsc.typechecker.Namers$Namer$$anonfun$monoTypeCompleter$1.apply(Namers.scala:730)
> > >     at
> > >
> >
> scala.tools.nsc.typechecker.Namers$Namer$$anonfun$monoTypeCompleter$1.apply(Namers.scala:729)
> > >     at
> > >
> >
> scala.tools.nsc.typechecker.Namers$$anon$1.completeImpl(Namers.scala:1614)
> > >     at
> > >
> >
> scala.tools.nsc.typechecker.Namers$LockingTypeCompleter$class.complete(Namers.scala:1622)
> > >     at
> > > scala.tools.nsc.typechecker.Namers$$anon$1.complete(Namers.scala:1612)
> > >     at scala.reflect.internal.Symbols$Symbol.info(Symbols.scala:1231)
> > >     at
> > scala.reflect.internal.Symbols$Symbol.initialize(Symbols.scala:1374)
> > >     at
> > >
> >
> scala.tools.nsc.typechecker.MethodSynthesis$MethodSynth$class.addDerivedTrees(MethodSynthesis.scala:225)
> > >     at
> > >
> scala.tools.nsc.typechecker.Namers$Namer.addDerivedTrees(Namers.scala:55)
> > >     at
> > >
> >
> scala.tools.nsc.typechecker.Typers$Typer$$anonfun$32.apply(Typers.scala:1917)
> > >     at
> > >
> >
> scala.tools.nsc.typechecker.Typers$Typer$$anonfun$32.apply(Typers.scala:1917)
> > >     at
> > >
> >
> scala.tools.nsc.typechecker.Typers$Typer$$anonfun$rewrappingWrapperTrees$1.apply(Typers.scala:1856)
> > >     at
> > >
> >
> scala.tools.nsc.typechecker.Typers$Typer$$anonfun$rewrappingWrapperTrees$1.apply(Typers.scala:1853)
> > >     at
> > >
> >
> scala.collection.TraversableLike$$anonfun$flatMap$1.apply(TraversableLike.scala:251)
> > >     at
> > >
> >
> scala.collection.TraversableLike$$anonfun$flatMap$1.apply(TraversableLike.scala:251)
> > >     at scala.collection.immutable.List.foreach(List.scala:318)
> > >     at
> > >
> scala.collection.TraversableLike$class.flatMap(TraversableLike.scala:251)
> > >     at
> > scala.collection.AbstractTraversable.flatMap(Traversable.scala:105)
> > >     at
> > >
> scala.tools.nsc.typechecker.Typers$Typer.typedTemplate(Typers.scala:1917)
> > >     at
> > >
> scala.tools.nsc.typechecker.Typers$Typer.typedClassDef(Typers.scala:1759)
> > >     at
> scala.tools.nsc.typechecker.Typers$Typer.typed1(Typers.scala:5583)
> > >     at
> scala.tools.nsc.typechecker.Typers$Typer.typed(Typers.scala:5642)
> > >     at
> > >
> >
> scala.tools.nsc.typechecker.Typers$Typer.scala$tools$nsc$typechecker$Typers$Typer$$typedStat$1(Typers.scala:2928)
> > >     at
> > >
> >
> scala.tools.nsc.typechecker.Typers$Typer$$anonfun$61.apply(Typers.scala:3032)
> > >     at
> > >
> >
> scala.tools.nsc.typechecker.Typers$Typer$$anonfun$61.apply(Typers.scala:3032)
> > >     at scala.collection.immutable.List.loop$1(List.scala:170)
> > >     at scala.collection.immutable.List.mapConserve(List.scala:186)
> > >     at
> > > scala.tools.nsc.typechecker.Typers$Typer.typedStats(Typers.scala:3032)
> > >     at
> > >
> >
> scala.tools.nsc.typechecker.Typers$Typer.typedPackageDef$1(Typers.scala:5301)
> > >     at
> scala.tools.nsc.typechecker.Typers$Typer.typed1(Typers.scala:5587)
> > >     at
> scala.tools.nsc.typechecker.Typers$Typer.typed(Typers.scala:5642)
> > >     at
> scala.tools.nsc.typechecker.Typers$Typer.typed(Typers.scala:5704)
> > >     at
> > >
> >
> scala.tools.nsc.typechecker.Analyzer$typerFactory$$anon$3.apply(Analyzer.scala:99)
> > >     at scala.tools.nsc.Global$GlobalPhase.applyPhase(Global.scala:464)
> > >     at
> > >
> >
> scala.tools.nsc.typechecker.Analyzer$typerFactory$$anon$3$$anonfun$run$1.apply(Analyzer.scala:91)
> > >     at
> > >
> >
> scala.tools.nsc.typechecker.Analyzer$typerFactory$$anon$3$$anonfun$run$1.apply(Analyzer.scala:91)
> > >     at scala.collection.Iterator$class.foreach(Iterator.scala:727)
> > >     at scala.collection.AbstractIterator.foreach(Iterator.scala:1157)
> > >     at
> > >
> >
> scala.tools.nsc.typechecker.Analyzer$typerFactory$$anon$3.run(Analyzer.scala:91)
> > >     at
> scala.tools.nsc.Global$Run.compileUnitsInternal(Global.scala:1583)
> > >     at scala.tools.nsc.Global$Run.compileUnits(Global.scala:1557)
> > >     at scala.tools.nsc.Global$Run.compileSources(Global.scala:1553)
> > >     at scala.tools.nsc.Global$Run.compile(Global.scala:1662)
> > >     at xsbt.CachedCompiler0.run(CompilerInterface.scala:123)
> > >     at xsbt.CachedCompiler0.run(CompilerInterface.scala:99)
> > >     at xsbt.CompilerInterface.run(CompilerInterface.scala:27)
> > >     at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
> > >     at
> > >
> >
> sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:57)
> > >     at
> > >
> >
> sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
> > >     at java.lang.reflect.Method.invoke(Method.java:606)
> > >     at sbt.compiler.AnalyzingCompiler.call(AnalyzingCompiler.scala:102)
> > >     at
> sbt.compiler.AnalyzingCompiler.compile(AnalyzingCompiler.scala:48)
> > >     at
> sbt.compiler.AnalyzingCompiler.compile(AnalyzingCompiler.scala:41)
> > >     at
> > >
> >
> sbt.compiler.AggressiveCompile$$anonfun$3$$anonfun$compileScala$1$1.apply$mcV$sp(AggressiveCompile.scala:99)
> > >     at
> > >
> >
> sbt.compiler.AggressiveCompile$$anonfun$3$$anonfun$compileScala$1$1.apply(AggressiveCompile.scala:99)
> > >     at
> > >
> >
> sbt.compiler.AggressiveCompile$$anonfun$3$$anonfun$compileScala$1$1.apply(AggressiveCompile.scala:99)
> > >     at
> > >
> >
> sbt.compiler.AggressiveCompile.sbt$compiler$AggressiveCompile$$timed(AggressiveCompile.scala:166)
> > >     at
> > >
> >
> sbt.compiler.AggressiveCompile$$anonfun$3.compileScala$1(AggressiveCompile.scala:98)
> > >     at
> > >
> >
> sbt.compiler.AggressiveCompile$$anonfun$3.apply(AggressiveCompile.scala:143)
> > >     at
> > >
> >
> sbt.compiler.AggressiveCompile$$anonfun$3.apply(AggressiveCompile.scala:87)
> > >     at
> > > sbt.inc.IncrementalCompile$$anonfun$doCompile$1.apply(Compile.scala:39)
> > >     at
> > > sbt.inc.IncrementalCompile$$anonfun$doCompile$1.apply(Compile.scala:37)
> > >     at sbt.inc.IncrementalCommon.cycle(Incremental.scala:99)
> > >     at sbt.inc.Incremental$$anonfun$1.apply(Incremental.scala:38)
> > >     at sbt.inc.Incremental$$anonfun$1.apply(Incremental.scala:37)
> > >     at sbt.inc.Incremental$.manageClassfiles(Incremental.scala:65)
> > >     at sbt.inc.Incremental$.compile(Incremental.scala:37)
> > >     at sbt.inc.IncrementalCompile$.apply(Compile.scala:27)
> > >     at
> > sbt.compiler.AggressiveCompile.compile2(AggressiveCompile.scala:157)
> > >     at
> > sbt.compiler.AggressiveCompile.compile1(AggressiveCompile.scala:71)
> > >     at com.typesafe.zinc.Compiler.compile(Compiler.scala:184)
> > >     at com.typesafe.zinc.Main$.run(Main.scala:98)
> > >     at com.typesafe.zinc.Nailgun$.zinc(Nailgun.scala:93)
> > >     at com.typesafe.zinc.Nailgun$.nailMain(Nailgun.scala:82)
> > >     at com.typesafe.zinc.Nailgun.nailMain(Nailgun.scala)
> > >     at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
> > >     at
> > >
> >
> sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:57)
> > >     at
> > >
> >
> sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
> > >     at java.lang.reflect.Method.invoke(Method.java:606)
> > >     at com.martiansoftware.nailgun.NGSession.run(NGSession.java:280)
> > > [INFO]
> > >
> ------------------------------------------------------------------------
> > > [INFO] Reactor Summary:
> > > [INFO]
> > > [INFO] Spark Project Parent POM ........................... SUCCESS [
> > > 1.760 s]
> > > [INFO] Spark Project Core ................................. FAILURE [
> > > 5.312 s]
> > > [INFO] Spark Project Bagel ................................ SKIPPED
> > > [INFO] Spark Project GraphX ............................... SKIPPED
> > > [INFO] Spark Project ML Library ........................... SKIPPED
> > > [INFO] Spark Project Streaming ............................ SKIPPED
> > > [INFO] Spark Project Tools ................................ SKIPPED
> > > [INFO] Spark Project Catalyst ............................. SKIPPED
> > > [INFO] Spark Project SQL .................................. SKIPPED
> > > [INFO] Spark Project Hive ................................. SKIPPED
> > > [INFO] Spark Project REPL ................................. SKIPPED
> > > [INFO] Spark Project YARN Parent POM ...................... SKIPPED
> > > [INFO] Spark Project YARN Stable API ...................... SKIPPED
> > > [INFO] Spark Project Assembly ............................. SKIPPED
> > > [INFO] Spark Project External Twitter ..................... SKIPPED
> > > [INFO] Spark Project External Kafka ....................... SKIPPED
> > > [INFO] Spark Project External Flume ....................... SKIPPED
> > > [INFO] Spark Project External ZeroMQ ...................... SKIPPED
> > > [INFO] Spark Project External MQTT ........................ SKIPPED
> > > [INFO] Spark Project Examples ............................. SKIPPED
> > > [INFO]
> > >
> ------------------------------------------------------------------------
> > > [INFO] BUILD FAILURE
> > > [INFO]
> > >
> ------------------------------------------------------------------------
> > > [INFO] Total time: 7.562 s
> > > [INFO] Finished at: 2014-07-30T21:18:41-07:00
> > > [INFO] Final Memory: 39M/713M
> > > [INFO]
> > >
> ------------------------------------------------------------------------
> > > [ERROR] Failed to execute goal
> > > net.alchim31.maven:scala-maven-plugin:3.1.6:compile
> (scala-compile-first)
> > > on project spark-core_2.10: Execution scala-compile-first of goal
> > > net.alchim31.maven:scala-maven-plugin:3.1.6:compile failed.
> CompileFailed
> > > -> [Help 1]
> > >
> > > Thanks
> > > Shengzhe
> >
>

Reply via email to