By the way - we can report issues to the Scala/Typesafe team if we
have a way to reproduce this. I just haven't found a reliable
reproduction yet.

- Patrick

On Sun, Nov 2, 2014 at 7:48 PM, Stephen Boesch <java...@gmail.com> wrote:
> Yes I have seen this same error - and for team members as well - repeatedly
> since June. A Patrick and Cheng mentioned, the next step is to do an sbt
> clean
>
> 2014-11-02 19:37 GMT-08:00 Cheng Lian <lian.cs....@gmail.com>:
>
>> I often see this when I first build the whole Spark project with SBT, then
>> modify some code and tries to build and debug within IDEA, or vice versa.
>> A
>> clean rebuild can always solve this.
>>
>> On Mon, Nov 3, 2014 at 11:28 AM, Patrick Wendell <pwend...@gmail.com>
>> wrote:
>>
>> > Does this happen if you clean and recompile? I've seen failures on and
>> > off, but haven't been able to find one that I could reproduce from a
>> > clean build such that we could hand it to the scala team.
>> >
>> > - Patrick
>> >
>> > On Sun, Nov 2, 2014 at 7:25 PM, Imran Rashid <im...@therashids.com>
>> > wrote:
>> > > I'm finding the scala compiler crashes when I compile the spark-sql
>> > project
>> > > in sbt.  This happens in both the 1.1 branch and master (full error
>> > > below).  The other projects build fine in sbt, and everything builds
>> > > fine
>> > > in maven.  is there some sbt option I'm forgetting?  Any one else
>> > > experiencing this?
>> > >
>> > > Also, are there up-to-date instructions on how to do common dev tasks
>> > > in
>> > > both sbt & maven?  I have only found these instructions on building
>> > > with
>> > > maven:
>> > >
>> > > http://spark.apache.org/docs/latest/building-with-maven.html
>> > >
>> > > and some general info here:
>> > >
>> > >
>> > > https://cwiki.apache.org/confluence/display/SPARK/Contributing+to+Spark
>> > >
>> > > but I think this doesn't walk through a lot of the steps of a typical
>> > > dev
>> > > cycle, eg, continuous compilation, running one test, running one main
>> > > class, etc.  (especially since it seems like people still favor sbt
>> > > for
>> > > dev.)  If it doesn't already exist somewhere, I could try to put
>> > together a
>> > > brief doc for how to do the basics.  (I'm returning to spark dev after
>> > > a
>> > > little hiatus myself, and I'm hitting some stumbling blocks that are
>> > > probably common knowledge to everyone still dealing with it all the
>> > time.)
>> > >
>> > > thanks,
>> > > Imran
>> > >
>> > > ------------------------------
>> > > full crash info from sbt:
>> > >
>> > >> project sql
>> > > [info] Set current project to spark-sql (in build
>> > > file:/Users/imran/spark/spark/)
>> > >> compile
>> > > [info] Compiling 62 Scala sources to
>> > > /Users/imran/spark/spark/sql/catalyst/target/scala-2.10/classes...
>> > > [info] Compiling 45 Scala sources and 39 Java sources to
>> > > /Users/imran/spark/spark/sql/core/target/scala-2.10/classes...
>> > > [error]
>> > > [error]      while compiling:
>> > >
>> >
>> > /Users/imran/spark/spark/sql/core/src/main/scala/org/apache/spark/sql/types/util/DataTypeConversions.scala
>> > > [error]         during phase: jvm
>> > > [error]      library version: version 2.10.4
>> > > [error]     compiler version: version 2.10.4
>> > > [error]   reconstructed args: -classpath
>> > >
>> >
>> > /Users/imran/spark/spark/sql/core/target/scala-2.10/classes:/Users/imran/spark/spark/core/target/scala-2.10/classes:/Users/imran/spark/spark/sql/catalyst/target/scala-2.10/classes:/Users/imran/spark/spark/lib_managed/jars/hadoop-client-1.0.4.jar:/Users/imran/spark/spark/lib_managed/jars/hadoop-core-1.0.4.jar:/Users/imran/spark/spark/lib_managed/jars/xmlenc-0.52.jar:/Users/imran/spark/spark/lib_managed/jars/commons-math-2.1.jar:/Users/imran/spark/spark/lib_managed/jars/commons-configuration-1.6.jar:/Users/imran/spark/spark/lib_managed/jars/commons-collections-3.2.1.jar:/Users/imran/spark/spark/lib_managed/jars/commons-lang-2.4.jar:/Users/imran/spark/spark/lib_managed/jars/commons-logging-1.1.1.jar:/Users/imran/spark/spark/lib_managed/jars/commons-digester-1.8.jar:/Users/imran/spark/spark/lib_managed/jars/commons-beanutils-1.7.0.jar:/Users/imran/spark/spark/lib_managed/jars/commons-beanutils-core-1.8.0.jar:/Users/imran/spark/spark/lib_managed/jars/commons-net-2.2.jar:/Users/imran/spark/spark/lib_managed/jars/commons-el-1.0.jar:/Users/imran/spark/spark/lib_managed/jars/hsqldb-1.8.0.10.jar:/Users/imran/spark/spark/lib_managed/jars/oro-2.0.8.jar:/Users/imran/spark/spark/lib_managed/jars/jets3t-0.7.1.jar:/Users/imran/spark/spark/lib_managed/jars/commons-httpclient-3.1.jar:/Users/imran/spark/spark/lib_managed/bundles/curator-recipes-2.4.0.jar:/Users/imran/spark/spark/lib_managed/bundles/curator-framework-2.4.0.jar:/Users/imran/spark/spark/lib_managed/bundles/curator-client-2.4.0.jar:/Users/imran/spark/spark/lib_managed/jars/zookeeper-3.4.5.jar:/Users/imran/spark/spark/lib_managed/jars/slf4j-log4j12-1.7.5.jar:/Users/imran/spark/spark/lib_managed/bundles/log4j-1.2.17.jar:/Users/imran/spark/spark/lib_managed/jars/jline-0.9.94.jar:/Users/imran/spark/spark/lib_managed/bundles/guava-14.0.1.jar:/Users/imran/spark/spark/lib_managed/jars/jetty-plus-8.1.14.v20131031.jar:/Users/imran/spark/spark/lib_managed/orbits/javax.transaction-1.1.1.v201105210645.jar:/Users/imran/spark/spark/lib_managed/jars/jetty-webapp-8.1.14.v20131031.jar:/Users/imran/spark/spark/lib_managed/jars/jetty-xml-8.1.14.v20131031.jar:/Users/imran/spark/spark/lib_managed/jars/jetty-util-8.1.14.v20131031.jar:/Users/imran/spark/spark/lib_managed/jars/jetty-servlet-8.1.14.v20131031.jar:/Users/imran/spark/spark/lib_managed/jars/jetty-security-8.1.14.v20131031.jar:/Users/imran/spark/spark/lib_managed/jars/jetty-server-8.1.14.v20131031.jar:/Users/imran/spark/spark/lib_managed/orbits/javax.servlet-3.0.0.v201112011016.jar:/Users/imran/spark/spark/lib_managed/jars/jetty-continuation-8.1.14.v20131031.jar:/Users/imran/spark/spark/lib_managed/jars/jetty-http-8.1.14.v20131031.jar:/Users/imran/spark/spark/lib_managed/jars/jetty-io-8.1.14.v20131031.jar:/Users/imran/spark/spark/lib_managed/jars/jetty-jndi-8.1.14.v20131031.jar:/Users/imran/spark/spark/lib_managed/orbits/javax.mail.glassfish-1.4.1.v201005082020.jar:/Users/imran/spark/spark/lib_managed/orbits/javax.activation-1.1.0.v201105071233.jar:/Users/imran/spark/spark/lib_managed/jars/commons-lang3-3.3.2.jar:/Users/imran/spark/spark/lib_managed/jars/jsr305-1.3.9.jar:/Users/imran/spark/spark/lib_managed/jars/slf4j-api-1.7.5.jar:/Users/imran/spark/spark/lib_managed/jars/jul-to-slf4j-1.7.5.jar:/Users/imran/spark/spark/lib_managed/jars/jcl-over-slf4j-1.7.5.jar:/Users/imran/spark/spark/lib_managed/bundles/compress-lzf-1.0.0.jar:/Users/imran/spark/spark/lib_managed/bundles/snappy-java-1.0.5.3.jar:/Users/imran/spark/spark/lib_managed/jars/lz4-1.2.0.jar:/Users/imran/spark/spark/lib_managed/jars/chill_2.10-0.3.6.jar:/Users/imran/spark/spark/lib_managed/jars/chill-java-0.3.6.jar:/Users/imran/spark/spark/lib_managed/bundles/kryo-2.21.jar:/Users/imran/spark/spark/lib_managed/jars/reflectasm-1.07-shaded.jar:/Users/imran/spark/spark/lib_managed/jars/minlog-1.2.jar:/Users/imran/spark/spark/lib_managed/jars/objenesis-1.2.jar:/Users/imran/spark/spark/lib_managed/bundles/akka-remote_2.10-2.2.3-shaded-protobuf.jar:/Users/imran/spark/spark/lib_managed/jars/akka-actor_2.10-2.2.3-shaded-protobuf.jar:/Users/imran/spark/spark/lib_managed/bundles/config-1.0.2.jar:/Users/imran/spark/spark/lib_managed/bundles/netty-3.6.6.Final.jar:/Users/imran/spark/spark/lib_managed/jars/protobuf-java-2.4.1-shaded.jar:/Users/imran/spark/spark/lib_managed/jars/uncommons-maths-1.2.2a.jar:/Users/imran/spark/spark/lib_managed/bundles/akka-slf4j_2.10-2.2.3-shaded-protobuf.jar:/Users/imran/spark/spark/lib_managed/jars/json4s-jackson_2.10-3.2.10.jar:/Users/imran/spark/spark/lib_managed/jars/json4s-core_2.10-3.2.10.jar:/Users/imran/spark/spark/lib_managed/jars/json4s-ast_2.10-3.2.10.jar:/Users/imran/spark/spark/lib_managed/jars/paranamer-2.6.jar:/Users/imran/spark/spark/lib_managed/jars/scalap-2.10.0.jar:/Users/imran/spark/spark/lib_managed/bundles/jackson-databind-2.3.1.jar:/Users/imran/spark/spark/lib_managed/bundles/jackson-annotations-2.3.0.jar:/Users/imran/spark/spark/lib_managed/bundles/jackson-core-2.3.1.jar:/Users/imran/spark/spark/lib_managed/jars/colt-1.2.0.jar:/Users/imran/spark/spark/lib_managed/jars/concurrent-1.3.4.jar:/Users/imran/spark/spark/lib_managed/jars/mesos-0.18.1-shaded-protobuf.jar:/Users/imran/spark/spark/lib_managed/jars/netty-all-4.0.23.Final.jar:/Users/imran/spark/spark/lib_managed/jars/stream-2.7.0.jar:/Users/imran/spark/spark/lib_managed/bundles/metrics-core-3.0.0.jar:/Users/imran/spark/spark/lib_managed/bundles/metrics-jvm-3.0.0.jar:/Users/imran/spark/spark/lib_managed/bundles/metrics-json-3.0.0.jar:/Users/imran/spark/spark/lib_managed/bundles/metrics-graphite-3.0.0.jar:/Users/imran/spark/spark/lib_managed/jars/tachyon-client-0.5.0.jar:/Users/imran/spark/spark/lib_managed/jars/tachyon-0.5.0.jar:/Users/imran/spark/spark/lib_managed/jars/commons-io-2.4.jar:/Users/imran/spark/spark/lib_managed/jars/pyrolite-2.0.1.jar:/Users/imran/spark/spark/lib_managed/jars/py4j-0.8.2.1.jar:/Users/imran/.sbt/boot/scala-2.10.4/lib/scala-compiler.jar:/Users/imran/.sbt/boot/scala-2.10.4/lib/scala-reflect.jar:/Users/imran/spark/spark/lib_managed/jars/quasiquotes_2.10-2.0.1.jar:/Users/imran/spark/spark/lib_managed/jars/parquet-column-1.4.3.jar:/Users/imran/spark/spark/lib_managed/jars/parquet-common-1.4.3.jar:/Users/imran/spark/spark/lib_managed/jars/parquet-encoding-1.4.3.jar:/Users/imran/spark/spark/lib_managed/jars/parquet-generator-1.4.3.jar:/Users/imran/spark/spark/lib_managed/jars/commons-codec-1.5.jar:/Users/imran/spark/spark/lib_managed/jars/parquet-hadoop-1.4.3.jar:/Users/imran/spark/spark/lib_managed/jars/parquet-format-2.0.0.jar:/Users/imran/spark/spark/lib_managed/jars/parquet-jackson-1.4.3.jar:/Users/imran/spark/spark/lib_managed/jars/jackson-mapper-asl-1.9.11
>> > >
>> >
>> > .jar:/Users/imran/spark/spark/lib_managed/jars/jackson-core-asl-1.9.11.jar
>> > > -deprecation -feature
>> > > -P:genjavadoc:out=/Users/imran/spark/spark/sql/core/target/java
>> > >
>> >
>> > -Xplugin:/Users/imran/spark/spark/lib_managed/jars/genjavadoc-plugin_2.10.4-0.7.jar
>> > > -bootclasspath
>> > >
>> >
>> > /Library/Java/JavaVirtualMachines/jdk1.7.0_51.jdk/Contents/Home/jre/lib/resources.jar:/Library/Java/JavaVirtualMachines/jdk1.7.0_51.jdk/Contents/Home/jre/lib/rt.jar:/Library/Java/JavaVirtualMachines/jdk1.7.0_51.jdk/Contents/Home/jre/lib/sunrsasign.jar:/Library/Java/JavaVirtualMachines/jdk1.7.0_51.jdk/Contents/Home/jre/lib/jsse.jar:/Library/Java/JavaVirtualMachines/jdk1.7.0_51.jdk/Contents/Home/jre/lib/jce.jar:/Library/Java/JavaVirtualMachines/jdk1.7.0_51.jdk/Contents/Home/jre/lib/charsets.jar:/Library/Java/JavaVirtualMachines/jdk1.7.0_51.jdk/Contents/Home/jre/lib/jfr.jar:/Library/Java/JavaVirtualMachines/jdk1.7.0_51.jdk/Contents/Home/jre/classes:/Users/imran/.sbt/boot/scala-2.10.4/lib/scala-library.jar
>> > > -unchecked -language:postfixOps
>> > > [error]
>> > > [error]   last tree to typer:
>> > > Literal(Constant(org.apache.spark.sql.catalyst.types.PrimitiveType))
>> > > [error]               symbol: null
>> > > [error]    symbol definition: null
>> > > [error]                  tpe:
>> > > Class(classOf[org.apache.spark.sql.catalyst.types.PrimitiveType])
>> > > [error]        symbol owners:
>> > > [error]       context owners: anonymous class
>> > > anonfun$asScalaDataType$1
>> > ->
>> > > package util
>> > > [error]
>> > > [error] == Enclosing template or block ==
>> > > [error]
>> > > [error] Template( // val <local $anonfun>: <notype>,
>> > > tree.tpe=org.apache.spark.sql.types.util.anonfun$asScalaDataType$1
>> > > [error]   "scala.runtime.AbstractFunction1", "scala.Serializable" //
>> > parents
>> > > [error]   ValDef(
>> > > [error]     private
>> > > [error]     "_"
>> > > [error]     <tpt>
>> > > [error]     <empty>
>> > > [error]   )
>> > > [error]   // 3 statements
>> > > [error]   DefDef( // final def apply(javaStructField:
>> > > org.apache.spark.sql.api.java.StructField):
>> > > org.apache.spark.sql.catalyst.types.StructField
>> > > [error]     <method> final <triedcooking>
>> > > [error]     "apply"
>> > > [error]     []
>> > > [error]     // 1 parameter list
>> > > [error]     ValDef( // javaStructField:
>> > > org.apache.spark.sql.api.java.StructField
>> > > [error]       <param> <synthetic> <triedcooking>
>> > > [error]       "javaStructField"
>> > > [error]       <tpt> //
>> > > tree.tpe=org.apache.spark.sql.api.java.StructField
>> > > [error]       <empty>
>> > > [error]     )
>> > > [error]     <tpt> //
>> > > tree.tpe=org.apache.spark.sql.catalyst.types.StructField
>> > > [error]     Apply( // def asScalaStructField(javaStructField:
>> > > org.apache.spark.sql.api.java.StructField):
>> > > org.apache.spark.sql.catalyst.types.StructField in object
>> > > DataTypeConversions,
>> > > tree.tpe=org.apache.spark.sql.catalyst.types.StructField
>> > > [error]       DataTypeConversions.this."asScalaStructField" // def
>> > > asScalaStructField(javaStructField:
>> > > org.apache.spark.sql.api.java.StructField):
>> > > org.apache.spark.sql.catalyst.types.StructField in object
>> > > DataTypeConversions, tree.tpe=(javaStructField:
>> > >
>> >
>> > org.apache.spark.sql.api.java.StructField)org.apache.spark.sql.catalyst.types.StructField
>> > > [error]       "javaStructField" // javaStructField:
>> > > org.apache.spark.sql.api.java.StructField,
>> > > tree.tpe=org.apache.spark.sql.api.java.StructField
>> > > [error]     )
>> > > [error]   )
>> > > [error]   DefDef( // final def apply(v1: Object): Object
>> > > [error]     <method> final <bridge>
>> > > [error]     "apply"
>> > > [error]     []
>> > > [error]     // 1 parameter list
>> > > [error]     ValDef( // v1: Object
>> > > [error]       <param> <triedcooking>
>> > > [error]       "v1"
>> > > [error]       <tpt> // tree.tpe=Object
>> > > [error]       <empty>
>> > > [error]     )
>> > > [error]     <tpt> // tree.tpe=Object
>> > > [error]     Apply( // final def apply(javaStructField:
>> > > org.apache.spark.sql.api.java.StructField):
>> > > org.apache.spark.sql.catalyst.types.StructField,
>> > > tree.tpe=org.apache.spark.sql.catalyst.types.StructField
>> > > [error]
>> > > DataTypeConversions$$anonfun$asScalaDataType$1.this."apply"
>> > > // final def apply(javaStructField:
>> > > org.apache.spark.sql.api.java.StructField):
>> > > org.apache.spark.sql.catalyst.types.StructField,
>> > tree.tpe=(javaStructField:
>> > >
>> >
>> > org.apache.spark.sql.api.java.StructField)org.apache.spark.sql.catalyst.types.StructField
>> > > [error]       Apply( // final def $asInstanceOf[T0 >: ? <: ?](): T0 in
>> > > class Object, tree.tpe=org.apache.spark.sql.api.java.StructField
>> > > [error]         TypeApply( // final def $asInstanceOf[T0 >: ? <: ?]():
>> > > T0
>> > > in class Object, tree.tpe=()org.apache.spark.sql.api.java.StructField
>> > > [error]           "v1"."$asInstanceOf" // final def $asInstanceOf[T0
>> > > >: ?
>> > > <: ?](): T0 in class Object, tree.tpe=[T0 >: ? <: ?]()T0
>> > > [error]           <tpt> //
>> > > tree.tpe=org.apache.spark.sql.api.java.StructField
>> > > [error]         )
>> > > [error]         Nil
>> > > [error]       )
>> > > [error]     )
>> > > [error]   )
>> > > [error]   DefDef( // def <init>():
>> > > org.apache.spark.sql.types.util.anonfun$asScalaDataType$1
>> > > [error]     <method> <triedcooking>
>> > > [error]     "<init>"
>> > > [error]     []
>> > > [error]     List(Nil)
>> > > [error]     <tpt> //
>> > > tree.tpe=org.apache.spark.sql.types.util.anonfun$asScalaDataType$1
>> > > [error]     Block( // tree.tpe=Unit
>> > > [error]       Apply( // def <init>(): scala.runtime.AbstractFunction1
>> > > in
>> > > class AbstractFunction1, tree.tpe=scala.runtime.AbstractFunction1
>> > > [error]
>> > > DataTypeConversions$$anonfun$asScalaDataType$1.super."<init>" // def
>> > > <init>(): scala.runtime.AbstractFunction1 in class AbstractFunction1,
>> > > tree.tpe=()scala.runtime.AbstractFunction1
>> > > [error]         Nil
>> > > [error]       )
>> > > [error]       ()
>> > > [error]     )
>> > > [error]   )
>> > > [error] )
>> > > [error]
>> > > [error] == Expanded type of tree ==
>> > > [error]
>> > > [error] ConstantType(
>> > > [error]   value =
>> > > Constant(org.apache.spark.sql.catalyst.types.PrimitiveType)
>> > > [error] )
>> > > [error]
>> > > [error] uncaught exception during compilation:
>> > > java.lang.AssertionError
>> > > [trace] Stack trace suppressed: run last sql/compile:compile for the
>> > > full
>> > > output.
>> > > [error] (sql/compile:compile) java.lang.AssertionError: assertion
>> > > failed:
>> > > List(object package$DebugNode, object package$DebugNode)
>> > > [error] Total time: 23 s, completed Nov 2, 2014 1:00:37 PM
>> >
>> > ---------------------------------------------------------------------
>> > To unsubscribe, e-mail: dev-unsubscr...@spark.apache.org
>> > For additional commands, e-mail: dev-h...@spark.apache.org
>> >
>> >
>
>

---------------------------------------------------------------------
To unsubscribe, e-mail: dev-unsubscr...@spark.apache.org
For additional commands, e-mail: dev-h...@spark.apache.org

Reply via email to