Re: scalac crash when compiling DataTypeConversions.scala

2014-10-27 Thread guoxu1231
Hi Stephen, 
I tried it again, 
To avoid the profile impact,  I execute mvn -DskipTests clean package with
Hadoop 1.0.4 by default and open the IDEA and import it as a maven project,
and I didn't choose any profile in the import wizard.
Then Make project or re-build project in IDEA,  unfortunately the
DataTypeConversions.scala compile failed agian.


Any updated guide for using With IntelliJ IDEA?  I'm following the Building
Spark with Maven in website.   



--
View this message in context: 
http://apache-spark-user-list.1001560.n3.nabble.com/scalac-crash-when-compiling-DataTypeConversions-scala-tp17083p17333.html
Sent from the Apache Spark User List mailing list archive at Nabble.com.

-
To unsubscribe, e-mail: user-unsubscr...@spark.apache.org
For additional commands, e-mail: user-h...@spark.apache.org



Re: scalac crash when compiling DataTypeConversions.scala

2014-10-27 Thread Yana Kadiyska
guoxu1231, I struggled with the Idea problem for a full week. Same thing --
clean builds under MVN/Sbt, but no luck with IDEA. What worked for me was
the solution posted higher up in this thread -- it's a SO post that
basically says to delete all iml files anywhere under the project
directory.

Let me know if you can't see this mail and I'll locate the exact SO post

On Mon, Oct 27, 2014 at 5:15 AM, guoxu1231 guoxu1...@gmail.com wrote:

 Hi Stephen,
 I tried it again,
 To avoid the profile impact,  I execute mvn -DskipTests clean package
 with
 Hadoop 1.0.4 by default and open the IDEA and import it as a maven project,
 and I didn't choose any profile in the import wizard.
 Then Make project or re-build project in IDEA,  unfortunately the
 DataTypeConversions.scala compile failed agian.


 Any updated guide for using With IntelliJ IDEA?  I'm following the
 Building
 Spark with Maven in website.



 --
 View this message in context:
 http://apache-spark-user-list.1001560.n3.nabble.com/scalac-crash-when-compiling-DataTypeConversions-scala-tp17083p17333.html
 Sent from the Apache Spark User List mailing list archive at Nabble.com.

 -
 To unsubscribe, e-mail: user-unsubscr...@spark.apache.org
 For additional commands, e-mail: user-h...@spark.apache.org




Re: scalac crash when compiling DataTypeConversions.scala

2014-10-27 Thread Soumya Simanta
You need to change the Scala compiler from IntelliJ to “sbt incremental
compiler” (see the

screenshot below).



You can access this by going to “preferences” ­ “scala”.

NOTE: This is supported only for certain version of IntelliJ scala plugin.
See this link for details.

http://blog.jetbrains.com/scala/2014/01/30/try
­faster­scala­compiler­in­intellij­idea­13­0­2/

On Mon, Oct 27, 2014 at 9:04 PM, Yana Kadiyska yana.kadiy...@gmail.com
wrote:

 guoxu1231, I struggled with the Idea problem for a full week. Same thing
 -- clean builds under MVN/Sbt, but no luck with IDEA. What worked for me
 was the solution posted higher up in this thread -- it's a SO post that
 basically says to delete all iml files anywhere under the project
 directory.

 Let me know if you can't see this mail and I'll locate the exact SO post

 On Mon, Oct 27, 2014 at 5:15 AM, guoxu1231 guoxu1...@gmail.com wrote:

 Hi Stephen,
 I tried it again,
 To avoid the profile impact,  I execute mvn -DskipTests clean package
 with
 Hadoop 1.0.4 by default and open the IDEA and import it as a maven
 project,
 and I didn't choose any profile in the import wizard.
 Then Make project or re-build project in IDEA,  unfortunately the
 DataTypeConversions.scala compile failed agian.


 Any updated guide for using With IntelliJ IDEA?  I'm following the
 Building
 Spark with Maven in website.



 --
 View this message in context:
 http://apache-spark-user-list.1001560.n3.nabble.com/scalac-crash-when-compiling-DataTypeConversions-scala-tp17083p17333.html
 Sent from the Apache Spark User List mailing list archive at Nabble.com.

 -
 To unsubscribe, e-mail: user-unsubscr...@spark.apache.org
 For additional commands, e-mail: user-h...@spark.apache.org





Re: scalac crash when compiling DataTypeConversions.scala

2014-10-26 Thread guoxu1231
Any update?

I encountered same issue in my environment.

Here are my steps as usual: 
git clone https://github.com/apache/spark
mvn -Pyarn -Phadoop-2.4 -Dhadoop.version=2.4.0 -Phive -DskipTests clean
package

build successfully by maven.

import into IDEA as a maven project, click Build-Make Project,.

2 compile errors found, one of them are DataTypeConversions.scala





Error:scala: 
 while compiling:
/opt/Development/spark/sql/core/src/main/scala/org/apache/spark/sql/types/util/DataTypeConversions.scala
during phase: jvm
 library version: version 2.10.4
compiler version: version 2.10.4
  reconstructed args: -nobootcp -javabootclasspath : -deprecation -feature
-Xplugin:/home/shawguo/host/maven_repo/org/scalamacros/paradise_2.10.4/2.0.1/paradise_2.10.4-2.0.1.jar
-unchecked -classpath

Re: scalac crash when compiling DataTypeConversions.scala

2014-10-26 Thread Stephen Boesch
Yes it is necessary to do a mvn clean when encountering this issue.
Typically you would have changed one or more of the profiles/options -
which leads to this occurring.

2014-10-22 22:00 GMT-07:00 Ryan Williams ryan.blake.willi...@gmail.com:

 I started building Spark / running Spark tests this weekend and on maybe
 5-10 occasions have run into a compiler crash while compiling
 DataTypeConversions.scala.

 Here https://gist.github.com/ryan-williams/7673d7da928570907f4d is a
 full gist of an innocuous test command (mvn test
 -Dsuites='*KafkaStreamSuite') exhibiting this behavior. Problem starts on
 L512
 https://gist.github.com/ryan-williams/7673d7da928570907f4d#file-stdout-L512
 and there’s a final stack trace at the bottom
 https://gist.github.com/ryan-williams/7673d7da928570907f4d#file-stdout-L671
 .

 mvn clean or ./sbt/sbt clean “fix” it (I believe I’ve observed the issue
 while compiling with each tool), but are annoying/time-consuming to do,
 obvs, and it’s happening pretty frequently for me when doing only small
 numbers of incremental compiles punctuated by e.g. checking out different
 git commits.

 Have other people seen this? This post
 http://apache-spark-user-list.1001560.n3.nabble.com/spark-github-source-build-error-td10532.html
 on this list is basically the same error, but in TestSQLContext.scala and this
 SO post
 http://stackoverflow.com/questions/25211071/compilation-errors-in-spark-datatypeconversions-scala-on-intellij-when-using-m
 claims to be hitting it when trying to build in intellij.

 It seems likely to be a bug in scalac; would finding a consistent repro
 case and filing it somewhere be useful?
 ​



Re: scalac crash when compiling DataTypeConversions.scala

2014-10-26 Thread Ryan Williams
I heard from one person offline who regularly builds Spark on OSX and Linux
and they felt like they only ever saw this error on OSX; if anyone can
confirm whether they've seen it on Linux, that would be good to know.

Stephen: good to know re: profiles/options. I don't think changing them is
a necessary condition as I believe I've run into it without doing that, but
any set of steps to reproduce this would be welcome so that we could
escalate to Typesafe as appropriate.

On Sun, Oct 26, 2014 at 11:46 PM, Stephen Boesch java...@gmail.com wrote:

 Yes it is necessary to do a mvn clean when encountering this issue.
 Typically you would have changed one or more of the profiles/options -
 which leads to this occurring.

 2014-10-22 22:00 GMT-07:00 Ryan Williams ryan.blake.willi...@gmail.com:

 I started building Spark / running Spark tests this weekend and on maybe
 5-10 occasions have run into a compiler crash while compiling
 DataTypeConversions.scala.

 Here https://gist.github.com/ryan-williams/7673d7da928570907f4d is a
 full gist of an innocuous test command (mvn test
 -Dsuites='*KafkaStreamSuite') exhibiting this behavior. Problem starts
 on L512
 https://gist.github.com/ryan-williams/7673d7da928570907f4d#file-stdout-L512
 and there’s a final stack trace at the bottom
 https://gist.github.com/ryan-williams/7673d7da928570907f4d#file-stdout-L671
 .

 mvn clean or ./sbt/sbt clean “fix” it (I believe I’ve observed the issue
 while compiling with each tool), but are annoying/time-consuming to do,
 obvs, and it’s happening pretty frequently for me when doing only small
 numbers of incremental compiles punctuated by e.g. checking out different
 git commits.

 Have other people seen this? This post
 http://apache-spark-user-list.1001560.n3.nabble.com/spark-github-source-build-error-td10532.html
 on this list is basically the same error, but in TestSQLContext.scala
 and this SO post
 http://stackoverflow.com/questions/25211071/compilation-errors-in-spark-datatypeconversions-scala-on-intellij-when-using-m
 claims to be hitting it when trying to build in intellij.

 It seems likely to be a bug in scalac; would finding a consistent repro
 case and filing it somewhere be useful?
 ​





Re: scalac crash when compiling DataTypeConversions.scala

2014-10-26 Thread Stephen Boesch
I see the errors regularly on linux under the conditions of having changed
profiles.

2014-10-26 20:49 GMT-07:00 Ryan Williams ryan.blake.willi...@gmail.com:

 I heard from one person offline who regularly builds Spark on OSX and
 Linux and they felt like they only ever saw this error on OSX; if anyone
 can confirm whether they've seen it on Linux, that would be good to know.

 Stephen: good to know re: profiles/options. I don't think changing them is
 a necessary condition as I believe I've run into it without doing that, but
 any set of steps to reproduce this would be welcome so that we could
 escalate to Typesafe as appropriate.

 On Sun, Oct 26, 2014 at 11:46 PM, Stephen Boesch java...@gmail.com
 wrote:

 Yes it is necessary to do a mvn clean when encountering this issue.
 Typically you would have changed one or more of the profiles/options -
 which leads to this occurring.

 2014-10-22 22:00 GMT-07:00 Ryan Williams ryan.blake.willi...@gmail.com:

 I started building Spark / running Spark tests this weekend and on maybe
 5-10 occasions have run into a compiler crash while compiling
 DataTypeConversions.scala.

 Here https://gist.github.com/ryan-williams/7673d7da928570907f4d is a
 full gist of an innocuous test command (mvn test
 -Dsuites='*KafkaStreamSuite') exhibiting this behavior. Problem starts
 on L512
 https://gist.github.com/ryan-williams/7673d7da928570907f4d#file-stdout-L512
 and there’s a final stack trace at the bottom
 https://gist.github.com/ryan-williams/7673d7da928570907f4d#file-stdout-L671
 .

 mvn clean or ./sbt/sbt clean “fix” it (I believe I’ve observed the
 issue while compiling with each tool), but are annoying/time-consuming to
 do, obvs, and it’s happening pretty frequently for me when doing only small
 numbers of incremental compiles punctuated by e.g. checking out different
 git commits.

 Have other people seen this? This post
 http://apache-spark-user-list.1001560.n3.nabble.com/spark-github-source-build-error-td10532.html
 on this list is basically the same error, but in TestSQLContext.scala
 and this SO post
 http://stackoverflow.com/questions/25211071/compilation-errors-in-spark-datatypeconversions-scala-on-intellij-when-using-m
 claims to be hitting it when trying to build in intellij.

 It seems likely to be a bug in scalac; would finding a consistent repro
 case and filing it somewhere be useful?
 ​






Re: scalac crash when compiling DataTypeConversions.scala

2014-10-23 Thread Patrick Wendell
Hey Ryan,

I've found that filing issues with the Scala/Typesafe JIRA is pretty
helpful if the issue can be fully reproduced, and even sometimes
helpful if it can't. You can file bugs here:

https://issues.scala-lang.org/secure/Dashboard.jspa

The Spark SQL code in particular is typically the source of these, as
we use more fancy scala features. In a pinch it is also possible to
recompile and test code without building SQL if you just run tests for
the specific module (e.g. streaming). In sbt this sort of just
works:

sbt/sbt streaming/test-only a.b.c*

In maven it's more clunky but if you do a mvn install first then (I
think) you can test sub-modules independently:

mvn test -pl streaming ...

- Patrick

On Wed, Oct 22, 2014 at 10:00 PM, Ryan Williams
ryan.blake.willi...@gmail.com wrote:
 I started building Spark / running Spark tests this weekend and on maybe
 5-10 occasions have run into a compiler crash while compiling
 DataTypeConversions.scala.

 Here is a full gist of an innocuous test command (mvn test
 -Dsuites='*KafkaStreamSuite') exhibiting this behavior. Problem starts on
 L512 and there's a final stack trace at the bottom.

 mvn clean or ./sbt/sbt clean fix it (I believe I've observed the issue
 while compiling with each tool), but are annoying/time-consuming to do,
 obvs, and it's happening pretty frequently for me when doing only small
 numbers of incremental compiles punctuated by e.g. checking out different
 git commits.

 Have other people seen this? This post on this list is basically the same
 error, but in TestSQLContext.scala and this SO post claims to be hitting it
 when trying to build in intellij.

 It seems likely to be a bug in scalac; would finding a consistent repro case
 and filing it somewhere be useful?

-
To unsubscribe, e-mail: user-unsubscr...@spark.apache.org
For additional commands, e-mail: user-h...@spark.apache.org



scalac crash when compiling DataTypeConversions.scala

2014-10-22 Thread Ryan Williams
I started building Spark / running Spark tests this weekend and on maybe
5-10 occasions have run into a compiler crash while compiling
DataTypeConversions.scala.

Here https://gist.github.com/ryan-williams/7673d7da928570907f4d is a full
gist of an innocuous test command (mvn test -Dsuites='*KafkaStreamSuite')
exhibiting this behavior. Problem starts on L512
https://gist.github.com/ryan-williams/7673d7da928570907f4d#file-stdout-L512
and there’s a final stack trace at the bottom
https://gist.github.com/ryan-williams/7673d7da928570907f4d#file-stdout-L671
.

mvn clean or ./sbt/sbt clean “fix” it (I believe I’ve observed the issue
while compiling with each tool), but are annoying/time-consuming to do,
obvs, and it’s happening pretty frequently for me when doing only small
numbers of incremental compiles punctuated by e.g. checking out different
git commits.

Have other people seen this? This post
http://apache-spark-user-list.1001560.n3.nabble.com/spark-github-source-build-error-td10532.html
on this list is basically the same error, but in TestSQLContext.scala and this
SO post
http://stackoverflow.com/questions/25211071/compilation-errors-in-spark-datatypeconversions-scala-on-intellij-when-using-m
claims to be hitting it when trying to build in intellij.

It seems likely to be a bug in scalac; would finding a consistent repro
case and filing it somewhere be useful?
​