Hi Stephen,
I tried it again,
To avoid the profile impact, I execute mvn -DskipTests clean package with
Hadoop 1.0.4 by default and open the IDEA and import it as a maven project,
and I didn't choose any profile in the import wizard.
Then Make project or re-build project in IDEA, unfortunately
guoxu1231, I struggled with the Idea problem for a full week. Same thing --
clean builds under MVN/Sbt, but no luck with IDEA. What worked for me was
the solution posted higher up in this thread -- it's a SO post that
basically says to delete all iml files anywhere under the project
directory.
You need to change the Scala compiler from IntelliJ to “sbt incremental
compiler” (see the
screenshot below).
You can access this by going to “preferences” “scala”.
NOTE: This is supported only for certain version of IntelliJ scala plugin.
See this link for details.
Any update?
I encountered same issue in my environment.
Here are my steps as usual:
git clone https://github.com/apache/spark
mvn -Pyarn -Phadoop-2.4 -Dhadoop.version=2.4.0 -Phive -DskipTests clean
package
build successfully by maven.
import into IDEA as a maven project, click Build-Make
Yes it is necessary to do a mvn clean when encountering this issue.
Typically you would have changed one or more of the profiles/options -
which leads to this occurring.
2014-10-22 22:00 GMT-07:00 Ryan Williams ryan.blake.willi...@gmail.com:
I started building Spark / running Spark tests this
I heard from one person offline who regularly builds Spark on OSX and Linux
and they felt like they only ever saw this error on OSX; if anyone can
confirm whether they've seen it on Linux, that would be good to know.
Stephen: good to know re: profiles/options. I don't think changing them is
a
I see the errors regularly on linux under the conditions of having changed
profiles.
2014-10-26 20:49 GMT-07:00 Ryan Williams ryan.blake.willi...@gmail.com:
I heard from one person offline who regularly builds Spark on OSX and
Linux and they felt like they only ever saw this error on OSX; if
Hey Ryan,
I've found that filing issues with the Scala/Typesafe JIRA is pretty
helpful if the issue can be fully reproduced, and even sometimes
helpful if it can't. You can file bugs here:
https://issues.scala-lang.org/secure/Dashboard.jspa
The Spark SQL code in particular is typically the
I started building Spark / running Spark tests this weekend and on maybe
5-10 occasions have run into a compiler crash while compiling
DataTypeConversions.scala.
Here https://gist.github.com/ryan-williams/7673d7da928570907f4d is a full
gist of an innocuous test command (mvn test