Re: Build spark with Intellij IDEA 13

2014-09-28 Thread Yi Tian
Hi

If you want IDEA compile your spark project (version 1.0.0 and above), you 
should do it with following steps.

1 clone spark project
2 use mvn to compile your spark project ( because you need the generated avro 
source file in flume-sink module)
3 open spark/pom.xml with IDEA
4 check profiles you need in “maven projects” window
5 modify the source path of  flume-sink module, make 
“target/scala-2.10/src_managed/main/compiled_avro” as a source path
6 if you checked yarn profile, you need to 
 remove the module spark-yarn_2.10” 
 add “spark/yarn/common/src/main/scala” and 
“spark/yarn/stable/src/main/scala” the source path of module “yarn-parent_2.10
7 then you can run Build - Rebuild Project in IDEA.

PS: you should run “rebuild” after you run mvn or sbt command to spark project.




Best Regards,

Yi Tian
tianyi.asiai...@gmail.com




On Sep 28, 2014, at 11:01, maddenpj madde...@gmail.com wrote:

 I actually got this same exact issue compiling a unrelated project (not using
 spark). Maybe it's a protobuf issue? 
 
 
 
 --
 View this message in context: 
 http://apache-spark-user-list.1001560.n3.nabble.com/Build-spark-with-Intellij-IDEA-13-tp9904p15284.html
 Sent from the Apache Spark User List mailing list archive at Nabble.com.
 
 -
 To unsubscribe, e-mail: user-unsubscr...@spark.apache.org
 For additional commands, e-mail: user-h...@spark.apache.org
 



Re: Build spark with Intellij IDEA 13

2014-09-27 Thread maddenpj
I actually got this same exact issue compiling a unrelated project (not using
spark). Maybe it's a protobuf issue? 



--
View this message in context: 
http://apache-spark-user-list.1001560.n3.nabble.com/Build-spark-with-Intellij-IDEA-13-tp9904p15284.html
Sent from the Apache Spark User List mailing list archive at Nabble.com.

-
To unsubscribe, e-mail: user-unsubscr...@spark.apache.org
For additional commands, e-mail: user-h...@spark.apache.org



Build Spark in IntelliJ IDEA 13

2014-02-26 Thread Yanzhe Chen
Hi, all

I'm trying to build Spark in IntelliJ IDEA 13.

I clone the latest repo and run sbt/sbt gen-idea in the root folder.
Then import it into IntelliJ IDEA. Scala plugin for IntelliJ IDEA has been
installed.

Everything seems ok until I ran Build  Make Project:

Information: Using javac 1.7.0_51 to compile java sources
Information: java: Errors occurred while compiling module 'spark-core'
Information: Modules spark-streaming-flume-build, spark-repl-build,
spark-graphx-build, spark-tools-build, spark-streaming-kafka-build
and 9 others were fully rebuilt due to project configuration/dependencies
changes
Information: Compilation completed with 1 error and 1 warning in 23 sec
Information: 1 error
Information: 1 warning
Error: java: javacTask: source release 1.7 requires target release 1.7
Warning: scalac: there were 56 feature warning(s); re-run with -feature for
details

I have only JDK 1.7 installed and the settings of Java Compiler are all
targeting to 1.7. So what does this error mean?

Besides, project can be compiled correctly from console and examples can
also run smoothly from console. The reason that I want to build from
IntelliJ IDEA is that I want to do some debugging. Anyone can show me a
better way to debug Spark (like I can step in / out some functions and
check variables in real time) ?

Best,
Yanzhe


Re: Build Spark in IntelliJ IDEA 13

2014-02-26 Thread Sean Owen
I also use IntelliJ 13 on a Mac, with only Java 7, and have never seen this.

If you look at the Spark build, you will see that it specifies Java 6, not 7.
Even if you changed java.version in the build, you would not get this
error, since it specifies source and target to be the same value.
In fact it would be fine to specify source/target 7 too, if you wanted
to for your own purposes.

The error says you changed the module source JDK level to be 7, but
are asking it to output Java 6 bytecode.

My guess is you inadvertently set the source language level to 7 in
IntelliJ. Check that. If so, try telling IntelliJ to reimport the
Maven project from the top-level pom.xml and it should override any of
that.
--
Sean Owen | Director, Data Science | London


On Wed, Feb 26, 2014 at 4:59 PM, Yanzhe Chen yanzhe...@gmail.com wrote:
 Hi, all

 I'm trying to build Spark in IntelliJ IDEA 13.

 I clone the latest repo and run sbt/sbt gen-idea in the root folder.
 Then import it into IntelliJ IDEA. Scala plugin for IntelliJ IDEA has been
 installed.

 Everything seems ok until I ran Build  Make Project:

 Information: Using javac 1.7.0_51 to compile java sources
 Information: java: Errors occurred while compiling module 'spark-core'
 Information: Modules spark-streaming-flume-build, spark-repl-build,
 spark-graphx-build, spark-tools-build, spark-streaming-kafka-build and
 9 others were fully rebuilt due to project configuration/dependencies
 changes
 Information: Compilation completed with 1 error and 1 warning in 23 sec
 Information: 1 error
 Information: 1 warning
 Error: java: javacTask: source release 1.7 requires target release 1.7
 Warning: scalac: there were 56 feature warning(s); re-run with -feature for
 details

 I have only JDK 1.7 installed and the settings of Java Compiler are all
 targeting to 1.7. So what does this error mean?

 Besides, project can be compiled correctly from console and examples can
 also run smoothly from console. The reason that I want to build from
 IntelliJ IDEA is that I want to do some debugging. Anyone can show me a
 better way to debug Spark (like I can step in / out some functions and check
 variables in real time) ?

 Best,
 Yanzhe