Hi

If you want IDEA compile your spark project (version 1.0.0 and above), you 
should do it with following steps.

1 clone spark project
2 use mvn to compile your spark project ( because you need the generated avro 
source file in flume-sink module)
3 open spark/pom.xml with IDEA
4 check profiles you need in “maven projects” window
5 modify the source path of  flume-sink module, make 
“target/scala-2.10/src_managed/main/compiled_avro” as a source path
6 if you checked yarn profile, you need to 
     remove the module "spark-yarn_2.10” 
     add “spark/yarn/common/src/main/scala” and 
“spark/yarn/stable/src/main/scala” the source path of module “yarn-parent_2.10"
7 then you can run "Build -> Rebuild Project" in IDEA.

PS: you should run “rebuild” after you run mvn or sbt command to spark project.




Best Regards,

Yi Tian
tianyi.asiai...@gmail.com




On Sep 28, 2014, at 11:01, maddenpj <madde...@gmail.com> wrote:

> I actually got this same exact issue compiling a unrelated project (not using
> spark). Maybe it's a protobuf issue? 
> 
> 
> 
> --
> View this message in context: 
> http://apache-spark-user-list.1001560.n3.nabble.com/Build-spark-with-Intellij-IDEA-13-tp9904p15284.html
> Sent from the Apache Spark User List mailing list archive at Nabble.com.
> 
> ---------------------------------------------------------------------
> To unsubscribe, e-mail: user-unsubscr...@spark.apache.org
> For additional commands, e-mail: user-h...@spark.apache.org
> 

Reply via email to