[ https://issues.apache.org/jira/browse/SPARK-4128?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=14540727#comment-14540727 ]
Christian Kadner commented on SPARK-4128: ----------------------------------------- Hi Sean, based on what Patrick described, I would propose this text under "IDE Setup" > "IntelliJ" > "Other Tips" <!-- start --> Some of the modules have pluggable source directories based on Maven profiles (i.e. to support both Scala 2.11 and 2.10 or to allow cross building against different versions of Hive). In some cases IntelliJ's does not correctly detect our use of the maven-build-plugin to add source directories. In these cases, you may need to add source locations explicitly to compile the entire project. - open the "Project Settings" and select "Modules" - based on your selected Maven profiles, you may need to add source folders to the following modules: spark-hive: add v0.13.1/src/main/scala spark-streaming-flume-sink: add target\scala-2.10\src_managed\main\compiled_avro <!-- end --> In addition we could quote the compilation errors, so other developers will find this solution when they use search the web to trouble shoot these issues. > Create instructions on fully building Spark in Intellij > ------------------------------------------------------- > > Key: SPARK-4128 > URL: https://issues.apache.org/jira/browse/SPARK-4128 > Project: Spark > Issue Type: Improvement > Components: Documentation > Reporter: Patrick Wendell > Assignee: Patrick Wendell > Priority: Blocker > Fix For: 1.2.0 > > > With some of our more complicated modules, I'm not sure whether Intellij > correctly understands all source locations. Also, we might require specifying > some profiles for the build to work directly. We should document clearly how > to start with vanilla Spark master and get the entire thing building in > Intellij. -- This message was sent by Atlassian JIRA (v6.3.4#6332) --------------------------------------------------------------------- To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org For additional commands, e-mail: issues-h...@spark.apache.org