[ 
https://issues.apache.org/jira/browse/SPARK-4128?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=14540397#comment-14540397
 ] 

Christian Kadner edited comment on SPARK-4128 at 5/12/15 6:23 PM:
------------------------------------------------------------------

Not every user may care about each of the modules, and yes, these instructions 
may need to be revised.

Yet I strongly think there should be some general text, maybe under "Other 
Tips", that explains the need to manually update the Module settings to mark 
additional folders as Source folders (after selecting the right combination of 
Profiles and doing a "Generate Sources ...".

For spark-hive this seems to still be true.

Patrick had written this comment in one of his emails, which is helpful to 
understand why that needs to be done.

>> In some cases in the maven build we now have pluggable source
>> directories based on profiles using the maven build helper plug-in.
>> This is necessary to support cross building against different Hive
>> versions, and there will be additional instances of this due to
>> supporting scala 2.11 and 2.10.

>> In these cases, you may need to add source locations explicitly to
>> intellij if you want the entire project to compile there.

>> Unfortunately as long as we support cross-building like this, it will
>> be an issue. Intellij's maven support does not correctly detect our
>> use of the maven-build-plugin to add source directories.

Besides fixing the module settings for spark-hive, I had to change the 
flume-sink module settings to mark 
target\scala-2.10\src_managed\main\compiled_avro folder as additional Source 
Folder.




was (Author: ckadner):
Not every user may care about each of the modules, and yes, these instructions 
may need to be revised.

Yet I strongly think there should be some general text, maybe under "Other 
Tips", that explains the need to manually update the Module settings to mark 
additional folders as Source folders (after selecting the right combination of 
Profiles and doing a "Generate Sources ...".

For spark-hive this seems to still be true.

Patrick had written this comment in one of his emails, which are helpful to 
understand why that needs to be done.

>> In some cases in the maven build we now have pluggable source
>> directories based on profiles using the maven build helper plug-in.
>> This is necessary to support cross building against different Hive
>> versions, and there will be additional instances of this due to
>> supporting scala 2.11 and 2.10.

>> In these cases, you may need to add source locations explicitly to
>> intellij if you want the entire project to compile there.

>> Unfortunately as long as we support cross-building like this, it will
>> be an issue. Intellij's maven support does not correctly detect our
>> use of the maven-build-plugin to add source directories.

Besides fixing the module settings for spark-hive, I had to change the 
flume-sink module settings to mark 
target\scala-2.10\src_managed\main\compiled_avro folder as additional Source 
Folder.



> Create instructions on fully building Spark in Intellij
> -------------------------------------------------------
>
>                 Key: SPARK-4128
>                 URL: https://issues.apache.org/jira/browse/SPARK-4128
>             Project: Spark
>          Issue Type: Improvement
>          Components: Documentation
>            Reporter: Patrick Wendell
>            Assignee: Patrick Wendell
>            Priority: Blocker
>             Fix For: 1.2.0
>
>
> With some of our more complicated modules, I'm not sure whether Intellij 
> correctly understands all source locations. Also, we might require specifying 
> some profiles for the build to work directly. We should document clearly how 
> to start with vanilla Spark master and get the entire thing building in 
> Intellij.



--
This message was sent by Atlassian JIRA
(v6.3.4#6332)

---------------------------------------------------------------------
To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org
For additional commands, e-mail: issues-h...@spark.apache.org

Reply via email to