This is an automated email from the ASF dual-hosted git repository.

srowen pushed a commit to branch asf-site
in repository https://gitbox.apache.org/repos/asf/spark-website.git


The following commit(s) were added to refs/heads/asf-site by this push:
     new b4e0afa  [SPARK-27458][DOC] remind developers to reset maven home in 
IntelliJ
b4e0afa is described below

commit b4e0afa7574cf4ec4682dec63a3f67196ae350a2
Author: williamwong <william1...@gmail.com>
AuthorDate: Wed Apr 17 08:35:00 2019 -0500

    [SPARK-27458][DOC] remind developers to reset maven home in IntelliJ
    
    I tried to follow the guide at 
'http://spark.apache.org/developer-tools.html' to setup an IntelliJ project for 
Spark. However, the project was failed to build. It was due to missing classes 
generated via antlr on sql/catalyst project even thought I clicked the 
'Generate Sources and Update Folders For All Projects' button in IntelliJ as 
per suggested.
    
    It turned out that I forgot to reset the maven home in my IntelliJ and the 
IntelliJ failed the 'Generate Sources and Update Folders For All Projects' 
action silently. That was why ANTLR4 files were not generated as expected.
    
    To help other developers, I would like to enhance 
'http://spark.apache.org/developer-tools.html' to add a note to remind 
developer to check if the 'Generate Sources and Update Folders For All 
Projects' action was failed silently due to incorrect maven version. If so, 
they should update the maven home in IntelliJ accordingly
    
    Author: williamwong <william1...@gmail.com>
    Author: William Wong <william1...@gmail.com>
    
    Closes #195 from William1104/feature/SPARK-27458.
---
 developer-tools.md        | 7 +++++++
 site/developer-tools.html | 7 +++++++
 2 files changed, 14 insertions(+)

diff --git a/developer-tools.md b/developer-tools.md
index 29a9f92..e5a66d8 100644
--- a/developer-tools.md
+++ b/developer-tools.md
@@ -397,6 +397,13 @@ Other tips:
 - "Rebuild Project" can fail the first time the project is compiled, because 
generate source files 
 are not automatically generated. Try clicking the "Generate Sources and Update 
Folders For All 
 Projects" button in the "Maven Projects" tool window to manually generate 
these sources.
+- The version of Maven bundled with IntelliJ may not be new enough for Spark. 
If that happens,
+the action "Generate Sources and Update Folders For All Projects" could fail 
silently. 
+Please remember to reset the Maven home directory 
+(`Preference -> Build, Execution, Deployment -> Maven -> Maven home 
directory`) of your project to 
+point to a newer installation of Maven. You may also build Spark with the 
script `build/mvn` first.
+If the script cannot locate a new enough Maven installation, it will download 
and install a recent 
+version of Maven to folder `build/apache-maven-<version>/`.
 - Some of the modules have pluggable source directories based on Maven 
profiles (i.e. to support 
 both Scala 2.11 and 2.10 or to allow cross building against different versions 
of Hive). In some 
 cases IntelliJ's does not correctly detect use of the maven-build-plugin to 
add source directories. 
diff --git a/site/developer-tools.html b/site/developer-tools.html
index 7d5b35c..ef20828 100644
--- a/site/developer-tools.html
+++ b/site/developer-tools.html
@@ -570,6 +570,13 @@ enable profiles <code>yarn</code> and 
<code>hadoop-2.7</code>. These selections
   <li>&#8220;Rebuild Project&#8221; can fail the first time the project is 
compiled, because generate source files 
 are not automatically generated. Try clicking the &#8220;Generate Sources and 
Update Folders For All 
 Projects&#8221; button in the &#8220;Maven Projects&#8221; tool window to 
manually generate these sources.</li>
+  <li>The version of Maven bundled with IntelliJ may not be new enough for 
Spark. If that happens,
+the action &#8220;Generate Sources and Update Folders For All Projects&#8221; 
could fail silently. 
+Please remember to reset the Maven home directory 
+(<code>Preference -&gt; Build, Execution, Deployment -&gt; Maven -&gt; Maven 
home directory</code>) of your project to 
+point to a newer installation of Maven. You may also build Spark with the 
script <code>build/mvn</code> first.
+If the script cannot locate a new enough Maven installation, it will download 
and install a recent 
+version of Maven to folder 
<code>build/apache-maven-&lt;version&gt;/</code>.</li>
   <li>Some of the modules have pluggable source directories based on Maven 
profiles (i.e. to support 
 both Scala 2.11 and 2.10 or to allow cross building against different versions 
of Hive). In some 
 cases IntelliJ&#8217;s does not correctly detect use of the maven-build-plugin 
to add source directories. 


---------------------------------------------------------------------
To unsubscribe, e-mail: commits-unsubscr...@spark.apache.org
For additional commands, e-mail: commits-h...@spark.apache.org

Reply via email to