One question, then, is what to use to debug Spark if Intellij can only be used 
for code browsing for the sake of unresolved symbols as mentioned by Ron? 
More specifically, if one builds from command line, but would like to debug a 
running Spark from a IDE, Intellij, e.g., what could he do?

Another note is that the problem seems to start to appear on Spark 1.0.0 and 
not with Spark 0.8.0 at least. Any lights to shed on this difference between 
the versions?

Thanks,

Yan

-----Original Message-----
From: Reynold Xin [mailto:r...@databricks.com] 
Sent: Thursday, June 26, 2014 8:57 PM
To: dev@spark.apache.org
Subject: Re: IntelliJ IDEA cannot compile TreeNode.scala

IntelliJ parser/analyzer/compiler behaves differently from Scala compiler, and 
sometimes lead to inconsistent behavior. This is one of the case.

In general while we use IntelliJ, we don't use it to build stuff. I personally 
always build in command line with sbt or Maven.



On Thu, Jun 26, 2014 at 7:43 PM, Ron Chung Hu (Ron Hu, ARC) < 
ron...@huawei.com> wrote:

> Hi,
>
> I am a Spark newbie.  I just downloaded Spark1.0.0 and latest IntelliJ 
> version 13.1 with Scala plug-in.  At spark-1.0.0 top level, I executed 
> the following SBT commands and they ran successfully.
>
>
> -          ./sbt/sbt assembly
>
> -          ./sbt/sbt update gen-idea
>
> After opening IntelliJ IDEA, I tried to compile 
> ...../sql/catalyst/trees/TreeNode.scala inside IntelliJ.  I got many 
> compile errors such as "cannot resolve symbol children", "cannot resolve
> symbol id".  Actually both symbols are defined in the same file.   As Spark
> was built successfully with "sbt/sbt assembly" command, I wondered 
> what went wrong in compiling TreeNode.scala.  Any pointer will be appreciated.
>
> Thanks.
>
> Best,
> Ron Hu
>
>

Reply via email to