Re: Unable to compile and test Spark in IntelliJ

2016-01-26 Thread Iulian Dragoș
On Tue, Jan 19, 2016 at 6:06 AM, Hyukjin Kwon  wrote:

> Hi all,
>
> I usually have been working with Spark in IntelliJ.
>
> Before this PR,
> https://github.com/apache/spark/commit/7cd7f2202547224593517b392f56e49e4c94cabc
>  for
> `[SPARK-12575][SQL] Grammar parity with existing SQL parser`. I was able to
> just open the project and then run some tests with IntelliJ Run button.
>
> However, it looks that PR adds some ANTLR files for parsing and I cannot
> run the tests as I did. So, I ended up with doing this by mvn compile first
> and then running some tests with IntelliJ.
>
> I can still run some tests with sbt or maven in comment line but this is a
> bit inconvenient. I just want to run some tests as I did in IntelliJ.
>
> I followed this
> https://cwiki.apache.org/confluence/display/SPARK/Useful+Developer+Tools
> several times but it still emits some exceptions such as
>
> Error:(779, 34) not found: value SparkSqlParser
> case ast if ast.tokenType == SparkSqlParser.TinyintLiteral =>
>  ^
>
> and I still should run mvn compile or mvn test first for them.
>
> Is there any good way to run some Spark tests within IntelliJ as I did
> before?
>

I'm using Eclipse, but all I had to do in order to build in the IDE was to
add `target/generated-sources/antlr3` to the project sources, after
building once in Sbt. You probably have the sources there already.

iulian


>
> Thanks!
>



-- 

--
Iulian Dragos

--
Reactive Apps on the JVM
www.typesafe.com


RE: Unable to compile and test Spark in IntelliJ

2016-01-26 Thread Mao, Wei
I used to meet same compile error within Intellij, and resolved by click:

View --> Tool Windows --> Maven Projects --> Spark Project Catalyst --> Plugins 
--> antlr3, then remake project

Thanks,
William Mao

From: Iulian Dragoș [mailto:iulian.dra...@typesafe.com]
Sent: Wednesday, January 27, 2016 12:12 AM
To: Hyukjin Kwon
Cc: dev@spark.apache.org
Subject: Re: Unable to compile and test Spark in IntelliJ



On Tue, Jan 19, 2016 at 6:06 AM, Hyukjin Kwon 
> wrote:
Hi all,

I usually have been working with Spark in IntelliJ.
Before this PR, 
https://github.com/apache/spark/commit/7cd7f2202547224593517b392f56e49e4c94cabc 
for `[SPARK-12575][SQL] Grammar parity with existing SQL parser`. I was able to 
just open the project and then run some tests with IntelliJ Run button.

However, it looks that PR adds some ANTLR files for parsing and I cannot run 
the tests as I did. So, I ended up with doing this by mvn compile first and 
then running some tests with IntelliJ.

I can still run some tests with sbt or maven in comment line but this is a bit 
inconvenient. I just want to run some tests as I did in IntelliJ.

I followed this 
https://cwiki.apache.org/confluence/display/SPARK/Useful+Developer+Tools 
several times but it still emits some exceptions such as

Error:(779, 34) not found: value SparkSqlParser
case ast if ast.tokenType == SparkSqlParser.TinyintLiteral =>
 ^

and I still should run mvn compile or mvn test first for them.

Is there any good way to run some Spark tests within IntelliJ as I did before?

I'm using Eclipse, but all I had to do in order to build in the IDE was to add 
`target/generated-sources/antlr3` to the project sources, after building once 
in Sbt. You probably have the sources there already.

iulian


Thanks!



--

--
Iulian Dragos

--
Reactive Apps on the JVM
www.typesafe.com



RE: Unable to compile and test Spark in IntelliJ

2016-01-26 Thread Hyukjin Kwon
Thanks guys! That works good.
On 27 Jan 2016 12:14, "Mao, Wei"  wrote:

> I used to meet same compile error within Intellij, and resolved by click:
>
>
>
> View à Tool Windows à Maven Projects à Spark Project Catalyst à Plugins à
> antlr3, then remake project
>
>
>
> Thanks,
>
> William Mao
>
>
>
> *From:* Iulian Dragoș [mailto:iulian.dra...@typesafe.com]
> *Sent:* Wednesday, January 27, 2016 12:12 AM
> *To:* Hyukjin Kwon
> *Cc:* dev@spark.apache.org
> *Subject:* Re: Unable to compile and test Spark in IntelliJ
>
>
>
>
>
>
>
> On Tue, Jan 19, 2016 at 6:06 AM, Hyukjin Kwon  wrote:
>
> Hi all,
>
> I usually have been working with Spark in IntelliJ.
>
> Before this PR,
> https://github.com/apache/spark/commit/7cd7f2202547224593517b392f56e49e4c94cabc
>  for
> `[SPARK-12575][SQL] Grammar parity with existing SQL parser`. I was able to
> just open the project and then run some tests with IntelliJ Run button.
>
>
> However, it looks that PR adds some ANTLR files for parsing and I cannot
> run the tests as I did. So, I ended up with doing this by mvn compile first
> and then running some tests with IntelliJ.
>
>
> I can still run some tests with sbt or maven in comment line but this is a
> bit inconvenient. I just want to run some tests as I did in IntelliJ.
>
> I followed this
> https://cwiki.apache.org/confluence/display/SPARK/Useful+Developer+Tools
> several times but it still emits some exceptions such as
>
>
>
> Error:(779, 34) not found: value SparkSqlParser
>
> case ast if ast.tokenType == SparkSqlParser.TinyintLiteral =>
>
>  ^
>
>
>
> and I still should run mvn compile or mvn test first for them.
>
> Is there any good way to run some Spark tests within IntelliJ as I did
> before?
>
>
>
> I'm using Eclipse, but all I had to do in order to build in the IDE was to
> add `target/generated-sources/antlr3` to the project sources, after
> building once in Sbt. You probably have the sources there already.
>
>
>
> iulian
>
>
>
>
> Thanks!
>
>
>
>
>
> --
>
>
> --
> Iulian Dragos
>
>
>
> --
> Reactive Apps on the JVM
> www.typesafe.com
>
>
>