Hi all,

I usually have been working with Spark in IntelliJ.

Before this PR,
https://github.com/apache/spark/commit/7cd7f2202547224593517b392f56e49e4c94cabc
for
`[SPARK-12575][SQL] Grammar parity with existing SQL parser`. I was able to
just open the project and then run some tests with IntelliJ Run button.

However, it looks that PR adds some ANTLR files for parsing and I cannot
run the tests as I did. So, I ended up with doing this by mvn compile first
and then running some tests with IntelliJ.

I can still run some tests with sbt or maven in comment line but this is a
bit inconvenient. I just want to run some tests as I did in IntelliJ.

I followed this
https://cwiki.apache.org/confluence/display/SPARK/Useful+Developer+Tools
several times but it still emits some exceptions such as

Error:(779, 34) not found: value SparkSqlParser
    case ast if ast.tokenType == SparkSqlParser.TinyintLiteral =>
                                 ^

and I still should run mvn compile or mvn test first for them.

Is there any good way to run some Spark tests within IntelliJ as I did
before?

Thanks!

Reply via email to