I was having the same issue and that helped.  But now I get the following
compilation error when trying to run a test from within Intellij (v 14)

/Users/bbejeck/dev/github_clones/bbejeck-spark/sql/catalyst/src/main/scala/org/apache/spark/sql/catalyst/dsl/package.scala
Error:(308, 109) polymorphic expression cannot be instantiated to expected
type;
 found   : [T(in method
apply)]org.apache.spark.sql.catalyst.dsl.ScalaUdfBuilder[T(in method apply)]
 required: org.apache.spark.sql.catalyst.dsl.package.ScalaUdfBuilder[T(in
method functionToUdfBuilder)]
  implicit def functionToUdfBuilder[T: TypeTag](func: Function1[_, T]):
ScalaUdfBuilder[T] = ScalaUdfBuilder(func)

Any thoughts?

                                ^

On Thu, Jan 8, 2015 at 6:33 AM, Jakub Dubovsky <
spark.dubovsky.ja...@seznam.cz> wrote:

> Thanks that helped.
>
> I vote for wiki as well. More fine graned documentation should be on wiki
> and linked,
>
> Jakub
>
>
> ---------- Původní zpráva ----------
> Od: Sean Owen <so...@cloudera.com>
> Komu: Jakub Dubovsky <spark.dubovsky.ja...@seznam.cz>
> Datum: 8. 1. 2015 11:29:22
> Předmět: Re: Spark development with IntelliJ
>
> "Yeah, I hit this too. IntelliJ picks this up from the build but then
> it can't run its own scalac with this plugin added.
>
> Go to Preferences > Build, Execution, Deployment > Scala Compiler and
> clear the "Additional compiler options" field. It will work then
> although the option will come back when the project reimports.
>
> Right now I don't know of a better fix.
>
> There's another recent open question about updating IntelliJ docs:
> https://issues.apache.org/jira/browse/SPARK-5136 Should this stuff go
> in the site docs, or wiki? I vote for wiki I suppose and make the site
> docs point to the wiki. I'd be happy to make wiki edits if I can get
> permission, or propose this text along with other new text on the
> JIRA.
>
> On Thu, Jan 8, 2015 at 10:00 AM, Jakub Dubovsky
> <spark.dubovsky.ja...@seznam.cz> wrote:
> > Hi devs,
> >
> > I'd like to ask if anybody has experience with using intellij 14 to step
> > into spark code. Whatever I try I get compilation error:
> >
> > Error:scalac: bad option: -P:/home/jakub/.m2/repository/org/scalamacros/
> > paradise_2.10.4/2.0.1/paradise_2.10.4-2.0.1.jar
> >
> > Project is set up by Patrick's instruction [1] and packaged by mvn -
> > DskipTests clean install. Compilation works fine. Then I just created
> > breakpoint in test code and run debug with the error.
> >
> > Thanks for any hints
> >
> > Jakub
> >
> > [1] https://cwiki.apache.org/confluence/display/SPARK/Useful+Developer+
> > Tools#UsefulDeveloperTools-BuildingSparkinIntelliJIDEA
>
> ---------------------------------------------------------------------
> To unsubscribe, e-mail: dev-unsubscr...@spark.apache.org
> For additional commands, e-mail: dev-h...@spark.apache.org"
>

Reply via email to