The examples aren't runnable quite like this. It's intended that they
are submitted to a cluster with spark-submit, which would among other
things provide Spark at runtime.

I think you might get them to run this way if you set master to
"local[*]" and indeed made a run profile that also included Spark on
the classpath.

You would never modify the .iml files anyway. You can change the Maven
pom.xml files if you were to need to modify a dependency scope.

On Mon, Aug 25, 2014 at 12:12 AM, Ron Gonzalez
<zlgonza...@yahoo.com.invalid> wrote:
> Hi,
>   After getting the code base to compile, I tried running some of the scala
> examples.
>   They all fail since it can't find classes like SparkConf.
>   If I change the iml file to convert provided scope from PROVIDED to
> COMPILE, I am able to run them. It's simple by doing the following in the
> root directory of the spark code base: find . -name "*.iml" | xargs sed
> -i.bak 's/PROVIDED/COMPILE/g'.
>   Is this expected? I'd really rather not modify the iml files since they
> were sourced from the pom xml files, so if you guys have some tips on doing
> this better, that would be great...
>
> Thanks,
> Ron
>
> ---------------------------------------------------------------------
> To unsubscribe, e-mail: dev-unsubscr...@spark.apache.org
> For additional commands, e-mail: dev-h...@spark.apache.org
>

---------------------------------------------------------------------
To unsubscribe, e-mail: dev-unsubscr...@spark.apache.org
For additional commands, e-mail: dev-h...@spark.apache.org

Reply via email to