Hi Pat,

Couple of points:

1) I must have done something naive like:
git clone git://github.com/apache/spark.git -b branch-1.2.0

because "git branch" is telling me I'm on the "master" branch, and I see
that branch-1.2.0 doesn't exist (https://github.com/apache/spark).
Nevertheless, when I compiled this master branch spark shell tells me I
have 1.2.0. So I guess the master is currently 1.2.0...

2) The README on the master branch only has build instructions for maven. I
built Spark successfully with
mvn -Phadoop-2.2 -Dhadoop.version=2.2.0 -DskipTests clean package

and it looks like the publish local solution for maven is:
mvn -Phadoop-2.2 -Dhadoop.version=2.2.0 -DskipTests clean install

I will report back with the result.

On Wed, Oct 8, 2014 at 5:50 PM, Pat McDonough <pat.mcdono...@databricks.com>
wrote:

> Hey Arun,
>
> Since this build depends on unpublished builds of spark (1.2.0-SNAPSHOT),
> you'll need to first build spark and "publish-local" so your application
> build can find those SNAPSHOTs in your local repo.
>
> Just append "publish-local" to your sbt command where you build Spark.
>
> -Pat
>
>
>
> On Wed, Oct 8, 2014 at 5:35 PM, Arun Luthra <arun.lut...@gmail.com> wrote:
>
>> I built Spark 1.2.0 succesfully, but was unable to build my Spark program
>> under 1.2.0 with sbt assembly & my build.sbt file. It contains:
>>
>> I tried:
>>     "org.apache.spark" %% "spark-sql" % "1.2.0",
>>     "org.apache.spark" %% "spark-core" % "1.2.0",
>>
>> and
>>
>>     "org.apache.spark" %% "spark-sql" % "1.2.0-SNAPSHOT",
>>     "org.apache.spark" %% "spark-core" % "1.2.0-SNAPSHOT",
>>
>> but I get errors like:
>> [warn] ::::::::::::::::::::::::::::::::::::::::::::::
>> [warn] ::          UNRESOLVED DEPENDENCIES         ::
>> [warn] ::::::::::::::::::::::::::::::::::::::::::::::
>> [warn] :: org.apache.spark#spark-sql_2.10;1.2.0: not found
>> [warn] :: org.apache.spark#spark-core_2.10;1.2.0: not found
>> [warn] ::::::::::::::::::::::::::::::::::::::::::::::
>>
>> sbt.ResolveException: unresolved dependency:
>> org.apache.spark#spark-sql_2.10;1.2.0: not found
>> unresolved dependency: org.apache.spark#spark-core_2.10;1.2.0: not found
>> ...
>> [error] (*:update) sbt.ResolveException: unresolved dependency:
>> org.apache.spark#spark-sql_2.10;1.2.0: not found
>> [error] unresolved dependency: org.apache.spark#spark-core_2.10;1.2.0:
>> not found
>>
>>
>>
>> Do I need to configure my build.sbt to point to my local spark 1.2.0
>> repository? How?
>>
>> Thanks,
>> - Arun
>>
>
>

Reply via email to