Re: spark 1.0 standalone application

2014-05-19 Thread Shivaram Venkataraman
On a related note there is also a staging Apache repository where the
latest rc gets pushed to
https://repository.apache.org/content/repositories/staging/org/apache/spark/spark-core_2.10/--

The artifact here is just named "1.0.0" (similar to the rc specific
repository that Patrick mentioned). So if you just want to build you app
against the latest staging RC you can add "
https://repository.apache.org/content/repositories/staging"; to your
resolvers in SBT / Maven.

Thanks
Shivaram


On Mon, May 19, 2014 at 10:23 PM, Nan Zhu  wrote:

> First time to know there is a temporary maven repository…….
>
> --
> Nan Zhu
>
>
> On Monday, May 19, 2014 at 10:10 PM, Patrick Wendell wrote:
>
> > Whenever we publish a release candidate, we create a temporary maven
> > repository that host the artifacts. We do this precisely for the case
> > you are running into (where a user wants to build an application
> > against it to test).
> >
> > You can build against the release candidate by just adding that
> > repository in your sbt build, then linking against "spark-core"
> > version "1.0.0". For rc9 the repository is in the vote e-mail:
> >
> >
> http://apache-spark-developers-list.1001551.n3.nabble.com/VOTE-Release-Apache-Spark-1-0-0-rc9-td6629.html
> >
> > On Mon, May 19, 2014 at 7:03 PM, Mark Hamstra 
> >  m...@clearstorydata.com)> wrote:
> > > That's the crude way to do it. If you run `sbt/sbt publishLocal`, then
> you
> > > can resolve the artifact from your local cache in the same way that you
> > > would resolve it if it were deployed to a remote cache. That's just the
> > > build step. Actually running the application will require the necessary
> > > jars to be accessible by the cluster nodes.
> > >
> > >
> > > On Mon, May 19, 2014 at 7:04 PM, Nan Zhu  zhunanmcg...@gmail.com)> wrote:
> > >
> > > > en, you have to put spark-assembly-*.jar to the lib directory of your
> > > > application
> > > >
> > > > Best,
> > > >
> > > > --
> > > > Nan Zhu
> > > >
> > > >
> > > > On Monday, May 19, 2014 at 9:48 PM, nit wrote:
> > > >
> > > > > I am not much comfortable with sbt. I want to build a standalone
> > > > application
> > > > > using spark 1.0 RC9. I can build sbt assembly for my application
> with
> > > >
> > > > Spark
> > > > > 0.9.1, and I think in that case spark is pulled from Aka
> Repository?
> > > > >
> > > > > Now if I want to use 1.0 RC9 for my application; what is the
> process ?
> > > > > (FYI, I was able to build spark-1.0 via sbt/assembly and I can see
> > > > > sbt-assembly jar; and I think I will have to copy my jar
> somewhere? and
> > > > > update build.sbt?)
> > > > >
> > > > > PS: I am not sure if this is the right place for this question;
> but since
> > > > > 1.0 is still RC, I felt that this may be appropriate forum.
> > > > >
> > > > > thank!
> > > > >
> > > > >
> > > > >
> > > > > --
> > > > > View this message in context:
> > > > >
> > > >
> > > >
> http://apache-spark-developers-list.1001551.n3.nabble.com/spark-1-0-standalone-application-tp6698.html
> > > > > Sent from the Apache Spark Developers List mailing list archive at
> > > >
> > > > Nabble.com (http://Nabble.com).
> > > > >
> > > >
> > > >
> > >
> > >
> >
> >
> >
>
>
>


Re: spark 1.0 standalone application

2014-05-19 Thread Nan Zhu
First time to know there is a temporary maven repository…….

--  
Nan Zhu


On Monday, May 19, 2014 at 10:10 PM, Patrick Wendell wrote:

> Whenever we publish a release candidate, we create a temporary maven
> repository that host the artifacts. We do this precisely for the case
> you are running into (where a user wants to build an application
> against it to test).
>  
> You can build against the release candidate by just adding that
> repository in your sbt build, then linking against "spark-core"
> version "1.0.0". For rc9 the repository is in the vote e-mail:
>  
> http://apache-spark-developers-list.1001551.n3.nabble.com/VOTE-Release-Apache-Spark-1-0-0-rc9-td6629.html
>  
> On Mon, May 19, 2014 at 7:03 PM, Mark Hamstra  (mailto:m...@clearstorydata.com)> wrote:
> > That's the crude way to do it. If you run `sbt/sbt publishLocal`, then you
> > can resolve the artifact from your local cache in the same way that you
> > would resolve it if it were deployed to a remote cache. That's just the
> > build step. Actually running the application will require the necessary
> > jars to be accessible by the cluster nodes.
> >  
> >  
> > On Mon, May 19, 2014 at 7:04 PM, Nan Zhu  > (mailto:zhunanmcg...@gmail.com)> wrote:
> >  
> > > en, you have to put spark-assembly-*.jar to the lib directory of your
> > > application
> > >  
> > > Best,
> > >  
> > > --
> > > Nan Zhu
> > >  
> > >  
> > > On Monday, May 19, 2014 at 9:48 PM, nit wrote:
> > >  
> > > > I am not much comfortable with sbt. I want to build a standalone
> > > application
> > > > using spark 1.0 RC9. I can build sbt assembly for my application with
> > >  
> > > Spark
> > > > 0.9.1, and I think in that case spark is pulled from Aka Repository?
> > > >  
> > > > Now if I want to use 1.0 RC9 for my application; what is the process ?
> > > > (FYI, I was able to build spark-1.0 via sbt/assembly and I can see
> > > > sbt-assembly jar; and I think I will have to copy my jar somewhere? and
> > > > update build.sbt?)
> > > >  
> > > > PS: I am not sure if this is the right place for this question; but 
> > > > since
> > > > 1.0 is still RC, I felt that this may be appropriate forum.
> > > >  
> > > > thank!
> > > >  
> > > >  
> > > >  
> > > > --
> > > > View this message in context:
> > > >  
> > >  
> > > http://apache-spark-developers-list.1001551.n3.nabble.com/spark-1-0-standalone-application-tp6698.html
> > > > Sent from the Apache Spark Developers List mailing list archive at
> > >  
> > > Nabble.com (http://Nabble.com).
> > > >  
> > >  
> > >  
> >  
> >  
>  
>  
>  




Re: spark 1.0 standalone application

2014-05-19 Thread nit
Thanks everyone. I followed Patrick's suggestion and it worked like a charm.



--
View this message in context: 
http://apache-spark-developers-list.1001551.n3.nabble.com/spark-1-0-standalone-application-tp6698p6710.html
Sent from the Apache Spark Developers List mailing list archive at Nabble.com.


Re: spark 1.0 standalone application

2014-05-19 Thread Sujeet Varakhedi
Threads like these are great candidates to be part of the "Contributors
guide". I will create a JIRA to update the guide with data past threads
like these.

Sujeet


On Mon, May 19, 2014 at 7:10 PM, Patrick Wendell  wrote:

> Whenever we publish a release candidate, we create a temporary maven
> repository that host the artifacts. We do this precisely for the case
> you are running into (where a user wants to build an application
> against it to test).
>
> You can build against the release candidate by just adding that
> repository in your sbt build, then linking against "spark-core"
> version "1.0.0". For rc9 the repository is in the vote e-mail:
>
>
> http://apache-spark-developers-list.1001551.n3.nabble.com/VOTE-Release-Apache-Spark-1-0-0-rc9-td6629.html
>
> On Mon, May 19, 2014 at 7:03 PM, Mark Hamstra 
> wrote:
> > That's the crude way to do it.  If you run `sbt/sbt publishLocal`, then
> you
> > can resolve the artifact from your local cache in the same way that you
> > would resolve it if it were deployed to a remote cache.  That's just the
> > build step.  Actually running the application will require the necessary
> > jars to be accessible by the cluster nodes.
> >
> >
> > On Mon, May 19, 2014 at 7:04 PM, Nan Zhu  wrote:
> >
> >> en, you have to put spark-assembly-*.jar to the lib directory of your
> >> application
> >>
> >> Best,
> >>
> >> --
> >> Nan Zhu
> >>
> >>
> >> On Monday, May 19, 2014 at 9:48 PM, nit wrote:
> >>
> >> > I am not much comfortable with sbt. I want to build a standalone
> >> application
> >> > using spark 1.0 RC9. I can build sbt assembly for my application with
> >> Spark
> >> > 0.9.1, and I think in that case spark is pulled from Aka Repository?
> >> >
> >> > Now if I want to use 1.0 RC9 for my application; what is the process ?
> >> > (FYI, I was able to build spark-1.0 via sbt/assembly and I can see
> >> > sbt-assembly jar; and I think I will have to copy my jar somewhere?
> and
> >> > update build.sbt?)
> >> >
> >> > PS: I am not sure if this is the right place for this question; but
> since
> >> > 1.0 is still RC, I felt that this may be appropriate forum.
> >> >
> >> > thank!
> >> >
> >> >
> >> >
> >> > --
> >> > View this message in context:
> >>
> http://apache-spark-developers-list.1001551.n3.nabble.com/spark-1-0-standalone-application-tp6698.html
> >> > Sent from the Apache Spark Developers List mailing list archive at
> >> Nabble.com (http://Nabble.com).
> >> >
> >> >
> >>
> >>
> >>
>


Re: spark 1.0 standalone application

2014-05-19 Thread Patrick Wendell
Whenever we publish a release candidate, we create a temporary maven
repository that host the artifacts. We do this precisely for the case
you are running into (where a user wants to build an application
against it to test).

You can build against the release candidate by just adding that
repository in your sbt build, then linking against "spark-core"
version "1.0.0". For rc9 the repository is in the vote e-mail:

http://apache-spark-developers-list.1001551.n3.nabble.com/VOTE-Release-Apache-Spark-1-0-0-rc9-td6629.html

On Mon, May 19, 2014 at 7:03 PM, Mark Hamstra  wrote:
> That's the crude way to do it.  If you run `sbt/sbt publishLocal`, then you
> can resolve the artifact from your local cache in the same way that you
> would resolve it if it were deployed to a remote cache.  That's just the
> build step.  Actually running the application will require the necessary
> jars to be accessible by the cluster nodes.
>
>
> On Mon, May 19, 2014 at 7:04 PM, Nan Zhu  wrote:
>
>> en, you have to put spark-assembly-*.jar to the lib directory of your
>> application
>>
>> Best,
>>
>> --
>> Nan Zhu
>>
>>
>> On Monday, May 19, 2014 at 9:48 PM, nit wrote:
>>
>> > I am not much comfortable with sbt. I want to build a standalone
>> application
>> > using spark 1.0 RC9. I can build sbt assembly for my application with
>> Spark
>> > 0.9.1, and I think in that case spark is pulled from Aka Repository?
>> >
>> > Now if I want to use 1.0 RC9 for my application; what is the process ?
>> > (FYI, I was able to build spark-1.0 via sbt/assembly and I can see
>> > sbt-assembly jar; and I think I will have to copy my jar somewhere? and
>> > update build.sbt?)
>> >
>> > PS: I am not sure if this is the right place for this question; but since
>> > 1.0 is still RC, I felt that this may be appropriate forum.
>> >
>> > thank!
>> >
>> >
>> >
>> > --
>> > View this message in context:
>> http://apache-spark-developers-list.1001551.n3.nabble.com/spark-1-0-standalone-application-tp6698.html
>> > Sent from the Apache Spark Developers List mailing list archive at
>> Nabble.com (http://Nabble.com).
>> >
>> >
>>
>>
>>


Re: spark 1.0 standalone application

2014-05-19 Thread Mark Hamstra
That's the crude way to do it.  If you run `sbt/sbt publishLocal`, then you
can resolve the artifact from your local cache in the same way that you
would resolve it if it were deployed to a remote cache.  That's just the
build step.  Actually running the application will require the necessary
jars to be accessible by the cluster nodes.


On Mon, May 19, 2014 at 7:04 PM, Nan Zhu  wrote:

> en, you have to put spark-assembly-*.jar to the lib directory of your
> application
>
> Best,
>
> --
> Nan Zhu
>
>
> On Monday, May 19, 2014 at 9:48 PM, nit wrote:
>
> > I am not much comfortable with sbt. I want to build a standalone
> application
> > using spark 1.0 RC9. I can build sbt assembly for my application with
> Spark
> > 0.9.1, and I think in that case spark is pulled from Aka Repository?
> >
> > Now if I want to use 1.0 RC9 for my application; what is the process ?
> > (FYI, I was able to build spark-1.0 via sbt/assembly and I can see
> > sbt-assembly jar; and I think I will have to copy my jar somewhere? and
> > update build.sbt?)
> >
> > PS: I am not sure if this is the right place for this question; but since
> > 1.0 is still RC, I felt that this may be appropriate forum.
> >
> > thank!
> >
> >
> >
> > --
> > View this message in context:
> http://apache-spark-developers-list.1001551.n3.nabble.com/spark-1-0-standalone-application-tp6698.html
> > Sent from the Apache Spark Developers List mailing list archive at
> Nabble.com (http://Nabble.com).
> >
> >
>
>
>


Re: spark 1.0 standalone application

2014-05-19 Thread Nan Zhu
en, you have to put spark-assembly-*.jar to the lib directory of your 
application 

Best, 

-- 
Nan Zhu


On Monday, May 19, 2014 at 9:48 PM, nit wrote:

> I am not much comfortable with sbt. I want to build a standalone application
> using spark 1.0 RC9. I can build sbt assembly for my application with Spark
> 0.9.1, and I think in that case spark is pulled from Aka Repository?
> 
> Now if I want to use 1.0 RC9 for my application; what is the process ?
> (FYI, I was able to build spark-1.0 via sbt/assembly and I can see
> sbt-assembly jar; and I think I will have to copy my jar somewhere? and
> update build.sbt?)
> 
> PS: I am not sure if this is the right place for this question; but since
> 1.0 is still RC, I felt that this may be appropriate forum.
> 
> thank! 
> 
> 
> 
> --
> View this message in context: 
> http://apache-spark-developers-list.1001551.n3.nabble.com/spark-1-0-standalone-application-tp6698.html
> Sent from the Apache Spark Developers List mailing list archive at Nabble.com 
> (http://Nabble.com).
> 
> 




spark 1.0 standalone application

2014-05-19 Thread nit
I am not  much comfortable with sbt. I want to build a standalone application
using spark 1.0 RC9. I can build sbt assembly for my application with Spark
0.9.1, and I think in that case spark is pulled from Aka Repository?

Now if I want to use 1.0 RC9 for my application; what is the process ?
(FYI, I was able to build spark-1.0 via sbt/assembly and I can see
sbt-assembly jar; and I think I will have to copy my jar somewhere? and
update build.sbt?)

PS: I am not sure if this is the right place for this question; but since
1.0 is still RC, I felt that this may be appropriate forum.

thank! 



--
View this message in context: 
http://apache-spark-developers-list.1001551.n3.nabble.com/spark-1-0-standalone-application-tp6698.html
Sent from the Apache Spark Developers List mailing list archive at Nabble.com.