Re: Build issues with apache-spark-on-k8s.

2018-03-29 Thread Yinan Li
For 2.3, the dockerfile is under kubernetes/ in the tarball, not under the
directory where you started the build. Once you successfully build, copy
the tarball out, untar it, and you should see the directory kubernetes/ in
it.

On Thu, Mar 29, 2018 at 3:00 AM, Atul Sowani  wrote:

> Thanks all for responding and helping me with the build issue. I tried
> building the code at git://github.com/apache/spark.git (master branch) in
> my ppc64le Ubuntu 16.04 VM and it failed. I tried building a specific
> branch (branch-2.2) using following command:
>
> build/mvn -DskipTests -Pkubernetes clean package install
>
> This builds it successfully, but again I do not see "dockerfiles" and
> "jars" directories anywhere. This behaviour is exactly same as observed
> with source code at https://github.com/apache-spark-on-k8s/spark
>
> Any advise on how to proceed on this? As far as possible, I need to build
> v2.2.
>
> Thanks,
> Atul.
>
>
>
> On Wed, Mar 28, 2018 at 8:06 PM, Anirudh Ramanathan <
> ramanath...@google.com> wrote:
>
>> As Lucas said, those directories are generated and copied when you run a
>> full maven build with the -Pkubernetes flag specified (or use instructions
>> in  https://spark.apache.org/docs/latest/building-spark.html
>> #building-a-runnable-distribution).
>>
>> Also, using the Kubernetes integration in the  main Apache Spark project
>> is recommended. The fork https://github.com/apache-spark-on-k8s/spark/
>> will be retired once we finish upstreaming all those features in Spark 2.4.
>>
>>
>> On Wed, Mar 28, 2018, 6:42 AM Lucas Kacher  wrote:
>>
>>> Are you building on the fork or on the official release now? I built
>>> v2.3.0 from source w/out issue. One thing I noticed is that I needed to run
>>> the build-image command from the bin which was placed in dist/ as opposed
>>> to the one in the repo (as that's how it copies the necessary targets).
>>>
>>> (Failed to reply-all to the list).
>>>
>>> On Wed, Mar 28, 2018 at 4:30 AM, Atul Sowani  wrote:
>>>
 Hi,

 I built apache-spark-on-k8s from source on Ubuntu 16.04 and it got
 built without errors. Next, I wanted to create docker images, so as
 explained at https://apache-spark-on-k8s.github.io/userdocs/running-
 on-kubernetes.html I used sbin/build-push-docker-images.sh to create
 those. While using this script I came across 2 issues:

 1. It references "dockerfiles" directory which should be in "spark",
 however this directory is missing. I created "dockerfiles" directory and
 copied Dockerfiles from resource-managers/kuberne
 tes/docker-minimal-bundle

 2, spark-base dockerfile expects to have some JAR files present in a
 directory called "jars" - this directory is missing. I tried rebuilding the
 code but this directory is not getting generated if it is supposed to be.

 My doubt is, if this is a genuine/known issue or am I missing out some
 build steps?

 Thanks,
 Atul.


>>>
>>>
>>> --
>>>
>>> *Lucas Kacher*Senior Engineer
>>> -
>>> vsco.co 
>>> New York, NY
>>> 818.512.5239 <(818)%20512-5239>
>>>
>>
>


Re: Build issues with apache-spark-on-k8s.

2018-03-29 Thread Rob Vesse
Kubernetes support was only added as an experimental feature in Spark 2.3.0

 

It does not exist in the Apache Spark branch-2.2

 

If you really must build for Spark 2.2 you will need to use 
branch-2.2-kubernetes from the apache-spark-on-k8s fork on GitHub

 

Note that there are various functional and implementation differences between 
the fork and what is currently integrated into Spark so please ensure you refer 
to the official/fork documentation as appropriate

 

Rob

 

From: Atul Sowani <sow...@gmail.com>
Date: Thursday, 29 March 2018 at 11:27
To: Anirudh Ramanathan <ramanath...@google.com>
Cc: Lucas Kacher <lu...@vsco.co>, dev <dev@spark.apache.org>
Subject: Re: Build issues with apache-spark-on-k8s.

 

Thanks all for responding and helping me with the build issue. I tried building 
the code at git://github.com/apache/spark.git (master branch) in my ppc64le 
Ubuntu 16.04 VM and it failed. I tried building a specific branch (branch-2.2) 
using following command:

 

build/mvn -DskipTests -Pkubernetes clean package install

 

This builds it successfully, but again I do not see "dockerfiles" and "jars" 
directories anywhere. This behaviour is exactly same as observed with source 
code at https://github.com/apache-spark-on-k8s/spark

 

Any advise on how to proceed on this? As far as possible, I need to build v2.2.

 

Thanks,

Atul.

 

 

 

On Wed, Mar 28, 2018 at 8:06 PM, Anirudh Ramanathan <ramanath...@google.com> 
wrote:

As Lucas said, those directories are generated and copied when you run a full 
maven build with the -Pkubernetes flag specified (or use instructions in  
https://spark.apache.org/docs/latest/building-spark.html#building-a-runnable-distribution).

 

Also, using the Kubernetes integration in the  main Apache Spark project is 
recommended. The fork https://github.com/apache-spark-on-k8s/spark/ will be 
retired once we finish upstreaming all those features in Spark 2.4. 

 

 

On Wed, Mar 28, 2018, 6:42 AM Lucas Kacher <lu...@vsco.co> wrote:

Are you building on the fork or on the official release now? I built v2.3.0 
from source w/out issue. One thing I noticed is that I needed to run the 
build-image command from the bin which was placed in dist/ as opposed to the 
one in the repo (as that's how it copies the necessary targets).

(Failed to reply-all to the list).

 

On Wed, Mar 28, 2018 at 4:30 AM, Atul Sowani <sow...@gmail.com> wrote:

Hi,

 

I built apache-spark-on-k8s from source on Ubuntu 16.04 and it got built 
without errors. Next, I wanted to create docker images, so as explained at 
https://apache-spark-on-k8s.github.io/userdocs/running-on-kubernetes.html I 
used sbin/build-push-docker-images.sh to create those. While using this script 
I came across 2 issues:

 

1. It references "dockerfiles" directory which should be in "spark", however 
this directory is missing. I created "dockerfiles" directory and copied 
Dockerfiles from resource-managers/kubernetes/docker-minimal-bundle

 

2, spark-base dockerfile expects to have some JAR files present in a directory 
called "jars" - this directory is missing. I tried rebuilding the code but this 
directory is not getting generated if it is supposed to be.

 

My doubt is, if this is a genuine/known issue or am I missing out some build 
steps?

 

Thanks,

Atul.

 



 

-- 

Lucas Kacher
Senior Engineer
-
vsco.co

New York, NY

818.512.5239

 



Re: Build issues with apache-spark-on-k8s.

2018-03-29 Thread Atul Sowani
Thanks all for responding and helping me with the build issue. I tried
building the code at git://github.com/apache/spark.git (master branch) in
my ppc64le Ubuntu 16.04 VM and it failed. I tried building a specific
branch (branch-2.2) using following command:

build/mvn -DskipTests -Pkubernetes clean package install

This builds it successfully, but again I do not see "dockerfiles" and
"jars" directories anywhere. This behaviour is exactly same as observed
with source code at https://github.com/apache-spark-on-k8s/spark

Any advise on how to proceed on this? As far as possible, I need to build
v2.2.

Thanks,
Atul.



On Wed, Mar 28, 2018 at 8:06 PM, Anirudh Ramanathan 
wrote:

> As Lucas said, those directories are generated and copied when you run a
> full maven build with the -Pkubernetes flag specified (or use instructions
> in  https://spark.apache.org/docs/latest/building-spark.
> html#building-a-runnable-distribution).
>
> Also, using the Kubernetes integration in the  main Apache Spark project
> is recommended. The fork https://github.com/apache-spark-on-k8s/spark/
> will be retired once we finish upstreaming all those features in Spark 2.4.
>
>
> On Wed, Mar 28, 2018, 6:42 AM Lucas Kacher  wrote:
>
>> Are you building on the fork or on the official release now? I built
>> v2.3.0 from source w/out issue. One thing I noticed is that I needed to run
>> the build-image command from the bin which was placed in dist/ as opposed
>> to the one in the repo (as that's how it copies the necessary targets).
>>
>> (Failed to reply-all to the list).
>>
>> On Wed, Mar 28, 2018 at 4:30 AM, Atul Sowani  wrote:
>>
>>> Hi,
>>>
>>> I built apache-spark-on-k8s from source on Ubuntu 16.04 and it got built
>>> without errors. Next, I wanted to create docker images, so as explained at
>>> https://apache-spark-on-k8s.github.io/userdocs/
>>> running-on-kubernetes.html I used sbin/build-push-docker-images.sh to
>>> create those. While using this script I came across 2 issues:
>>>
>>> 1. It references "dockerfiles" directory which should be in "spark",
>>> however this directory is missing. I created "dockerfiles" directory and
>>> copied Dockerfiles from resource-managers/kubernetes/docker-minimal-
>>> bundle
>>>
>>> 2, spark-base dockerfile expects to have some JAR files present in a
>>> directory called "jars" - this directory is missing. I tried rebuilding the
>>> code but this directory is not getting generated if it is supposed to be.
>>>
>>> My doubt is, if this is a genuine/known issue or am I missing out some
>>> build steps?
>>>
>>> Thanks,
>>> Atul.
>>>
>>>
>>
>>
>> --
>>
>> *Lucas Kacher*Senior Engineer
>> -
>> vsco.co 
>> New York, NY
>> 818.512.5239
>>
>


Re: Build issues with apache-spark-on-k8s.

2018-03-28 Thread Anirudh Ramanathan
As Lucas said, those directories are generated and copied when you run a
full maven build with the -Pkubernetes flag specified (or use instructions
in
https://spark.apache.org/docs/latest/building-spark.html#building-a-runnable-distribution
).

Also, using the Kubernetes integration in the  main Apache Spark project is
recommended. The fork https://github.com/apache-spark-on-k8s/spark/ will be
retired once we finish upstreaming all those features in Spark 2.4.


On Wed, Mar 28, 2018, 6:42 AM Lucas Kacher  wrote:

> Are you building on the fork or on the official release now? I built
> v2.3.0 from source w/out issue. One thing I noticed is that I needed to run
> the build-image command from the bin which was placed in dist/ as opposed
> to the one in the repo (as that's how it copies the necessary targets).
>
> (Failed to reply-all to the list).
>
> On Wed, Mar 28, 2018 at 4:30 AM, Atul Sowani  wrote:
>
>> Hi,
>>
>> I built apache-spark-on-k8s from source on Ubuntu 16.04 and it got built
>> without errors. Next, I wanted to create docker images, so as explained at
>> https://apache-spark-on-k8s.github.io/userdocs/running-on-kubernetes.html
>>  I used sbin/build-push-docker-images.sh to create those. While using
>> this script I came across 2 issues:
>>
>> 1. It references "dockerfiles" directory which should be in "spark",
>> however this directory is missing. I created "dockerfiles" directory and
>> copied Dockerfiles from resource-managers/kubernetes/docker-minimal-bundle
>>
>> 2, spark-base dockerfile expects to have some JAR files present in a
>> directory called "jars" - this directory is missing. I tried rebuilding the
>> code but this directory is not getting generated if it is supposed to be.
>>
>> My doubt is, if this is a genuine/known issue or am I missing out some
>> build steps?
>>
>> Thanks,
>> Atul.
>>
>>
>
>
> --
>
> *Lucas Kacher*Senior Engineer
> -
> vsco.co 
> New York, NY
> 818.512.5239
>


Re: Build issues with apache-spark-on-k8s.

2018-03-28 Thread Lucas Kacher
Are you building on the fork or on the official release now? I built v2.3.0
from source w/out issue. One thing I noticed is that I needed to run the
build-image command from the bin which was placed in dist/ as opposed to
the one in the repo (as that's how it copies the necessary targets).

(Failed to reply-all to the list).

On Wed, Mar 28, 2018 at 4:30 AM, Atul Sowani  wrote:

> Hi,
>
> I built apache-spark-on-k8s from source on Ubuntu 16.04 and it got built
> without errors. Next, I wanted to create docker images, so as explained at
> https://apache-spark-on-k8s.github.io/userdocs/running-on-kubernetes.html I
> used sbin/build-push-docker-images.sh to create those. While using this
> script I came across 2 issues:
>
> 1. It references "dockerfiles" directory which should be in "spark",
> however this directory is missing. I created "dockerfiles" directory and
> copied Dockerfiles from resource-managers/kubernetes/docker-minimal-bundle
>
> 2, spark-base dockerfile expects to have some JAR files present in a
> directory called "jars" - this directory is missing. I tried rebuilding the
> code but this directory is not getting generated if it is supposed to be.
>
> My doubt is, if this is a genuine/known issue or am I missing out some
> build steps?
>
> Thanks,
> Atul.
>
>


-- 

*Lucas Kacher*Senior Engineer
-
vsco.co 
New York, NY
818.512.5239


Build issues with apache-spark-on-k8s.

2018-03-28 Thread Atul Sowani
 Hi,

I built apache-spark-on-k8s from source on Ubuntu 16.04 and it got built
without errors. Next, I wanted to create docker images, so as explained at
https://apache-spark-on-k8s.github.io/userdocs/running-on-kubernetes.html I
used sbin/build-push-docker-images.sh to create those. While using this
script I came across 2 issues:

1. It references "dockerfiles" directory which should be in "spark",
however this directory is missing. I created "dockerfiles" directory and
copied Dockerfiles from resource-managers/kubernetes/docker-minimal-bundle

2, spark-base dockerfile expects to have some JAR files present in a
directory called "jars" - this directory is missing. I tried rebuilding the
code but this directory is not getting generated if it is supposed to be.

My doubt is, if this is a genuine/known issue or am I missing out some
build steps?

Thanks,
Atul.