As Lucas said, those directories are generated and copied when you run a
full maven build with the -Pkubernetes flag specified (or use instructions
in
https://spark.apache.org/docs/latest/building-spark.html#building-a-runnable-distribution
).

Also, using the Kubernetes integration in the  main Apache Spark project is
recommended. The fork https://github.com/apache-spark-on-k8s/spark/ will be
retired once we finish upstreaming all those features in Spark 2.4.


On Wed, Mar 28, 2018, 6:42 AM Lucas Kacher <lu...@vsco.co> wrote:

> Are you building on the fork or on the official release now? I built
> v2.3.0 from source w/out issue. One thing I noticed is that I needed to run
> the build-image command from the bin which was placed in dist/ as opposed
> to the one in the repo (as that's how it copies the necessary targets).
>
> (Failed to reply-all to the list).
>
> On Wed, Mar 28, 2018 at 4:30 AM, Atul Sowani <sow...@gmail.com> wrote:
>
>> Hi,
>>
>> I built apache-spark-on-k8s from source on Ubuntu 16.04 and it got built
>> without errors. Next, I wanted to create docker images, so as explained at
>> https://apache-spark-on-k8s.github.io/userdocs/running-on-kubernetes.html
>>  I used sbin/build-push-docker-images.sh to create those. While using
>> this script I came across 2 issues:
>>
>> 1. It references "dockerfiles" directory which should be in "spark",
>> however this directory is missing. I created "dockerfiles" directory and
>> copied Dockerfiles from resource-managers/kubernetes/docker-minimal-bundle
>>
>> 2, spark-base dockerfile expects to have some JAR files present in a
>> directory called "jars" - this directory is missing. I tried rebuilding the
>> code but this directory is not getting generated if it is supposed to be.
>>
>> My doubt is, if this is a genuine/known issue or am I missing out some
>> build steps?
>>
>> Thanks,
>> Atul.
>>
>>
>
>
> --
>
> *Lucas Kacher*Senior Engineer
> -
> vsco.co <https://www.vsco.co/>
> New York, NY
> 818.512.5239
>

Reply via email to