[jira] [Created] (SPARK-25957) Skip building spark-r docker image if spark distribution does not have R support

2018-11-06 Thread Nagaram Prasad Addepally (JIRA)
Nagaram Prasad Addepally created SPARK-25957:


 Summary: Skip building spark-r docker image if spark distribution 
does not have R support
 Key: SPARK-25957
 URL: https://issues.apache.org/jira/browse/SPARK-25957
 Project: Spark
  Issue Type: Improvement
  Components: Kubernetes
Affects Versions: 2.4.0
Reporter: Nagaram Prasad Addepally


[docker-image-tool.sh|https://github.com/apache/spark/blob/master/bin/docker-image-tool.sh]
 script by default tries to build spark-r image by default. We may not always 
build spark distribution with R support. It would be good to skip building and 
publishing spark-r images if R support is not available in the spark 
distribution.



--
This message was sent by Atlassian JIRA
(v7.6.3#76005)

-
To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org
For additional commands, e-mail: issues-h...@spark.apache.org



[jira] [Updated] (SPARK-25957) Skip building spark-r docker image if spark distribution does not have R support

2018-11-06 Thread Nagaram Prasad Addepally (JIRA)


 [ 
https://issues.apache.org/jira/browse/SPARK-25957?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
 ]

Nagaram Prasad Addepally updated SPARK-25957:
-
Description: 
[docker-image-tool.sh|https://github.com/apache/spark/blob/master/bin/docker-image-tool.sh]
 script by default tries to build spark-r image. We may not always build spark 
distribution with R support. It would be good to skip building and publishing 
spark-r images if R support is not available in the spark distribution.  (was: 
[docker-image-tool.sh|https://github.com/apache/spark/blob/master/bin/docker-image-tool.sh]
 script by default tries to build spark-r image by default. We may not always 
build spark distribution with R support. It would be good to skip building and 
publishing spark-r images if R support is not available in the spark 
distribution.)

> Skip building spark-r docker image if spark distribution does not have R 
> support
> 
>
> Key: SPARK-25957
> URL: https://issues.apache.org/jira/browse/SPARK-25957
> Project: Spark
>  Issue Type: Improvement
>  Components: Kubernetes
>Affects Versions: 2.4.0
>Reporter: Nagaram Prasad Addepally
>Priority: Major
>
> [docker-image-tool.sh|https://github.com/apache/spark/blob/master/bin/docker-image-tool.sh]
>  script by default tries to build spark-r image. We may not always build 
> spark distribution with R support. It would be good to skip building and 
> publishing spark-r images if R support is not available in the spark 
> distribution.



--
This message was sent by Atlassian JIRA
(v7.6.3#76005)

-
To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org
For additional commands, e-mail: issues-h...@spark.apache.org



[jira] [Commented] (SPARK-25957) Skip building spark-r docker image if spark distribution does not have R support

2018-11-14 Thread Nagaram Prasad Addepally (JIRA)


[ 
https://issues.apache.org/jira/browse/SPARK-25957?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=16687276#comment-16687276
 ] 

Nagaram Prasad Addepally commented on SPARK-25957:
--

I think we can parameterize what images we want to build and publish using 
[docker-image-tool.sh|https://github.com/apache/spark/blob/master/bin/docker-image-tool.sh].
 By default, we can build and publish all images (to keep existing behavior 
intact) and provide an override option to specify which images we want to build 
explicitly. Note that we will always build base spark (JVM) docker image.

For example,
{noformat}
./docker-image-tool.sh -r  -t  build|publish # Builds/publishes all 
docker images

./docker-image-tool.sh -r  -t  --select [p,R] build|publish # 
Builds/publishes docker images specified in select param. We will always build 
spark base (JVM) docker image.{noformat}
Does this approach sound reasonable? Or anyone has a better suggestion?

 

> Skip building spark-r docker image if spark distribution does not have R 
> support
> 
>
> Key: SPARK-25957
> URL: https://issues.apache.org/jira/browse/SPARK-25957
> Project: Spark
>  Issue Type: Improvement
>  Components: Kubernetes
>Affects Versions: 2.4.0
>Reporter: Nagaram Prasad Addepally
>Priority: Major
>
> [docker-image-tool.sh|https://github.com/apache/spark/blob/master/bin/docker-image-tool.sh]
>  script by default tries to build spark-r image. We may not always build 
> spark distribution with R support. It would be good to skip building and 
> publishing spark-r images if R support is not available in the spark 
> distribution.



--
This message was sent by Atlassian JIRA
(v7.6.3#76005)

-
To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org
For additional commands, e-mail: issues-h...@spark.apache.org



[jira] [Commented] (SPARK-25957) Skip building spark-r docker image if spark distribution does not have R support

2018-11-14 Thread Nagaram Prasad Addepally (JIRA)


[ 
https://issues.apache.org/jira/browse/SPARK-25957?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=16687337#comment-16687337
 ] 

Nagaram Prasad Addepally commented on SPARK-25957:
--

Thanks [~vanzin]... we can do skip flags instead. I think we can auto detect R 
installation by checking for presence of  "$SPARK_HOME/R/lib" folder. Correct 
me if I am wrong. 

I will work on this change and post a PR. Can you assign this Jira to me. I do 
not seem to have permission to assign this Jira to myself.

> Skip building spark-r docker image if spark distribution does not have R 
> support
> 
>
> Key: SPARK-25957
> URL: https://issues.apache.org/jira/browse/SPARK-25957
> Project: Spark
>  Issue Type: Improvement
>  Components: Kubernetes
>Affects Versions: 2.4.0
>Reporter: Nagaram Prasad Addepally
>Priority: Major
>
> [docker-image-tool.sh|https://github.com/apache/spark/blob/master/bin/docker-image-tool.sh]
>  script by default tries to build spark-r image. We may not always build 
> spark distribution with R support. It would be good to skip building and 
> publishing spark-r images if R support is not available in the spark 
> distribution.



--
This message was sent by Atlassian JIRA
(v7.6.3#76005)

-
To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org
For additional commands, e-mail: issues-h...@spark.apache.org



[jira] [Commented] (SPARK-25957) Skip building spark-r docker image if spark distribution does not have R support

2018-11-15 Thread Nagaram Prasad Addepally (JIRA)


[ 
https://issues.apache.org/jira/browse/SPARK-25957?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=16688869#comment-16688869
 ] 

Nagaram Prasad Addepally commented on SPARK-25957:
--

Posted PR [https://github.com/apache/spark/pull/23053] for this ticket.

> Skip building spark-r docker image if spark distribution does not have R 
> support
> 
>
> Key: SPARK-25957
> URL: https://issues.apache.org/jira/browse/SPARK-25957
> Project: Spark
>  Issue Type: Improvement
>  Components: Kubernetes
>Affects Versions: 2.4.0
>Reporter: Nagaram Prasad Addepally
>Priority: Major
>
> [docker-image-tool.sh|https://github.com/apache/spark/blob/master/bin/docker-image-tool.sh]
>  script by default tries to build spark-r image. We may not always build 
> spark distribution with R support. It would be good to skip building and 
> publishing spark-r images if R support is not available in the spark 
> distribution.



--
This message was sent by Atlassian JIRA
(v7.6.3#76005)

-
To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org
For additional commands, e-mail: issues-h...@spark.apache.org



[jira] [Created] (SPARK-26585) [K8S] Add additional integration tests for K8s Scheduler Backend

2019-01-09 Thread Nagaram Prasad Addepally (JIRA)
Nagaram Prasad Addepally created SPARK-26585:


 Summary: [K8S] Add additional integration tests for K8s Scheduler 
Backend 
 Key: SPARK-26585
 URL: https://issues.apache.org/jira/browse/SPARK-26585
 Project: Spark
  Issue Type: Test
  Components: Kubernetes
Affects Versions: 3.0.0
Reporter: Nagaram Prasad Addepally


I have reviewed the kubernetes integration tests and found out that following 
cases are missing for testing scheduler backend functionality. 
 * Run application with driver and executor image specified independently
 * Request Pods with custom CPU and Limits
 * Request Pods with custom Memory and memory overhead factor
 * Request Pods with custom Memory and memory overhead
 * Pods are relaunched on failures (as per 
spark.kubernetes.executor.lostCheck.maxAttempts)

Logging this Jira to add these tests



--
This message was sent by Atlassian JIRA
(v7.6.3#76005)

-
To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org
For additional commands, e-mail: issues-h...@spark.apache.org



[jira] [Commented] (SPARK-26585) [K8S] Add additional integration tests for K8s Scheduler Backend

2019-01-09 Thread Nagaram Prasad Addepally (JIRA)


[ 
https://issues.apache.org/jira/browse/SPARK-26585?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=16738944#comment-16738944
 ] 

Nagaram Prasad Addepally commented on SPARK-26585:
--

I am working on adding these tests.

> [K8S] Add additional integration tests for K8s Scheduler Backend 
> -
>
> Key: SPARK-26585
> URL: https://issues.apache.org/jira/browse/SPARK-26585
> Project: Spark
>  Issue Type: Test
>  Components: Kubernetes
>Affects Versions: 3.0.0
>Reporter: Nagaram Prasad Addepally
>Priority: Major
>
> I have reviewed the kubernetes integration tests and found out that following 
> cases are missing for testing scheduler backend functionality. 
>  * Run application with driver and executor image specified independently
>  * Request Pods with custom CPU and Limits
>  * Request Pods with custom Memory and memory overhead factor
>  * Request Pods with custom Memory and memory overhead
>  * Pods are relaunched on failures (as per 
> spark.kubernetes.executor.lostCheck.maxAttempts)
> Logging this Jira to add these tests



--
This message was sent by Atlassian JIRA
(v7.6.3#76005)

-
To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org
For additional commands, e-mail: issues-h...@spark.apache.org



[jira] [Commented] (SPARK-26585) [K8S] Add additional integration tests for K8s Scheduler Backend

2019-01-09 Thread Nagaram Prasad Addepally (JIRA)


[ 
https://issues.apache.org/jira/browse/SPARK-26585?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=16738956#comment-16738956
 ] 

Nagaram Prasad Addepally commented on SPARK-26585:
--

https://github.com/apache/spark/pull/23504

> [K8S] Add additional integration tests for K8s Scheduler Backend 
> -
>
> Key: SPARK-26585
> URL: https://issues.apache.org/jira/browse/SPARK-26585
> Project: Spark
>  Issue Type: Test
>  Components: Kubernetes
>Affects Versions: 3.0.0
>Reporter: Nagaram Prasad Addepally
>Priority: Major
>
> I have reviewed the kubernetes integration tests and found out that following 
> cases are missing for testing scheduler backend functionality. 
>  * Run application with driver and executor image specified independently
>  * Request Pods with custom CPU and Limits
>  * Request Pods with custom Memory and memory overhead factor
>  * Request Pods with custom Memory and memory overhead
>  * Pods are relaunched on failures (as per 
> spark.kubernetes.executor.lostCheck.maxAttempts)
> Logging this Jira to add these tests



--
This message was sent by Atlassian JIRA
(v7.6.3#76005)

-
To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org
For additional commands, e-mail: issues-h...@spark.apache.org



[jira] [Commented] (SPARK-26585) [K8S] Add additional integration tests for K8s Scheduler Backend

2019-01-11 Thread Nagaram Prasad Addepally (JIRA)


[ 
https://issues.apache.org/jira/browse/SPARK-26585?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=16740718#comment-16740718
 ] 

Nagaram Prasad Addepally commented on SPARK-26585:
--

Closing this Jira as per comments in PR. We do not want to add integration 
tests and instead we need to add unittests to cover these cases.

> [K8S] Add additional integration tests for K8s Scheduler Backend 
> -
>
> Key: SPARK-26585
> URL: https://issues.apache.org/jira/browse/SPARK-26585
> Project: Spark
>  Issue Type: Test
>  Components: Kubernetes
>Affects Versions: 3.0.0
>Reporter: Nagaram Prasad Addepally
>Priority: Major
>
> I have reviewed the kubernetes integration tests and found out that following 
> cases are missing for testing scheduler backend functionality. 
>  * Run application with driver and executor image specified independently
>  * Request Pods with custom CPU and Limits
>  * Request Pods with custom Memory and memory overhead factor
>  * Request Pods with custom Memory and memory overhead
>  * Pods are relaunched on failures (as per 
> spark.kubernetes.executor.lostCheck.maxAttempts)
> Logging this Jira to add these tests



--
This message was sent by Atlassian JIRA
(v7.6.3#76005)

-
To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org
For additional commands, e-mail: issues-h...@spark.apache.org



[jira] [Resolved] (SPARK-26585) [K8S] Add additional integration tests for K8s Scheduler Backend

2019-01-11 Thread Nagaram Prasad Addepally (JIRA)


 [ 
https://issues.apache.org/jira/browse/SPARK-26585?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
 ]

Nagaram Prasad Addepally resolved SPARK-26585.
--
Resolution: Won't Fix

> [K8S] Add additional integration tests for K8s Scheduler Backend 
> -
>
> Key: SPARK-26585
> URL: https://issues.apache.org/jira/browse/SPARK-26585
> Project: Spark
>  Issue Type: Test
>  Components: Kubernetes
>Affects Versions: 3.0.0
>Reporter: Nagaram Prasad Addepally
>Priority: Major
>
> I have reviewed the kubernetes integration tests and found out that following 
> cases are missing for testing scheduler backend functionality. 
>  * Run application with driver and executor image specified independently
>  * Request Pods with custom CPU and Limits
>  * Request Pods with custom Memory and memory overhead factor
>  * Request Pods with custom Memory and memory overhead
>  * Pods are relaunched on failures (as per 
> spark.kubernetes.executor.lostCheck.maxAttempts)
> Logging this Jira to add these tests



--
This message was sent by Atlassian JIRA
(v7.6.3#76005)

-
To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org
For additional commands, e-mail: issues-h...@spark.apache.org