[jira] [Commented] (SPARK-37571) decouple amplab jenkins from spark website, builds and tests

2023-12-05 Thread Shane Knapp (Jira)


[ 
https://issues.apache.org/jira/browse/SPARK-37571?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=17793420#comment-17793420
 ] 

Shane Knapp commented on SPARK-37571:
-

a blast from the past!  XD

sounds good to me...  there are still some vestiges of amplab and jenkins in 
some bits of the repo but nothing really to write home about:

[https://github.com/search?q=repo%3Aapache%2Fspark+amplab=code]

[https://github.com/search?q=repo%3Aapache%2Fspark+jenkins=code]

the 'jenkins' entries *mostly* look to be setting bits for tests' setup and my 
quick perusal didn't raise any flags.  it might be a decent idea to audit this 
stuff at some point and pull it out.  :shrug:

> decouple amplab jenkins from spark website, builds and tests
> 
>
> Key: SPARK-37571
> URL: https://issues.apache.org/jira/browse/SPARK-37571
> Project: Spark
>  Issue Type: Improvement
>  Components: Build
>Affects Versions: 3.3.0
>Reporter: Shane Knapp
>Assignee: Shane Knapp
>Priority: Major
> Attachments: audit.txt, spark-repo-to-be-audited.txt
>
>
> we will be turning off jenkins on dec 23rd, and we need to decouple the build 
> infra from jenkins, as well as remove any amplab jenkins-specific docs on the 
> website, scripts and infra setup.
> i'll be creating > 1 PRs for this.



--
This message was sent by Atlassian Jira
(v8.20.10#820010)

-
To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org
For additional commands, e-mail: issues-h...@spark.apache.org



[jira] [Commented] (SPARK-37571) decouple amplab jenkins from spark website, builds and tests

2021-12-07 Thread Shane Knapp (Jira)


[ 
https://issues.apache.org/jira/browse/SPARK-37571?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=17454925#comment-17454925
 ] 

Shane Knapp commented on SPARK-37571:
-

this is gonna take a while...  nearly a decade later, jenkins' reach is pretty 
deep:  [^audit.txt]

> decouple amplab jenkins from spark website, builds and tests
> 
>
> Key: SPARK-37571
> URL: https://issues.apache.org/jira/browse/SPARK-37571
> Project: Spark
>  Issue Type: Improvement
>  Components: Build
>Affects Versions: 3.2.0
>Reporter: Shane Knapp
>Assignee: Shane Knapp
>Priority: Major
> Attachments: audit.txt, spark-repo-to-be-audited.txt
>
>
> we will be turning off jenkins on dec 23rd, and we need to decouple the build 
> infra from jenkins, as well as remove any amplab jenkins-specific docs on the 
> website, scripts and infra setup.
> i'll be creating > 1 PRs for this.



--
This message was sent by Atlassian Jira
(v8.20.1#820001)

-
To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org
For additional commands, e-mail: issues-h...@spark.apache.org



[jira] [Updated] (SPARK-37571) decouple amplab jenkins from spark website, builds and tests

2021-12-07 Thread Shane Knapp (Jira)


 [ 
https://issues.apache.org/jira/browse/SPARK-37571?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
 ]

Shane Knapp updated SPARK-37571:

Attachment: audit.txt

> decouple amplab jenkins from spark website, builds and tests
> 
>
> Key: SPARK-37571
> URL: https://issues.apache.org/jira/browse/SPARK-37571
> Project: Spark
>  Issue Type: Improvement
>  Components: Build
>Affects Versions: 3.2.0
>Reporter: Shane Knapp
>Assignee: Shane Knapp
>Priority: Major
> Attachments: audit.txt, spark-repo-to-be-audited.txt
>
>
> we will be turning off jenkins on dec 23rd, and we need to decouple the build 
> infra from jenkins, as well as remove any amplab jenkins-specific docs on the 
> website, scripts and infra setup.
> i'll be creating > 1 PRs for this.



--
This message was sent by Atlassian Jira
(v8.20.1#820001)

-
To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org
For additional commands, e-mail: issues-h...@spark.apache.org



[jira] [Updated] (SPARK-37571) decouple amplab jenkins from spark website, builds and tests

2021-12-07 Thread Shane Knapp (Jira)


 [ 
https://issues.apache.org/jira/browse/SPARK-37571?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
 ]

Shane Knapp updated SPARK-37571:

Attachment: spark-repo-to-be-audited.txt

> decouple amplab jenkins from spark website, builds and tests
> 
>
> Key: SPARK-37571
> URL: https://issues.apache.org/jira/browse/SPARK-37571
> Project: Spark
>  Issue Type: Improvement
>  Components: Build
>Affects Versions: 3.2.0
>Reporter: Shane Knapp
>Assignee: Shane Knapp
>Priority: Major
> Attachments: spark-repo-to-be-audited.txt
>
>
> we will be turning off jenkins on dec 23rd, and we need to decouple the build 
> infra from jenkins, as well as remove any amplab jenkins-specific docs on the 
> website, scripts and infra setup.
> i'll be creating > 1 PRs for this.



--
This message was sent by Atlassian Jira
(v8.20.1#820001)

-
To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org
For additional commands, e-mail: issues-h...@spark.apache.org



[jira] [Updated] (SPARK-37571) decouple amplab jenkins from spark website, builds and tests

2021-12-07 Thread Shane Knapp (Jira)


 [ 
https://issues.apache.org/jira/browse/SPARK-37571?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
 ]

Shane Knapp updated SPARK-37571:

Attachment: spark-repo-to-be-audited.txt

> decouple amplab jenkins from spark website, builds and tests
> 
>
> Key: SPARK-37571
> URL: https://issues.apache.org/jira/browse/SPARK-37571
> Project: Spark
>  Issue Type: Improvement
>  Components: Build
>Affects Versions: 3.2.0
>Reporter: Shane Knapp
>Assignee: Shane Knapp
>Priority: Major
>
> we will be turning off jenkins on dec 23rd, and we need to decouple the build 
> infra from jenkins, as well as remove any amplab jenkins-specific docs on the 
> website, scripts and infra setup.
> i'll be creating > 1 PRs for this.



--
This message was sent by Atlassian Jira
(v8.20.1#820001)

-
To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org
For additional commands, e-mail: issues-h...@spark.apache.org



[jira] [Updated] (SPARK-37571) decouple amplab jenkins from spark website, builds and tests

2021-12-07 Thread Shane Knapp (Jira)


 [ 
https://issues.apache.org/jira/browse/SPARK-37571?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
 ]

Shane Knapp updated SPARK-37571:

Attachment: (was: spark-repo-to-be-audited.txt)

> decouple amplab jenkins from spark website, builds and tests
> 
>
> Key: SPARK-37571
> URL: https://issues.apache.org/jira/browse/SPARK-37571
> Project: Spark
>  Issue Type: Improvement
>  Components: Build
>Affects Versions: 3.2.0
>Reporter: Shane Knapp
>Assignee: Shane Knapp
>Priority: Major
>
> we will be turning off jenkins on dec 23rd, and we need to decouple the build 
> infra from jenkins, as well as remove any amplab jenkins-specific docs on the 
> website, scripts and infra setup.
> i'll be creating > 1 PRs for this.



--
This message was sent by Atlassian Jira
(v8.20.1#820001)

-
To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org
For additional commands, e-mail: issues-h...@spark.apache.org



[jira] [Updated] (SPARK-37571) decouple amplab jenkins from spark website, builds and tests

2021-12-07 Thread Shane Knapp (Jira)


 [ 
https://issues.apache.org/jira/browse/SPARK-37571?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
 ]

Shane Knapp updated SPARK-37571:

Summary: decouple amplab jenkins from spark website, builds and tests  
(was: decouple jenkins from spark builds and tests)

> decouple amplab jenkins from spark website, builds and tests
> 
>
> Key: SPARK-37571
> URL: https://issues.apache.org/jira/browse/SPARK-37571
> Project: Spark
>  Issue Type: Improvement
>  Components: Build
>Affects Versions: 3.2.0
>Reporter: Shane Knapp
>Assignee: Shane Knapp
>Priority: Major
>
> we will be turning off jenkins on dec 23rd, and we need to decouple the build 
> infra from jenkins, as well as remove any amplab jenkins-specific scripts and 
> infra setup.
> i'll be creating > 1 PRs for this.



--
This message was sent by Atlassian Jira
(v8.20.1#820001)

-
To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org
For additional commands, e-mail: issues-h...@spark.apache.org



[jira] [Updated] (SPARK-37571) decouple amplab jenkins from spark website, builds and tests

2021-12-07 Thread Shane Knapp (Jira)


 [ 
https://issues.apache.org/jira/browse/SPARK-37571?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
 ]

Shane Knapp updated SPARK-37571:

Description: 
we will be turning off jenkins on dec 23rd, and we need to decouple the build 
infra from jenkins, as well as remove any amplab jenkins-specific docs on the 
website, scripts and infra setup.

i'll be creating > 1 PRs for this.

  was:
we will be turning off jenkins on dec 23rd, and we need to decouple the build 
infra from jenkins, as well as remove any amplab jenkins-specific scripts and 
infra setup.

i'll be creating > 1 PRs for this.


> decouple amplab jenkins from spark website, builds and tests
> 
>
> Key: SPARK-37571
> URL: https://issues.apache.org/jira/browse/SPARK-37571
> Project: Spark
>  Issue Type: Improvement
>  Components: Build
>Affects Versions: 3.2.0
>Reporter: Shane Knapp
>Assignee: Shane Knapp
>Priority: Major
>
> we will be turning off jenkins on dec 23rd, and we need to decouple the build 
> infra from jenkins, as well as remove any amplab jenkins-specific docs on the 
> website, scripts and infra setup.
> i'll be creating > 1 PRs for this.



--
This message was sent by Atlassian Jira
(v8.20.1#820001)

-
To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org
For additional commands, e-mail: issues-h...@spark.apache.org



[jira] [Created] (SPARK-37571) decouple jenkins from spark builds and tests

2021-12-07 Thread Shane Knapp (Jira)
Shane Knapp created SPARK-37571:
---

 Summary: decouple jenkins from spark builds and tests
 Key: SPARK-37571
 URL: https://issues.apache.org/jira/browse/SPARK-37571
 Project: Spark
  Issue Type: Improvement
  Components: Build
Affects Versions: 3.2.0
Reporter: Shane Knapp
Assignee: Shane Knapp


we will be turning off jenkins on dec 23rd, and we need to decouple the build 
infra from jenkins, as well as remove any amplab jenkins-specific scripts and 
infra setup.

i'll be creating > 1 PRs for this.



--
This message was sent by Atlassian Jira
(v8.20.1#820001)

-
To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org
For additional commands, e-mail: issues-h...@spark.apache.org



[jira] [Commented] (SPARK-37109) Install Java 17 on all of the Jenkins workers

2021-10-26 Thread Shane Knapp (Jira)


[ 
https://issues.apache.org/jira/browse/SPARK-37109?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=17434581#comment-17434581
 ] 

Shane Knapp commented on SPARK-37109:
-

yep, jenkins is going away at the end of this year...  all support is currently 
'best effort'. 

> Install Java 17 on all of the Jenkins workers
> -
>
> Key: SPARK-37109
> URL: https://issues.apache.org/jira/browse/SPARK-37109
> Project: Spark
>  Issue Type: Sub-task
>  Components: Project Infra
>Affects Versions: 3.3.0
>Reporter: Yuming Wang
>Priority: Major
>




--
This message was sent by Atlassian Jira
(v8.3.4#803005)

-
To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org
For additional commands, e-mail: issues-h...@spark.apache.org



[jira] [Resolved] (SPARK-37011) Upgrade flake8 to 3.9.0 or above in Jenkins

2021-10-25 Thread Shane Knapp (Jira)


 [ 
https://issues.apache.org/jira/browse/SPARK-37011?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
 ]

Shane Knapp resolved SPARK-37011.
-
Fix Version/s: 3.3.0
   Resolution: Fixed

Issue resolved by pull request 34384
[https://github.com/apache/spark/pull/34384]

> Upgrade flake8 to 3.9.0 or above in Jenkins
> ---
>
> Key: SPARK-37011
> URL: https://issues.apache.org/jira/browse/SPARK-37011
> Project: Spark
>  Issue Type: Improvement
>  Components: PySpark
>Affects Versions: 3.3.0
>Reporter: Takuya Ueshin
>Assignee: Shane Knapp
>Priority: Major
> Fix For: 3.3.0
>
>
> In flake8 < 3.9.0, F401 error occurs for imports when the imported identities 
> are used in a {{bound}} argument in {{TypeVar(..., bound="XXX")}}.
> For example:
> {code:python}
> if TYPE_CHECKING:
> from pyspark.pandas.base import IndexOpsMixin
> IndexOpsLike = TypeVar("IndexOpsLike", bound="IndexOpsMixin")
> {code}
> Since this behavior is fixed In flake8 >= 3.9.0, we should upgrade the flake8 
> installed in Jenkins to 3.9.0 or above.
> And also we might update the {{MINIMUM_FLAKE8}} in the {{lint-python}} from 
> 3.8.0 to 3.9.0.



--
This message was sent by Atlassian Jira
(v8.3.4#803005)

-
To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org
For additional commands, e-mail: issues-h...@spark.apache.org



[jira] [Assigned] (SPARK-37011) Upgrade flake8 to 3.9.0 or above in Jenkins

2021-10-25 Thread Shane Knapp (Jira)


 [ 
https://issues.apache.org/jira/browse/SPARK-37011?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
 ]

Shane Knapp reassigned SPARK-37011:
---

Assignee: Shane Knapp

> Upgrade flake8 to 3.9.0 or above in Jenkins
> ---
>
> Key: SPARK-37011
> URL: https://issues.apache.org/jira/browse/SPARK-37011
> Project: Spark
>  Issue Type: Improvement
>  Components: PySpark
>Affects Versions: 3.3.0
>Reporter: Takuya Ueshin
>Assignee: Shane Knapp
>Priority: Major
>
> In flake8 < 3.9.0, F401 error occurs for imports when the imported identities 
> are used in a {{bound}} argument in {{TypeVar(..., bound="XXX")}}.
> For example:
> {code:python}
> if TYPE_CHECKING:
> from pyspark.pandas.base import IndexOpsMixin
> IndexOpsLike = TypeVar("IndexOpsLike", bound="IndexOpsMixin")
> {code}
> Since this behavior is fixed In flake8 >= 3.9.0, we should upgrade the flake8 
> installed in Jenkins to 3.9.0 or above.
> And also we might update the {{MINIMUM_FLAKE8}} in the {{lint-python}} from 
> 3.8.0 to 3.9.0.



--
This message was sent by Atlassian Jira
(v8.3.4#803005)

-
To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org
For additional commands, e-mail: issues-h...@spark.apache.org



[jira] [Commented] (SPARK-37011) Upgrade flake8 to 3.9.0 or above in Jenkins

2021-10-25 Thread Shane Knapp (Jira)


[ 
https://issues.apache.org/jira/browse/SPARK-37011?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=17433954#comment-17433954
 ] 

Shane Knapp commented on SPARK-37011:
-

from the test build 
(https://amplab.cs.berkeley.edu/jenkins/job/SparkPullRequestBuilder/144597/consoleFull):

 
{noformat}
starting python compilation test...
python compilation succeeded.

The python3 -m black command was not found. Skipping black checks for now.

downloading pycodestyle from 
https://raw.githubusercontent.com/PyCQA/pycodestyle/2.7.0/pycodestyle.py...
starting pycodestyle test...
pycodestyle checks passed.

starting flake8 test...
flake8 checks passed.

The mypy command was not found. Skipping for now.

all lint-python tests passed!{noformat}
 regardless of it passing or not, the flake8 version tested against passed.  
i'm going to close this and merge the PR.

 

 

 

> Upgrade flake8 to 3.9.0 or above in Jenkins
> ---
>
> Key: SPARK-37011
> URL: https://issues.apache.org/jira/browse/SPARK-37011
> Project: Spark
>  Issue Type: Improvement
>  Components: PySpark
>Affects Versions: 3.3.0
>Reporter: Takuya Ueshin
>Priority: Major
>
> In flake8 < 3.9.0, F401 error occurs for imports when the imported identities 
> are used in a {{bound}} argument in {{TypeVar(..., bound="XXX")}}.
> For example:
> {code:python}
> if TYPE_CHECKING:
> from pyspark.pandas.base import IndexOpsMixin
> IndexOpsLike = TypeVar("IndexOpsLike", bound="IndexOpsMixin")
> {code}
> Since this behavior is fixed In flake8 >= 3.9.0, we should upgrade the flake8 
> installed in Jenkins to 3.9.0 or above.
> And also we might update the {{MINIMUM_FLAKE8}} in the {{lint-python}} from 
> 3.8.0 to 3.9.0.



--
This message was sent by Atlassian Jira
(v8.3.4#803005)

-
To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org
For additional commands, e-mail: issues-h...@spark.apache.org



[jira] [Commented] (SPARK-37011) Upgrade flake8 to 3.9.0 or above in Jenkins

2021-10-25 Thread Shane Knapp (Jira)


[ 
https://issues.apache.org/jira/browse/SPARK-37011?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=17433923#comment-17433923
 ] 

Shane Knapp commented on SPARK-37011:
-

PR w/the updated conda config:

[https://github.com/apache/spark/pull/34384]

i will watch the build results for passing python style tests.

> Upgrade flake8 to 3.9.0 or above in Jenkins
> ---
>
> Key: SPARK-37011
> URL: https://issues.apache.org/jira/browse/SPARK-37011
> Project: Spark
>  Issue Type: Improvement
>  Components: PySpark
>Affects Versions: 3.3.0
>Reporter: Takuya Ueshin
>Priority: Major
>
> In flake8 < 3.9.0, F401 error occurs for imports when the imported identities 
> are used in a {{bound}} argument in {{TypeVar(..., bound="XXX")}}.
> For example:
> {code:python}
> if TYPE_CHECKING:
> from pyspark.pandas.base import IndexOpsMixin
> IndexOpsLike = TypeVar("IndexOpsLike", bound="IndexOpsMixin")
> {code}
> Since this behavior is fixed In flake8 >= 3.9.0, we should upgrade the flake8 
> installed in Jenkins to 3.9.0 or above.
> And also we might update the {{MINIMUM_FLAKE8}} in the {{lint-python}} from 
> 3.8.0 to 3.9.0.



--
This message was sent by Atlassian Jira
(v8.3.4#803005)

-
To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org
For additional commands, e-mail: issues-h...@spark.apache.org



[jira] [Commented] (SPARK-37011) Upgrade flake8 to 3.9.0 or above in Jenkins

2021-10-25 Thread Shane Knapp (Jira)


[ 
https://issues.apache.org/jira/browse/SPARK-37011?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=17433916#comment-17433916
 ] 

Shane Knapp commented on SPARK-37011:
-

done.

 
{noformat}
(py36) jenkins@research-jenkins-worker-01:~/sknapp/spark/dev$ python
Python 3.6.8 |Anaconda, Inc.| (default, Dec 30 2018, 01:22:34)
[GCC 7.3.0] on linux
Type "help", "copyright", "credits" or "license" for more information.
>>> import flake8
>>> flake8.__version__
'3.9.0'
>>>
(py36) jenkins@research-jenkins-worker-01:~/sknapp/spark/dev$ ./lint-python
starting python compilation test...
python compilation succeeded.

The python3 -m black command was not found. Skipping black checks for now.

downloading pycodestyle from 
https://raw.githubusercontent.com/PyCQA/pycodestyle/2.7.0/pycodestyle.py...
starting pycodestyle test...
pycodestyle checks passed.

starting flake8 test...
flake8 checks passed.

The mypy command was not found. Skipping for now.

all lint-python tests passed!{noformat}

> Upgrade flake8 to 3.9.0 or above in Jenkins
> ---
>
> Key: SPARK-37011
> URL: https://issues.apache.org/jira/browse/SPARK-37011
> Project: Spark
>  Issue Type: Improvement
>  Components: PySpark
>Affects Versions: 3.3.0
>Reporter: Takuya Ueshin
>Priority: Major
>
> In flake8 < 3.9.0, F401 error occurs for imports when the imported identities 
> are used in a {{bound}} argument in {{TypeVar(..., bound="XXX")}}.
> For example:
> {code:python}
> if TYPE_CHECKING:
> from pyspark.pandas.base import IndexOpsMixin
> IndexOpsLike = TypeVar("IndexOpsLike", bound="IndexOpsMixin")
> {code}
> Since this behavior is fixed In flake8 >= 3.9.0, we should upgrade the flake8 
> installed in Jenkins to 3.9.0 or above.
> And also we might update the {{MINIMUM_FLAKE8}} in the {{lint-python}} from 
> 3.8.0 to 3.9.0.



--
This message was sent by Atlassian Jira
(v8.3.4#803005)

-
To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org
For additional commands, e-mail: issues-h...@spark.apache.org



[jira] [Comment Edited] (SPARK-34943) Upgrade flake8 to 3.8.0 or above in Jenkins

2021-09-13 Thread Shane Knapp (Jira)


[ 
https://issues.apache.org/jira/browse/SPARK-34943?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=17414616#comment-17414616
 ] 

Shane Knapp edited comment on SPARK-34943 at 9/13/21, 10:15 PM:


flake8 tests passing w/3.8.0!

from [https://amplab.cs.berkeley.edu/jenkins/job/SparkPullRequestBuilder/143223]
{noformat}

Running Python style checks

starting python compilation test...
python compilation succeeded.

downloading pycodestyle from 
https://raw.githubusercontent.com/PyCQA/pycodestyle/2.7.0/pycodestyle.py...
starting pycodestyle test...
pycodestyle checks passed.

starting flake8 test...
flake8 checks passed.

The mypy command was not found. Skipping for now.

all lint-python tests passed!{noformat}
checking on the jenkins worker directly:
{noformat}
(py36) jenkins@research-jenkins-worker-08:~/workspace$ grep MINIMUM_FLAKE8
SparkPullRequestBuilder/dev/lint-python MINIMUM_FLAKE8="3.8.0"
{noformat}


was (Author: shaneknapp):
flake8 tests passing w/3.8.0!

from [https://amplab.cs.berkeley.edu/jenkins/job/SparkPullRequestBuilder/143223]
{noformat}

Running Python style checks

starting python compilation test...
python compilation succeeded.

downloading pycodestyle from 
https://raw.githubusercontent.com/PyCQA/pycodestyle/2.7.0/pycodestyle.py...
starting pycodestyle test...
pycodestyle checks passed.

starting flake8 test...
flake8 checks passed.

The mypy command was not found. Skipping for now.

all lint-python tests passed!{noformat}
checking on the jenkins worker directly:
{noformat}
(py36) jenkins@research-jenkins-worker-08:~/workspace$ grep MINIMUM_FLAKE8 
SparkPullRequestBuilder/dev/lint-python MINIMUM_FLAKE8="3.8.0"
{noformat}

> Upgrade flake8 to 3.8.0 or above in Jenkins
> ---
>
> Key: SPARK-34943
> URL: https://issues.apache.org/jira/browse/SPARK-34943
> Project: Spark
>  Issue Type: Improvement
>  Components: PySpark
>Affects Versions: 3.2.0
>Reporter: Haejoon Lee
>Assignee: Shane Knapp
>Priority: Major
>
> In flake8 < 3.8.0, F401 error occurs for imports in *if* statements when 
> TYPE_CHECKING is True. However, TYPE_CHECKING is always False at runtime, so 
> there is no need to treat it as an error in static analysis.
> Since this behavior is fixed In flake8 >= 3.8.0, we should upgrade the flake8 
> installed in Jenkins to 3.8.0 or above. Otherwise, it occurs F401 error for 
> several lines in pandas-on-PySpark that uses TYPE_CHECKING.
> And also we might update the {{MINIMUM_FLAKE8}} in the {{lint-python}} from 
> 3.5.0 to 3.8.0.



--
This message was sent by Atlassian Jira
(v8.3.4#803005)

-
To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org
For additional commands, e-mail: issues-h...@spark.apache.org



[jira] [Commented] (SPARK-34943) Upgrade flake8 to 3.8.0 or above in Jenkins

2021-09-13 Thread Shane Knapp (Jira)


[ 
https://issues.apache.org/jira/browse/SPARK-34943?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=17414616#comment-17414616
 ] 

Shane Knapp commented on SPARK-34943:
-

flake8 tests passing w/3.8.0!

from [https://amplab.cs.berkeley.edu/jenkins/job/SparkPullRequestBuilder/143223]
{noformat}

Running Python style checks

starting python compilation test...
python compilation succeeded.

downloading pycodestyle from 
https://raw.githubusercontent.com/PyCQA/pycodestyle/2.7.0/pycodestyle.py...
starting pycodestyle test...
pycodestyle checks passed.

starting flake8 test...
flake8 checks passed.

The mypy command was not found. Skipping for now.

all lint-python tests passed!{noformat}
checking on the jenkins worker directly:
{noformat}
(py36) jenkins@research-jenkins-worker-08:~/workspace$ grep MINIMUM_FLAKE8 
SparkPullRequestBuilder/dev/lint-python MINIMUM_FLAKE8="3.8.0"
{noformat}

> Upgrade flake8 to 3.8.0 or above in Jenkins
> ---
>
> Key: SPARK-34943
> URL: https://issues.apache.org/jira/browse/SPARK-34943
> Project: Spark
>  Issue Type: Improvement
>  Components: PySpark
>Affects Versions: 3.2.0
>Reporter: Haejoon Lee
>Assignee: Shane Knapp
>Priority: Major
>
> In flake8 < 3.8.0, F401 error occurs for imports in *if* statements when 
> TYPE_CHECKING is True. However, TYPE_CHECKING is always False at runtime, so 
> there is no need to treat it as an error in static analysis.
> Since this behavior is fixed In flake8 >= 3.8.0, we should upgrade the flake8 
> installed in Jenkins to 3.8.0 or above. Otherwise, it occurs F401 error for 
> several lines in pandas-on-PySpark that uses TYPE_CHECKING.
> And also we might update the {{MINIMUM_FLAKE8}} in the {{lint-python}} from 
> 3.5.0 to 3.8.0.



--
This message was sent by Atlassian Jira
(v8.3.4#803005)

-
To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org
For additional commands, e-mail: issues-h...@spark.apache.org



[jira] [Commented] (SPARK-34943) Upgrade flake8 to 3.8.0 or above in Jenkins

2021-09-13 Thread Shane Knapp (Jira)


[ 
https://issues.apache.org/jira/browse/SPARK-34943?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=17414585#comment-17414585
 ] 

Shane Knapp commented on SPARK-34943:
-

done:

 
{noformat}
parallel-ssh -h ubuntu_workers.txt -i 
'/home/jenkins/anaconda2/envs/py36/bin/python -c "import flake8; 
print(flake8.__version__)"'
[1] 13:58:53 [SUCCESS] research-jenkins-worker-03
3.8.0
[2] 13:58:53 [SUCCESS] research-jenkins-worker-02
3.8.0
[3] 13:58:53 [SUCCESS] research-jenkins-worker-06
3.8.0
[4] 13:58:53 [SUCCESS] research-jenkins-worker-07
3.8.0
[5] 13:58:53 [SUCCESS] research-jenkins-worker-05
3.8.0
[6] 13:58:53 [SUCCESS] research-jenkins-worker-04
3.8.0
[7] 13:58:53 [SUCCESS] research-jenkins-worker-01
3.8.0
[8] 13:58:54 [SUCCESS] research-jenkins-worker-08
3.8.0{noformat}

> Upgrade flake8 to 3.8.0 or above in Jenkins
> ---
>
> Key: SPARK-34943
> URL: https://issues.apache.org/jira/browse/SPARK-34943
> Project: Spark
>  Issue Type: Improvement
>  Components: PySpark
>Affects Versions: 3.2.0
>Reporter: Haejoon Lee
>Assignee: Shane Knapp
>Priority: Major
>
> In flake8 < 3.8.0, F401 error occurs for imports in *if* statements when 
> TYPE_CHECKING is True. However, TYPE_CHECKING is always False at runtime, so 
> there is no need to treat it as an error in static analysis.
> Since this behavior is fixed In flake8 >= 3.8.0, we should upgrade the flake8 
> installed in Jenkins to 3.8.0 or above. Otherwise, it occurs F401 error for 
> several lines in pandas-on-PySpark that uses TYPE_CHECKING.
> And also we might update the {{MINIMUM_FLAKE8}} in the {{lint-python}} from 
> 3.5.0 to 3.8.0.



--
This message was sent by Atlassian Jira
(v8.3.4#803005)

-
To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org
For additional commands, e-mail: issues-h...@spark.apache.org



[jira] [Resolved] (SPARK-35430) Investigate the failure of "PVs with local storage" integration test on Docker driver

2021-08-02 Thread Shane Knapp (Jira)


 [ 
https://issues.apache.org/jira/browse/SPARK-35430?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
 ]

Shane Knapp resolved SPARK-35430.
-
Fix Version/s: 3.3.0
   Resolution: Fixed

Issue resolved by pull request 32793
[https://github.com/apache/spark/pull/32793]

> Investigate the failure of "PVs with local storage" integration test on 
> Docker driver
> -
>
> Key: SPARK-35430
> URL: https://issues.apache.org/jira/browse/SPARK-35430
> Project: Spark
>  Issue Type: Bug
>  Components: Kubernetes
>Affects Versions: 3.2.0
>Reporter: Attila Zsolt Piros
>Assignee: Attila Zsolt Piros
>Priority: Major
> Fix For: 3.3.0
>
>
> With https://issues.apache.org/jira/browse/SPARK-34738 integration tests are 
> migrated to docker but "PVs with local storage" was failing so we created a 
> separate test tag in https://github.com/apache/spark/pull/31829 called 
> "persistentVolume" test tag which not used by the 
> dev-run-integration-tests.sh so this way that tests is skipped.
> Here we should revert "persistentVolume" and investigate the error.



--
This message was sent by Atlassian Jira
(v8.3.4#803005)

-
To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org
For additional commands, e-mail: issues-h...@spark.apache.org



[jira] [Assigned] (SPARK-35430) Investigate the failure of "PVs with local storage" integration test on Docker driver

2021-08-02 Thread Shane Knapp (Jira)


 [ 
https://issues.apache.org/jira/browse/SPARK-35430?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
 ]

Shane Knapp reassigned SPARK-35430:
---

Assignee: Attila Zsolt Piros

> Investigate the failure of "PVs with local storage" integration test on 
> Docker driver
> -
>
> Key: SPARK-35430
> URL: https://issues.apache.org/jira/browse/SPARK-35430
> Project: Spark
>  Issue Type: Bug
>  Components: Kubernetes
>Affects Versions: 3.2.0
>Reporter: Attila Zsolt Piros
>Assignee: Attila Zsolt Piros
>Priority: Major
>
> With https://issues.apache.org/jira/browse/SPARK-34738 integration tests are 
> migrated to docker but "PVs with local storage" was failing so we created a 
> separate test tag in https://github.com/apache/spark/pull/31829 called 
> "persistentVolume" test tag which not used by the 
> dev-run-integration-tests.sh so this way that tests is skipped.
> Here we should revert "persistentVolume" and investigate the error.



--
This message was sent by Atlassian Jira
(v8.3.4#803005)

-
To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org
For additional commands, e-mail: issues-h...@spark.apache.org



[jira] [Resolved] (SPARK-32797) Install mypy on the Jenkins CI workers

2021-07-26 Thread Shane Knapp (Jira)


 [ 
https://issues.apache.org/jira/browse/SPARK-32797?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
 ]

Shane Knapp resolved SPARK-32797.
-
Resolution: Won't Fix

re comments from [https://github.com/apache/spark/pull/33503]

since we test mypy w/github actions, we don't need this to be installed on the 
jenkins workers.

> Install mypy on the Jenkins CI workers
> --
>
> Key: SPARK-32797
> URL: https://issues.apache.org/jira/browse/SPARK-32797
> Project: Spark
>  Issue Type: Improvement
>  Components: jenkins, PySpark
>Affects Versions: 3.1.0
>Reporter: Fokko Driesprong
>Assignee: Shane Knapp
>Priority: Major
> Fix For: 3.3.0
>
>
> We want to check the types of the PySpark code. This requires mypy to be 
> installed on the CI. Can you do this [~shaneknapp]? 
> Related PR: [https://github.com/apache/spark/pull/29180]
> You can install this using pip: [https://pypi.org/project/mypy/] Should be 
> similar to flake8 and sphinx. The latest version is ok! Thanks!



--
This message was sent by Atlassian Jira
(v8.3.4#803005)

-
To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org
For additional commands, e-mail: issues-h...@spark.apache.org



[jira] [Comment Edited] (SPARK-32797) Install mypy on the Jenkins CI workers

2021-07-23 Thread Shane Knapp (Jira)


[ 
https://issues.apache.org/jira/browse/SPARK-32797?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=17386522#comment-17386522
 ] 

Shane Knapp edited comment on SPARK-32797 at 7/24/21, 1:05 AM:
---

[~hyukjin.kwon] [~fokko] this change breaks the current build (properly, i 
assume, but fail nonetheless):

[https://amplab.cs.berkeley.edu/jenkins/job/SparkPullRequestBuilder/141578/console]


was (Author: shaneknapp):
[~hyukjin.kwon] this is now causing all builds to fail (properly, i assume, but 
fail nonetheless):

https://amplab.cs.berkeley.edu/jenkins/job/SparkPullRequestBuilder/141578/console

> Install mypy on the Jenkins CI workers
> --
>
> Key: SPARK-32797
> URL: https://issues.apache.org/jira/browse/SPARK-32797
> Project: Spark
>  Issue Type: Improvement
>  Components: jenkins, PySpark
>Affects Versions: 3.1.0
>Reporter: Fokko Driesprong
>Assignee: Shane Knapp
>Priority: Major
> Fix For: 3.3.0
>
>
> We want to check the types of the PySpark code. This requires mypy to be 
> installed on the CI. Can you do this [~shaneknapp]? 
> Related PR: [https://github.com/apache/spark/pull/29180]
> You can install this using pip: [https://pypi.org/project/mypy/] Should be 
> similar to flake8 and sphinx. The latest version is ok! Thanks!



--
This message was sent by Atlassian Jira
(v8.3.4#803005)

-
To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org
For additional commands, e-mail: issues-h...@spark.apache.org



[jira] [Commented] (SPARK-32797) Install mypy on the Jenkins CI workers

2021-07-23 Thread Shane Knapp (Jira)


[ 
https://issues.apache.org/jira/browse/SPARK-32797?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=17386524#comment-17386524
 ] 

Shane Knapp commented on SPARK-32797:
-

i've temporarily disabled mypy on the workers:

for x in `cat ~/txt/jenkins-workers.txt`; do ssh jenkins@${x} "mv 
/home/jenkins/anaconda2/envs/py36/bin/mypy 
/home/jenkins/anaconda2/envs/py36/bin/mypy.bak"; done

the files that need fixing are here:
{noformat}
starting mypy test...
mypy checks failed:
python/pyspark/pandas/data_type_ops/categorical_ops.py:128: error: Function is 
missing a type annotation
python/pyspark/mllib/tree.pyi:29: error: Overloaded function signatures 1 and 2 
overlap with incompatible return types
python/pyspark/mllib/tree.pyi:38: error: Overloaded function signatures 1 and 2 
overlap with incompatible return types
python/pyspark/mllib/feature.pyi:34: error: Overloaded function signatures 1 
and 2 overlap with incompatible return types
python/pyspark/mllib/feature.pyi:42: error: Overloaded function signatures 1 
and 2 overlap with incompatible return types
python/pyspark/mllib/feature.pyi:48: error: Overloaded function signatures 1 
and 2 overlap with incompatible return types
python/pyspark/mllib/feature.pyi:54: error: Overloaded function signatures 1 
and 2 overlap with incompatible return types
python/pyspark/mllib/feature.pyi:76: error: Overloaded function signatures 1 
and 2 overlap with incompatible return types
python/pyspark/mllib/feature.pyi:124: error: Overloaded function signatures 1 
and 2 overlap with incompatible return types
python/pyspark/mllib/feature.pyi:165: error: Overloaded function signatures 1 
and 2 overlap with incompatible return types
python/pyspark/mllib/clustering.pyi:45: error: Overloaded function signatures 1 
and 2 overlap with incompatible return types
python/pyspark/mllib/clustering.pyi:72: error: Overloaded function signatures 1 
and 2 overlap with incompatible return types
python/pyspark/mllib/classification.pyi:39: error: Overloaded function 
signatures 1 and 2 overlap with incompatible return types
python/pyspark/mllib/classification.pyi:52: error: Overloaded function 
signatures 1 and 2 overlap with incompatible return types
{noformat}
could someone create a PR that addresses these, and once that's done i'll 
re-enable mypy on the workers and we'll coordinate a quick test.

 

 

> Install mypy on the Jenkins CI workers
> --
>
> Key: SPARK-32797
> URL: https://issues.apache.org/jira/browse/SPARK-32797
> Project: Spark
>  Issue Type: Improvement
>  Components: jenkins, PySpark
>Affects Versions: 3.1.0
>Reporter: Fokko Driesprong
>Assignee: Shane Knapp
>Priority: Major
> Fix For: 3.3.0
>
>
> We want to check the types of the PySpark code. This requires mypy to be 
> installed on the CI. Can you do this [~shaneknapp]? 
> Related PR: [https://github.com/apache/spark/pull/29180]
> You can install this using pip: [https://pypi.org/project/mypy/] Should be 
> similar to flake8 and sphinx. The latest version is ok! Thanks!



--
This message was sent by Atlassian Jira
(v8.3.4#803005)

-
To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org
For additional commands, e-mail: issues-h...@spark.apache.org



[jira] [Commented] (SPARK-32797) Install mypy on the Jenkins CI workers

2021-07-23 Thread Shane Knapp (Jira)


[ 
https://issues.apache.org/jira/browse/SPARK-32797?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=17386522#comment-17386522
 ] 

Shane Knapp commented on SPARK-32797:
-

[~hyukjin.kwon] this is now causing all builds to fail (properly, i assume, but 
fail nonetheless):

https://amplab.cs.berkeley.edu/jenkins/job/SparkPullRequestBuilder/141578/console

> Install mypy on the Jenkins CI workers
> --
>
> Key: SPARK-32797
> URL: https://issues.apache.org/jira/browse/SPARK-32797
> Project: Spark
>  Issue Type: Improvement
>  Components: jenkins, PySpark
>Affects Versions: 3.1.0
>Reporter: Fokko Driesprong
>Assignee: Shane Knapp
>Priority: Major
> Fix For: 3.3.0
>
>
> We want to check the types of the PySpark code. This requires mypy to be 
> installed on the CI. Can you do this [~shaneknapp]? 
> Related PR: [https://github.com/apache/spark/pull/29180]
> You can install this using pip: [https://pypi.org/project/mypy/] Should be 
> similar to flake8 and sphinx. The latest version is ok! Thanks!



--
This message was sent by Atlassian Jira
(v8.3.4#803005)

-
To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org
For additional commands, e-mail: issues-h...@spark.apache.org



[jira] [Reopened] (SPARK-32797) Install mypy on the Jenkins CI workers

2021-07-23 Thread Shane Knapp (Jira)


 [ 
https://issues.apache.org/jira/browse/SPARK-32797?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
 ]

Shane Knapp reopened SPARK-32797:
-

the version of mypy i installed wasn't current enough for the tests to run...  
fixing that now.

> Install mypy on the Jenkins CI workers
> --
>
> Key: SPARK-32797
> URL: https://issues.apache.org/jira/browse/SPARK-32797
> Project: Spark
>  Issue Type: Improvement
>  Components: jenkins, PySpark
>Affects Versions: 3.1.0
>Reporter: Fokko Driesprong
>Assignee: Shane Knapp
>Priority: Major
> Fix For: 3.3.0
>
>
> We want to check the types of the PySpark code. This requires mypy to be 
> installed on the CI. Can you do this [~shaneknapp]? 
> Related PR: [https://github.com/apache/spark/pull/29180]
> You can install this using pip: [https://pypi.org/project/mypy/] Should be 
> similar to flake8 and sphinx. The latest version is ok! Thanks!



--
This message was sent by Atlassian Jira
(v8.3.4#803005)

-
To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org
For additional commands, e-mail: issues-h...@spark.apache.org



[jira] [Resolved] (SPARK-33242) Install numpydoc in Jenkins machines

2021-07-21 Thread Shane Knapp (Jira)


 [ 
https://issues.apache.org/jira/browse/SPARK-33242?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
 ]

Shane Knapp resolved SPARK-33242.
-
Fix Version/s: 3.3.0
   Resolution: Fixed

Issue resolved by pull request 33469
[https://github.com/apache/spark/pull/33469]

> Install numpydoc in Jenkins machines
> 
>
> Key: SPARK-33242
> URL: https://issues.apache.org/jira/browse/SPARK-33242
> Project: Spark
>  Issue Type: Test
>  Components: Project Infra, PySpark
>Affects Versions: 3.1.0
>Reporter: Hyukjin Kwon
>Assignee: Shane Knapp
>Priority: Major
> Fix For: 3.3.0
>
>
> To switch to reST style to numpydoc style, we should install numpydoc as 
> well. This is being used in Sphinx. See the parent JIRA as well.



--
This message was sent by Atlassian Jira
(v8.3.4#803005)

-
To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org
For additional commands, e-mail: issues-h...@spark.apache.org



[jira] [Resolved] (SPARK-32391) Install pydata_sphinx_theme in Jenkins machines

2021-07-21 Thread Shane Knapp (Jira)


 [ 
https://issues.apache.org/jira/browse/SPARK-32391?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
 ]

Shane Knapp resolved SPARK-32391.
-
Fix Version/s: 3.3.0
   Resolution: Fixed

Issue resolved by pull request 33469
[https://github.com/apache/spark/pull/33469]

> Install pydata_sphinx_theme in Jenkins machines
> ---
>
> Key: SPARK-32391
> URL: https://issues.apache.org/jira/browse/SPARK-32391
> Project: Spark
>  Issue Type: Test
>  Components: Project Infra, PySpark
>Affects Versions: 3.0.1
>Reporter: Hyukjin Kwon
>Assignee: Shane Knapp
>Priority: Major
> Fix For: 3.3.0
>
>
> After SPARK-32179, {{pydata_sphinx_theme}} 
> https://pypi.org/project/pydata-sphinx-theme/ is needed as a new Python 
> dependency for PySpark documentation build.
> We should install it in Jenkins to test PySpark documentation build in Python 
> 3.



--
This message was sent by Atlassian Jira
(v8.3.4#803005)

-
To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org
For additional commands, e-mail: issues-h...@spark.apache.org



[jira] [Resolved] (SPARK-32666) Install ipython and nbsphinx in Jenkins for Binder integration

2021-07-21 Thread Shane Knapp (Jira)


 [ 
https://issues.apache.org/jira/browse/SPARK-32666?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
 ]

Shane Knapp resolved SPARK-32666.
-
Fix Version/s: 3.3.0
   Resolution: Fixed

Issue resolved by pull request 33469
[https://github.com/apache/spark/pull/33469]

> Install ipython and nbsphinx in Jenkins for Binder integration
> --
>
> Key: SPARK-32666
> URL: https://issues.apache.org/jira/browse/SPARK-32666
> Project: Spark
>  Issue Type: Test
>  Components: Project Infra
>Affects Versions: 3.1.0
>Reporter: Hyukjin Kwon
>Assignee: Shane Knapp
>Priority: Major
> Fix For: 3.3.0
>
>
> Binder integration requires IPython and nbsphinx to use the notebook file as 
> the documentation in PySpark.
> See SPARK-32204 and its PR for more details.



--
This message was sent by Atlassian Jira
(v8.3.4#803005)

-
To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org
For additional commands, e-mail: issues-h...@spark.apache.org



[jira] [Resolved] (SPARK-32797) Install mypy on the Jenkins CI workers

2021-07-21 Thread Shane Knapp (Jira)


 [ 
https://issues.apache.org/jira/browse/SPARK-32797?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
 ]

Shane Knapp resolved SPARK-32797.
-
Fix Version/s: 3.3.0
   Resolution: Fixed

Issue resolved by pull request 33469
[https://github.com/apache/spark/pull/33469]

> Install mypy on the Jenkins CI workers
> --
>
> Key: SPARK-32797
> URL: https://issues.apache.org/jira/browse/SPARK-32797
> Project: Spark
>  Issue Type: Improvement
>  Components: jenkins, PySpark
>Affects Versions: 3.1.0
>Reporter: Fokko Driesprong
>Assignee: Shane Knapp
>Priority: Major
> Fix For: 3.3.0
>
>
> We want to check the types of the PySpark code. This requires mypy to be 
> installed on the CI. Can you do this [~shaneknapp]? 
> Related PR: [https://github.com/apache/spark/pull/29180]
> You can install this using pip: [https://pypi.org/project/mypy/] Should be 
> similar to flake8 and sphinx. The latest version is ok! Thanks!



--
This message was sent by Atlassian Jira
(v8.3.4#803005)

-
To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org
For additional commands, e-mail: issues-h...@spark.apache.org



[jira] [Commented] (SPARK-32391) Install pydata_sphinx_theme in Jenkins machines

2021-07-21 Thread Shane Knapp (Jira)


[ 
https://issues.apache.org/jira/browse/SPARK-32391?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=17385044#comment-17385044
 ] 

Shane Knapp commented on SPARK-32391:
-

anyways, i installed this via conda and will roll out to all workers later this 
week.  :)

> Install pydata_sphinx_theme in Jenkins machines
> ---
>
> Key: SPARK-32391
> URL: https://issues.apache.org/jira/browse/SPARK-32391
> Project: Spark
>  Issue Type: Test
>  Components: Project Infra, PySpark
>Affects Versions: 3.0.1
>Reporter: Hyukjin Kwon
>Assignee: Shane Knapp
>Priority: Major
>
> After SPARK-32179, {{pydata_sphinx_theme}} 
> https://pypi.org/project/pydata-sphinx-theme/ is needed as a new Python 
> dependency for PySpark documentation build.
> We should install it in Jenkins to test PySpark documentation build in Python 
> 3.



--
This message was sent by Atlassian Jira
(v8.3.4#803005)

-
To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org
For additional commands, e-mail: issues-h...@spark.apache.org



[jira] [Commented] (SPARK-34930) Install PyArrow and pandas on Jenkins

2021-07-21 Thread Shane Knapp (Jira)


[ 
https://issues.apache.org/jira/browse/SPARK-34930?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=17385040#comment-17385040
 ] 

Shane Knapp commented on SPARK-34930:
-

oh yeah, a LOT of those skipped tests are for pypy3, not python3.6

> Install PyArrow and pandas on Jenkins
> -
>
> Key: SPARK-34930
> URL: https://issues.apache.org/jira/browse/SPARK-34930
> Project: Spark
>  Issue Type: Test
>  Components: Project Infra
>Affects Versions: 3.2.0
>Reporter: Hyukjin Kwon
>Assignee: Shane Knapp
>Priority: Critical
>
> Looks like Jenkins mahcines don't have pandas and PyArrow (ever since it got 
> upgraded?) which result in skipping related tests in PySpark, see also 
> https://github.com/apache/spark/pull/31470#issuecomment-811618571
> It would be great if we can install both in Python 3.6 on Jenkins.



--
This message was sent by Atlassian Jira
(v8.3.4#803005)

-
To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org
For additional commands, e-mail: issues-h...@spark.apache.org



[jira] [Commented] (SPARK-32797) Install mypy on the Jenkins CI workers

2021-07-21 Thread Shane Knapp (Jira)


[ 
https://issues.apache.org/jira/browse/SPARK-32797?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=17385038#comment-17385038
 ] 

Shane Knapp commented on SPARK-32797:
-

ill roll this out (and other python package updates) later today/this week.

> Install mypy on the Jenkins CI workers
> --
>
> Key: SPARK-32797
> URL: https://issues.apache.org/jira/browse/SPARK-32797
> Project: Spark
>  Issue Type: Improvement
>  Components: jenkins, PySpark
>Affects Versions: 3.1.0
>Reporter: Fokko Driesprong
>Assignee: Shane Knapp
>Priority: Major
>
> We want to check the types of the PySpark code. This requires mypy to be 
> installed on the CI. Can you do this [~shaneknapp]? 
> Related PR: [https://github.com/apache/spark/pull/29180]
> You can install this using pip: [https://pypi.org/project/mypy/] Should be 
> similar to flake8 and sphinx. The latest version is ok! Thanks!



--
This message was sent by Atlassian Jira
(v8.3.4#803005)

-
To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org
For additional commands, e-mail: issues-h...@spark.apache.org



[jira] [Resolved] (SPARK-29183) Upgrade JDK 11 Installation to 11.0.6

2021-07-21 Thread Shane Knapp (Jira)


 [ 
https://issues.apache.org/jira/browse/SPARK-29183?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
 ]

Shane Knapp resolved SPARK-29183.
-
Resolution: Fixed

this is done and all java11 installs are at 11.0.10

> Upgrade JDK 11 Installation to 11.0.6
> -
>
> Key: SPARK-29183
> URL: https://issues.apache.org/jira/browse/SPARK-29183
> Project: Spark
>  Issue Type: Improvement
>  Components: Project Infra
>Affects Versions: 3.1.0
>Reporter: Dongjoon Hyun
>Assignee: Shane Knapp
>Priority: Major
>
> Every JDK 11.0.x releases have many fixes including performance regression 
> fix. We had better upgrade it to the latest 11.0.4.
> - https://bugs.java.com/bugdatabase/view_bug.do?bug_id=JDK-8221760



--
This message was sent by Atlassian Jira
(v8.3.4#803005)

-
To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org
For additional commands, e-mail: issues-h...@spark.apache.org



[jira] [Commented] (SPARK-34930) Install PyArrow and pandas on Jenkins

2021-07-21 Thread Shane Knapp (Jira)


[ 
https://issues.apache.org/jira/browse/SPARK-34930?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=17385030#comment-17385030
 ] 

Shane Knapp commented on SPARK-34930:
-

pandas is installed, so i'm a little curious as to why the tests aren't running:
{noformat}
jenkins@research-jenkins-worker-01:~$ python
Python 3.6.8 |Anaconda, Inc.| (default, Dec 30 2018, 01:22:34)
[GCC 7.3.0] on linux
Type "help", "copyright", "credits" or "license" for more information.
>>> import pandas
>>> pandas.__version__
'0.24.2'
>>>{noformat}
pyarrow is a much more complex install than just adding the package, and 
requires manual compilation.  i'll revisit pyarrow in the next couple of 
weeks...

> Install PyArrow and pandas on Jenkins
> -
>
> Key: SPARK-34930
> URL: https://issues.apache.org/jira/browse/SPARK-34930
> Project: Spark
>  Issue Type: Test
>  Components: Project Infra
>Affects Versions: 3.2.0
>Reporter: Hyukjin Kwon
>Assignee: Shane Knapp
>Priority: Critical
>
> Looks like Jenkins mahcines don't have pandas and PyArrow (ever since it got 
> upgraded?) which result in skipping related tests in PySpark, see also 
> https://github.com/apache/spark/pull/31470#issuecomment-811618571
> It would be great if we can install both in Python 3.6 on Jenkins.



--
This message was sent by Atlassian Jira
(v8.3.4#803005)

-
To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org
For additional commands, e-mail: issues-h...@spark.apache.org



[jira] [Commented] (SPARK-32391) Install pydata_sphinx_theme in Jenkins machines

2021-07-21 Thread Shane Knapp (Jira)


[ 
https://issues.apache.org/jira/browse/SPARK-32391?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=17385026#comment-17385026
 ] 

Shane Knapp commented on SPARK-32391:
-

[~hyukjin.kwon] i am able to install this via conda...  any particular reason 
why you're requesting this through pip?

 

pydata-sphinx-theme-0.6.3 | pyhd8ed1ab_0 1.3 MB conda-forge

> Install pydata_sphinx_theme in Jenkins machines
> ---
>
> Key: SPARK-32391
> URL: https://issues.apache.org/jira/browse/SPARK-32391
> Project: Spark
>  Issue Type: Test
>  Components: Project Infra, PySpark
>Affects Versions: 3.0.1
>Reporter: Hyukjin Kwon
>Assignee: Shane Knapp
>Priority: Major
>
> After SPARK-32179, {{pydata_sphinx_theme}} 
> https://pypi.org/project/pydata-sphinx-theme/ is needed as a new Python 
> dependency for PySpark documentation build.
> We should install it in Jenkins to test PySpark documentation build in Python 
> 3.



--
This message was sent by Atlassian Jira
(v8.3.4#803005)

-
To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org
For additional commands, e-mail: issues-h...@spark.apache.org



[jira] [Commented] (SPARK-32666) Install ipython and nbsphinx in Jenkins for Binder integration

2021-07-21 Thread Shane Knapp (Jira)


[ 
https://issues.apache.org/jira/browse/SPARK-32666?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=17385021#comment-17385021
 ] 

Shane Knapp commented on SPARK-32666:
-

ill roll this out (and other python package updates) later today/this week.

> Install ipython and nbsphinx in Jenkins for Binder integration
> --
>
> Key: SPARK-32666
> URL: https://issues.apache.org/jira/browse/SPARK-32666
> Project: Spark
>  Issue Type: Test
>  Components: Project Infra
>Affects Versions: 3.1.0
>Reporter: Hyukjin Kwon
>Assignee: Shane Knapp
>Priority: Major
>
> Binder integration requires IPython and nbsphinx to use the notebook file as 
> the documentation in PySpark.
> See SPARK-32204 and its PR for more details.



--
This message was sent by Atlassian Jira
(v8.3.4#803005)

-
To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org
For additional commands, e-mail: issues-h...@spark.apache.org



[jira] [Commented] (SPARK-33242) Install numpydoc in Jenkins machines

2021-07-21 Thread Shane Knapp (Jira)


[ 
https://issues.apache.org/jira/browse/SPARK-33242?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=17385022#comment-17385022
 ] 

Shane Knapp commented on SPARK-33242:
-

ill roll this out (and other python package updates) later today/this week.

> Install numpydoc in Jenkins machines
> 
>
> Key: SPARK-33242
> URL: https://issues.apache.org/jira/browse/SPARK-33242
> Project: Spark
>  Issue Type: Test
>  Components: Project Infra, PySpark
>Affects Versions: 3.1.0
>Reporter: Hyukjin Kwon
>Assignee: Shane Knapp
>Priority: Major
>
> To switch to reST style to numpydoc style, we should install numpydoc as 
> well. This is being used in Sphinx. See the parent JIRA as well.



--
This message was sent by Atlassian Jira
(v8.3.4#803005)

-
To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org
For additional commands, e-mail: issues-h...@spark.apache.org



[jira] [Resolved] (SPARK-34738) Upgrade Minikube and kubernetes and move to docker 'virtualization' layer

2021-05-17 Thread Shane Knapp (Jira)


 [ 
https://issues.apache.org/jira/browse/SPARK-34738?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
 ]

Shane Knapp resolved SPARK-34738.
-
Resolution: Fixed

k8s tests are passing!  marking this as 'resolved'.

> Upgrade Minikube and kubernetes and move to docker 'virtualization' layer
> -
>
> Key: SPARK-34738
> URL: https://issues.apache.org/jira/browse/SPARK-34738
> Project: Spark
>  Issue Type: Task
>  Components: jenkins, Kubernetes
>Affects Versions: 3.2.0
>Reporter: Attila Zsolt Piros
>Assignee: Shane Knapp
>Priority: Major
> Attachments: integration-tests.log
>
>
> [~shaneknapp] as we discussed [on the mailing 
> list|http://apache-spark-developers-list.1001551.n3.nabble.com/minikube-and-kubernetes-cluster-versions-for-integration-testing-td30856.html]
>  Minikube can be upgraded to the latest (v1.18.1) and kubernetes version 
> should be v1.17.3 (`minikube config set kubernetes-version v1.17.3`).
> [Here|https://github.com/apache/spark/pull/31829] is my PR which uses a new 
> method to configure the kubernetes client. Thanks in advance to use it for 
> testing on the Jenkins after the Minikube version is updated.
>  
> Added by Shane:
> we also need to move from the kvm2 virtualization layer to docker.  docker is 
> a recommended driver w/the latest versions of minikube, and this will allow 
> devs to more easily recreate the minikube/k8s env on their local workstations 
> and run the integration tests in an identical environment as jenkins.
> the TL;DR is that upgrading to docker works, except that the PV integration 
> tests are failing due to a couple of possible reasons:
> 1) the 'spark-kubernetes-driver' isn't properly being loaded 
> (https://issues.apache.org/jira/browse/SPARK-34738?focusedCommentId=17312517=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel#comment-17312517)
> 2) during the PV test run, the error message 'Given path 
> (/opt/spark/pv-tests/tmp4595937990978494271.txt) does not exist' shows up in 
> the logs.  however, the mk cluster *does* mount successfully to the local 
> bare-metal filesystem *and* if i 'minikube ssh' in to it, i can see the mount 
> and read/write successfully to it 
> (https://issues.apache.org/jira/browse/SPARK-34738?focusedCommentId=17312548=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel#comment-17312548)
> i could really use some help, and if it's useful, i can create some local 
> accounts manually and allow ssh access for a couple of people to assist me.



--
This message was sent by Atlassian Jira
(v8.3.4#803005)

-
To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org
For additional commands, e-mail: issues-h...@spark.apache.org



[jira] [Commented] (SPARK-34738) Upgrade Minikube and kubernetes and move to docker 'virtualization' layer

2021-05-17 Thread Shane Knapp (Jira)


[ 
https://issues.apache.org/jira/browse/SPARK-34738?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=17346437#comment-17346437
 ] 

Shane Knapp commented on SPARK-34738:
-

alrighty, i've updated the remaining 6 jenkins workers w/mk 1.18.1 and set k8s 
to 1.17.3 and updated all of the k8s builds to use the docker driver instead of 
kvm2.

i triggered a few builds, so let's wait to see if things go green before we 
close this.

> Upgrade Minikube and kubernetes and move to docker 'virtualization' layer
> -
>
> Key: SPARK-34738
> URL: https://issues.apache.org/jira/browse/SPARK-34738
> Project: Spark
>  Issue Type: Task
>  Components: jenkins, Kubernetes
>Affects Versions: 3.2.0
>Reporter: Attila Zsolt Piros
>Assignee: Shane Knapp
>Priority: Major
> Attachments: integration-tests.log
>
>
> [~shaneknapp] as we discussed [on the mailing 
> list|http://apache-spark-developers-list.1001551.n3.nabble.com/minikube-and-kubernetes-cluster-versions-for-integration-testing-td30856.html]
>  Minikube can be upgraded to the latest (v1.18.1) and kubernetes version 
> should be v1.17.3 (`minikube config set kubernetes-version v1.17.3`).
> [Here|https://github.com/apache/spark/pull/31829] is my PR which uses a new 
> method to configure the kubernetes client. Thanks in advance to use it for 
> testing on the Jenkins after the Minikube version is updated.
>  
> Added by Shane:
> we also need to move from the kvm2 virtualization layer to docker.  docker is 
> a recommended driver w/the latest versions of minikube, and this will allow 
> devs to more easily recreate the minikube/k8s env on their local workstations 
> and run the integration tests in an identical environment as jenkins.
> the TL;DR is that upgrading to docker works, except that the PV integration 
> tests are failing due to a couple of possible reasons:
> 1) the 'spark-kubernetes-driver' isn't properly being loaded 
> (https://issues.apache.org/jira/browse/SPARK-34738?focusedCommentId=17312517=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel#comment-17312517)
> 2) during the PV test run, the error message 'Given path 
> (/opt/spark/pv-tests/tmp4595937990978494271.txt) does not exist' shows up in 
> the logs.  however, the mk cluster *does* mount successfully to the local 
> bare-metal filesystem *and* if i 'minikube ssh' in to it, i can see the mount 
> and read/write successfully to it 
> (https://issues.apache.org/jira/browse/SPARK-34738?focusedCommentId=17312548=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel#comment-17312548)
> i could really use some help, and if it's useful, i can create some local 
> accounts manually and allow ssh access for a couple of people to assist me.



--
This message was sent by Atlassian Jira
(v8.3.4#803005)

-
To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org
For additional commands, e-mail: issues-h...@spark.apache.org



[jira] [Commented] (SPARK-34738) Upgrade Minikube and kubernetes and move to docker 'virtualization' layer

2021-05-10 Thread Shane Knapp (Jira)


[ 
https://issues.apache.org/jira/browse/SPARK-34738?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=17342024#comment-17342024
 ] 

Shane Knapp commented on SPARK-34738:
-

sorry i dropped off the radar...  i've been dealing w/a serious health issue 
these past few weeks (which is sorted), and i will update the remaining workers 
this week.

> Upgrade Minikube and kubernetes and move to docker 'virtualization' layer
> -
>
> Key: SPARK-34738
> URL: https://issues.apache.org/jira/browse/SPARK-34738
> Project: Spark
>  Issue Type: Task
>  Components: jenkins, Kubernetes
>Affects Versions: 3.2.0
>Reporter: Attila Zsolt Piros
>Assignee: Shane Knapp
>Priority: Major
> Attachments: integration-tests.log
>
>
> [~shaneknapp] as we discussed [on the mailing 
> list|http://apache-spark-developers-list.1001551.n3.nabble.com/minikube-and-kubernetes-cluster-versions-for-integration-testing-td30856.html]
>  Minikube can be upgraded to the latest (v1.18.1) and kubernetes version 
> should be v1.17.3 (`minikube config set kubernetes-version v1.17.3`).
> [Here|https://github.com/apache/spark/pull/31829] is my PR which uses a new 
> method to configure the kubernetes client. Thanks in advance to use it for 
> testing on the Jenkins after the Minikube version is updated.
>  
> Added by Shane:
> we also need to move from the kvm2 virtualization layer to docker.  docker is 
> a recommended driver w/the latest versions of minikube, and this will allow 
> devs to more easily recreate the minikube/k8s env on their local workstations 
> and run the integration tests in an identical environment as jenkins.
> the TL;DR is that upgrading to docker works, except that the PV integration 
> tests are failing due to a couple of possible reasons:
> 1) the 'spark-kubernetes-driver' isn't properly being loaded 
> (https://issues.apache.org/jira/browse/SPARK-34738?focusedCommentId=17312517=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel#comment-17312517)
> 2) during the PV test run, the error message 'Given path 
> (/opt/spark/pv-tests/tmp4595937990978494271.txt) does not exist' shows up in 
> the logs.  however, the mk cluster *does* mount successfully to the local 
> bare-metal filesystem *and* if i 'minikube ssh' in to it, i can see the mount 
> and read/write successfully to it 
> (https://issues.apache.org/jira/browse/SPARK-34738?focusedCommentId=17312548=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel#comment-17312548)
> i could really use some help, and if it's useful, i can create some local 
> accounts manually and allow ssh access for a couple of people to assist me.



--
This message was sent by Atlassian Jira
(v8.3.4#803005)

-
To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org
For additional commands, e-mail: issues-h...@spark.apache.org



[jira] [Commented] (SPARK-34738) Upgrade Minikube and kubernetes and move to docker 'virtualization' layer

2021-04-16 Thread Shane Knapp (Jira)


[ 
https://issues.apache.org/jira/browse/SPARK-34738?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=17324131#comment-17324131
 ] 

Shane Knapp commented on SPARK-34738:
-

alright, my canary build w/skipping the PV integration test passed w/the docker 
driver:https://amplab.cs.berkeley.edu/jenkins/job/spark-master-test-k8s-clone/20/

i'll put together a PR for this over the weekend (it's a one-liner) and once we 
merge i can get the remaining workers upgraded early next week.

> Upgrade Minikube and kubernetes and move to docker 'virtualization' layer
> -
>
> Key: SPARK-34738
> URL: https://issues.apache.org/jira/browse/SPARK-34738
> Project: Spark
>  Issue Type: Task
>  Components: jenkins, Kubernetes
>Affects Versions: 3.2.0
>Reporter: Attila Zsolt Piros
>Assignee: Shane Knapp
>Priority: Major
> Attachments: integration-tests.log
>
>
> [~shaneknapp] as we discussed [on the mailing 
> list|http://apache-spark-developers-list.1001551.n3.nabble.com/minikube-and-kubernetes-cluster-versions-for-integration-testing-td30856.html]
>  Minikube can be upgraded to the latest (v1.18.1) and kubernetes version 
> should be v1.17.3 (`minikube config set kubernetes-version v1.17.3`).
> [Here|https://github.com/apache/spark/pull/31829] is my PR which uses a new 
> method to configure the kubernetes client. Thanks in advance to use it for 
> testing on the Jenkins after the Minikube version is updated.
>  
> Added by Shane:
> we also need to move from the kvm2 virtualization layer to docker.  docker is 
> a recommended driver w/the latest versions of minikube, and this will allow 
> devs to more easily recreate the minikube/k8s env on their local workstations 
> and run the integration tests in an identical environment as jenkins.
> the TL;DR is that upgrading to docker works, except that the PV integration 
> tests are failing due to a couple of possible reasons:
> 1) the 'spark-kubernetes-driver' isn't properly being loaded 
> (https://issues.apache.org/jira/browse/SPARK-34738?focusedCommentId=17312517=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel#comment-17312517)
> 2) during the PV test run, the error message 'Given path 
> (/opt/spark/pv-tests/tmp4595937990978494271.txt) does not exist' shows up in 
> the logs.  however, the mk cluster *does* mount successfully to the local 
> bare-metal filesystem *and* if i 'minikube ssh' in to it, i can see the mount 
> and read/write successfully to it 
> (https://issues.apache.org/jira/browse/SPARK-34738?focusedCommentId=17312548=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel#comment-17312548)
> i could really use some help, and if it's useful, i can create some local 
> accounts manually and allow ssh access for a couple of people to assist me.



--
This message was sent by Atlassian Jira
(v8.3.4#803005)

-
To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org
For additional commands, e-mail: issues-h...@spark.apache.org



[jira] [Updated] (SPARK-34738) Upgrade Minikube and kubernetes and move to docker 'virtualization' layer

2021-04-14 Thread Shane Knapp (Jira)


 [ 
https://issues.apache.org/jira/browse/SPARK-34738?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
 ]

Shane Knapp updated SPARK-34738:

Description: 
[~shaneknapp] as we discussed [on the mailing 
list|http://apache-spark-developers-list.1001551.n3.nabble.com/minikube-and-kubernetes-cluster-versions-for-integration-testing-td30856.html]
 Minikube can be upgraded to the latest (v1.18.1) and kubernetes version should 
be v1.17.3 (`minikube config set kubernetes-version v1.17.3`).

[Here|https://github.com/apache/spark/pull/31829] is my PR which uses a new 
method to configure the kubernetes client. Thanks in advance to use it for 
testing on the Jenkins after the Minikube version is updated.

 

Added by Shane:

we also need to move from the kvm2 virtualization layer to docker.  docker is a 
recommended driver w/the latest versions of minikube, and this will allow devs 
to more easily recreate the minikube/k8s env on their local workstations and 
run the integration tests in an identical environment as jenkins.

the TL;DR is that upgrading to docker works, except that the PV integration 
tests are failing due to a couple of possible reasons:

1) the 'spark-kubernetes-driver' isn't properly being loaded 
(https://issues.apache.org/jira/browse/SPARK-34738?focusedCommentId=17312517=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel#comment-17312517)

2) during the PV test run, the error message 'Given path 
(/opt/spark/pv-tests/tmp4595937990978494271.txt) does not exist' shows up in 
the logs.  however, the mk cluster *does* mount successfully to the local 
bare-metal filesystem *and* if i 'minikube ssh' in to it, i can see the mount 
and read/write successfully to it 
(https://issues.apache.org/jira/browse/SPARK-34738?focusedCommentId=17312548=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel#comment-17312548)

i could really use some help, and if it's useful, i can create some local 
accounts manually and allow ssh access for a couple of people to assist me.

  was:
[~shaneknapp] as we discussed [on the mailing 
list|http://apache-spark-developers-list.1001551.n3.nabble.com/minikube-and-kubernetes-cluster-versions-for-integration-testing-td30856.html]
 Minikube can be upgraded to the latest (v1.18.1) and kubernetes version should 
be v1.17.3 (`minikube config set kubernetes-version v1.17.3`).

[Here|https://github.com/apache/spark/pull/31829] is my PR which uses a new 
method to configure the kubernetes client. Thanks in advance to use it for 
testing on the Jenkins after the Minikube version is updated.

 

Added by Shane:

we also need to move from the kvm2 virtualization layer to docker.  docker is a 
recommended driver w/the latest versions of minikube, and this will allow devs 
to more easily recreate the minikube/k8s env on their local workstations and 
run the integration tests in an identical environment as jenkins.

the TL;DR is that upgrading to docker works, except that the PV integration 
tests are failing due to a couple of possible reasons:

1) the 'spark-kubernetes-driver' isn't properly being loaded 
(https://issues.apache.org/jira/browse/SPARK-34738?focusedCommentId=17312517=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel#comment-17312517)

2) during the PV test run, the error message 'Given path 
(/opt/spark/pv-tests/tmp4595937990978494271.txt) does not exist' shows up in 
the logs.  however, the mk cluster *does* mount successfully to the local 
bare-metal filesystem *and* if i 'minikube ssh' in to it, i can see the mount 
and read/write successfully to it (


> Upgrade Minikube and kubernetes and move to docker 'virtualization' layer
> -
>
> Key: SPARK-34738
> URL: https://issues.apache.org/jira/browse/SPARK-34738
> Project: Spark
>  Issue Type: Task
>  Components: jenkins, Kubernetes
>Affects Versions: 3.2.0
>Reporter: Attila Zsolt Piros
>Assignee: Shane Knapp
>Priority: Major
> Attachments: integration-tests.log
>
>
> [~shaneknapp] as we discussed [on the mailing 
> list|http://apache-spark-developers-list.1001551.n3.nabble.com/minikube-and-kubernetes-cluster-versions-for-integration-testing-td30856.html]
>  Minikube can be upgraded to the latest (v1.18.1) and kubernetes version 
> should be v1.17.3 (`minikube config set kubernetes-version v1.17.3`).
> [Here|https://github.com/apache/spark/pull/31829] is my PR which uses a new 
> method to configure the kubernetes client. Thanks in advance to use it for 
> testing on the Jenkins after the Minikube version is updated.
>  
> Added by Shane:
> we also need to move from the kvm2 virtualization layer to docker.  docker is 
> a recommended driver w/the latest versions of minikube, and this will allow 
> devs to more easily recreate the minikube/k8s env on 

[jira] [Updated] (SPARK-34738) Upgrade Minikube and kubernetes and move to docker 'virtualization' layer

2021-04-14 Thread Shane Knapp (Jira)


 [ 
https://issues.apache.org/jira/browse/SPARK-34738?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
 ]

Shane Knapp updated SPARK-34738:

Description: 
[~shaneknapp] as we discussed [on the mailing 
list|http://apache-spark-developers-list.1001551.n3.nabble.com/minikube-and-kubernetes-cluster-versions-for-integration-testing-td30856.html]
 Minikube can be upgraded to the latest (v1.18.1) and kubernetes version should 
be v1.17.3 (`minikube config set kubernetes-version v1.17.3`).

[Here|https://github.com/apache/spark/pull/31829] is my PR which uses a new 
method to configure the kubernetes client. Thanks in advance to use it for 
testing on the Jenkins after the Minikube version is updated.

 

Added by Shane:

we also need to move from the kvm2 virtualization layer to docker.  docker is a 
recommended driver w/the latest versions of minikube, and this will allow devs 
to more easily recreate the minikube/k8s env on their local workstations and 
run the integration tests in an identical environment as jenkins.

the TL;DR is that upgrading to docker works, except that the PV integration 
tests are failing due to a couple of possible reasons:

1) the 'spark-kubernetes-driver' isn't properly being loaded 
(https://issues.apache.org/jira/browse/SPARK-34738?focusedCommentId=17312517=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel#comment-17312517)

2) during the PV test run, the error message 'Given path 
(/opt/spark/pv-tests/tmp4595937990978494271.txt) does not exist' shows up in 
the logs.  however, the mk cluster *does* mount successfully to the local 
bare-metal filesystem *and* if i 'minikube ssh' in to it, i can see the mount 
and read/write successfully to it (

  was:
[~shaneknapp] as we discussed [on the mailing 
list|http://apache-spark-developers-list.1001551.n3.nabble.com/minikube-and-kubernetes-cluster-versions-for-integration-testing-td30856.html]
 Minikube can be upgraded to the latest (v1.18.1) and kubernetes version should 
be v1.17.3 (`minikube config set kubernetes-version v1.17.3`).

[Here|https://github.com/apache/spark/pull/31829] is my PR which uses a new 
method to configure the kubernetes client. Thanks in advance to use it for 
testing on the Jenkins after the Minikube version is updated.

 

Added by Shane:

we also need to move from the kvm2 virtualization layer to docker.  docker is a 
recommended driver w/the latest versions of minikube, and this will allow devs 
to more easily recreate the minikube/k8s env on their local workstations and 
run the integration tests in an identical environment as jenkins.

the TL;DR is that upgrading to docker works, except that the PV integration 
tests are failing due to a couple of possible reasons:

1) the 'spark-kubernetes-driver' isn't properly being loaded 
(https://issues.apache.org/jira/browse/SPARK-34738?focusedCommentId=17312517=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel#comment-17312517)

2) /


> Upgrade Minikube and kubernetes and move to docker 'virtualization' layer
> -
>
> Key: SPARK-34738
> URL: https://issues.apache.org/jira/browse/SPARK-34738
> Project: Spark
>  Issue Type: Task
>  Components: jenkins, Kubernetes
>Affects Versions: 3.2.0
>Reporter: Attila Zsolt Piros
>Assignee: Shane Knapp
>Priority: Major
> Attachments: integration-tests.log
>
>
> [~shaneknapp] as we discussed [on the mailing 
> list|http://apache-spark-developers-list.1001551.n3.nabble.com/minikube-and-kubernetes-cluster-versions-for-integration-testing-td30856.html]
>  Minikube can be upgraded to the latest (v1.18.1) and kubernetes version 
> should be v1.17.3 (`minikube config set kubernetes-version v1.17.3`).
> [Here|https://github.com/apache/spark/pull/31829] is my PR which uses a new 
> method to configure the kubernetes client. Thanks in advance to use it for 
> testing on the Jenkins after the Minikube version is updated.
>  
> Added by Shane:
> we also need to move from the kvm2 virtualization layer to docker.  docker is 
> a recommended driver w/the latest versions of minikube, and this will allow 
> devs to more easily recreate the minikube/k8s env on their local workstations 
> and run the integration tests in an identical environment as jenkins.
> the TL;DR is that upgrading to docker works, except that the PV integration 
> tests are failing due to a couple of possible reasons:
> 1) the 'spark-kubernetes-driver' isn't properly being loaded 
> (https://issues.apache.org/jira/browse/SPARK-34738?focusedCommentId=17312517=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel#comment-17312517)
> 2) during the PV test run, the error message 'Given path 
> (/opt/spark/pv-tests/tmp4595937990978494271.txt) does not exist' shows up in 
> the logs.  however, the mk 

[jira] [Updated] (SPARK-34738) Upgrade Minikube and kubernetes and move to docker 'virtualization' layer

2021-04-14 Thread Shane Knapp (Jira)


 [ 
https://issues.apache.org/jira/browse/SPARK-34738?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
 ]

Shane Knapp updated SPARK-34738:

Description: 
[~shaneknapp] as we discussed [on the mailing 
list|http://apache-spark-developers-list.1001551.n3.nabble.com/minikube-and-kubernetes-cluster-versions-for-integration-testing-td30856.html]
 Minikube can be upgraded to the latest (v1.18.1) and kubernetes version should 
be v1.17.3 (`minikube config set kubernetes-version v1.17.3`).

[Here|https://github.com/apache/spark/pull/31829] is my PR which uses a new 
method to configure the kubernetes client. Thanks in advance to use it for 
testing on the Jenkins after the Minikube version is updated.

 

Added by Shane:

we also need to move from the kvm2 virtualization layer to docker.  docker is a 
recommended driver w/the latest versions of minikube, and this will allow devs 
to more easily recreate the minikube/k8s env on their local workstations and 
run the integration tests in an identical environment as jenkins.

the TL;DR is that upgrading to docker works, except that the PV integration 
tests are failing due to a couple of possible reasons:

1) the 'spark-kubernetes-driver' isn't properly being loaded 
(https://issues.apache.org/jira/browse/SPARK-34738?focusedCommentId=17312517=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel#comment-17312517)

2) /

  was:
[~shaneknapp] as we discussed [on the mailing 
list|http://apache-spark-developers-list.1001551.n3.nabble.com/minikube-and-kubernetes-cluster-versions-for-integration-testing-td30856.html]
 Minikube can be upgraded to the latest (v1.18.1) and kubernetes version should 
be v1.17.3 (`minikube config set kubernetes-version v1.17.3`).

[Here|https://github.com/apache/spark/pull/31829] is my PR which uses a new 
method to configure the kubernetes client. Thanks in advance to use it for 
testing on the Jenkins after the Minikube version is updated.

 

Added by Shane:

we also need to move from the kvm2 virtualization layer to docker.  docker is a 
recommended driver w/the latest versions of minikube, and this will allow devs 
to more easily recreate the minikube/k8s env on their local workstations and 
run the integration tests in an identical environment as jenkins.

the TL;DR is that the PV integration tests are failing due to 


> Upgrade Minikube and kubernetes and move to docker 'virtualization' layer
> -
>
> Key: SPARK-34738
> URL: https://issues.apache.org/jira/browse/SPARK-34738
> Project: Spark
>  Issue Type: Task
>  Components: jenkins, Kubernetes
>Affects Versions: 3.2.0
>Reporter: Attila Zsolt Piros
>Assignee: Shane Knapp
>Priority: Major
> Attachments: integration-tests.log
>
>
> [~shaneknapp] as we discussed [on the mailing 
> list|http://apache-spark-developers-list.1001551.n3.nabble.com/minikube-and-kubernetes-cluster-versions-for-integration-testing-td30856.html]
>  Minikube can be upgraded to the latest (v1.18.1) and kubernetes version 
> should be v1.17.3 (`minikube config set kubernetes-version v1.17.3`).
> [Here|https://github.com/apache/spark/pull/31829] is my PR which uses a new 
> method to configure the kubernetes client. Thanks in advance to use it for 
> testing on the Jenkins after the Minikube version is updated.
>  
> Added by Shane:
> we also need to move from the kvm2 virtualization layer to docker.  docker is 
> a recommended driver w/the latest versions of minikube, and this will allow 
> devs to more easily recreate the minikube/k8s env on their local workstations 
> and run the integration tests in an identical environment as jenkins.
> the TL;DR is that upgrading to docker works, except that the PV integration 
> tests are failing due to a couple of possible reasons:
> 1) the 'spark-kubernetes-driver' isn't properly being loaded 
> (https://issues.apache.org/jira/browse/SPARK-34738?focusedCommentId=17312517=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel#comment-17312517)
> 2) /



--
This message was sent by Atlassian Jira
(v8.3.4#803005)

-
To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org
For additional commands, e-mail: issues-h...@spark.apache.org



[jira] [Updated] (SPARK-34738) Upgrade Minikube and kubernetes and move to docker 'virtualization' layer

2021-04-14 Thread Shane Knapp (Jira)


 [ 
https://issues.apache.org/jira/browse/SPARK-34738?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
 ]

Shane Knapp updated SPARK-34738:

Description: 
[~shaneknapp] as we discussed [on the mailing 
list|http://apache-spark-developers-list.1001551.n3.nabble.com/minikube-and-kubernetes-cluster-versions-for-integration-testing-td30856.html]
 Minikube can be upgraded to the latest (v1.18.1) and kubernetes version should 
be v1.17.3 (`minikube config set kubernetes-version v1.17.3`).

[Here|https://github.com/apache/spark/pull/31829] is my PR which uses a new 
method to configure the kubernetes client. Thanks in advance to use it for 
testing on the Jenkins after the Minikube version is updated.

 

Added by Shane:

we also need to move from the kvm2 virtualization layer to docker.  docker is a 
recommended driver w/the latest versions of minikube, and this will allow devs 
to more easily recreate the minikube/k8s env on their local workstations and 
run the integration tests in an identical environment as jenkins.

the TL;DR is that the PV integration tests are failing due to 

  was:
[~shaneknapp] as we discussed [on the mailing 
list|http://apache-spark-developers-list.1001551.n3.nabble.com/minikube-and-kubernetes-cluster-versions-for-integration-testing-td30856.html]
 Minikube can be upgraded to the latest (v1.18.1) and kubernetes version should 
be v1.17.3 (`minikube config set kubernetes-version v1.17.3`).

[Here|https://github.com/apache/spark/pull/31829] is my PR which uses a new 
method to configure the kubernetes client. Thanks in advance to use it for 
testing on the Jenkins after the Minikube version is updated.




> Upgrade Minikube and kubernetes and move to docker 'virtualization' layer
> -
>
> Key: SPARK-34738
> URL: https://issues.apache.org/jira/browse/SPARK-34738
> Project: Spark
>  Issue Type: Task
>  Components: jenkins, Kubernetes
>Affects Versions: 3.2.0
>Reporter: Attila Zsolt Piros
>Assignee: Shane Knapp
>Priority: Major
> Attachments: integration-tests.log
>
>
> [~shaneknapp] as we discussed [on the mailing 
> list|http://apache-spark-developers-list.1001551.n3.nabble.com/minikube-and-kubernetes-cluster-versions-for-integration-testing-td30856.html]
>  Minikube can be upgraded to the latest (v1.18.1) and kubernetes version 
> should be v1.17.3 (`minikube config set kubernetes-version v1.17.3`).
> [Here|https://github.com/apache/spark/pull/31829] is my PR which uses a new 
> method to configure the kubernetes client. Thanks in advance to use it for 
> testing on the Jenkins after the Minikube version is updated.
>  
> Added by Shane:
> we also need to move from the kvm2 virtualization layer to docker.  docker is 
> a recommended driver w/the latest versions of minikube, and this will allow 
> devs to more easily recreate the minikube/k8s env on their local workstations 
> and run the integration tests in an identical environment as jenkins.
> the TL;DR is that the PV integration tests are failing due to 



--
This message was sent by Atlassian Jira
(v8.3.4#803005)

-
To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org
For additional commands, e-mail: issues-h...@spark.apache.org



[jira] [Updated] (SPARK-34738) Upgrade Minikube and kubernetes and move to docker 'virtualization' layer

2021-04-14 Thread Shane Knapp (Jira)


 [ 
https://issues.apache.org/jira/browse/SPARK-34738?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
 ]

Shane Knapp updated SPARK-34738:

Summary: Upgrade Minikube and kubernetes and move to docker 
'virtualization' layer  (was: Upgrade Minikube and kubernetes cluster version 
on Jenkins)

> Upgrade Minikube and kubernetes and move to docker 'virtualization' layer
> -
>
> Key: SPARK-34738
> URL: https://issues.apache.org/jira/browse/SPARK-34738
> Project: Spark
>  Issue Type: Task
>  Components: jenkins, Kubernetes
>Affects Versions: 3.2.0
>Reporter: Attila Zsolt Piros
>Assignee: Shane Knapp
>Priority: Major
> Attachments: integration-tests.log
>
>
> [~shaneknapp] as we discussed [on the mailing 
> list|http://apache-spark-developers-list.1001551.n3.nabble.com/minikube-and-kubernetes-cluster-versions-for-integration-testing-td30856.html]
>  Minikube can be upgraded to the latest (v1.18.1) and kubernetes version 
> should be v1.17.3 (`minikube config set kubernetes-version v1.17.3`).
> [Here|https://github.com/apache/spark/pull/31829] is my PR which uses a new 
> method to configure the kubernetes client. Thanks in advance to use it for 
> testing on the Jenkins after the Minikube version is updated.



--
This message was sent by Atlassian Jira
(v8.3.4#803005)

-
To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org
For additional commands, e-mail: issues-h...@spark.apache.org



[jira] [Commented] (SPARK-34738) Upgrade Minikube and kubernetes cluster version on Jenkins

2021-04-09 Thread Shane Knapp (Jira)


[ 
https://issues.apache.org/jira/browse/SPARK-34738?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=17318132#comment-17318132
 ] 

Shane Knapp commented on SPARK-34738:
-

i would LOVE to just use the docker driver for minikube.  this wasn't an option 
when we first deployed it for the k8s integration tests, but i think it'll go a 
long way towards helping folks run test infra locally on laptops, rather than 
needing to track down an ubuntu box...  :)

> Upgrade Minikube and kubernetes cluster version on Jenkins
> --
>
> Key: SPARK-34738
> URL: https://issues.apache.org/jira/browse/SPARK-34738
> Project: Spark
>  Issue Type: Task
>  Components: jenkins, Kubernetes
>Affects Versions: 3.2.0
>Reporter: Attila Zsolt Piros
>Assignee: Shane Knapp
>Priority: Major
> Attachments: integration-tests.log
>
>
> [~shaneknapp] as we discussed [on the mailing 
> list|http://apache-spark-developers-list.1001551.n3.nabble.com/minikube-and-kubernetes-cluster-versions-for-integration-testing-td30856.html]
>  Minikube can be upgraded to the latest (v1.18.1) and kubernetes version 
> should be v1.17.3 (`minikube config set kubernetes-version v1.17.3`).
> [Here|https://github.com/apache/spark/pull/31829] is my PR which uses a new 
> method to configure the kubernetes client. Thanks in advance to use it for 
> testing on the Jenkins after the Minikube version is updated.



--
This message was sent by Atlassian Jira
(v8.3.4#803005)

-
To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org
For additional commands, e-mail: issues-h...@spark.apache.org



[jira] [Commented] (SPARK-34738) Upgrade Minikube and kubernetes cluster version on Jenkins

2021-04-08 Thread Shane Knapp (Jira)


[ 
https://issues.apache.org/jira/browse/SPARK-34738?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=17317436#comment-17317436
 ] 

Shane Knapp commented on SPARK-34738:
-

no, you're right.  i just noticed that 'minikube' was already in the list.  

still, that bit of code just doesn't smell right...  we can't always rely on 
the name of the cluster being 'minikube' and adding any permutation to this 
list seems fragile.  is that withValues necessary?

> Upgrade Minikube and kubernetes cluster version on Jenkins
> --
>
> Key: SPARK-34738
> URL: https://issues.apache.org/jira/browse/SPARK-34738
> Project: Spark
>  Issue Type: Task
>  Components: jenkins, Kubernetes
>Affects Versions: 3.2.0
>Reporter: Attila Zsolt Piros
>Assignee: Shane Knapp
>Priority: Major
> Attachments: integration-tests.log
>
>
> [~shaneknapp] as we discussed [on the mailing 
> list|http://apache-spark-developers-list.1001551.n3.nabble.com/minikube-and-kubernetes-cluster-versions-for-integration-testing-td30856.html]
>  Minikube can be upgraded to the latest (v1.18.1) and kubernetes version 
> should be v1.17.3 (`minikube config set kubernetes-version v1.17.3`).
> [Here|https://github.com/apache/spark/pull/31829] is my PR which uses a new 
> method to configure the kubernetes client. Thanks in advance to use it for 
> testing on the Jenkins after the Minikube version is updated.



--
This message was sent by Atlassian Jira
(v8.3.4#803005)

-
To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org
For additional commands, e-mail: issues-h...@spark.apache.org



[jira] [Comment Edited] (SPARK-34738) Upgrade Minikube and kubernetes cluster version on Jenkins

2021-04-08 Thread Shane Knapp (Jira)


[ 
https://issues.apache.org/jira/browse/SPARK-34738?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=17317418#comment-17317418
 ] 

Shane Knapp edited comment on SPARK-34738 at 4/8/21, 6:40 PM:
--

{noformat}
Given path (/opt/spark/pv-tests/tmp5813668424419880732.txt) does not exist
{noformat}
this is supposed to be mounted in the minikube cluster, not on the bare metal.

> But sth is mounted to there:
{noformat}
 Mounts:
  /opt/spark/conf from spark-conf-volume-driver (rw)
  /opt/spark/pv-tests from data (rw)
...
{noformat}
> But my guess this is not the one which connects the host path 
> "PVC_TESTS_HOST_PATH" with the internal minikube mounted path (the one which 
> goes further to the driver/executor). This is why the locally created file is 
> missing.

 

it's properly mounting the local (bare metal) filesystem, as is able to create 
the file.  see:  
https://issues.apache.org/jira/browse/SPARK-34738?focusedCommentId=17312548=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel#comment-17312548


was (Author: shaneknapp):
{noformat}
Given path (/opt/spark/pv-tests/tmp5813668424419880732.txt) does not exist
{noformat}
this is supposed to be mounted in the minikube cluster, not on the bare metal.

> But sth is mounted to there:
{noformat}
 Mounts:
  /opt/spark/conf from spark-conf-volume-driver (rw)
  /opt/spark/pv-tests from data (rw)
...
{noformat}
> But my guess this is not the one which connects the host path 
> "PVC_TESTS_HOST_PATH" with the internal minikube mounted path (the one which 
> goes further to the driver/executor). This is why the locally created file is 
> missing.

 

it's properly mounting the local (bare metal) filesystem.  see:  
https://issues.apache.org/jira/browse/SPARK-34738?focusedCommentId=17312548=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel#comment-17312548

> Upgrade Minikube and kubernetes cluster version on Jenkins
> --
>
> Key: SPARK-34738
> URL: https://issues.apache.org/jira/browse/SPARK-34738
> Project: Spark
>  Issue Type: Task
>  Components: jenkins, Kubernetes
>Affects Versions: 3.2.0
>Reporter: Attila Zsolt Piros
>Assignee: Shane Knapp
>Priority: Major
> Attachments: integration-tests.log
>
>
> [~shaneknapp] as we discussed [on the mailing 
> list|http://apache-spark-developers-list.1001551.n3.nabble.com/minikube-and-kubernetes-cluster-versions-for-integration-testing-td30856.html]
>  Minikube can be upgraded to the latest (v1.18.1) and kubernetes version 
> should be v1.17.3 (`minikube config set kubernetes-version v1.17.3`).
> [Here|https://github.com/apache/spark/pull/31829] is my PR which uses a new 
> method to configure the kubernetes client. Thanks in advance to use it for 
> testing on the Jenkins after the Minikube version is updated.



--
This message was sent by Atlassian Jira
(v8.3.4#803005)

-
To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org
For additional commands, e-mail: issues-h...@spark.apache.org



[jira] [Commented] (SPARK-34738) Upgrade Minikube and kubernetes cluster version on Jenkins

2021-04-08 Thread Shane Knapp (Jira)


[ 
https://issues.apache.org/jira/browse/SPARK-34738?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=17317418#comment-17317418
 ] 

Shane Knapp commented on SPARK-34738:
-

{noformat}
Given path (/opt/spark/pv-tests/tmp5813668424419880732.txt) does not exist
{noformat}
this is supposed to be mounted in the minikube cluster, not on the bare metal.

> But sth is mounted to there:
{noformat}
 Mounts:
  /opt/spark/conf from spark-conf-volume-driver (rw)
  /opt/spark/pv-tests from data (rw)
...
{noformat}
> But my guess this is not the one which connects the host path 
> "PVC_TESTS_HOST_PATH" with the internal minikube mounted path (the one which 
> goes further to the driver/executor). This is why the locally created file is 
> missing.

 

it's properly mounting the local (bare metal) filesystem.  see:  
https://issues.apache.org/jira/browse/SPARK-34738?focusedCommentId=17312548=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel#comment-17312548

> Upgrade Minikube and kubernetes cluster version on Jenkins
> --
>
> Key: SPARK-34738
> URL: https://issues.apache.org/jira/browse/SPARK-34738
> Project: Spark
>  Issue Type: Task
>  Components: jenkins, Kubernetes
>Affects Versions: 3.2.0
>Reporter: Attila Zsolt Piros
>Assignee: Shane Knapp
>Priority: Major
> Attachments: integration-tests.log
>
>
> [~shaneknapp] as we discussed [on the mailing 
> list|http://apache-spark-developers-list.1001551.n3.nabble.com/minikube-and-kubernetes-cluster-versions-for-integration-testing-td30856.html]
>  Minikube can be upgraded to the latest (v1.18.1) and kubernetes version 
> should be v1.17.3 (`minikube config set kubernetes-version v1.17.3`).
> [Here|https://github.com/apache/spark/pull/31829] is my PR which uses a new 
> method to configure the kubernetes client. Thanks in advance to use it for 
> testing on the Jenkins after the Minikube version is updated.



--
This message was sent by Atlassian Jira
(v8.3.4#803005)

-
To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org
For additional commands, e-mail: issues-h...@spark.apache.org



[jira] [Commented] (SPARK-34738) Upgrade Minikube and kubernetes cluster version on Jenkins

2021-04-08 Thread Shane Knapp (Jira)


[ 
https://issues.apache.org/jira/browse/SPARK-34738?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=17317409#comment-17317409
 ] 

Shane Knapp commented on SPARK-34738:
-

i think it's more than suspicious!  :)

https://github.com/apache/spark/commit/dba525c997b0033ac1b6fd24236cd72938f94bbf

https://issues.apache.org/jira/browse/SPARK-31313

[~dongjoon] could you s/m01/minikube?/g

 

> Upgrade Minikube and kubernetes cluster version on Jenkins
> --
>
> Key: SPARK-34738
> URL: https://issues.apache.org/jira/browse/SPARK-34738
> Project: Spark
>  Issue Type: Task
>  Components: jenkins, Kubernetes
>Affects Versions: 3.2.0
>Reporter: Attila Zsolt Piros
>Assignee: Shane Knapp
>Priority: Major
> Attachments: integration-tests.log
>
>
> [~shaneknapp] as we discussed [on the mailing 
> list|http://apache-spark-developers-list.1001551.n3.nabble.com/minikube-and-kubernetes-cluster-versions-for-integration-testing-td30856.html]
>  Minikube can be upgraded to the latest (v1.18.1) and kubernetes version 
> should be v1.17.3 (`minikube config set kubernetes-version v1.17.3`).
> [Here|https://github.com/apache/spark/pull/31829] is my PR which uses a new 
> method to configure the kubernetes client. Thanks in advance to use it for 
> testing on the Jenkins after the Minikube version is updated.



--
This message was sent by Atlassian Jira
(v8.3.4#803005)

-
To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org
For additional commands, e-mail: issues-h...@spark.apache.org



[jira] [Comment Edited] (SPARK-34738) Upgrade Minikube and kubernetes cluster version on Jenkins

2021-04-08 Thread Shane Knapp (Jira)


[ 
https://issues.apache.org/jira/browse/SPARK-34738?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=17317334#comment-17317334
 ] 

Shane Knapp edited comment on SPARK-34738 at 4/8/21, 4:45 PM:
--

done (attached to the issue)

also, it's been so long since i've had to debug this stuff that i'd forgotten 
about those logs...  :facepalm:  :)


was (Author: shaneknapp):
done (attached to the issue)

> Upgrade Minikube and kubernetes cluster version on Jenkins
> --
>
> Key: SPARK-34738
> URL: https://issues.apache.org/jira/browse/SPARK-34738
> Project: Spark
>  Issue Type: Task
>  Components: jenkins, Kubernetes
>Affects Versions: 3.2.0
>Reporter: Attila Zsolt Piros
>Assignee: Shane Knapp
>Priority: Major
> Attachments: integration-tests.log
>
>
> [~shaneknapp] as we discussed [on the mailing 
> list|http://apache-spark-developers-list.1001551.n3.nabble.com/minikube-and-kubernetes-cluster-versions-for-integration-testing-td30856.html]
>  Minikube can be upgraded to the latest (v1.18.1) and kubernetes version 
> should be v1.17.3 (`minikube config set kubernetes-version v1.17.3`).
> [Here|https://github.com/apache/spark/pull/31829] is my PR which uses a new 
> method to configure the kubernetes client. Thanks in advance to use it for 
> testing on the Jenkins after the Minikube version is updated.



--
This message was sent by Atlassian Jira
(v8.3.4#803005)

-
To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org
For additional commands, e-mail: issues-h...@spark.apache.org



[jira] [Commented] (SPARK-34738) Upgrade Minikube and kubernetes cluster version on Jenkins

2021-04-08 Thread Shane Knapp (Jira)


[ 
https://issues.apache.org/jira/browse/SPARK-34738?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=17317334#comment-17317334
 ] 

Shane Knapp commented on SPARK-34738:
-

done (attached to the issue)

> Upgrade Minikube and kubernetes cluster version on Jenkins
> --
>
> Key: SPARK-34738
> URL: https://issues.apache.org/jira/browse/SPARK-34738
> Project: Spark
>  Issue Type: Task
>  Components: jenkins, Kubernetes
>Affects Versions: 3.2.0
>Reporter: Attila Zsolt Piros
>Assignee: Shane Knapp
>Priority: Major
> Attachments: integration-tests.log
>
>
> [~shaneknapp] as we discussed [on the mailing 
> list|http://apache-spark-developers-list.1001551.n3.nabble.com/minikube-and-kubernetes-cluster-versions-for-integration-testing-td30856.html]
>  Minikube can be upgraded to the latest (v1.18.1) and kubernetes version 
> should be v1.17.3 (`minikube config set kubernetes-version v1.17.3`).
> [Here|https://github.com/apache/spark/pull/31829] is my PR which uses a new 
> method to configure the kubernetes client. Thanks in advance to use it for 
> testing on the Jenkins after the Minikube version is updated.



--
This message was sent by Atlassian Jira
(v8.3.4#803005)

-
To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org
For additional commands, e-mail: issues-h...@spark.apache.org



[jira] [Updated] (SPARK-34738) Upgrade Minikube and kubernetes cluster version on Jenkins

2021-04-08 Thread Shane Knapp (Jira)


 [ 
https://issues.apache.org/jira/browse/SPARK-34738?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
 ]

Shane Knapp updated SPARK-34738:

Attachment: integration-tests.log

> Upgrade Minikube and kubernetes cluster version on Jenkins
> --
>
> Key: SPARK-34738
> URL: https://issues.apache.org/jira/browse/SPARK-34738
> Project: Spark
>  Issue Type: Task
>  Components: jenkins, Kubernetes
>Affects Versions: 3.2.0
>Reporter: Attila Zsolt Piros
>Assignee: Shane Knapp
>Priority: Major
> Attachments: integration-tests.log
>
>
> [~shaneknapp] as we discussed [on the mailing 
> list|http://apache-spark-developers-list.1001551.n3.nabble.com/minikube-and-kubernetes-cluster-versions-for-integration-testing-td30856.html]
>  Minikube can be upgraded to the latest (v1.18.1) and kubernetes version 
> should be v1.17.3 (`minikube config set kubernetes-version v1.17.3`).
> [Here|https://github.com/apache/spark/pull/31829] is my PR which uses a new 
> method to configure the kubernetes client. Thanks in advance to use it for 
> testing on the Jenkins after the Minikube version is updated.



--
This message was sent by Atlassian Jira
(v8.3.4#803005)

-
To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org
For additional commands, e-mail: issues-h...@spark.apache.org



[jira] [Commented] (SPARK-34738) Upgrade Minikube and kubernetes cluster version on Jenkins

2021-04-01 Thread Shane Knapp (Jira)


[ 
https://issues.apache.org/jira/browse/SPARK-34738?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=17313345#comment-17313345
 ] 

Shane Knapp commented on SPARK-34738:
-

a few things:

1) we are using ubuntu 20/kvm2 drivers the current minikube/k8s testing 
environment.

2) the current mk/k8s deployments seem to have completely stopped working, 
which is odd as those worker's configs haven't changed one bit.  i am also 
investigating this, and stumped as to wtf is going on (looks to be an ssh auth 
problem when spinning up the cluster, but i'm able to manually ssh in to it 
using the command that minikube start is running...  serious wtf)

3) i am unable to mount a persistent volume w/the latest minikube + k8s and the 
docker-machine-kvm2 driver

4) i AM able to mount a PV w/the latest minikube + k8s and docker, but the 
integration test fails

also:  i won't be able to work on this until tuesday, next week.

 

> Upgrade Minikube and kubernetes cluster version on Jenkins
> --
>
> Key: SPARK-34738
> URL: https://issues.apache.org/jira/browse/SPARK-34738
> Project: Spark
>  Issue Type: Task
>  Components: jenkins, Kubernetes
>Affects Versions: 3.2.0
>Reporter: Attila Zsolt Piros
>Assignee: Shane Knapp
>Priority: Major
>
> [~shaneknapp] as we discussed [on the mailing 
> list|http://apache-spark-developers-list.1001551.n3.nabble.com/minikube-and-kubernetes-cluster-versions-for-integration-testing-td30856.html]
>  Minikube can be upgraded to the latest (v1.18.1) and kubernetes version 
> should be v1.17.3 (`minikube config set kubernetes-version v1.17.3`).
> [Here|https://github.com/apache/spark/pull/31829] is my PR which uses a new 
> method to configure the kubernetes client. Thanks in advance to use it for 
> testing on the Jenkins after the Minikube version is updated.



--
This message was sent by Atlassian Jira
(v8.3.4#803005)

-
To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org
For additional commands, e-mail: issues-h...@spark.apache.org



[jira] [Comment Edited] (SPARK-34738) Upgrade Minikube and kubernetes cluster version on Jenkins

2021-03-31 Thread Shane Knapp (Jira)


[ 
https://issues.apache.org/jira/browse/SPARK-34738?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=17312696#comment-17312696
 ] 

Shane Knapp edited comment on SPARK-34738 at 3/31/21, 8:38 PM:
---

managed to snag the logs from the pod when it errored out:
{code:java}
++ id -u
+ myuid=185
++ id -g
+ mygid=0
+ set +e
++ getent passwd 185
+ uidentry=
+ set -e
+ '[' -z '' ']'
+ '[' -w /etc/passwd ']'
+ echo '185:x:185:0:anonymous uid:/opt/spark:/bin/false'
+ SPARK_CLASSPATH=':/opt/spark/jars/*'
+ env
+ grep SPARK_JAVA_OPT_
+ sort -t_ -k4 -n
+ sed 's/[^=]*=\(.*\)/\1/g'
+ readarray -t SPARK_EXECUTOR_JAVA_OPTS
+ '[' -n '' ']'
+ '[' -z ']'
+ '[' -z ']'
+ '[' -n '' ']'
+ '[' -z ']'
+ '[' -z x ']'
+ SPARK_CLASSPATH='/opt/spark/conf::/opt/spark/jars/*'
+ case "$1" in
+ shift 1
+ CMD=("$SPARK_HOME/bin/spark-submit" --conf 
"spark.driver.bindAddress=$SPARK_DRIVER_BIND_ADDRESS" --deploy-mode client "$@")
+ exec /usr/bin/tini -s -- /opt/spark/bin/spark-submit --conf 
spark.driver.bindAddress=172.17.0.3 --deploy-mode client --properties-file 
/opt/spark/conf/spark.properties --class 
org.apache.spark.examples.DFSReadWriteTest 
local:///opt/spark/examples/jars/spark-examples_2.12-3.2.0-SNAPSHOT.jar 
/opt/spark/pv-tests/tmp4595937990978494271.txt /opt/spark/pv-tests
21/03/31 20:26:24 WARN NativeCodeLoader: Unable to load native-hadoop library 
for your platform... using builtin-java classes where applicable
Given path (/opt/spark/pv-tests/tmp4595937990978494271.txt) does not exist
DFS Read-Write Test
Usage: localFile dfsDir
localFile - (string) local file to use in test
dfsDir - (string) DFS directory for read/write tests
log4j:WARN No appenders could be found for logger 
(org.apache.spark.util.ShutdownHookManager).
log4j:WARN Please initialize the log4j system properly.
log4j:WARN See http://logging.apache.org/log4j/1.2/faq.html#noconfig for more 
info.{code}
this def caught my eye:  Given path 
(/opt/spark/pv-tests/tmp4595937990978494271.txt) does not exist

i sshed in to the cluster, was able (again) to confirm that mk was able to 
mount the PVC test dir on that worker in /tmp, and that the file 
tmp4595937990978494271.txt was visible and readable from within mk...  however 
/opt/spark/pv-tests/ wasn't visible within the mk cluster.


was (Author: shaneknapp):
managed to snag the logs from the pod when it errored out:
{code:java}
++ id -u
+ myuid=185
++ id -g
+ mygid=0
+ set +e
++ getent passwd 185
+ uidentry=
+ set -e
+ '[' -z '' ']'
+ '[' -w /etc/passwd ']'
+ echo '185:x:185:0:anonymous uid:/opt/spark:/bin/false'
+ SPARK_CLASSPATH=':/opt/spark/jars/*'
+ env
+ grep SPARK_JAVA_OPT_
+ sort -t_ -k4 -n
+ sed 's/[^=]*=\(.*\)/\1/g'
+ readarray -t SPARK_EXECUTOR_JAVA_OPTS
+ '[' -n '' ']'
+ '[' -z ']'
+ '[' -z ']'
+ '[' -n '' ']'
+ '[' -z ']'
+ '[' -z x ']'
+ SPARK_CLASSPATH='/opt/spark/conf::/opt/spark/jars/*'
+ case "$1" in
+ shift 1
+ CMD=("$SPARK_HOME/bin/spark-submit" --conf 
"spark.driver.bindAddress=$SPARK_DRIVER_BIND_ADDRESS" --deploy-mode client "$@")
+ exec /usr/bin/tini -s -- /opt/spark/bin/spark-submit --conf 
spark.driver.bindAddress=172.17.0.3 --deploy-mode client --properties-file 
/opt/spark/conf/spark.properties --class 
org.apache.spark.examples.DFSReadWriteTest 
local:///opt/spark/examples/jars/spark-examples_2.12-3.2.0-SNAPSHOT.jar 
/opt/spark/pv-tests/tmp4595937990978494271.txt /opt/spark/pv-tests
21/03/31 20:26:24 WARN NativeCodeLoader: Unable to load native-hadoop library 
for your platform... using builtin-java classes where applicable
Given path (/opt/spark/pv-tests/tmp4595937990978494271.txt) does not exist
DFS Read-Write Test
Usage: localFile dfsDir
localFile - (string) local file to use in test
dfsDir - (string) DFS directory for read/write tests
log4j:WARN No appenders could be found for logger 
(org.apache.spark.util.ShutdownHookManager).
log4j:WARN Please initialize the log4j system properly.
log4j:WARN See http://logging.apache.org/log4j/1.2/faq.html#noconfig for more 
info.{code}
this def caught my eye:  Given path 
(/opt/spark/pv-tests/tmp4595937990978494271.txt) does not exist

i sshed in to the cluster, was able (again) to confirm that mk was able to 
mount the PVC test dir on that worker, and that the file 
tmp4595937990978494271.txt was visible and readable from within mk...

> Upgrade Minikube and kubernetes cluster version on Jenkins
> --
>
> Key: SPARK-34738
> URL: https://issues.apache.org/jira/browse/SPARK-34738
> Project: Spark
>  Issue Type: Task
>  Components: jenkins, Kubernetes
>Affects Versions: 3.2.0
>Reporter: Attila Zsolt Piros
>Assignee: Shane Knapp
>Priority: Major
>
> [~shaneknapp] as we discussed [on the mailing 
> 

[jira] [Commented] (SPARK-34738) Upgrade Minikube and kubernetes cluster version on Jenkins

2021-03-31 Thread Shane Knapp (Jira)


[ 
https://issues.apache.org/jira/browse/SPARK-34738?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=17312696#comment-17312696
 ] 

Shane Knapp commented on SPARK-34738:
-

managed to snag the logs from the pod when it errored out:
{code:java}
++ id -u
+ myuid=185
++ id -g
+ mygid=0
+ set +e
++ getent passwd 185
+ uidentry=
+ set -e
+ '[' -z '' ']'
+ '[' -w /etc/passwd ']'
+ echo '185:x:185:0:anonymous uid:/opt/spark:/bin/false'
+ SPARK_CLASSPATH=':/opt/spark/jars/*'
+ env
+ grep SPARK_JAVA_OPT_
+ sort -t_ -k4 -n
+ sed 's/[^=]*=\(.*\)/\1/g'
+ readarray -t SPARK_EXECUTOR_JAVA_OPTS
+ '[' -n '' ']'
+ '[' -z ']'
+ '[' -z ']'
+ '[' -n '' ']'
+ '[' -z ']'
+ '[' -z x ']'
+ SPARK_CLASSPATH='/opt/spark/conf::/opt/spark/jars/*'
+ case "$1" in
+ shift 1
+ CMD=("$SPARK_HOME/bin/spark-submit" --conf 
"spark.driver.bindAddress=$SPARK_DRIVER_BIND_ADDRESS" --deploy-mode client "$@")
+ exec /usr/bin/tini -s -- /opt/spark/bin/spark-submit --conf 
spark.driver.bindAddress=172.17.0.3 --deploy-mode client --properties-file 
/opt/spark/conf/spark.properties --class 
org.apache.spark.examples.DFSReadWriteTest 
local:///opt/spark/examples/jars/spark-examples_2.12-3.2.0-SNAPSHOT.jar 
/opt/spark/pv-tests/tmp4595937990978494271.txt /opt/spark/pv-tests
21/03/31 20:26:24 WARN NativeCodeLoader: Unable to load native-hadoop library 
for your platform... using builtin-java classes where applicable
Given path (/opt/spark/pv-tests/tmp4595937990978494271.txt) does not exist
DFS Read-Write Test
Usage: localFile dfsDir
localFile - (string) local file to use in test
dfsDir - (string) DFS directory for read/write tests
log4j:WARN No appenders could be found for logger 
(org.apache.spark.util.ShutdownHookManager).
log4j:WARN Please initialize the log4j system properly.
log4j:WARN See http://logging.apache.org/log4j/1.2/faq.html#noconfig for more 
info.{code}
this def caught my eye:  Given path 
(/opt/spark/pv-tests/tmp4595937990978494271.txt) does not exist

i sshed in to the cluster, was able (again) to confirm that mk was able to 
mount the PVC test dir on that worker, and that the file 
tmp4595937990978494271.txt was visible and readable from within mk...

> Upgrade Minikube and kubernetes cluster version on Jenkins
> --
>
> Key: SPARK-34738
> URL: https://issues.apache.org/jira/browse/SPARK-34738
> Project: Spark
>  Issue Type: Task
>  Components: jenkins, Kubernetes
>Affects Versions: 3.2.0
>Reporter: Attila Zsolt Piros
>Assignee: Shane Knapp
>Priority: Major
>
> [~shaneknapp] as we discussed [on the mailing 
> list|http://apache-spark-developers-list.1001551.n3.nabble.com/minikube-and-kubernetes-cluster-versions-for-integration-testing-td30856.html]
>  Minikube can be upgraded to the latest (v1.18.1) and kubernetes version 
> should be v1.17.3 (`minikube config set kubernetes-version v1.17.3`).
> [Here|https://github.com/apache/spark/pull/31829] is my PR which uses a new 
> method to configure the kubernetes client. Thanks in advance to use it for 
> testing on the Jenkins after the Minikube version is updated.



--
This message was sent by Atlassian Jira
(v8.3.4#803005)

-
To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org
For additional commands, e-mail: issues-h...@spark.apache.org



[jira] [Commented] (SPARK-34738) Upgrade Minikube and kubernetes cluster version on Jenkins

2021-03-31 Thread Shane Knapp (Jira)


[ 
https://issues.apache.org/jira/browse/SPARK-34738?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=17312548#comment-17312548
 ] 

Shane Knapp commented on SPARK-34738:
-

fwiw, the pod is mounting the local FS correctly:
{code:java}
jenkins@research-jenkins-worker-08:~$ minikube ssh
docker@minikube:~$ cd /tmp
docker@minikube:/tmp$ ls
gvisor h.829 h.912 hostpath-provisioner hostpath_pv tmp.iSJp3I8otl
docker@minikube:/tmp$ cd tmp.iSJp3I8otl/
docker@minikube:/tmp/tmp.iSJp3I8otl$ touch ASDF
docker@minikube:/tmp/tmp.iSJp3I8otl$ logout
jenkins@research-jenkins-worker-08:~$ cd /tmp/tmp.iSJp3I8otl/
jenkins@research-jenkins-worker-08:/tmp/tmp.iSJp3I8otl$ ls
ASDF tmp3533667297442374713.txt
jenkins@research-jenkins-worker-08:/tmp/tmp.iSJp3I8otl$ cat 
tmp3533667297442374713.txt
test PVs{code}

> Upgrade Minikube and kubernetes cluster version on Jenkins
> --
>
> Key: SPARK-34738
> URL: https://issues.apache.org/jira/browse/SPARK-34738
> Project: Spark
>  Issue Type: Task
>  Components: jenkins, Kubernetes
>Affects Versions: 3.2.0
>Reporter: Attila Zsolt Piros
>Assignee: Shane Knapp
>Priority: Major
>
> [~shaneknapp] as we discussed [on the mailing 
> list|http://apache-spark-developers-list.1001551.n3.nabble.com/minikube-and-kubernetes-cluster-versions-for-integration-testing-td30856.html]
>  Minikube can be upgraded to the latest (v1.18.1) and kubernetes version 
> should be v1.17.3 (`minikube config set kubernetes-version v1.17.3`).
> [Here|https://github.com/apache/spark/pull/31829] is my PR which uses a new 
> method to configure the kubernetes client. Thanks in advance to use it for 
> testing on the Jenkins after the Minikube version is updated.



--
This message was sent by Atlassian Jira
(v8.3.4#803005)

-
To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org
For additional commands, e-mail: issues-h...@spark.apache.org



[jira] [Comment Edited] (SPARK-34738) Upgrade Minikube and kubernetes cluster version on Jenkins

2021-03-31 Thread Shane Knapp (Jira)


[ 
https://issues.apache.org/jira/browse/SPARK-34738?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=17312517#comment-17312517
 ] 

Shane Knapp edited comment on SPARK-34738 at 3/31/21, 4:43 PM:
---

alright, sometimes these things go smoothly, sometimes not.

this is firmly in the 'not' camp.

after upgrading minikube and k8s, i was unable to mount a persistent volume 
when using the kvm2 driver.  much debugging ensued.  no progress was made and 
the error reported was that the minikube pod was unable to connect to the 
localhost and mount (Connection refused).

so, i decided to randomly try the docker minikube driver.  voila!  i'm now able 
to happily mount persistent volumes.

however, when running the k8s integration test, everything passes *except* the 
PVs w/local storage.

from [https://amplab.cs.berkeley.edu/jenkins/job/spark-master-test-k8s-clone:]
{code:java}
- PVs with local storage *** FAILED ***
 The code passed to eventually never returned normally. Attempted 179 times 
over 3.00242447046 minutes. Last failure message: container not found 
("spark-kubernetes-driver"). (PVTestsSuite.scala:117){code}
i've never seen this error before, and apparently there aren't many things 

here's how we launch minikube and create the mount:
{code:java}
minikube --vm-driver=docker start --memory 6000 --cpus 8
minikube mount ${PVC_TESTS_HOST_PATH}:${PVC_TESTS_VM_PATH} 
--9p-version=9p2000.L &
{code}
we're using ZFS on the bare metal, and minikube is complaining:
{code:java}
! docker is currently using the zfs storage driver, consider switching to 
overlay2 for better performance{code}
i'll continue to dig in to this today, but i'm currently blocked...


was (Author: shaneknapp):
alright, sometimes these things go smoothly, sometimes not.

this is firmly in the 'not' camp.

after upgrading minikube and k8s, i was unable to mount a persistent volume 
when using the kvm2 driver.  much debugging ensued.  no progress was made.

so, i decided to randomly try the docker minikube driver.  voila!  i'm now able 
to happily mount persistent volumes.

however, when running the k8s integration test, everything passes *except* the 
PVs w/local storage.

from [https://amplab.cs.berkeley.edu/jenkins/job/spark-master-test-k8s-clone:]
{code:java}
- PVs with local storage *** FAILED ***
 The code passed to eventually never returned normally. Attempted 179 times 
over 3.00242447046 minutes. Last failure message: container not found 
("spark-kubernetes-driver"). (PVTestsSuite.scala:117){code}
i've never seen this error before, and apparently there aren't many things 

here's how we launch minikube and create the mount:
{code:java}
minikube --vm-driver=docker start --memory 6000 --cpus 8
minikube mount ${PVC_TESTS_HOST_PATH}:${PVC_TESTS_VM_PATH} 
--9p-version=9p2000.L &
{code}
we're using ZFS on the bare metal, and minikube is complaining:
{code:java}
! docker is currently using the zfs storage driver, consider switching to 
overlay2 for better performance{code}
i'll continue to dig in to this today, but i'm currently blocked...

> Upgrade Minikube and kubernetes cluster version on Jenkins
> --
>
> Key: SPARK-34738
> URL: https://issues.apache.org/jira/browse/SPARK-34738
> Project: Spark
>  Issue Type: Task
>  Components: jenkins, Kubernetes
>Affects Versions: 3.2.0
>Reporter: Attila Zsolt Piros
>Assignee: Shane Knapp
>Priority: Major
>
> [~shaneknapp] as we discussed [on the mailing 
> list|http://apache-spark-developers-list.1001551.n3.nabble.com/minikube-and-kubernetes-cluster-versions-for-integration-testing-td30856.html]
>  Minikube can be upgraded to the latest (v1.18.1) and kubernetes version 
> should be v1.17.3 (`minikube config set kubernetes-version v1.17.3`).
> [Here|https://github.com/apache/spark/pull/31829] is my PR which uses a new 
> method to configure the kubernetes client. Thanks in advance to use it for 
> testing on the Jenkins after the Minikube version is updated.



--
This message was sent by Atlassian Jira
(v8.3.4#803005)

-
To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org
For additional commands, e-mail: issues-h...@spark.apache.org



[jira] [Comment Edited] (SPARK-34738) Upgrade Minikube and kubernetes cluster version on Jenkins

2021-03-31 Thread Shane Knapp (Jira)


[ 
https://issues.apache.org/jira/browse/SPARK-34738?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=17312517#comment-17312517
 ] 

Shane Knapp edited comment on SPARK-34738 at 3/31/21, 4:20 PM:
---

alright, sometimes these things go smoothly, sometimes not.

this is firmly in the 'not' camp.

after upgrading minikube and k8s, i was unable to mount a persistent volume 
when using the kvm2 driver.  much debugging ensued.  no progress was made.

so, i decided to randomly try the docker minikube driver.  voila!  i'm now able 
to happily mount persistent volumes.

however, when running the k8s integration test, everything passes *except* the 
PVs w/local storage.

from [https://amplab.cs.berkeley.edu/jenkins/job/spark-master-test-k8s-clone:]
{code:java}
- PVs with local storage *** FAILED ***
 The code passed to eventually never returned normally. Attempted 179 times 
over 3.00242447046 minutes. Last failure message: container not found 
("spark-kubernetes-driver"). (PVTestsSuite.scala:117){code}
i've never seen this error before, and apparently there aren't many things 

here's how we launch minikube and create the mount:
{code:java}
minikube --vm-driver=docker start --memory 6000 --cpus 8
minikube mount ${PVC_TESTS_HOST_PATH}:${PVC_TESTS_VM_PATH} 
--9p-version=9p2000.L &
{code}
we're using ZFS on the bare metal, and minikube is complaining:
{code:java}
! docker is currently using the zfs storage driver, consider switching to 
overlay2 for better performance{code}
i'll continue to dig in to this today, but i'm currently blocked...


was (Author: shaneknapp):
alright, sometimes these things go smoothly, sometimes not.

this is firmly in the 'not' camp.

after upgrading minikube and k8s, i was unable to mount a persistent volume 
when using the kvm2 driver.  much debugging ensued.  no progress was made.

so, i decided to randomly try the docker minikube driver.  voila!  i'm now able 
to happily mount persistent volumes.

however, when running the k8s integration test, everything passes *except* the 
PVs w/local storage.

from https://amplab.cs.berkeley.edu/jenkins/job/spark-master-test-k8s-clone:
{code:java}
- PVs with local storage *** FAILED ***
 The code passed to eventually never returned normally. Attempted 179 times 
over 3.00242447046 minutes. Last failure message: container not found 
("spark-kubernetes-driver"). (PVTestsSuite.scala:117){code}
i've never seen this error before, and apparently there aren't many things 

here's how we launch minikube and create the mount:

 
{code:java}
minikube --vm-driver=docker start --memory 6000 --cpus 8
minikube mount ${PVC_TESTS_HOST_PATH}:${PVC_TESTS_VM_PATH} 
--9p-version=9p2000.L &
{code}
i'll continue to dig in to this today, but i'm currently blocked...

> Upgrade Minikube and kubernetes cluster version on Jenkins
> --
>
> Key: SPARK-34738
> URL: https://issues.apache.org/jira/browse/SPARK-34738
> Project: Spark
>  Issue Type: Task
>  Components: jenkins, Kubernetes
>Affects Versions: 3.2.0
>Reporter: Attila Zsolt Piros
>Assignee: Shane Knapp
>Priority: Major
>
> [~shaneknapp] as we discussed [on the mailing 
> list|http://apache-spark-developers-list.1001551.n3.nabble.com/minikube-and-kubernetes-cluster-versions-for-integration-testing-td30856.html]
>  Minikube can be upgraded to the latest (v1.18.1) and kubernetes version 
> should be v1.17.3 (`minikube config set kubernetes-version v1.17.3`).
> [Here|https://github.com/apache/spark/pull/31829] is my PR which uses a new 
> method to configure the kubernetes client. Thanks in advance to use it for 
> testing on the Jenkins after the Minikube version is updated.



--
This message was sent by Atlassian Jira
(v8.3.4#803005)

-
To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org
For additional commands, e-mail: issues-h...@spark.apache.org



[jira] [Commented] (SPARK-34738) Upgrade Minikube and kubernetes cluster version on Jenkins

2021-03-31 Thread Shane Knapp (Jira)


[ 
https://issues.apache.org/jira/browse/SPARK-34738?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=17312518#comment-17312518
 ] 

Shane Knapp commented on SPARK-34738:
-

[~skonto] any insights?

> Upgrade Minikube and kubernetes cluster version on Jenkins
> --
>
> Key: SPARK-34738
> URL: https://issues.apache.org/jira/browse/SPARK-34738
> Project: Spark
>  Issue Type: Task
>  Components: jenkins, Kubernetes
>Affects Versions: 3.2.0
>Reporter: Attila Zsolt Piros
>Assignee: Shane Knapp
>Priority: Major
>
> [~shaneknapp] as we discussed [on the mailing 
> list|http://apache-spark-developers-list.1001551.n3.nabble.com/minikube-and-kubernetes-cluster-versions-for-integration-testing-td30856.html]
>  Minikube can be upgraded to the latest (v1.18.1) and kubernetes version 
> should be v1.17.3 (`minikube config set kubernetes-version v1.17.3`).
> [Here|https://github.com/apache/spark/pull/31829] is my PR which uses a new 
> method to configure the kubernetes client. Thanks in advance to use it for 
> testing on the Jenkins after the Minikube version is updated.



--
This message was sent by Atlassian Jira
(v8.3.4#803005)

-
To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org
For additional commands, e-mail: issues-h...@spark.apache.org



[jira] [Commented] (SPARK-34738) Upgrade Minikube and kubernetes cluster version on Jenkins

2021-03-31 Thread Shane Knapp (Jira)


[ 
https://issues.apache.org/jira/browse/SPARK-34738?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=17312517#comment-17312517
 ] 

Shane Knapp commented on SPARK-34738:
-

alright, sometimes these things go smoothly, sometimes not.

this is firmly in the 'not' camp.

after upgrading minikube and k8s, i was unable to mount a persistent volume 
when using the kvm2 driver.  much debugging ensued.  no progress was made.

so, i decided to randomly try the docker minikube driver.  voila!  i'm now able 
to happily mount persistent volumes.

however, when running the k8s integration test, everything passes *except* the 
PVs w/local storage.

from https://amplab.cs.berkeley.edu/jenkins/job/spark-master-test-k8s-clone:
{code:java}
- PVs with local storage *** FAILED ***
 The code passed to eventually never returned normally. Attempted 179 times 
over 3.00242447046 minutes. Last failure message: container not found 
("spark-kubernetes-driver"). (PVTestsSuite.scala:117){code}
i've never seen this error before, and apparently there aren't many things 

here's how we launch minikube and create the mount:

 
{code:java}
minikube --vm-driver=docker start --memory 6000 --cpus 8
minikube mount ${PVC_TESTS_HOST_PATH}:${PVC_TESTS_VM_PATH} 
--9p-version=9p2000.L &
{code}
i'll continue to dig in to this today, but i'm currently blocked...

> Upgrade Minikube and kubernetes cluster version on Jenkins
> --
>
> Key: SPARK-34738
> URL: https://issues.apache.org/jira/browse/SPARK-34738
> Project: Spark
>  Issue Type: Task
>  Components: jenkins, Kubernetes
>Affects Versions: 3.2.0
>Reporter: Attila Zsolt Piros
>Assignee: Shane Knapp
>Priority: Major
>
> [~shaneknapp] as we discussed [on the mailing 
> list|http://apache-spark-developers-list.1001551.n3.nabble.com/minikube-and-kubernetes-cluster-versions-for-integration-testing-td30856.html]
>  Minikube can be upgraded to the latest (v1.18.1) and kubernetes version 
> should be v1.17.3 (`minikube config set kubernetes-version v1.17.3`).
> [Here|https://github.com/apache/spark/pull/31829] is my PR which uses a new 
> method to configure the kubernetes client. Thanks in advance to use it for 
> testing on the Jenkins after the Minikube version is updated.



--
This message was sent by Atlassian Jira
(v8.3.4#803005)

-
To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org
For additional commands, e-mail: issues-h...@spark.apache.org



[jira] [Commented] (SPARK-34738) Upgrade Minikube and kubernetes cluster version on Jenkins

2021-03-16 Thread Shane Knapp (Jira)


[ 
https://issues.apache.org/jira/browse/SPARK-34738?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=17302777#comment-17302777
 ] 

Shane Knapp commented on SPARK-34738:
-

i'll be doing this next tuesday (3/23) and teaching one of my sysadmins to help 
out.

> Upgrade Minikube and kubernetes cluster version on Jenkins
> --
>
> Key: SPARK-34738
> URL: https://issues.apache.org/jira/browse/SPARK-34738
> Project: Spark
>  Issue Type: Task
>  Components: jenkins, Kubernetes
>Affects Versions: 3.2.0
>Reporter: Attila Zsolt Piros
>Assignee: Shane Knapp
>Priority: Major
>
> [~shaneknapp] as we discussed [on the mailing 
> list|http://apache-spark-developers-list.1001551.n3.nabble.com/minikube-and-kubernetes-cluster-versions-for-integration-testing-td30856.html]
>  Minikube can be upgraded to the latest (v1.18.1) and kubernetes version 
> should be v1.17.3 (`minikube config set kubernetes-version v1.17.3`).
> [Here|https://github.com/apache/spark/pull/31829] is my PR which uses a new 
> method to configure the kubernetes client. Thanks in advance to use it for 
> testing on the Jenkins after the Minikube version is updated.



--
This message was sent by Atlassian Jira
(v8.3.4#803005)

-
To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org
For additional commands, e-mail: issues-h...@spark.apache.org



[jira] [Assigned] (SPARK-34738) Upgrade Minikube and kubernetes cluster version on Jenkins

2021-03-15 Thread Shane Knapp (Jira)


 [ 
https://issues.apache.org/jira/browse/SPARK-34738?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
 ]

Shane Knapp reassigned SPARK-34738:
---

Assignee: Shane Knapp

> Upgrade Minikube and kubernetes cluster version on Jenkins
> --
>
> Key: SPARK-34738
> URL: https://issues.apache.org/jira/browse/SPARK-34738
> Project: Spark
>  Issue Type: Task
>  Components: jenkins, Kubernetes
>Affects Versions: 3.2.0
>Reporter: Attila Zsolt Piros
>Assignee: Shane Knapp
>Priority: Major
>
> [~shaneknapp] as we discussed [on the mailing 
> list|http://apache-spark-developers-list.1001551.n3.nabble.com/minikube-and-kubernetes-cluster-versions-for-integration-testing-td30856.html]
>  Minikube can be upgraded to the latest (v1.18.1) and kubernetes version 
> should be v1.17.3 (`minikube config set kubernetes-version v1.17.3`).
> [Here|https://github.com/apache/spark/pull/31829] is my PR which uses a new 
> method to configure the kubernetes client. Thanks in advance to use it for 
> testing on the Jenkins after the Minikube version is updated.



--
This message was sent by Atlassian Jira
(v8.3.4#803005)

-
To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org
For additional commands, e-mail: issues-h...@spark.apache.org



[jira] [Commented] (SPARK-34641) ARM CI failed due to download hive-exec failed

2021-03-05 Thread Shane Knapp (Jira)


[ 
https://issues.apache.org/jira/browse/SPARK-34641?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=17296239#comment-17296239
 ] 

Shane Knapp commented on SPARK-34641:
-

looks like hive-exec downloaded fine in my recent build:

[https://amplab.cs.berkeley.edu/jenkins/job/spark-master-test-maven-arm/565/console]
{code:java}
Downloading from gcs-maven-central-mirror: 
https://maven-central.storage-download.googleapis.com/maven2/org/apache/hive/hive-exec/2.3.8/hive-exec-2.3.8.pom
Progress (1): 3.1/31 kB
Progress (1): 5.8/31 kB
Progress (1): 8.6/31 kB
Progress (1): 11/31 kB 
Progress (1): 14/31 kB
Progress (1): 16/31 kB
Progress (1): 19/31 kB
Progress (1): 22/31 kB
Progress (1): 25/31 kB
Progress (1): 28/31 kB
Progress (1): 30/31 kB
Progress (1): 31 kB   
   
Downloaded from gcs-maven-central-mirror: 
https://maven-central.storage-download.googleapis.com/maven2/org/apache/hive/hive-exec/2.3.8/hive-exec-2.3.8.pom
 (31 kB at 22 kB/s)r
{code}
we'll wait and see how this build turns out.

> ARM CI failed due to download hive-exec failed
> --
>
> Key: SPARK-34641
> URL: https://issues.apache.org/jira/browse/SPARK-34641
> Project: Spark
>  Issue Type: Bug
>  Components: Build
>Affects Versions: 3.1.2
>Reporter: Yikun Jiang
>Priority: Major
>
> The download failed (org.apache.hive#hive-exec;3.0.0!hive-exec.jar) was 
> happened in recent spark-master-test-maven-arm build tests.
> But it's not reproduced and all hive test passed in my arm and x86 local env.
> and the jenkins x86 test like [2] is also unstable due to other reason, so 
> looks like we can't find a valid reference.
> [1] [https://amplab.cs.berkeley.edu/jenkins/job/spark-master-test-maven-arm/]
> [2] 
> [https://amplab.cs.berkeley.edu/jenkins/job/spark-master-test-maven-hadoop-2.7/]
>  



--
This message was sent by Atlassian Jira
(v8.3.4#803005)

-
To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org
For additional commands, e-mail: issues-h...@spark.apache.org



[jira] [Commented] (SPARK-34641) ARM CI failed due to download hive-exec failed

2021-03-05 Thread Shane Knapp (Jira)


[ 
https://issues.apache.org/jira/browse/SPARK-34641?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=17296235#comment-17296235
 ] 

Shane Knapp commented on SPARK-34641:
-

no clue, tbh.  i'll try and take a closer look at the build logs later, but for 
now i wiped the ivy and maven caches on the VM and triggered a fresh build.

> ARM CI failed due to download hive-exec failed
> --
>
> Key: SPARK-34641
> URL: https://issues.apache.org/jira/browse/SPARK-34641
> Project: Spark
>  Issue Type: Bug
>  Components: Build
>Affects Versions: 3.1.2
>Reporter: Yikun Jiang
>Priority: Major
>
> The download failed (org.apache.hive#hive-exec;3.0.0!hive-exec.jar) was 
> happened in recent spark-master-test-maven-arm build tests.
> But it's not reproduced and all hive test passed in my arm and x86 local env.
> and the jenkins x86 test like [2] is also unstable due to other reason, so 
> looks like we can't find a valid reference.
> [1] [https://amplab.cs.berkeley.edu/jenkins/job/spark-master-test-maven-arm/]
> [2] 
> [https://amplab.cs.berkeley.edu/jenkins/job/spark-master-test-maven-hadoop-2.7/]
>  



--
This message was sent by Atlassian Jira
(v8.3.4#803005)

-
To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org
For additional commands, e-mail: issues-h...@spark.apache.org



[jira] [Commented] (SPARK-33044) Add a Jenkins build and test job for Scala 2.13

2021-02-23 Thread shane knapp (Jira)


[ 
https://issues.apache.org/jira/browse/SPARK-33044?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=17289542#comment-17289542
 ] 

shane knapp commented on SPARK-33044:
-

hey!  sorry, i've been pretty slammed these past few weeks.  i should be
able to get this done by EOW.

On Mon, Oct 19, 2020 at 1:36 PM Dongjoon Hyun (Jira) 



-- 
Shane Knapp
Computer Guy / Voice of Reason
UC Berkeley EECS Research / RISELab Staff Technical Lead
https://rise.cs.berkeley.edu


> Add a Jenkins build and test job for Scala 2.13
> ---
>
> Key: SPARK-33044
> URL: https://issues.apache.org/jira/browse/SPARK-33044
> Project: Spark
>  Issue Type: Sub-task
>  Components: jenkins
>Affects Versions: 3.1.0
>Reporter: Yang Jie
>Assignee: Shane Knapp
>Priority: Major
> Attachments: Screen Shot 2020-12-08 at 1.56.59 PM.png, Screen Shot 
> 2020-12-08 at 1.58.07 PM.png
>
>
> {{Master}} branch seems to be almost ready for Scala 2.13 now, we need a 
> Jenkins test job to verify current work results and CI.



--
This message was sent by Atlassian Jira
(v8.3.4#803005)

-
To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org
For additional commands, e-mail: issues-h...@spark.apache.org



[jira] [Commented] (SPARK-33044) Add a Jenkins build and test job for Scala 2.13

2021-02-23 Thread shane knapp (Jira)


[ 
https://issues.apache.org/jira/browse/SPARK-33044?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=17289527#comment-17289527
 ] 

shane knapp commented on SPARK-33044:
-

1) logins are temporarily disabled due to new campus network security
standards.   i need to find a non-manual-way of dealing with this asap.

2) i will get to this tomorrow.




-- 
Shane Knapp
Computer Guy / Voice of Reason
UC Berkeley EECS Research / RISELab Staff Technical Lead
https://rise.cs.berkeley.edu


> Add a Jenkins build and test job for Scala 2.13
> ---
>
> Key: SPARK-33044
> URL: https://issues.apache.org/jira/browse/SPARK-33044
> Project: Spark
>  Issue Type: Sub-task
>  Components: jenkins
>Affects Versions: 3.1.0
>Reporter: Yang Jie
>Assignee: Shane Knapp
>Priority: Major
> Attachments: Screen Shot 2020-12-08 at 1.56.59 PM.png, Screen Shot 
> 2020-12-08 at 1.58.07 PM.png
>
>
> {{Master}} branch seems to be almost ready for Scala 2.13 now, we need a 
> Jenkins test job to verify current work results and CI.



--
This message was sent by Atlassian Jira
(v8.3.4#803005)

-
To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org
For additional commands, e-mail: issues-h...@spark.apache.org



[jira] [Commented] (SPARK-33044) Add a Jenkins build and test job for Scala 2.13

2021-02-23 Thread shane knapp (Jira)


[ 
https://issues.apache.org/jira/browse/SPARK-33044?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=17289470#comment-17289470
 ] 

shane knapp commented on SPARK-33044:
-

welp, i manually created a test job and it failed pretty early on:
https://amplab.cs.berkeley.edu/jenkins/view/All/job/spark-master-test-maven-hadoop-3.2-hive-2.3-scala-2.13/1/

On Tue, Dec 8, 2020 at 10:59 AM Dongjoon Hyun (Jira) 



-- 
Shane Knapp
Computer Guy / Voice of Reason
UC Berkeley EECS Research / RISELab Staff Technical Lead
https://rise.cs.berkeley.edu


> Add a Jenkins build and test job for Scala 2.13
> ---
>
> Key: SPARK-33044
> URL: https://issues.apache.org/jira/browse/SPARK-33044
> Project: Spark
>  Issue Type: Sub-task
>  Components: jenkins
>Affects Versions: 3.1.0
>Reporter: Yang Jie
>Assignee: Shane Knapp
>Priority: Major
> Attachments: Screen Shot 2020-12-08 at 1.56.59 PM.png, Screen Shot 
> 2020-12-08 at 1.58.07 PM.png
>
>
> {{Master}} branch seems to be almost ready for Scala 2.13 now, we need a 
> Jenkins test job to verify current work results and CI.



--
This message was sent by Atlassian Jira
(v8.3.4#803005)

-
To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org
For additional commands, e-mail: issues-h...@spark.apache.org



[jira] [Resolved] (SPARK-30747) Update roxygen2 to 7.0.1

2021-01-14 Thread Shane Knapp (Jira)


 [ 
https://issues.apache.org/jira/browse/SPARK-30747?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
 ]

Shane Knapp resolved SPARK-30747.
-
Resolution: Fixed

> Update roxygen2 to 7.0.1
> 
>
> Key: SPARK-30747
> URL: https://issues.apache.org/jira/browse/SPARK-30747
> Project: Spark
>  Issue Type: Improvement
>  Components: SparkR, Tests
>Affects Versions: 3.1.0
>Reporter: Maciej Szymkiewicz
>Assignee: Shane Knapp
>Priority: Minor
>
> Currently Spark uses {{roxygen2}} 5.0.1. It is already pretty old 
> (2015-11-11) so it could be a good idea to use current R updates to update it 
> as well.
> At crude inspection:
> * SPARK-22430 has been resolved a while ago.
> * SPARK-30737][SPARK-27262,  https://github.com/apache/spark/pull/27437 and 
> https://github.com/apache/spark/commit/b95ccb1d8b726b11435789cdb5882df6643430ed
>  resolved persisting warnings
> * Documentation builds and CRAN checks pass
> * Generated HTML docs are identical to 5.0.1
> Since {{roxygen2}} shares some potentially unstable dependencies with 
> {{devtools}} (primarily {{rlang}}) it might be a good idea to keep these in 
> sync (as a bonus we wouldn't have to worry about {{DESCRIPTION}} being 
> overwritten by local tests).



--
This message was sent by Atlassian Jira
(v8.3.4#803005)

-
To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org
For additional commands, e-mail: issues-h...@spark.apache.org



[jira] [Commented] (SPARK-30747) Update roxygen2 to 7.0.1

2021-01-14 Thread Shane Knapp (Jira)


[ 
https://issues.apache.org/jira/browse/SPARK-30747?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=17265142#comment-17265142
 ] 

Shane Knapp commented on SPARK-30747:
-

we're currently running 7.1.1 on all of the workers.

> Update roxygen2 to 7.0.1
> 
>
> Key: SPARK-30747
> URL: https://issues.apache.org/jira/browse/SPARK-30747
> Project: Spark
>  Issue Type: Improvement
>  Components: SparkR, Tests
>Affects Versions: 3.1.0
>Reporter: Maciej Szymkiewicz
>Assignee: Shane Knapp
>Priority: Minor
>
> Currently Spark uses {{roxygen2}} 5.0.1. It is already pretty old 
> (2015-11-11) so it could be a good idea to use current R updates to update it 
> as well.
> At crude inspection:
> * SPARK-22430 has been resolved a while ago.
> * SPARK-30737][SPARK-27262,  https://github.com/apache/spark/pull/27437 and 
> https://github.com/apache/spark/commit/b95ccb1d8b726b11435789cdb5882df6643430ed
>  resolved persisting warnings
> * Documentation builds and CRAN checks pass
> * Generated HTML docs are identical to 5.0.1
> Since {{roxygen2}} shares some potentially unstable dependencies with 
> {{devtools}} (primarily {{rlang}}) it might be a good idea to keep these in 
> sync (as a bonus we wouldn't have to worry about {{DESCRIPTION}} being 
> overwritten by local tests).



--
This message was sent by Atlassian Jira
(v8.3.4#803005)

-
To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org
For additional commands, e-mail: issues-h...@spark.apache.org



[jira] [Closed] (SPARK-31693) Investigate AmpLab Jenkins server network issue

2021-01-14 Thread Shane Knapp (Jira)


 [ 
https://issues.apache.org/jira/browse/SPARK-31693?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
 ]

Shane Knapp closed SPARK-31693.
---

> Investigate AmpLab Jenkins server network issue
> ---
>
> Key: SPARK-31693
> URL: https://issues.apache.org/jira/browse/SPARK-31693
> Project: Spark
>  Issue Type: Bug
>  Components: Project Infra
>Affects Versions: 2.4.6, 3.0.0, 3.1.0
>Reporter: Dongjoon Hyun
>Assignee: Shane Knapp
>Priority: Critical
>
> Given the series of failures in Spark packaging Jenkins job, it seems that 
> there is a network issue in AmbLab Jenkins cluster.
> - 
> https://amplab.cs.berkeley.edu/jenkins/view/Spark%20Packaging/job/spark-master-maven-snapshots/
> - The node failed to talk to GitBox. (SPARK-31687) -> GitHub is okay.
> - The node failed to download the maven mirror. (SPARK-31691) -> The primary 
> host is okay.
> - The node failed to communicate repository.apache.org. (Current master 
> branch Jenkins job failure)
> {code}
> [ERROR] Failed to execute goal 
> org.apache.maven.plugins:maven-deploy-plugin:3.0.0-M1:deploy (default-deploy) 
> on project spark-parent_2.12: ArtifactDeployerException: Failed to retrieve 
> remote metadata 
> org.apache.spark:spark-parent_2.12:3.1.0-SNAPSHOT/maven-metadata.xml: Could 
> not transfer metadata 
> org.apache.spark:spark-parent_2.12:3.1.0-SNAPSHOT/maven-metadata.xml from/to 
> apache.snapshots.https 
> (https://repository.apache.org/content/repositories/snapshots): Transfer 
> failed for 
> https://repository.apache.org/content/repositories/snapshots/org/apache/spark/spark-parent_2.12/3.1.0-SNAPSHOT/maven-metadata.xml:
>  Connect to repository.apache.org:443 [repository.apache.org/207.244.88.140] 
> failed: Connection timed out (Connection timed out) -> [Help 1]
> {code}



--
This message was sent by Atlassian Jira
(v8.3.4#803005)

-
To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org
For additional commands, e-mail: issues-h...@spark.apache.org



[jira] [Commented] (SPARK-29183) Upgrade JDK 11 Installation to 11.0.6

2021-01-14 Thread Shane Knapp (Jira)


[ 
https://issues.apache.org/jira/browse/SPARK-29183?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=17265139#comment-17265139
 ] 

Shane Knapp commented on SPARK-29183:
-

in the next ~month or so, we'll be reimaging the remaining ubuntu workers and 
java11 will be at 11.0.9+.  the new ubuntu 20 workers (r-j-w-01..06) all 
currently have 11.0.9 installed.

> Upgrade JDK 11 Installation to 11.0.6
> -
>
> Key: SPARK-29183
> URL: https://issues.apache.org/jira/browse/SPARK-29183
> Project: Spark
>  Issue Type: Improvement
>  Components: Project Infra
>Affects Versions: 3.1.0
>Reporter: Dongjoon Hyun
>Assignee: Shane Knapp
>Priority: Major
>
> Every JDK 11.0.x releases have many fixes including performance regression 
> fix. We had better upgrade it to the latest 11.0.4.
> - https://bugs.java.com/bugdatabase/view_bug.do?bug_id=JDK-8221760



--
This message was sent by Atlassian Jira
(v8.3.4#803005)

-
To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org
For additional commands, e-mail: issues-h...@spark.apache.org



[jira] [Resolved] (SPARK-31693) Investigate AmpLab Jenkins server network issue

2021-01-14 Thread Shane Knapp (Jira)


 [ 
https://issues.apache.org/jira/browse/SPARK-31693?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
 ]

Shane Knapp resolved SPARK-31693.
-
Resolution: Fixed

> Investigate AmpLab Jenkins server network issue
> ---
>
> Key: SPARK-31693
> URL: https://issues.apache.org/jira/browse/SPARK-31693
> Project: Spark
>  Issue Type: Bug
>  Components: Project Infra
>Affects Versions: 2.4.6, 3.0.0, 3.1.0
>Reporter: Dongjoon Hyun
>Assignee: Shane Knapp
>Priority: Critical
>
> Given the series of failures in Spark packaging Jenkins job, it seems that 
> there is a network issue in AmbLab Jenkins cluster.
> - 
> https://amplab.cs.berkeley.edu/jenkins/view/Spark%20Packaging/job/spark-master-maven-snapshots/
> - The node failed to talk to GitBox. (SPARK-31687) -> GitHub is okay.
> - The node failed to download the maven mirror. (SPARK-31691) -> The primary 
> host is okay.
> - The node failed to communicate repository.apache.org. (Current master 
> branch Jenkins job failure)
> {code}
> [ERROR] Failed to execute goal 
> org.apache.maven.plugins:maven-deploy-plugin:3.0.0-M1:deploy (default-deploy) 
> on project spark-parent_2.12: ArtifactDeployerException: Failed to retrieve 
> remote metadata 
> org.apache.spark:spark-parent_2.12:3.1.0-SNAPSHOT/maven-metadata.xml: Could 
> not transfer metadata 
> org.apache.spark:spark-parent_2.12:3.1.0-SNAPSHOT/maven-metadata.xml from/to 
> apache.snapshots.https 
> (https://repository.apache.org/content/repositories/snapshots): Transfer 
> failed for 
> https://repository.apache.org/content/repositories/snapshots/org/apache/spark/spark-parent_2.12/3.1.0-SNAPSHOT/maven-metadata.xml:
>  Connect to repository.apache.org:443 [repository.apache.org/207.244.88.140] 
> failed: Connection timed out (Connection timed out) -> [Help 1]
> {code}



--
This message was sent by Atlassian Jira
(v8.3.4#803005)

-
To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org
For additional commands, e-mail: issues-h...@spark.apache.org



[jira] [Commented] (SPARK-33044) Add a Jenkins build and test job for Scala 2.13

2020-12-11 Thread Shane Knapp (Jira)


[ 
https://issues.apache.org/jira/browse/SPARK-33044?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=17248013#comment-17248013
 ] 

Shane Knapp commented on SPARK-33044:
-

done

> Add a Jenkins build and test job for Scala 2.13
> ---
>
> Key: SPARK-33044
> URL: https://issues.apache.org/jira/browse/SPARK-33044
> Project: Spark
>  Issue Type: Sub-task
>  Components: jenkins
>Affects Versions: 3.1.0
>Reporter: Yang Jie
>Assignee: Shane Knapp
>Priority: Major
> Attachments: Screen Shot 2020-12-08 at 1.56.59 PM.png, Screen Shot 
> 2020-12-08 at 1.58.07 PM.png
>
>
> {{Master}} branch seems to be almost ready for Scala 2.13 now, we need a 
> Jenkins test job to verify current work results and CI.



--
This message was sent by Atlassian Jira
(v8.3.4#803005)

-
To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org
For additional commands, e-mail: issues-h...@spark.apache.org



[jira] [Resolved] (SPARK-33746) Minikube is failing to start on research-jenkins-worker-05

2020-12-11 Thread Shane Knapp (Jira)


 [ 
https://issues.apache.org/jira/browse/SPARK-33746?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
 ]

Shane Knapp resolved SPARK-33746.
-
Resolution: Fixed

> Minikube is failing to start on research-jenkins-worker-05
> --
>
> Key: SPARK-33746
> URL: https://issues.apache.org/jira/browse/SPARK-33746
> Project: Spark
>  Issue Type: Improvement
>  Components: Kubernetes, Tests
>Affects Versions: 3.1.0, 3.2.0
>Reporter: Holden Karau
>Assignee: Shane Knapp
>Priority: Minor
>
> [https://amplab.cs.berkeley.edu/jenkins/job/SparkPullRequestBuilder-K8s/37190/console]
>  
> {code:java}
> + minikube --vm-driver=kvm2 start --memory 6000 --cpus 8
> * minikube v1.7.3 on Ubuntu 20.04
> * Using the kvm2 driver based on user configuration
> ! Unable to update kvm2 driver: unable to acquire lock for 
> {Name:mk900956b073697a4aa6c80a27c6bb0742a99a53 Clock:{} Delay:500ms 
> Timeout:10m0s Cancel:}: unable to open 
> /tmp/juju-mk900956b073697a4aa6c80a27c6bb0742a99a53: permission denied
> * Kubernetes 1.17.3 is now available. If you would like to upgrade, specify: 
> --kubernetes-version=1.17.3
> * Reconfiguring existing host ...
> * Using the running kvm2 "minikube" VM ...
> * 
> X Unable to start VM. Please investigate and run 'minikube delete' if possible
> * Error: [SSH_AUTH_FAILURE] post-start: command runner: ssh client: Error 
> dialing tcp via ssh client: ssh: handshake failed: ssh: unable to 
> authenticate, attempted methods [none publickey], no supported methods 
> remain{code}



--
This message was sent by Atlassian Jira
(v8.3.4#803005)

-
To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org
For additional commands, e-mail: issues-h...@spark.apache.org



[jira] [Commented] (SPARK-33044) Add a Jenkins build and test job for Scala 2.13

2020-12-11 Thread Shane Knapp (Jira)


[ 
https://issues.apache.org/jira/browse/SPARK-33044?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=17248003#comment-17248003
 ] 

Shane Knapp commented on SPARK-33044:
-

ah, ok...  i thought i just needed to run `./dev/change-scala-version.sh 2.13`. 
 i'll update the build scripts now.

> Add a Jenkins build and test job for Scala 2.13
> ---
>
> Key: SPARK-33044
> URL: https://issues.apache.org/jira/browse/SPARK-33044
> Project: Spark
>  Issue Type: Sub-task
>  Components: jenkins
>Affects Versions: 3.1.0
>Reporter: Yang Jie
>Assignee: Shane Knapp
>Priority: Major
> Attachments: Screen Shot 2020-12-08 at 1.56.59 PM.png, Screen Shot 
> 2020-12-08 at 1.58.07 PM.png
>
>
> {{Master}} branch seems to be almost ready for Scala 2.13 now, we need a 
> Jenkins test job to verify current work results and CI.



--
This message was sent by Atlassian Jira
(v8.3.4#803005)

-
To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org
For additional commands, e-mail: issues-h...@spark.apache.org



[jira] [Commented] (SPARK-33746) Minikube is failing to start on research-jenkins-worker-05

2020-12-11 Thread Shane Knapp (Jira)


[ 
https://issues.apache.org/jira/browse/SPARK-33746?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=17247998#comment-17247998
 ] 

Shane Knapp commented on SPARK-33746:
-

k8s master build passed:

[https://amplab.cs.berkeley.edu/jenkins/job/spark-master-test-k8s/709/]

re-enabling this worker now.

> Minikube is failing to start on research-jenkins-worker-05
> --
>
> Key: SPARK-33746
> URL: https://issues.apache.org/jira/browse/SPARK-33746
> Project: Spark
>  Issue Type: Improvement
>  Components: Kubernetes, Tests
>Affects Versions: 3.1.0, 3.2.0
>Reporter: Holden Karau
>Assignee: Shane Knapp
>Priority: Minor
>
> [https://amplab.cs.berkeley.edu/jenkins/job/SparkPullRequestBuilder-K8s/37190/console]
>  
> {code:java}
> + minikube --vm-driver=kvm2 start --memory 6000 --cpus 8
> * minikube v1.7.3 on Ubuntu 20.04
> * Using the kvm2 driver based on user configuration
> ! Unable to update kvm2 driver: unable to acquire lock for 
> {Name:mk900956b073697a4aa6c80a27c6bb0742a99a53 Clock:{} Delay:500ms 
> Timeout:10m0s Cancel:}: unable to open 
> /tmp/juju-mk900956b073697a4aa6c80a27c6bb0742a99a53: permission denied
> * Kubernetes 1.17.3 is now available. If you would like to upgrade, specify: 
> --kubernetes-version=1.17.3
> * Reconfiguring existing host ...
> * Using the running kvm2 "minikube" VM ...
> * 
> X Unable to start VM. Please investigate and run 'minikube delete' if possible
> * Error: [SSH_AUTH_FAILURE] post-start: command runner: ssh client: Error 
> dialing tcp via ssh client: ssh: handshake failed: ssh: unable to 
> authenticate, attempted methods [none publickey], no supported methods 
> remain{code}



--
This message was sent by Atlassian Jira
(v8.3.4#803005)

-
To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org
For additional commands, e-mail: issues-h...@spark.apache.org



[jira] [Commented] (SPARK-33746) Minikube is failing to start on research-jenkins-worker-05

2020-12-10 Thread Shane Knapp (Jira)


[ 
https://issues.apache.org/jira/browse/SPARK-33746?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=17247557#comment-17247557
 ] 

Shane Knapp commented on SPARK-33746:
-

wow.  this was a rabbit hole of epic proportions, and after reinstalling the 
kvm/libvirt packages, and updating to the latest kvm2 and 
docker-machine-driver-kvm2 drivers, it STILL couldn't ssh to the cluster.

so, i decided to reboot the box and it magically started working...  i fired 
off the master k8s build and we'll see how that goes:

https://amplab.cs.berkeley.edu/jenkins/job/spark-master-test-k8s/708/

> Minikube is failing to start on research-jenkins-worker-05
> --
>
> Key: SPARK-33746
> URL: https://issues.apache.org/jira/browse/SPARK-33746
> Project: Spark
>  Issue Type: Improvement
>  Components: Kubernetes, Tests
>Affects Versions: 3.1.0, 3.2.0
>Reporter: Holden Karau
>Assignee: Shane Knapp
>Priority: Minor
>
> [https://amplab.cs.berkeley.edu/jenkins/job/SparkPullRequestBuilder-K8s/37190/console]
>  
> {code:java}
> + minikube --vm-driver=kvm2 start --memory 6000 --cpus 8
> * minikube v1.7.3 on Ubuntu 20.04
> * Using the kvm2 driver based on user configuration
> ! Unable to update kvm2 driver: unable to acquire lock for 
> {Name:mk900956b073697a4aa6c80a27c6bb0742a99a53 Clock:{} Delay:500ms 
> Timeout:10m0s Cancel:}: unable to open 
> /tmp/juju-mk900956b073697a4aa6c80a27c6bb0742a99a53: permission denied
> * Kubernetes 1.17.3 is now available. If you would like to upgrade, specify: 
> --kubernetes-version=1.17.3
> * Reconfiguring existing host ...
> * Using the running kvm2 "minikube" VM ...
> * 
> X Unable to start VM. Please investigate and run 'minikube delete' if possible
> * Error: [SSH_AUTH_FAILURE] post-start: command runner: ssh client: Error 
> dialing tcp via ssh client: ssh: handshake failed: ssh: unable to 
> authenticate, attempted methods [none publickey], no supported methods 
> remain{code}



--
This message was sent by Atlassian Jira
(v8.3.4#803005)

-
To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org
For additional commands, e-mail: issues-h...@spark.apache.org



[jira] [Commented] (SPARK-33746) Minikube is failing to start on research-jenkins-worker-05

2020-12-10 Thread Shane Knapp (Jira)


[ 
https://issues.apache.org/jira/browse/SPARK-33746?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=17247523#comment-17247523
 ] 

Shane Knapp commented on SPARK-33746:
-

...and here's what's actually going on:

 
{code:java}
  Unable to start VM. Please investigate and run 'minikube delete' if possible
❌  Error: [SSH_AUTH_FAILURE] post-start: command runner: ssh client: Error 
dialing tcp via ssh client: ssh: handshake failed: ssh: unable to authenticate, 
attempted methods [none publickey], no supported methods remain
  Suggestion: Your host is failing to route packets to the minikube VM. If you 
have VPN software, try turning it off or configuring it so that it does not 
re-route traffic to the VM IP. If not, check your VM environment routing 
options.
  Documentation: https://minikube.sigs.k8s.io/docs/reference/networking/vpn/
⁉️   Related issues:
▪ https://github.com/kubernetes/minikube/issues/3930
{code}

> Minikube is failing to start on research-jenkins-worker-05
> --
>
> Key: SPARK-33746
> URL: https://issues.apache.org/jira/browse/SPARK-33746
> Project: Spark
>  Issue Type: Improvement
>  Components: Kubernetes, Tests
>Affects Versions: 3.1.0, 3.2.0
>Reporter: Holden Karau
>Assignee: Shane Knapp
>Priority: Minor
>
> [https://amplab.cs.berkeley.edu/jenkins/job/SparkPullRequestBuilder-K8s/37190/console]
>  
> {code:java}
> + minikube --vm-driver=kvm2 start --memory 6000 --cpus 8
> * minikube v1.7.3 on Ubuntu 20.04
> * Using the kvm2 driver based on user configuration
> ! Unable to update kvm2 driver: unable to acquire lock for 
> {Name:mk900956b073697a4aa6c80a27c6bb0742a99a53 Clock:{} Delay:500ms 
> Timeout:10m0s Cancel:}: unable to open 
> /tmp/juju-mk900956b073697a4aa6c80a27c6bb0742a99a53: permission denied
> * Kubernetes 1.17.3 is now available. If you would like to upgrade, specify: 
> --kubernetes-version=1.17.3
> * Reconfiguring existing host ...
> * Using the running kvm2 "minikube" VM ...
> * 
> X Unable to start VM. Please investigate and run 'minikube delete' if possible
> * Error: [SSH_AUTH_FAILURE] post-start: command runner: ssh client: Error 
> dialing tcp via ssh client: ssh: handshake failed: ssh: unable to 
> authenticate, attempted methods [none publickey], no supported methods 
> remain{code}



--
This message was sent by Atlassian Jira
(v8.3.4#803005)

-
To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org
For additional commands, e-mail: issues-h...@spark.apache.org



[jira] [Commented] (SPARK-33746) Minikube is failing to start on research-jenkins-worker-05

2020-12-10 Thread Shane Knapp (Jira)


[ 
https://issues.apache.org/jira/browse/SPARK-33746?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=17247521#comment-17247521
 ] 

Shane Knapp commented on SPARK-33746:
-

yah, noticed this yesterday and took the worker offline...  and somehow it came 
back.  anyways, i'm investigating and hoping to get this sucker back in the 
worker queue asap.

> Minikube is failing to start on research-jenkins-worker-05
> --
>
> Key: SPARK-33746
> URL: https://issues.apache.org/jira/browse/SPARK-33746
> Project: Spark
>  Issue Type: Improvement
>  Components: Kubernetes, Tests
>Affects Versions: 3.1.0, 3.2.0
>Reporter: Holden Karau
>Assignee: Shane Knapp
>Priority: Minor
>
> [https://amplab.cs.berkeley.edu/jenkins/job/SparkPullRequestBuilder-K8s/37190/console]
>  
> {code:java}
> + minikube --vm-driver=kvm2 start --memory 6000 --cpus 8
> * minikube v1.7.3 on Ubuntu 20.04
> * Using the kvm2 driver based on user configuration
> ! Unable to update kvm2 driver: unable to acquire lock for 
> {Name:mk900956b073697a4aa6c80a27c6bb0742a99a53 Clock:{} Delay:500ms 
> Timeout:10m0s Cancel:}: unable to open 
> /tmp/juju-mk900956b073697a4aa6c80a27c6bb0742a99a53: permission denied
> * Kubernetes 1.17.3 is now available. If you would like to upgrade, specify: 
> --kubernetes-version=1.17.3
> * Reconfiguring existing host ...
> * Using the running kvm2 "minikube" VM ...
> * 
> X Unable to start VM. Please investigate and run 'minikube delete' if possible
> * Error: [SSH_AUTH_FAILURE] post-start: command runner: ssh client: Error 
> dialing tcp via ssh client: ssh: handshake failed: ssh: unable to 
> authenticate, attempted methods [none publickey], no supported methods 
> remain{code}



--
This message was sent by Atlassian Jira
(v8.3.4#803005)

-
To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org
For additional commands, e-mail: issues-h...@spark.apache.org



[jira] [Commented] (SPARK-33727) `gpg: keyserver receive failed: No name` during K8s IT

2020-12-09 Thread Shane Knapp (Jira)


[ 
https://issues.apache.org/jira/browse/SPARK-33727?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=17246928#comment-17246928
 ] 

Shane Knapp commented on SPARK-33727:
-

[~dongjoon] the build that i looked at 
([https://amplab.cs.berkeley.edu/jenkins/job/SparkPullRequestBuilder-K8s/37120/)]
 ran on one of the workers that's been set up for over a year, and hasn't had 
any network changes done to it since.
{noformat}
¯\_(ツ)_/¯{noformat}
i'd tend to agree w/holden's observation – sometimes keyservers are flaky.  
fallbacks are always a good thing.

> `gpg: keyserver receive failed: No name` during K8s IT
> --
>
> Key: SPARK-33727
> URL: https://issues.apache.org/jira/browse/SPARK-33727
> Project: Spark
>  Issue Type: Task
>  Components: Kubernetes, Project Infra, Tests
>Affects Versions: 3.0.2, 3.1.0, 3.2.0
>Reporter: Dongjoon Hyun
>Priority: Major
>
> K8s IT fails with gpg: keyserver receive failed: No name. This seems to be 
> consistent in the new Jenkins Server.
> {code}
> Executing: /tmp/apt-key-gpghome.gGqC9RwptN/gpg.1.sh --keyserver 
> keys.gnupg.net --recv-key E19F5F87128899B192B1A2C2AD5F960A256A04AF
> gpg: keyserver receive failed: No name
> The command '/bin/sh -c echo "deb http://cloud.r-project.org/bin/linux/debian 
> buster-cran35/" >> /etc/apt/sources.list &&   apt install -y gnupg &&   
> apt-key adv --keyserver keys.gnupg.net --recv-key 
> 'E19F5F87128899B192B1A2C2AD5F960A256A04AF' &&   apt-get update &&   apt 
> install -y -t buster-cran35 r-base r-base-dev &&   rm -rf /var/cache/apt/*' 
> returned a non-zero code: 2
> {code}
> It locally works on Mac.
> {code}
> $ gpg1 --keyserver keys.gnupg.net --recv-key 
> E19F5F87128899B192B1A2C2AD5F960A256A04AF
> gpg: requesting key 256A04AF from hkp server keys.gnupg.net
> gpg: key 256A04AF: public key "Johannes Ranke (Wissenschaftlicher Berater) 
> " imported
> gpg: 3 marginal(s) needed, 1 complete(s) needed, PGP trust model
> gpg: depth: 0  valid:   2  signed:   1  trust: 0-, 0q, 0n, 0m, 0f, 2u
> gpg: depth: 1  valid:   1  signed:   0  trust: 1-, 0q, 0n, 0m, 0f, 0u
> gpg: Total number processed: 1
> gpg:   imported: 1  (RSA: 1)
> {code}
> It happens multiple times.
> - https://github.com/apache/spark/pull/30693
> - https://github.com/apache/spark/pull/30694



--
This message was sent by Atlassian Jira
(v8.3.4#803005)

-
To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org
For additional commands, e-mail: issues-h...@spark.apache.org



[jira] [Comment Edited] (SPARK-33727) `gpg: keyserver receive failed: No name` during K8s IT

2020-12-09 Thread Shane Knapp (Jira)


[ 
https://issues.apache.org/jira/browse/SPARK-33727?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=17246894#comment-17246894
 ] 

Shane Knapp edited comment on SPARK-33727 at 12/9/20, 11:30 PM:


ok, this is not a failure on the jenkins worker – it's happening inside the 
docker container that the build spins up.  just scroll back from the gpg error 
message and you'll see the docker STDOUT as it's trying to build the spark-r 
container.

in fact, from looking at the logs it appears that the command that's actually 
failing in the container setup is:
{noformat}
apt-key adv --keyserver keys.gnupg.net --recv-key 
'E19F5F87128899B192B1A2C2AD5F960A256A04AF'{noformat}
that's causing the following error:
{code:java}
Executing: /tmp/apt-key-gpghome.gGqC9RwptN/gpg.1.sh --keyserver keys.gnupg.net 
--recv-key E19F5F87128899B192B1A2C2AD5F960A256A04AF
gpg: keyserver receive failed: No name
{code}
so, i took a peek at 
./resource-managers/kubernetes/docker/src/main/dockerfiles/spark/bindings/R/Dockerfile,
 and git blame for that specific line points to...

drumroll please...
{code:java}
22baf05a9ec (Dongjoon Hyun 2020-11-12 15:36:31 +0900 32) apt-key adv 
--keyserver keys.gnupg.net --recv-key 
'E19F5F87128899B192B1A2C2AD5F960A256A04AF' && {code}

 [~dongjoon]

 


was (Author: shaneknapp):
ok, this is not a failure on the jenkins worker – it's happening inside the 
docker container that the build spins up.  just scroll back from the gpg error 
message and you'll see the docker STDOUT as it's trying to build the spark-r 
container.

in fact, from looking at the logs it appears that the command that's actually 
failing in the container setup is:
{noformat}
apt-key adv --keyserver keys.gnupg.net --recv-key 
'E19F5F87128899B192B1A2C2AD5F960A256A04AF'{noformat}
that's causing the following error:
{code:java}
Executing: /tmp/apt-key-gpghome.gGqC9RwptN/gpg.1.sh --keyserver keys.gnupg.net 
--recv-key E19F5F87128899B192B1A2C2AD5F960A256A04AF
gpg: keyserver receive failed: No name
{code}
so, i took a peek at 
./resource-managers/kubernetes/docker/src/main/dockerfiles/spark/bindings/R/Dockerfile,
 and git blame for that specific line points to...

drumroll please...
{code:java}
22baf05a9ec (Dongjoon Hyun  2020-11-12 15:36:31 +0900 32)   apt-key adv 
--keyserver keys.gnupg.net --recv-key 
'E19F5F87128899B192B1A2C2AD5F960A256A04AF' && \{code}
[~dongjoon]

 

> `gpg: keyserver receive failed: No name` during K8s IT
> --
>
> Key: SPARK-33727
> URL: https://issues.apache.org/jira/browse/SPARK-33727
> Project: Spark
>  Issue Type: Task
>  Components: Kubernetes, Project Infra, Tests
>Affects Versions: 3.0.2, 3.1.0, 3.2.0
>Reporter: Dongjoon Hyun
>Priority: Major
>
> K8s IT fails with gpg: keyserver receive failed: No name. This seems to be 
> consistent in the new Jenkins Server.
> {code}
> Executing: /tmp/apt-key-gpghome.gGqC9RwptN/gpg.1.sh --keyserver 
> keys.gnupg.net --recv-key E19F5F87128899B192B1A2C2AD5F960A256A04AF
> gpg: keyserver receive failed: No name
> The command '/bin/sh -c echo "deb http://cloud.r-project.org/bin/linux/debian 
> buster-cran35/" >> /etc/apt/sources.list &&   apt install -y gnupg &&   
> apt-key adv --keyserver keys.gnupg.net --recv-key 
> 'E19F5F87128899B192B1A2C2AD5F960A256A04AF' &&   apt-get update &&   apt 
> install -y -t buster-cran35 r-base r-base-dev &&   rm -rf /var/cache/apt/*' 
> returned a non-zero code: 2
> {code}
> It locally works on Mac.
> {code}
> $ gpg1 --keyserver keys.gnupg.net --recv-key 
> E19F5F87128899B192B1A2C2AD5F960A256A04AF
> gpg: requesting key 256A04AF from hkp server keys.gnupg.net
> gpg: key 256A04AF: public key "Johannes Ranke (Wissenschaftlicher Berater) 
> " imported
> gpg: 3 marginal(s) needed, 1 complete(s) needed, PGP trust model
> gpg: depth: 0  valid:   2  signed:   1  trust: 0-, 0q, 0n, 0m, 0f, 2u
> gpg: depth: 1  valid:   1  signed:   0  trust: 1-, 0q, 0n, 0m, 0f, 0u
> gpg: Total number processed: 1
> gpg:   imported: 1  (RSA: 1)
> {code}
> It happens multiple times.
> - https://github.com/apache/spark/pull/30693
> - https://github.com/apache/spark/pull/30694



--
This message was sent by Atlassian Jira
(v8.3.4#803005)

-
To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org
For additional commands, e-mail: issues-h...@spark.apache.org



[jira] [Commented] (SPARK-33727) `gpg: keyserver receive failed: No name` during K8s IT

2020-12-09 Thread Shane Knapp (Jira)


[ 
https://issues.apache.org/jira/browse/SPARK-33727?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=17246894#comment-17246894
 ] 

Shane Knapp commented on SPARK-33727:
-

ok, this is not a failure on the jenkins worker – it's happening inside the 
docker container that the build spins up.  just scroll back from the gpg error 
message and you'll see the docker STDOUT as it's trying to build the spark-r 
container.

in fact, from looking at the logs it appears that the command that's actually 
failing in the container setup is:
{noformat}
apt-key adv --keyserver keys.gnupg.net --recv-key 
'E19F5F87128899B192B1A2C2AD5F960A256A04AF'{noformat}
that's causing the following error:
{code:java}
Executing: /tmp/apt-key-gpghome.gGqC9RwptN/gpg.1.sh --keyserver keys.gnupg.net 
--recv-key E19F5F87128899B192B1A2C2AD5F960A256A04AF
gpg: keyserver receive failed: No name
{code}
so, i took a peek at 
./resource-managers/kubernetes/docker/src/main/dockerfiles/spark/bindings/R/Dockerfile,
 and git blame for that specific line points to...

drumroll please...
{code:java}
22baf05a9ec (Dongjoon Hyun  2020-11-12 15:36:31 +0900 32)   apt-key adv 
--keyserver keys.gnupg.net --recv-key 
'E19F5F87128899B192B1A2C2AD5F960A256A04AF' && \{code}
[~dongjoon]

 

> `gpg: keyserver receive failed: No name` during K8s IT
> --
>
> Key: SPARK-33727
> URL: https://issues.apache.org/jira/browse/SPARK-33727
> Project: Spark
>  Issue Type: Task
>  Components: Kubernetes, Project Infra, Tests
>Affects Versions: 3.0.2, 3.1.0, 3.2.0
>Reporter: Dongjoon Hyun
>Priority: Major
>
> K8s IT fails with gpg: keyserver receive failed: No name. This seems to be 
> consistent in the new Jenkins Server.
> {code}
> Executing: /tmp/apt-key-gpghome.gGqC9RwptN/gpg.1.sh --keyserver 
> keys.gnupg.net --recv-key E19F5F87128899B192B1A2C2AD5F960A256A04AF
> gpg: keyserver receive failed: No name
> The command '/bin/sh -c echo "deb http://cloud.r-project.org/bin/linux/debian 
> buster-cran35/" >> /etc/apt/sources.list &&   apt install -y gnupg &&   
> apt-key adv --keyserver keys.gnupg.net --recv-key 
> 'E19F5F87128899B192B1A2C2AD5F960A256A04AF' &&   apt-get update &&   apt 
> install -y -t buster-cran35 r-base r-base-dev &&   rm -rf /var/cache/apt/*' 
> returned a non-zero code: 2
> {code}
> It locally works on Mac.
> {code}
> $ gpg1 --keyserver keys.gnupg.net --recv-key 
> E19F5F87128899B192B1A2C2AD5F960A256A04AF
> gpg: requesting key 256A04AF from hkp server keys.gnupg.net
> gpg: key 256A04AF: public key "Johannes Ranke (Wissenschaftlicher Berater) 
> " imported
> gpg: 3 marginal(s) needed, 1 complete(s) needed, PGP trust model
> gpg: depth: 0  valid:   2  signed:   1  trust: 0-, 0q, 0n, 0m, 0f, 2u
> gpg: depth: 1  valid:   1  signed:   0  trust: 1-, 0q, 0n, 0m, 0f, 0u
> gpg: Total number processed: 1
> gpg:   imported: 1  (RSA: 1)
> {code}
> It happens multiple times.
> - https://github.com/apache/spark/pull/30693
> - https://github.com/apache/spark/pull/30694



--
This message was sent by Atlassian Jira
(v8.3.4#803005)

-
To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org
For additional commands, e-mail: issues-h...@spark.apache.org



[jira] [Commented] (SPARK-33713) Remove `hive-2.3` post fix at master/branch-3.1 Jenkins job names

2020-12-09 Thread Shane Knapp (Jira)


[ 
https://issues.apache.org/jira/browse/SPARK-33713?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=17246850#comment-17246850
 ] 

Shane Knapp commented on SPARK-33713:
-

got it.  gonna rejigger my rejiggering of the JJB configs.

> Remove `hive-2.3` post fix at master/branch-3.1 Jenkins job names
> -
>
> Key: SPARK-33713
> URL: https://issues.apache.org/jira/browse/SPARK-33713
> Project: Spark
>  Issue Type: Task
>  Components: Project Infra
>Affects Versions: 3.2.0
>Reporter: Dongjoon Hyun
>Assignee: Shane Knapp
>Priority: Major
>
> We removed `hive-1.2` profile since branch-3.1. So, we can simplify the 
> Jenkins job title.



--
This message was sent by Atlassian Jira
(v8.3.4#803005)

-
To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org
For additional commands, e-mail: issues-h...@spark.apache.org



[jira] [Commented] (SPARK-33713) Remove `hive-2.3` post fix at master/branch-3.1 Jenkins job names

2020-12-09 Thread Shane Knapp (Jira)


[ 
https://issues.apache.org/jira/browse/SPARK-33713?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=17246816#comment-17246816
 ] 

Shane Knapp commented on SPARK-33713:
-

also, let me think about other sneaky ways of making this happen w/o needing to 
lose all the build logs...

> Remove `hive-2.3` post fix at master/branch-3.1 Jenkins job names
> -
>
> Key: SPARK-33713
> URL: https://issues.apache.org/jira/browse/SPARK-33713
> Project: Spark
>  Issue Type: Task
>  Components: Project Infra
>Affects Versions: 3.2.0
>Reporter: Dongjoon Hyun
>Assignee: Shane Knapp
>Priority: Major
>
> We removed `hive-1.2` profile since branch-3.1. So, we can simplify the 
> Jenkins job title.



--
This message was sent by Atlassian Jira
(v8.3.4#803005)

-
To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org
For additional commands, e-mail: issues-h...@spark.apache.org



[jira] [Commented] (SPARK-33713) Remove `hive-2.3` post fix at master/branch-3.1 Jenkins job names

2020-12-09 Thread Shane Knapp (Jira)


[ 
https://issues.apache.org/jira/browse/SPARK-33713?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=17246810#comment-17246810
 ] 

Shane Knapp commented on SPARK-33713:
-

fwiw, here's what the new and sexy build names will be:
{noformat}
INFO:jenkins_jobs.builder:Number of jobs generated: 30 
DEBUG:jenkins_jobs.builder:Writing XML to 
'target/jenkins-xml/spark-branch-2.4-test-maven-hadoop-2.6' 
DEBUG:jenkins_jobs.builder:Writing XML to 
'target/jenkins-xml/spark-branch-2.4-test-maven-hadoop-2.7' 
DEBUG:jenkins_jobs.builder:Writing XML to 
'target/jenkins-xml/spark-branch-2.4-test-sbt-hadoop-2.6' 
DEBUG:jenkins_jobs.builder:Writing XML to 
'target/jenkins-xml/spark-branch-2.4-test-sbt-hadoop-2.7' 
DEBUG:jenkins_jobs.builder:Writing XML to 
'target/jenkins-xml/spark-branch-3.0-test-maven-hadoop-2.7' 
DEBUG:jenkins_jobs.builder:Writing XML to 
'target/jenkins-xml/spark-branch-3.0-test-maven-hadoop-2.7-jdk-11' 
DEBUG:jenkins_jobs.builder:Writing XML to 
'target/jenkins-xml/spark-branch-3.0-test-maven-hadoop-3.2' 
DEBUG:jenkins_jobs.builder:Writing XML to 
'target/jenkins-xml/spark-branch-3.0-test-maven-hadoop-3.2-jdk-11' 
DEBUG:jenkins_jobs.builder:Writing XML to 
'target/jenkins-xml/spark-branch-3.0-test-sbt-hadoop-2.7' 
DEBUG:jenkins_jobs.builder:Writing XML to 
'target/jenkins-xml/spark-branch-3.0-test-sbt-hadoop-3.2' 
DEBUG:jenkins_jobs.builder:Writing XML to 
'target/jenkins-xml/spark-branch-3.1-test-maven-hadoop-2.7' 
DEBUG:jenkins_jobs.builder:Writing XML to 
'target/jenkins-xml/spark-branch-3.1-test-maven-hadoop-2.7-jdk-11' 
DEBUG:jenkins_jobs.builder:Writing XML to 
'target/jenkins-xml/spark-branch-3.1-test-maven-hadoop-2.7-jdk-11-scala-2.13' 
DEBUG:jenkins_jobs.builder:Writing XML to 
'target/jenkins-xml/spark-branch-3.1-test-maven-hadoop-2.7-scala-2.13' 
DEBUG:jenkins_jobs.builder:Writing XML to 
'target/jenkins-xml/spark-branch-3.1-test-maven-hadoop-3.2' 
DEBUG:jenkins_jobs.builder:Writing XML to 
'target/jenkins-xml/spark-branch-3.1-test-maven-hadoop-3.2-jdk-11' 
DEBUG:jenkins_jobs.builder:Writing XML to 
'target/jenkins-xml/spark-branch-3.1-test-maven-hadoop-3.2-jdk-11-scala-2.13' 
DEBUG:jenkins_jobs.builder:Writing XML to 
'target/jenkins-xml/spark-branch-3.1-test-maven-hadoop-3.2-scala-2.13' 
DEBUG:jenkins_jobs.builder:Writing XML to 
'target/jenkins-xml/spark-branch-3.1-test-sbt-hadoop-2.7' 
DEBUG:jenkins_jobs.builder:Writing XML to 
'target/jenkins-xml/spark-branch-3.1-test-sbt-hadoop-3.2' 
DEBUG:jenkins_jobs.builder:Writing XML to 
'target/jenkins-xml/spark-master-test-maven-hadoop-2.7' 
DEBUG:jenkins_jobs.builder:Writing XML to 
'target/jenkins-xml/spark-master-test-maven-hadoop-2.7-jdk-11' 
DEBUG:jenkins_jobs.builder:Writing XML to 
'target/jenkins-xml/spark-master-test-maven-hadoop-2.7-jdk-11-scala-2.13' 
DEBUG:jenkins_jobs.builder:Writing XML to 
'target/jenkins-xml/spark-master-test-maven-hadoop-2.7-scala-2.13' 
DEBUG:jenkins_jobs.builder:Writing XML to 
'target/jenkins-xml/spark-master-test-maven-hadoop-3.2' 
DEBUG:jenkins_jobs.builder:Writing XML to 
'target/jenkins-xml/spark-master-test-maven-hadoop-3.2-jdk-11' 
DEBUG:jenkins_jobs.builder:Writing XML to 
'target/jenkins-xml/spark-master-test-maven-hadoop-3.2-jdk-11-scala-2.13' 
DEBUG:jenkins_jobs.builder:Writing XML to 
'target/jenkins-xml/spark-master-test-maven-hadoop-3.2-scala-2.13' 
DEBUG:jenkins_jobs.builder:Writing XML to 
'target/jenkins-xml/spark-master-test-sbt-hadoop-2.7' 
DEBUG:jenkins_jobs.builder:Writing XML to 
'target/jenkins-xml/spark-master-test-sbt-hadoop-3.2'
{noformat}

> Remove `hive-2.3` post fix at master/branch-3.1 Jenkins job names
> -
>
> Key: SPARK-33713
> URL: https://issues.apache.org/jira/browse/SPARK-33713
> Project: Spark
>  Issue Type: Task
>  Components: Project Infra
>Affects Versions: 3.2.0
>Reporter: Dongjoon Hyun
>Assignee: Shane Knapp
>Priority: Major
>
> We removed `hive-1.2` profile since branch-3.1. So, we can simplify the 
> Jenkins job title.



--
This message was sent by Atlassian Jira
(v8.3.4#803005)

-
To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org
For additional commands, e-mail: issues-h...@spark.apache.org



[jira] [Comment Edited] (SPARK-33713) Remove `hive-2.3` post fix at master/branch-3.1 Jenkins job names

2020-12-09 Thread Shane Knapp (Jira)


[ 
https://issues.apache.org/jira/browse/SPARK-33713?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=17246810#comment-17246810
 ] 

Shane Knapp edited comment on SPARK-33713 at 12/9/20, 8:42 PM:
---

fwiw, here's what the new and sexy build names will be:
{code:java}
INFO:jenkins_jobs.builder:Number of jobs generated:  30
DEBUG:jenkins_jobs.builder:Writing XML to 
'target/jenkins-xml/spark-branch-2.4-test-maven-hadoop-2.6'
DEBUG:jenkins_jobs.builder:Writing XML to 
'target/jenkins-xml/spark-branch-2.4-test-maven-hadoop-2.7'
DEBUG:jenkins_jobs.builder:Writing XML to 
'target/jenkins-xml/spark-branch-2.4-test-sbt-hadoop-2.6'
DEBUG:jenkins_jobs.builder:Writing XML to 
'target/jenkins-xml/spark-branch-2.4-test-sbt-hadoop-2.7'
DEBUG:jenkins_jobs.builder:Writing XML to 
'target/jenkins-xml/spark-branch-3.0-test-maven-hadoop-2.7'
DEBUG:jenkins_jobs.builder:Writing XML to 
'target/jenkins-xml/spark-branch-3.0-test-maven-hadoop-2.7-jdk-11'
DEBUG:jenkins_jobs.builder:Writing XML to 
'target/jenkins-xml/spark-branch-3.0-test-maven-hadoop-3.2'
DEBUG:jenkins_jobs.builder:Writing XML to 
'target/jenkins-xml/spark-branch-3.0-test-maven-hadoop-3.2-jdk-11'
DEBUG:jenkins_jobs.builder:Writing XML to 
'target/jenkins-xml/spark-branch-3.0-test-sbt-hadoop-2.7'
DEBUG:jenkins_jobs.builder:Writing XML to 
'target/jenkins-xml/spark-branch-3.0-test-sbt-hadoop-3.2'
DEBUG:jenkins_jobs.builder:Writing XML to 
'target/jenkins-xml/spark-branch-3.1-test-maven-hadoop-2.7'
DEBUG:jenkins_jobs.builder:Writing XML to 
'target/jenkins-xml/spark-branch-3.1-test-maven-hadoop-2.7-jdk-11'
DEBUG:jenkins_jobs.builder:Writing XML to 
'target/jenkins-xml/spark-branch-3.1-test-maven-hadoop-2.7-jdk-11-scala-2.13'
DEBUG:jenkins_jobs.builder:Writing XML to 
'target/jenkins-xml/spark-branch-3.1-test-maven-hadoop-2.7-scala-2.13'
DEBUG:jenkins_jobs.builder:Writing XML to 
'target/jenkins-xml/spark-branch-3.1-test-maven-hadoop-3.2'
DEBUG:jenkins_jobs.builder:Writing XML to 
'target/jenkins-xml/spark-branch-3.1-test-maven-hadoop-3.2-jdk-11'
DEBUG:jenkins_jobs.builder:Writing XML to 
'target/jenkins-xml/spark-branch-3.1-test-maven-hadoop-3.2-jdk-11-scala-2.13'
DEBUG:jenkins_jobs.builder:Writing XML to 
'target/jenkins-xml/spark-branch-3.1-test-maven-hadoop-3.2-scala-2.13'
DEBUG:jenkins_jobs.builder:Writing XML to 
'target/jenkins-xml/spark-branch-3.1-test-sbt-hadoop-2.7'
DEBUG:jenkins_jobs.builder:Writing XML to 
'target/jenkins-xml/spark-branch-3.1-test-sbt-hadoop-3.2'
DEBUG:jenkins_jobs.builder:Writing XML to 
'target/jenkins-xml/spark-master-test-maven-hadoop-2.7'
DEBUG:jenkins_jobs.builder:Writing XML to 
'target/jenkins-xml/spark-master-test-maven-hadoop-2.7-jdk-11'
DEBUG:jenkins_jobs.builder:Writing XML to 
'target/jenkins-xml/spark-master-test-maven-hadoop-2.7-jdk-11-scala-2.13'
DEBUG:jenkins_jobs.builder:Writing XML to 
'target/jenkins-xml/spark-master-test-maven-hadoop-2.7-scala-2.13'
DEBUG:jenkins_jobs.builder:Writing XML to 
'target/jenkins-xml/spark-master-test-maven-hadoop-3.2'
DEBUG:jenkins_jobs.builder:Writing XML to 
'target/jenkins-xml/spark-master-test-maven-hadoop-3.2-jdk-11'
DEBUG:jenkins_jobs.builder:Writing XML to 
'target/jenkins-xml/spark-master-test-maven-hadoop-3.2-jdk-11-scala-2.13'
DEBUG:jenkins_jobs.builder:Writing XML to 
'target/jenkins-xml/spark-master-test-maven-hadoop-3.2-scala-2.13'
DEBUG:jenkins_jobs.builder:Writing XML to 
'target/jenkins-xml/spark-master-test-sbt-hadoop-2.7'
DEBUG:jenkins_jobs.builder:Writing XML to 
'target/jenkins-xml/spark-master-test-sbt-hadoop-3.2'
{code}
{noformat}
{noformat}


was (Author: shaneknapp):
fwiw, here's what the new and sexy build names will be:
{noformat}
INFO:jenkins_jobs.builder:Number of jobs generated: 30 
DEBUG:jenkins_jobs.builder:Writing XML to 
'target/jenkins-xml/spark-branch-2.4-test-maven-hadoop-2.6' 
DEBUG:jenkins_jobs.builder:Writing XML to 
'target/jenkins-xml/spark-branch-2.4-test-maven-hadoop-2.7' 
DEBUG:jenkins_jobs.builder:Writing XML to 
'target/jenkins-xml/spark-branch-2.4-test-sbt-hadoop-2.6' 
DEBUG:jenkins_jobs.builder:Writing XML to 
'target/jenkins-xml/spark-branch-2.4-test-sbt-hadoop-2.7' 
DEBUG:jenkins_jobs.builder:Writing XML to 
'target/jenkins-xml/spark-branch-3.0-test-maven-hadoop-2.7' 
DEBUG:jenkins_jobs.builder:Writing XML to 
'target/jenkins-xml/spark-branch-3.0-test-maven-hadoop-2.7-jdk-11' 
DEBUG:jenkins_jobs.builder:Writing XML to 
'target/jenkins-xml/spark-branch-3.0-test-maven-hadoop-3.2' 
DEBUG:jenkins_jobs.builder:Writing XML to 
'target/jenkins-xml/spark-branch-3.0-test-maven-hadoop-3.2-jdk-11' 
DEBUG:jenkins_jobs.builder:Writing XML to 
'target/jenkins-xml/spark-branch-3.0-test-sbt-hadoop-2.7' 
DEBUG:jenkins_jobs.builder:Writing XML to 
'target/jenkins-xml/spark-branch-3.0-test-sbt-hadoop-3.2' 
DEBUG:jenkins_jobs.builder:Writing XML to 
'target/jenkins-xml/spark-branch-3.1-test-maven-hadoop-2.7' 
DEBUG:jenkins_jobs.builder:Writing XML to 

[jira] [Commented] (SPARK-33713) Remove `hive-2.3` post fix at master/branch-3.1 Jenkins job names

2020-12-09 Thread Shane Knapp (Jira)


[ 
https://issues.apache.org/jira/browse/SPARK-33713?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=17246808#comment-17246808
 ] 

Shane Knapp commented on SPARK-33713:
-

hmm.  so while i heartily endorse this, there will be a side-effect of renaming 
the jobs (particularly as we use Jenkins Job Builder – JJB – to deploy and 
manage the build configs):

once i redeploy the updated JJB configs w/the shortened build names, it will 
create new builds and not change the names of existing ones.  unless i'm unable 
to read docs anymore (which is entirely possible) there's no 'rename' ability.

and since, according to jenkins, these are new builds, all previous build 
history will be lost...  unless i manually copy things over on the jenkins 
primary filesystem (which i'd like to avoid if possible).  given that we're 
only storing 2 weeks of builds, if we time it properly (aka at 3.1 release) the 
impact of losing these logs will be pretty insignificant.

my suggestion:

i want to do this, but i propose moving everything over when 3.1 is officially 
released.   we'll lose the previous 2 weeks of build history across all 
branches, but since we're "starting fresh"-ish i think the impact will be 
minimized.

 

thoughts?  comments?  should we pull anyone else in for their opions?  [~sowen] 
[~hyukjin.kwon] [~holden]

> Remove `hive-2.3` post fix at master/branch-3.1 Jenkins job names
> -
>
> Key: SPARK-33713
> URL: https://issues.apache.org/jira/browse/SPARK-33713
> Project: Spark
>  Issue Type: Task
>  Components: Project Infra
>Affects Versions: 3.2.0
>Reporter: Dongjoon Hyun
>Assignee: Shane Knapp
>Priority: Major
>
> We removed `hive-1.2` profile since branch-3.1. So, we can simplify the 
> Jenkins job title.



--
This message was sent by Atlassian Jira
(v8.3.4#803005)

-
To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org
For additional commands, e-mail: issues-h...@spark.apache.org



[jira] [Comment Edited] (SPARK-33044) Add a Jenkins build and test job for Scala 2.13

2020-12-08 Thread Shane Knapp (Jira)


[ 
https://issues.apache.org/jira/browse/SPARK-33044?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=17246168#comment-17246168
 ] 

Shane Knapp edited comment on SPARK-33044 at 12/8/20, 9:59 PM:
---

branch-3.1 jobs + all scala 2.13 builds are done!

 

!Screen Shot 2020-12-08 at 1.56.59 PM.png|width=455,height=243!!Screen Shot 
2020-12-08 at 1.58.07 PM.png|width=498,height=187!


was (Author: shaneknapp):
branch-3.1 jobs + all scala 2.13 builds are done!

 

!Screen Shot 2020-12-08 at 1.56.59 PM.png!!Screen Shot 2020-12-08 at 1.58.07 
PM.png!

> Add a Jenkins build and test job for Scala 2.13
> ---
>
> Key: SPARK-33044
> URL: https://issues.apache.org/jira/browse/SPARK-33044
> Project: Spark
>  Issue Type: Sub-task
>  Components: jenkins
>Affects Versions: 3.1.0
>Reporter: Yang Jie
>Assignee: Shane Knapp
>Priority: Major
> Attachments: Screen Shot 2020-12-08 at 1.56.59 PM.png, Screen Shot 
> 2020-12-08 at 1.58.07 PM.png
>
>
> {{Master}} branch seems to be almost ready for Scala 2.13 now, we need a 
> Jenkins test job to verify current work results and CI.



--
This message was sent by Atlassian Jira
(v8.3.4#803005)

-
To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org
For additional commands, e-mail: issues-h...@spark.apache.org



[jira] [Updated] (SPARK-33044) Add a Jenkins build and test job for Scala 2.13

2020-12-08 Thread Shane Knapp (Jira)


 [ 
https://issues.apache.org/jira/browse/SPARK-33044?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
 ]

Shane Knapp updated SPARK-33044:

Attachment: Screen Shot 2020-12-08 at 1.58.07 PM.png

> Add a Jenkins build and test job for Scala 2.13
> ---
>
> Key: SPARK-33044
> URL: https://issues.apache.org/jira/browse/SPARK-33044
> Project: Spark
>  Issue Type: Sub-task
>  Components: jenkins
>Affects Versions: 3.1.0
>Reporter: Yang Jie
>Assignee: Shane Knapp
>Priority: Major
> Attachments: Screen Shot 2020-12-08 at 1.56.59 PM.png, Screen Shot 
> 2020-12-08 at 1.58.07 PM.png
>
>
> {{Master}} branch seems to be almost ready for Scala 2.13 now, we need a 
> Jenkins test job to verify current work results and CI.



--
This message was sent by Atlassian Jira
(v8.3.4#803005)

-
To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org
For additional commands, e-mail: issues-h...@spark.apache.org



[jira] [Commented] (SPARK-33044) Add a Jenkins build and test job for Scala 2.13

2020-12-08 Thread Shane Knapp (Jira)


[ 
https://issues.apache.org/jira/browse/SPARK-33044?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=17246168#comment-17246168
 ] 

Shane Knapp commented on SPARK-33044:
-

branch-3.1 jobs + all scala 2.13 builds are done!

 

!Screen Shot 2020-12-08 at 1.56.59 PM.png!!Screen Shot 2020-12-08 at 1.58.07 
PM.png!

> Add a Jenkins build and test job for Scala 2.13
> ---
>
> Key: SPARK-33044
> URL: https://issues.apache.org/jira/browse/SPARK-33044
> Project: Spark
>  Issue Type: Sub-task
>  Components: jenkins
>Affects Versions: 3.1.0
>Reporter: Yang Jie
>Assignee: Shane Knapp
>Priority: Major
> Attachments: Screen Shot 2020-12-08 at 1.56.59 PM.png, Screen Shot 
> 2020-12-08 at 1.58.07 PM.png
>
>
> {{Master}} branch seems to be almost ready for Scala 2.13 now, we need a 
> Jenkins test job to verify current work results and CI.



--
This message was sent by Atlassian Jira
(v8.3.4#803005)

-
To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org
For additional commands, e-mail: issues-h...@spark.apache.org



[jira] [Updated] (SPARK-33044) Add a Jenkins build and test job for Scala 2.13

2020-12-08 Thread Shane Knapp (Jira)


 [ 
https://issues.apache.org/jira/browse/SPARK-33044?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
 ]

Shane Knapp updated SPARK-33044:

Attachment: Screen Shot 2020-12-08 at 1.56.59 PM.png

> Add a Jenkins build and test job for Scala 2.13
> ---
>
> Key: SPARK-33044
> URL: https://issues.apache.org/jira/browse/SPARK-33044
> Project: Spark
>  Issue Type: Sub-task
>  Components: jenkins
>Affects Versions: 3.1.0
>Reporter: Yang Jie
>Assignee: Shane Knapp
>Priority: Major
> Attachments: Screen Shot 2020-12-08 at 1.56.59 PM.png
>
>
> {{Master}} branch seems to be almost ready for Scala 2.13 now, we need a 
> Jenkins test job to verify current work results and CI.



--
This message was sent by Atlassian Jira
(v8.3.4#803005)

-
To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org
For additional commands, e-mail: issues-h...@spark.apache.org



[jira] [Resolved] (SPARK-33712) Remove `Spark Packaging` jobs from Jenkins

2020-12-08 Thread Shane Knapp (Jira)


 [ 
https://issues.apache.org/jira/browse/SPARK-33712?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
 ]

Shane Knapp resolved SPARK-33712.
-
Resolution: Fixed

> Remove `Spark Packaging` jobs from Jenkins
> --
>
> Key: SPARK-33712
> URL: https://issues.apache.org/jira/browse/SPARK-33712
> Project: Spark
>  Issue Type: Task
>  Components: Project Infra
>Affects Versions: 2.4.8, 3.0.2, 3.1.0, 3.2.0
>Reporter: Dongjoon Hyun
>Assignee: Shane Knapp
>Priority: Major
>
> SPARK-33675 migrated snapshot publishing to `GitHub Action`.
> Now, we can remove the jobs from Jenkins.
> - https://amplab.cs.berkeley.edu/jenkins/view/Spark%20Packaging/



--
This message was sent by Atlassian Jira
(v8.3.4#803005)

-
To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org
For additional commands, e-mail: issues-h...@spark.apache.org



[jira] [Assigned] (SPARK-33712) Remove `Spark Packaging` jobs from Jenkins

2020-12-08 Thread Shane Knapp (Jira)


 [ 
https://issues.apache.org/jira/browse/SPARK-33712?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
 ]

Shane Knapp reassigned SPARK-33712:
---

Assignee: Shane Knapp

> Remove `Spark Packaging` jobs from Jenkins
> --
>
> Key: SPARK-33712
> URL: https://issues.apache.org/jira/browse/SPARK-33712
> Project: Spark
>  Issue Type: Task
>  Components: Project Infra
>Affects Versions: 2.4.8, 3.0.2, 3.1.0, 3.2.0
>Reporter: Dongjoon Hyun
>Assignee: Shane Knapp
>Priority: Major
>
> SPARK-33675 migrated snapshot publishing to `GitHub Action`.
> Now, we can remove the jobs from Jenkins.
> - https://amplab.cs.berkeley.edu/jenkins/view/Spark%20Packaging/



--
This message was sent by Atlassian Jira
(v8.3.4#803005)

-
To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org
For additional commands, e-mail: issues-h...@spark.apache.org



[jira] [Commented] (SPARK-33712) Remove `Spark Packaging` jobs from Jenkins

2020-12-08 Thread Shane Knapp (Jira)


[ 
https://issues.apache.org/jira/browse/SPARK-33712?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=17246112#comment-17246112
 ] 

Shane Knapp commented on SPARK-33712:
-

ok, this is done done.  :)

> Remove `Spark Packaging` jobs from Jenkins
> --
>
> Key: SPARK-33712
> URL: https://issues.apache.org/jira/browse/SPARK-33712
> Project: Spark
>  Issue Type: Task
>  Components: Project Infra
>Affects Versions: 2.4.8, 3.0.2, 3.1.0, 3.2.0
>Reporter: Dongjoon Hyun
>Priority: Major
>
> SPARK-33675 migrated snapshot publishing to `GitHub Action`.
> Now, we can remove the jobs from Jenkins.
> - https://amplab.cs.berkeley.edu/jenkins/view/Spark%20Packaging/



--
This message was sent by Atlassian Jira
(v8.3.4#803005)

-
To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org
For additional commands, e-mail: issues-h...@spark.apache.org



[jira] [Assigned] (SPARK-33713) Remove `hive-2.3` post fix at master/branch-3.1 Jenkins job names

2020-12-08 Thread Shane Knapp (Jira)


 [ 
https://issues.apache.org/jira/browse/SPARK-33713?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
 ]

Shane Knapp reassigned SPARK-33713:
---

Assignee: Shane Knapp

> Remove `hive-2.3` post fix at master/branch-3.1 Jenkins job names
> -
>
> Key: SPARK-33713
> URL: https://issues.apache.org/jira/browse/SPARK-33713
> Project: Spark
>  Issue Type: Task
>  Components: Project Infra
>Affects Versions: 3.2.0
>Reporter: Dongjoon Hyun
>Assignee: Shane Knapp
>Priority: Major
>
> We removed `hive-1.2` profile since branch-3.1. So, we can simplify the 
> Jenkins job title.



--
This message was sent by Atlassian Jira
(v8.3.4#803005)

-
To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org
For additional commands, e-mail: issues-h...@spark.apache.org



[jira] [Commented] (SPARK-33712) Remove `Spark Packaging` jobs from Jenkins

2020-12-08 Thread Shane Knapp (Jira)


[ 
https://issues.apache.org/jira/browse/SPARK-33712?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=17246104#comment-17246104
 ] 

Shane Knapp commented on SPARK-33712:
-

done, pending my PR w/the jenkins job configs.

> Remove `Spark Packaging` jobs from Jenkins
> --
>
> Key: SPARK-33712
> URL: https://issues.apache.org/jira/browse/SPARK-33712
> Project: Spark
>  Issue Type: Task
>  Components: Project Infra
>Affects Versions: 2.4.8, 3.0.2, 3.1.0, 3.2.0
>Reporter: Dongjoon Hyun
>Priority: Major
>
> SPARK-33675 migrated snapshot publishing to `GitHub Action`.
> Now, we can remove the jobs from Jenkins.
> - https://amplab.cs.berkeley.edu/jenkins/view/Spark%20Packaging/



--
This message was sent by Atlassian Jira
(v8.3.4#803005)

-
To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org
For additional commands, e-mail: issues-h...@spark.apache.org



[jira] [Commented] (SPARK-33712) Remove `Spark Packaging` jobs from Jenkins

2020-12-08 Thread Shane Knapp (Jira)


[ 
https://issues.apache.org/jira/browse/SPARK-33712?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=17246103#comment-17246103
 ] 

Shane Knapp commented on SPARK-33712:
-

oh, awesome!  i'll take care of this now.  :)

> Remove `Spark Packaging` jobs from Jenkins
> --
>
> Key: SPARK-33712
> URL: https://issues.apache.org/jira/browse/SPARK-33712
> Project: Spark
>  Issue Type: Task
>  Components: Project Infra
>Affects Versions: 2.4.8, 3.0.2, 3.1.0, 3.2.0
>Reporter: Dongjoon Hyun
>Priority: Major
>
> SPARK-33675 migrated snapshot publishing to `GitHub Action`.
> Now, we can remove the jobs from Jenkins.
> - https://amplab.cs.berkeley.edu/jenkins/view/Spark%20Packaging/



--
This message was sent by Atlassian Jira
(v8.3.4#803005)

-
To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org
For additional commands, e-mail: issues-h...@spark.apache.org



[jira] [Commented] (SPARK-33044) Add a Jenkins build and test job for Scala 2.13

2020-12-08 Thread Shane Knapp (Jira)


[ 
https://issues.apache.org/jira/browse/SPARK-33044?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=17246045#comment-17246045
 ] 

Shane Knapp commented on SPARK-33044:
-

ping

[~LuciferYang] [~dongjoon] [~srowen]

> Add a Jenkins build and test job for Scala 2.13
> ---
>
> Key: SPARK-33044
> URL: https://issues.apache.org/jira/browse/SPARK-33044
> Project: Spark
>  Issue Type: Sub-task
>  Components: jenkins
>Affects Versions: 3.1.0
>Reporter: Yang Jie
>Assignee: Shane Knapp
>Priority: Major
>
> {{Master}} branch seems to be almost ready for Scala 2.13 now, we need a 
> Jenkins test job to verify current work results and CI.



--
This message was sent by Atlassian Jira
(v8.3.4#803005)

-
To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org
For additional commands, e-mail: issues-h...@spark.apache.org



  1   2   3   4   5   6   7   8   >