Re: [CRAN-pretest-archived] CRAN submission SparkR 2.2.2

2018-07-09 Thread Shivaram Venkataraman
I dont think we need to respin 2.2.2 -- Given that 2.3.2 is on the way
we can just submit that.

Shivaram
On Mon, Jul 9, 2018 at 6:19 PM Tom Graves  wrote:
>
> is there anyway to push it to CRAN without this fix, I don't really want to 
> respin 2.2.2 just with the test fix.
>
> Tom
>
> On Monday, July 9, 2018, 4:50:18 PM CDT, Shivaram Venkataraman 
>  wrote:
>
>
> Yes. I think Felix checked in a fix to ignore tests run on java
> versions that are not Java 8 (I think the fix was in
> https://github.com/apache/spark/pull/21666 which is in 2.3.2)
>
> Shivaram
> On Mon, Jul 9, 2018 at 5:39 PM Sean Owen  wrote:
> >
> > Yes, this flavor of error should only come up in Java 9. Spark doesn't 
> > support that. Is there any way to tell CRAN this should not be tested?
> >
> > On Mon, Jul 9, 2018, 4:17 PM Shivaram Venkataraman 
> >  wrote:
> >>
> >> The upcoming 2.2.2 release was submitted to CRAN. I think there are
> >> some knows issues on Windows, but does anybody know what the following
> >> error with Netty is ?
> >>
> >> >WARNING: Illegal reflective access by 
> >> > io.netty.util.internal.PlatformDependent0$1 
> >> > (file:/home/hornik/.cache/spark/spark-2.2.2-bin-hadoop2.7/jars/netty-all-4.0.43.Final.jar)
> >> >  to field java.nio.Buffer.address
> >>
> >> Thanks
> >> Shivaram
> >>
> >>
> >> -- Forwarded message -
> >> From: 
> >> Date: Mon, Jul 9, 2018 at 12:12 PM
> >> Subject: [CRAN-pretest-archived] CRAN submission SparkR 2.2.2
> >> To: 
> >> Cc: 
> >>
> >>
> >> Dear maintainer,
> >>
> >> package SparkR_2.2.2.tar.gz does not pass the incoming checks
> >> automatically, please see the following pre-tests:
> >> Windows: 
> >> <https://win-builder.r-project.org/incoming_pretest/SparkR_2.2.2_20180709_175630/Windows/00check.log>
> >> Status: 1 ERROR, 1 WARNING
> >> Debian: 
> >> <https://win-builder.r-project.org/incoming_pretest/SparkR_2.2.2_20180709_175630/Debian/00check.log>
> >> Status: 1 ERROR, 2 WARNINGs
> >>
> >> Last released version's CRAN status: ERROR: 1, OK: 1
> >> See: <https://CRAN.R-project.org/web/checks/check_results_SparkR.html>
> >>
> >> CRAN Web: <https://cran.r-project.org/package=SparkR>
> >>
> >> Please fix all problems and resubmit a fixed version via the webform.
> >> If you are not sure how to fix the problems shown, please ask for help
> >> on the R-package-devel mailing list:
> >> <https://stat.ethz.ch/mailman/listinfo/r-package-devel>
> >> If you are fairly certain the rejection is a false positive, please
> >> reply-all to this message and explain.
> >>
> >> More details are given in the directory:
> >> <https://win-builder.r-project.org/incoming_pretest/SparkR_2.2.2_20180709_175630/>
> >> The files will be removed after roughly 7 days.
> >>
> >> No strong reverse dependencies to be checked.
> >>
> >> Best regards,
> >> CRAN teams' auto-check service
> >> Flavor: r-devel-linux-x86_64-debian-gcc, r-devel-windows-ix86+x86_64
> >> Check: CRAN incoming feasibility, Result: WARNING
> >>  Maintainer: 'Shivaram Venkataraman '
> >>
> >>  New submission
> >>
> >>  Package was archived on CRAN
> >>
> >>  Insufficient package version (submitted: 2.2.2, existing: 2.3.0)
> >>
> >>  Possibly mis-spelled words in DESCRIPTION:
> >>Frontend (4:10, 5:28)
> >>
> >>  CRAN repository db overrides:
> >>X-CRAN-Comment: Archived on 2018-05-01 as check problems were not
> >>  corrected despite reminders.
> >>
> >>  Found the following (possibly) invalid URLs:
> >>URL: http://spark.apache.org/docs/latest/api/R/mean.html
> >>  From: inst/doc/sparkr-vignettes.html
> >>  Status: 404
> >>  Message: Not Found
> >>
> >> Flavor: r-devel-windows-ix86+x86_64
> >> Check: running tests for arch 'x64', Result: ERROR
> >>Running 'run-all.R' [175s]
> >>  Running the tests in 'tests/run-all.R' failed.
> >>  Complete output:
> >>> #
> >>> # Licensed to the Apache Software Foundation (ASF) under one or more
> >>> # contributor license agreements.  See the NOTICE file distributed 
> >> with
> >>> # this work for additional information regarding copyr

Re: [CRAN-pretest-archived] CRAN submission SparkR 2.2.2

2018-07-09 Thread Shivaram Venkataraman
Yes. I think Felix checked in a fix to ignore tests run on java
versions that are not Java 8 (I think the fix was in
https://github.com/apache/spark/pull/21666 which is in 2.3.2)

Shivaram
On Mon, Jul 9, 2018 at 5:39 PM Sean Owen  wrote:
>
> Yes, this flavor of error should only come up in Java 9. Spark doesn't 
> support that. Is there any way to tell CRAN this should not be tested?
>
> On Mon, Jul 9, 2018, 4:17 PM Shivaram Venkataraman 
>  wrote:
>>
>> The upcoming 2.2.2 release was submitted to CRAN. I think there are
>> some knows issues on Windows, but does anybody know what the following
>> error with Netty is ?
>>
>> > WARNING: Illegal reflective access by 
>> > io.netty.util.internal.PlatformDependent0$1 
>> > (file:/home/hornik/.cache/spark/spark-2.2.2-bin-hadoop2.7/jars/netty-all-4.0.43.Final.jar)
>> >  to field java.nio.Buffer.address
>>
>> Thanks
>> Shivaram
>>
>>
>> ------ Forwarded message -----
>> From: 
>> Date: Mon, Jul 9, 2018 at 12:12 PM
>> Subject: [CRAN-pretest-archived] CRAN submission SparkR 2.2.2
>> To: 
>> Cc: 
>>
>>
>> Dear maintainer,
>>
>> package SparkR_2.2.2.tar.gz does not pass the incoming checks
>> automatically, please see the following pre-tests:
>> Windows: 
>> <https://win-builder.r-project.org/incoming_pretest/SparkR_2.2.2_20180709_175630/Windows/00check.log>
>> Status: 1 ERROR, 1 WARNING
>> Debian: 
>> <https://win-builder.r-project.org/incoming_pretest/SparkR_2.2.2_20180709_175630/Debian/00check.log>
>> Status: 1 ERROR, 2 WARNINGs
>>
>> Last released version's CRAN status: ERROR: 1, OK: 1
>> See: <https://CRAN.R-project.org/web/checks/check_results_SparkR.html>
>>
>> CRAN Web: <https://cran.r-project.org/package=SparkR>
>>
>> Please fix all problems and resubmit a fixed version via the webform.
>> If you are not sure how to fix the problems shown, please ask for help
>> on the R-package-devel mailing list:
>> <https://stat.ethz.ch/mailman/listinfo/r-package-devel>
>> If you are fairly certain the rejection is a false positive, please
>> reply-all to this message and explain.
>>
>> More details are given in the directory:
>> <https://win-builder.r-project.org/incoming_pretest/SparkR_2.2.2_20180709_175630/>
>> The files will be removed after roughly 7 days.
>>
>> No strong reverse dependencies to be checked.
>>
>> Best regards,
>> CRAN teams' auto-check service
>> Flavor: r-devel-linux-x86_64-debian-gcc, r-devel-windows-ix86+x86_64
>> Check: CRAN incoming feasibility, Result: WARNING
>>   Maintainer: 'Shivaram Venkataraman '
>>
>>   New submission
>>
>>   Package was archived on CRAN
>>
>>   Insufficient package version (submitted: 2.2.2, existing: 2.3.0)
>>
>>   Possibly mis-spelled words in DESCRIPTION:
>> Frontend (4:10, 5:28)
>>
>>   CRAN repository db overrides:
>> X-CRAN-Comment: Archived on 2018-05-01 as check problems were not
>>   corrected despite reminders.
>>
>>   Found the following (possibly) invalid URLs:
>> URL: http://spark.apache.org/docs/latest/api/R/mean.html
>>   From: inst/doc/sparkr-vignettes.html
>>   Status: 404
>>   Message: Not Found
>>
>> Flavor: r-devel-windows-ix86+x86_64
>> Check: running tests for arch 'x64', Result: ERROR
>> Running 'run-all.R' [175s]
>>   Running the tests in 'tests/run-all.R' failed.
>>   Complete output:
>> > #
>> > # Licensed to the Apache Software Foundation (ASF) under one or more
>> > # contributor license agreements.  See the NOTICE file distributed with
>> > # this work for additional information regarding copyright ownership.
>> > # The ASF licenses this file to You under the Apache License, Version 
>> 2.0
>> > # (the "License"); you may not use this file except in compliance with
>> > # the License.  You may obtain a copy of the License at
>> > #
>> > #http://www.apache.org/licenses/LICENSE-2.0
>> > #
>> > # Unless required by applicable law or agreed to in writing, software
>> > # distributed under the License is distributed on an "AS IS" BASIS,
>> > # WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or 
>> implied.
>> > # See the License for the specific language governing permissions and
>> > # limitations under the License.
>> > #
>> >
>> 

Re: [CRAN-pretest-archived] CRAN submission SparkR 2.2.2

2018-07-09 Thread Felix Cheung
I recall this might be a problem running Spark on java 9



From: Shivaram Venkataraman 
Sent: Monday, July 9, 2018 2:17 PM
To: dev; Felix Cheung; Tom Graves
Subject: Fwd: [CRAN-pretest-archived] CRAN submission SparkR 2.2.2

The upcoming 2.2.2 release was submitted to CRAN. I think there are
some knows issues on Windows, but does anybody know what the following
error with Netty is ?

> WARNING: Illegal reflective access by 
> io.netty.util.internal.PlatformDependent0$1 
> (file:/home/hornik/.cache/spark/spark-2.2.2-bin-hadoop2.7/jars/netty-all-4.0.43.Final.jar)
>  to field java.nio.Buffer.address

Thanks
Shivaram


-- Forwarded message -
From: 
Date: Mon, Jul 9, 2018 at 12:12 PM
Subject: [CRAN-pretest-archived] CRAN submission SparkR 2.2.2
To: 
Cc: 


Dear maintainer,

package SparkR_2.2.2.tar.gz does not pass the incoming checks
automatically, please see the following pre-tests:
Windows: 
<https://win-builder.r-project.org/incoming_pretest/SparkR_2.2.2_20180709_175630/Windows/00check.log>
Status: 1 ERROR, 1 WARNING
Debian: 
<https://win-builder.r-project.org/incoming_pretest/SparkR_2.2.2_20180709_175630/Debian/00check.log>
Status: 1 ERROR, 2 WARNINGs

Last released version's CRAN status: ERROR: 1, OK: 1
See: <https://CRAN.R-project.org/web/checks/check_results_SparkR.html>

CRAN Web: <https://cran.r-project.org/package=SparkR>

Please fix all problems and resubmit a fixed version via the webform.
If you are not sure how to fix the problems shown, please ask for help
on the R-package-devel mailing list:
<https://stat.ethz.ch/mailman/listinfo/r-package-devel>
If you are fairly certain the rejection is a false positive, please
reply-all to this message and explain.

More details are given in the directory:
<https://win-builder.r-project.org/incoming_pretest/SparkR_2.2.2_20180709_175630/>
The files will be removed after roughly 7 days.

No strong reverse dependencies to be checked.

Best regards,
CRAN teams' auto-check service
Flavor: r-devel-linux-x86_64-debian-gcc, r-devel-windows-ix86+x86_64
Check: CRAN incoming feasibility, Result: WARNING
Maintainer: 'Shivaram Venkataraman '

New submission

Package was archived on CRAN

Insufficient package version (submitted: 2.2.2, existing: 2.3.0)

Possibly mis-spelled words in DESCRIPTION:
Frontend (4:10, 5:28)

CRAN repository db overrides:
X-CRAN-Comment: Archived on 2018-05-01 as check problems were not
corrected despite reminders.

Found the following (possibly) invalid URLs:
URL: http://spark.apache.org/docs/latest/api/R/mean.html
From: inst/doc/sparkr-vignettes.html
Status: 404
Message: Not Found

Flavor: r-devel-windows-ix86+x86_64
Check: running tests for arch 'x64', Result: ERROR
Running 'run-all.R' [175s]
Running the tests in 'tests/run-all.R' failed.
Complete output:
> #
> # Licensed to the Apache Software Foundation (ASF) under one or more
> # contributor license agreements. See the NOTICE file distributed with
> # this work for additional information regarding copyright ownership.
> # The ASF licenses this file to You under the Apache License, Version 2.0
> # (the "License"); you may not use this file except in compliance with
> # the License. You may obtain a copy of the License at
> #
> # http://www.apache.org/licenses/LICENSE-2.0
> #
> # Unless required by applicable law or agreed to in writing, software
> # distributed under the License is distributed on an "AS IS" BASIS,
> # WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
> # See the License for the specific language governing permissions and
> # limitations under the License.
> #
>
> library(testthat)
> library(SparkR)

Attaching package: 'SparkR'

The following object is masked from 'package:testthat':

describe

The following objects are masked from 'package:stats':

cov, filter, lag, na.omit, predict, sd, var, window

The following objects are masked from 'package:base':

as.data.frame, colnames, colnames<-, drop, endsWith, intersect,
rank, rbind, sample, startsWith, subset, summary, transform, union

>
> # Turn all warnings into errors
> options("warn" = 2)
>
> if (.Platform$OS.type == "windows") {
+ Sys.setenv(TZ = "GMT")
+ }
>
> # Setup global test environment
> # Install Spark first to set SPARK_HOME
>
> # NOTE(shivaram): We set overwrite to handle any old tar.gz
files or directories left behind on
> # CRAN machines. For Jenkins we should already have SPARK_HOME set.
> install.spark(overwrite = TRUE)
Overwrite = TRUE: download and overwrite the tar fileand Spark
package directory if they exist.
Spark not found in the cache directory. Installation will start.
MirrorUrl not provided.
Looking for preferred site from apache website...
Preferred mirror site found: http://mirror.dkd.de/apache/spark
Downloading spark-2.2.2 for Had

Fwd: [CRAN-pretest-archived] CRAN submission SparkR 2.2.2

2018-07-09 Thread Shivaram Venkataraman
The upcoming 2.2.2 release was submitted to CRAN. I think there are
some knows issues on Windows, but does anybody know what the following
error with Netty is ?

> WARNING: Illegal reflective access by 
> io.netty.util.internal.PlatformDependent0$1 
> (file:/home/hornik/.cache/spark/spark-2.2.2-bin-hadoop2.7/jars/netty-all-4.0.43.Final.jar)
>  to field java.nio.Buffer.address

Thanks
Shivaram


-- Forwarded message -
From: 
Date: Mon, Jul 9, 2018 at 12:12 PM
Subject: [CRAN-pretest-archived] CRAN submission SparkR 2.2.2
To: 
Cc: 


Dear maintainer,

package SparkR_2.2.2.tar.gz does not pass the incoming checks
automatically, please see the following pre-tests:
Windows: 
<https://win-builder.r-project.org/incoming_pretest/SparkR_2.2.2_20180709_175630/Windows/00check.log>
Status: 1 ERROR, 1 WARNING
Debian: 
<https://win-builder.r-project.org/incoming_pretest/SparkR_2.2.2_20180709_175630/Debian/00check.log>
Status: 1 ERROR, 2 WARNINGs

Last released version's CRAN status: ERROR: 1, OK: 1
See: <https://CRAN.R-project.org/web/checks/check_results_SparkR.html>

CRAN Web: <https://cran.r-project.org/package=SparkR>

Please fix all problems and resubmit a fixed version via the webform.
If you are not sure how to fix the problems shown, please ask for help
on the R-package-devel mailing list:
<https://stat.ethz.ch/mailman/listinfo/r-package-devel>
If you are fairly certain the rejection is a false positive, please
reply-all to this message and explain.

More details are given in the directory:
<https://win-builder.r-project.org/incoming_pretest/SparkR_2.2.2_20180709_175630/>
The files will be removed after roughly 7 days.

No strong reverse dependencies to be checked.

Best regards,
CRAN teams' auto-check service
Flavor: r-devel-linux-x86_64-debian-gcc, r-devel-windows-ix86+x86_64
Check: CRAN incoming feasibility, Result: WARNING
  Maintainer: 'Shivaram Venkataraman '

  New submission

  Package was archived on CRAN

  Insufficient package version (submitted: 2.2.2, existing: 2.3.0)

  Possibly mis-spelled words in DESCRIPTION:
Frontend (4:10, 5:28)

  CRAN repository db overrides:
X-CRAN-Comment: Archived on 2018-05-01 as check problems were not
  corrected despite reminders.

  Found the following (possibly) invalid URLs:
URL: http://spark.apache.org/docs/latest/api/R/mean.html
  From: inst/doc/sparkr-vignettes.html
  Status: 404
  Message: Not Found

Flavor: r-devel-windows-ix86+x86_64
Check: running tests for arch 'x64', Result: ERROR
Running 'run-all.R' [175s]
  Running the tests in 'tests/run-all.R' failed.
  Complete output:
> #
> # Licensed to the Apache Software Foundation (ASF) under one or more
> # contributor license agreements.  See the NOTICE file distributed with
> # this work for additional information regarding copyright ownership.
> # The ASF licenses this file to You under the Apache License, Version 2.0
> # (the "License"); you may not use this file except in compliance with
> # the License.  You may obtain a copy of the License at
> #
> #http://www.apache.org/licenses/LICENSE-2.0
> #
> # Unless required by applicable law or agreed to in writing, software
> # distributed under the License is distributed on an "AS IS" BASIS,
> # WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
> # See the License for the specific language governing permissions and
> # limitations under the License.
> #
>
> library(testthat)
> library(SparkR)

Attaching package: 'SparkR'

The following object is masked from 'package:testthat':

describe

The following objects are masked from 'package:stats':

cov, filter, lag, na.omit, predict, sd, var, window

The following objects are masked from 'package:base':

as.data.frame, colnames, colnames<-, drop, endsWith, intersect,
rank, rbind, sample, startsWith, subset, summary, transform, union

>
> # Turn all warnings into errors
> options("warn" = 2)
>
> if (.Platform$OS.type == "windows") {
+   Sys.setenv(TZ = "GMT")
+ }
>
> # Setup global test environment
> # Install Spark first to set SPARK_HOME
>
> # NOTE(shivaram): We set overwrite to handle any old tar.gz
files or directories left behind on
> # CRAN machines. For Jenkins we should already have SPARK_HOME set.
> install.spark(overwrite = TRUE)
Overwrite = TRUE: download and overwrite the tar fileand Spark
package directory if they exist.
Spark not found in the cache directory. Installation will start.
MirrorUrl not provided.
Looking for preferred site from apache website...
Preferred mirror site found: http://mirror.dkd.de/apache/spark
Downloading