ct: Re: [VOTE] Spark 2.2.1 (RC2)
Hi Felix Cheung:
When to pulish the new version 2.2.1 of spark doc to the website, now it's
still the version 2.2.0.
--
Sent from: http://apache-spark-developers-list.1001551.n3.nabb
Hi Felix Cheung:
When to pulish the new version 2.2.1 of spark doc to the website, now it's
still the version 2.2.0.
--
Sent from: http://apache-spark-developers-list.1001551.n3.nabble.com/
-
To unsubscribe e-mail:
Hi, dev, the version of latest spark doc is still 2.2.0, when to publish the
2.2.1 doc ?
--
Sent from: http://apache-spark-developers-list.1001551.n3.nabble.com/
-
To unsubscribe e-mail: dev-unsubscr...@spark.apache.org
I think the final requirements published being PMC only (w/maybe the person
who set it up being an exception) is generally the case for each of the
languages (e.g. Maven requires PMC to do final push, the dist download
requires a final svn mv by PMC, etc.).
On Thu, Dec 14, 2017 at 1:38 PM, Felix
;)
The credential to the user to publish to PyPI is PMC only.
+Holden
Had discussed this in the other thread I sent to private@ last week.
On Thu, Dec 14, 2017 at 4:34 AM Sean Owen wrote:
> On the various access questions here -- what do you need to have that
> access? We
On the various access questions here -- what do you need to have that
access? We definitely need to give you all necessary access if you're the
release manager!
On Thu, Dec 14, 2017 at 6:32 AM Felix Cheung wrote:
> And I don’t have access to publish python.
>
> On Wed,
And I don’t have access to publish python.
On Wed, Dec 13, 2017 at 9:55 AM Shivaram Venkataraman <
shiva...@eecs.berkeley.edu> wrote:
> The R artifacts have some issue that Felix and I are debugging. Lets not
> block the announcement for that.
>
> Thanks
>
> Shivaram
>
> On Wed, Dec 13, 2017 at
The R artifacts have some issue that Felix and I are debugging. Lets not
block the announcement for that.
Thanks
Shivaram
On Wed, Dec 13, 2017 at 5:59 AM, Sean Owen wrote:
> Looks like Maven artifacts are up, site's up -- what about the Python and
> R artifacts?
> I can
Looks like Maven artifacts are up, site's up -- what about the Python and R
artifacts?
I can also move the spark.apache/docs/latest link to point to 2.2.1 if it's
pretty ready.
We should announce the release officially too then.
On Wed, Dec 6, 2017 at 5:00 PM Felix Cheung
I saw the svn move on Monday so I’m working on the website updates.
I will look into maven today. I will ask if I couldn’t do it.
On Wed, Dec 6, 2017 at 10:49 AM Sean Owen wrote:
> Pardon, did this release finish? I don't see it in Maven. I know there was
> some question
Pardon, did this release finish? I don't see it in Maven. I know there was
some question about getting a hand in finishing the release process,
including copying artifacts in svn. Was there anything else you're waiting
on someone to do?
On Fri, Dec 1, 2017 at 2:10 AM Felix Cheung
yst . SUCCESS
>>>>>> [11:51 min]
>>>>>> [INFO] Spark Project SQL .............. SUCCESS
>>>>>> [17:55 min]
>>>>>> [INFO] Spark Project ML Library ... SUCCESS
>>> [INFO] Spark Project ML Library ... SUCCESS
>>>>> [17:05 min]
>>>>> [INFO]
>>>>> ----------------
>>>>> [INFO] BUILD SUCCESS
>>>>>
t;>> [INFO] Total time: 01:04 h
>>>> [INFO] Finished at: 2017-11-30T01:48:15+09:00
>>>> [INFO] Final Memory: 128M/329M
>>>> [INFO]
>>>>
>>>> [WARNING] Th
d at: 2017-11-30T01:48:15+09:00
>>> [INFO] Final Memory: 128M/329M
>>> [INFO] ------------
>>>
>>> [WARNING] The requested profile "hive" could not be activated because it
>>> does not ex
-
>>
>> [WARNING] The requested profile "hive" could not be activated because it
>> does not exist.
>>
>> Kazuaki Ishizaki
>>
>>
>>
>> From:Dongjoon Hyun <dongjoon.h...@gmail.com&
>
>
>
> From:Dongjoon Hyun <dongjoon.h...@gmail.com>
> To:Hyukjin Kwon <gurwls...@gmail.com>
> Cc:Spark dev list <dev@spark.apache.org>, Felix Cheung <
> felixche...@apache.org>, Sean Owen <so...@cloudera.com>
> Date:2
oon Hyun <dongjoon.h...@gmail.com>
To: Hyukjin Kwon <gurwls...@gmail.com>
Cc: Spark dev list <dev@spark.apache.org>, Felix Cheung
<felixche...@apache.org>, Sean Owen <so...@cloudera.com>
Date: 2017/11/29 12:56
Subject:Re: [VOTE] Spark 2.2.1 (RC2)
+1 (
+1 (non-binding)
RC2 is tested on CentOS, too.
Bests,
Dongjoon.
On Tue, Nov 28, 2017 at 4:35 PM, Hyukjin Kwon wrote:
> +1
>
> 2017-11-29 8:18 GMT+09:00 Henry Robinson :
>
>> (My vote is non-binding, of course).
>>
>> On 28 November 2017 at 14:53, Henry
+1
2017-11-29 8:18 GMT+09:00 Henry Robinson :
> (My vote is non-binding, of course).
>
> On 28 November 2017 at 14:53, Henry Robinson wrote:
>
>> +1, tests all pass for me on Ubuntu 16.04.
>>
>> On 28 November 2017 at 10:36, Herman van Hövell tot Westerflier
(My vote is non-binding, of course).
On 28 November 2017 at 14:53, Henry Robinson wrote:
> +1, tests all pass for me on Ubuntu 16.04.
>
> On 28 November 2017 at 10:36, Herman van Hövell tot Westerflier <
> hvanhov...@databricks.com> wrote:
>
>> +1
>>
>> On Tue, Nov 28, 2017 at
+1, tests all pass for me on Ubuntu 16.04.
On 28 November 2017 at 10:36, Herman van Hövell tot Westerflier <
hvanhov...@databricks.com> wrote:
> +1
>
> On Tue, Nov 28, 2017 at 7:35 PM, Felix Cheung
> wrote:
>
>> +1
>>
>> Thanks Sean. Please vote!
>>
>> Tested various
+1
On Tue, Nov 28, 2017 at 7:35 PM, Felix Cheung
wrote:
> +1
>
> Thanks Sean. Please vote!
>
> Tested various scenarios with R package. Ubuntu, Debian, Windows r-devel
> and release and on r-hub. Verified CRAN checks are clean (only 1 NOTE!) and
> no leaked files (.cache
+1
Thanks Sean. Please vote!
Tested various scenarios with R package. Ubuntu, Debian, Windows r-devel
and release and on r-hub. Verified CRAN checks are clean (only 1 NOTE!) and
no leaked files (.cache removed, /tmp clean)
On Sun, Nov 26, 2017 at 11:55 AM Sean Owen wrote:
Yes it downloads recent releases. The test worked for me on a second try,
so I suspect a bad mirror. If this comes up frequently we can just add
retry logic, as the closer.lua script will return different mirrors each
time.
The tests all pass for me on the latest Debian, so +1 for this release.
Hmm, this isn’t the first time we’ve had the mirror issues happen if that’s
the case. Maybe we should log the IP if this happens so we can report it to
infra?
On Sat, Nov 25, 2017 at 7:47 PM Felix Cheung wrote:
> Ah sorry digging through the history it looks like this is
Ah sorry digging through the history it looks like this is changed
relatively recently and should only download previous releases.
Perhaps we are intermittently hitting a mirror that doesn’t have the files?
https://github.com/apache/spark/commit/daa838b8886496e64700b55d1301d348f1d5c9ae
On
Thanks Sean.
For the second one, it looks like the HiveExternalCatalogVersionsSuite is
trying to download the release tgz from the official Apache mirror, which
won’t work unless the release is actually, released?
val preferredMirror =
Seq("wget",
I hit the same StackOverflowError as in the previous RC test, but, pretty
sure this is just because the increased thread stack size JVM flag isn't
applied consistently. This seems to resolve it:
https://github.com/apache/spark/pull/19820
This wouldn't block release IMHO.
I am currently
Please vote on releasing the following candidate as Apache Spark version
2.2.1. The vote is open until Friday December 1, 2017 at 8:00:00 am UTC and
passes if a majority of at least 3 PMC +1 votes are cast.
[ ] +1 Release this package as Apache Spark 2.2.1
[ ] -1 Do not release this package
30 matches
Mail list logo