parser error?

2018-05-13 Thread Reynold Xin
Just saw this in one of my PR that's doc only:

[error] warning(154): SqlBase.g4:400:0: rule fromClause contains an
optional block with at least one alternative that can match an empty
string


Re: Build timeout -- continuous-integration/appveyor/pr — AppVeyor build failed

2018-05-13 Thread Holden Karau
On Sun, May 13, 2018 at 9:43 PM Hyukjin Kwon  wrote:

> From a very quick look, I believe that's just occasional network issue in
> AppVeyor. For example, in this case:
>   Downloading:
> https://repo.maven.apache.org/maven2/org/scala-lang/scala-compiler/2.11.8/scala-compiler-2.11.8.jar
> This took 26ish mins and seems further downloading jars look mins much
> more than usual.
>
> FYI, It usually takes built 35 ~ 40 mins and R tests 25 ~ 30 mins where
> usually ends up 1 hour 5 min.
> Will take another look to reduce the time if the usual time reaches 1 hour
> and 30 mins (which is the current AppVeyor limit).
> I did this few times before - https://github.com/apache/spark/pull/19722
> and https://github.com/apache/spark/pull/19816.
>
> The timeout is already increased from 1 hour to 1 hour and 30 mins. They
> still look disallowing to increase timeout anymore.
> I contacted with them few times and manually requested this.
>
> For the best, I believe we usually just rebase rather than merging the
> commits in any case as mentioned in the contribution guide.
>
I don’t recal this being a thing that we actually go that far in
encouraging. The guide says rebases are one of the ways folks can keep
their PRs up to date, but no actually preference is stated. I tend to see
PRs from different folks doing either rebases or merges since we do squash
commits anyways.

I know for some developers keeping their branch up to date merge commits
tend to be less effort, and provided the diff is still clear and the
resulting merge is also clean I don’t see an issue.

> The test failure in the PR should be ignorable if that's not directly
> related with SparkR.
>
>
> Thanks.
>
>
>
> 2018-05-14 8:45 GMT+08:00 Ilan Filonenko :
>
>> Hi dev,
>>
>> I recently updated an on-going PR [
>> https://github.com/apache/spark/pull/21092] that was updated with a
>> merge that included a lot of commits from master and I got the following
>> error:
>>
>> *continuous-integration/appveyor/pr *— AppVeyor build failed
>>
>> due to:
>>
>> *Build execution time has reached the maximum allowed time for your plan
>> (90 minutes).*
>>
>> seen here:
>> https://ci.appveyor.com/project/ApacheSoftwareFoundation/spark/build/2300-master
>>
>> As this is the first time I am seeing this, I am wondering if this is in
>> relation to a large merge and if it is, I am wondering if the timeout can
>> be increased.
>>
>> Thanks!
>>
>> Best,
>> Ilan Filonenko
>>
>
> --
Twitter: https://twitter.com/holdenkarau


Re: Build timeout -- continuous-integration/appveyor/pr — AppVeyor build failed

2018-05-13 Thread Hyukjin Kwon
>From a very quick look, I believe that's just occasional network issue in
AppVeyor. For example, in this case:
  Downloading:
https://repo.maven.apache.org/maven2/org/scala-lang/scala-compiler/2.11.8/scala-compiler-2.11.8.jar
This took 26ish mins and seems further downloading jars look mins much more
than usual.

FYI, It usually takes built 35 ~ 40 mins and R tests 25 ~ 30 mins where
usually ends up 1 hour 5 min.
Will take another look to reduce the time if the usual time reaches 1 hour
and 30 mins (which is the current AppVeyor limit).
I did this few times before - https://github.com/apache/spark/pull/19722
and https://github.com/apache/spark/pull/19816.

The timeout is already increased from 1 hour to 1 hour and 30 mins. They
still look disallowing to increase timeout anymore.
I contacted with them few times and manually requested this.

For the best, I believe we usually just rebase rather than merging the
commits in any case as mentioned in the contribution guide.
The test failure in the PR should be ignorable if that's not directly
related with SparkR.


Thanks.



2018-05-14 8:45 GMT+08:00 Ilan Filonenko :

> Hi dev,
>
> I recently updated an on-going PR [https://github.com/apache/
> spark/pull/21092] that was updated with a merge that included a lot of
> commits from master and I got the following error:
>
> *continuous-integration/appveyor/pr *— AppVeyor build failed
>
> due to:
>
> *Build execution time has reached the maximum allowed time for your plan
> (90 minutes).*
>
> seen here: https://ci.appveyor.com/project/ApacheSoftwareFoundation/
> spark/build/2300-master
>
> As this is the first time I am seeing this, I am wondering if this is in
> relation to a large merge and if it is, I am wondering if the timeout can
> be increased.
>
> Thanks!
>
> Best,
> Ilan Filonenko
>


Build timeout -- continuous-integration/appveyor/pr — AppVeyor build failed

2018-05-13 Thread Ilan Filonenko
Hi dev,

I recently updated an on-going PR [
https://github.com/apache/spark/pull/21092] that was updated with a merge
that included a lot of commits from master and I got the following error:

*continuous-integration/appveyor/pr *— AppVeyor build failed

due to:

*Build execution time has reached the maximum allowed time for your plan
(90 minutes).*

seen here:
https://ci.appveyor.com/project/ApacheSoftwareFoundation/spark/build/2300-master

As this is the first time I am seeing this, I am wondering if this is in
relation to a large merge and if it is, I am wondering if the timeout can
be increased.

Thanks!

Best,
Ilan Filonenko


Re: Time for 2.3.1?

2018-05-13 Thread Shivaram Venkataraman
+1 We had a SparkR fix for CRAN SystemRequirements that will also be good
to get out.

Shivaram

On Fri, May 11, 2018 at 12:34 PM, Henry Robinson  wrote:

> https://github.com/apache/spark/pull/21302
>
> On 11 May 2018 at 11:47, Henry Robinson  wrote:
>
>> I was planning to do so shortly.
>>
>> Henry
>>
>> On 11 May 2018 at 11:45, Ryan Blue  wrote:
>>
>>> The Parquet Java 1.8.3 release is out. Has anyone started a PR to
>>> update, or should I?
>>>
>>> On Fri, May 11, 2018 at 7:40 AM, Cody Koeninger 
>>> wrote:
>>>
 Sounds good, I'd like to add SPARK-24067 today assuming there's no
 objections

 On Thu, May 10, 2018 at 1:22 PM, Henry Robinson 
 wrote:
 > +1, I'd like to get a release out with SPARK-23852 fixed. The Parquet
 > community are about to release 1.8.3 - the voting period closes
 tomorrow -
 > and I've tested it with Spark 2.3 and confirmed the bug is fixed.
 Hopefully
 > it is released and I can post the version change to branch-2.3 before
 you
 > start to roll the RC this weekend.
 >
 > Henry
 >
 > On 10 May 2018 at 11:09, Marcelo Vanzin  wrote:
 >>
 >> Hello all,
 >>
 >> It's been a while since we shipped 2.3.0 and lots of important bug
 >> fixes have gone into the branch since then. I took a look at Jira and
 >> it seems there's not a lot of things explicitly targeted at 2.3.1 -
 >> the only potential blocker (a parquet issue) is being worked on since
 >> a new parquet with the fix was just released.
 >>
 >> So I'd like to propose to release 2.3.1 soon. If there are important
 >> fixes that should go into the release, please let those be known (by
 >> replying here or updating the bug in Jira), otherwise I'm
 volunteering
 >> to prepare the first RC soon-ish (around the weekend).
 >>
 >> Thanks!
 >>
 >>
 >> --
 >> Marcelo
 >>
 >> 
 -
 >> To unsubscribe e-mail: dev-unsubscr...@spark.apache.org
 >>
 >

 -
 To unsubscribe e-mail: dev-unsubscr...@spark.apache.org


>>>
>>>
>>> --
>>> Ryan Blue
>>> Software Engineer
>>> Netflix
>>>
>>
>>
>