BTW does anyone know why there are two PR builder jobs? I'm confused
about why different ones would execute.

Yes I see NewSparkPullRequestBuilder failing on a variety of PRs.
I don't think it has anything to do with Hive; these PRs touch
different parts of code but all not related to this failure.

On Wed, Dec 4, 2019 at 12:40 PM Dongjoon Hyun <dongjoon.h...@gmail.com> wrote:
>
> Hi, Sean.
>
> It seems that there is no failure on your other SQL PR.
>
>     https://github.com/apache/spark/pull/26748
>
> Does the sequential failure happen only at `NewSparkPullRequestBuilder`?
> Since `NewSparkPullRequestBuilder` is not the same with 
> `SparkPullRequestBuilder`,
> there might be a root cause inside it if it happens only at 
> `NewSparkPullRequestBuilder`.
>
> For `org.apache.hive.service.ServiceException: Failed to Start HiveServer2`,
> I've observed them before, but the root cause might be different from this 
> one.
>
> BTW, to reduce the scope of investigation, could you try with `[hive-1.2]` 
> tag in your PR?
>
> Bests,
> Dongjoon.
>
>
> On Wed, Dec 4, 2019 at 6:29 AM Sean Owen <sro...@gmail.com> wrote:
>>
>> I'm seeing consistent failures in the PR builder when touching SQL code:
>>
>> https://amplab.cs.berkeley.edu/jenkins/job/NewSparkPullRequestBuilder/4960/testReport/
>>
>>  org.apache.spark.sql.hive.thriftserver.SparkMetadataOperationSuite.Spark's 
>> own GetSchemasOperation(SparkGetSchemasOperation)14 ms2
>>  
>> org.apache.spark.sql.hive.thriftserver.ThriftServerWithSparkContextSuite.(It 
>> is not a test it is a sbt.testing.SuiteSelector)
>>
>> Looks like this has failed about 6 builds in the past few days. Has anyone 
>> seen this / has a clue what's causing it? errors are like ...
>>
>> java.sql.SQLException: No suitable driver found for 
>> jdbc:hive2://localhost:13694/?a=avalue;b=bvalue#c=cvalue;d=dvalue
>>
>>
>> Caused by: sbt.ForkMain$ForkError: java.lang.RuntimeException: class 
>> org.apache.hadoop.hive.metastore.DefaultMetaStoreFilterHookImpl not 
>> org.apache.hadoop.hive.metastore.MetaStoreFilterHook

---------------------------------------------------------------------
To unsubscribe e-mail: dev-unsubscr...@spark.apache.org

Reply via email to