[
https://issues.apache.org/jira/browse/PHOENIX-6136?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=17196723#comment-17196723
]
Hadoop QA commented on PHOENIX-6136:
------------------------------------
{color:red}-1 overall{color}. Here are the results of testing the latest
attachment
http://issues.apache.org/jira/secure/attachment/13011592/PHOENIX-6136.master.v1.patch
against master branch at commit .
ATTACHMENT ID: 13011592
{color:green}+1 @author{color}. The patch does not contain any @author
tags.
{color:green}+0 tests included{color}. The patch appears to be a
documentation, build,
or dev patch that doesn't require tests.
{color:green}+1 javac{color}. The applied patch does not increase the
total number of javac compiler warnings.
{color:green}+1 release audit{color}. The applied patch does not increase
the total number of release audit warnings.
{color:green}+1 lineLengths{color}. The patch does not introduce lines
longer than 100
{color:red}-1 core tests{color}. The patch failed these unit tests:
{color:red}-1 core zombie tests{color}. There are 7 zombie test(s):
at
org.apache.phoenix.end2end.InListIT.testBaseTableAndIndexTableHaveRightScan(InListIT.java:1580)
at
org.apache.phoenix.end2end.EmptyColumnIT.testWhenTableWithIndexAndVariousOptions(EmptyColumnIT.java:492)
at
org.apache.phoenix.end2end.InstrFunctionIT.testSingleByteInstrDescendingNoString(InstrFunctionIT.java:92)
at
org.apache.phoenix.end2end.IndexToolForDeleteBeforeRebuildIT.testDeleteBeforeRebuildForGlobalIndex(IndexToolForDeleteBeforeRebuildIT.java:144)
at
org.apache.phoenix.end2end.DeleteIT.testDeleteForTableWithRowTimestampCol(DeleteIT.java:694)
at
org.apache.phoenix.end2end.DeleteIT.testDeleteForTableWithRowTimestampColClient(DeleteIT.java:684)
Test results:
https://ci-hadoop.apache.org/job/PreCommit-PHOENIX-Build/93//testReport/
Code Coverage results:
https://ci-hadoop.apache.org/job/PreCommit-PHOENIX-Build/93//artifact/phoenix-core/target/site/jacoco/index.html
Console output:
https://ci-hadoop.apache.org/job/PreCommit-PHOENIX-Build/93//console
This message is automatically generated.
> javax.servlet.UnavailableException thrown when using Spark connector
> --------------------------------------------------------------------
>
> Key: PHOENIX-6136
> URL: https://issues.apache.org/jira/browse/PHOENIX-6136
> Project: Phoenix
> Issue Type: Bug
> Components: core, spark-connector
> Affects Versions: 5.1.0
> Reporter: Tamas Adami
> Assignee: Istvan Toth
> Priority: Major
> Attachments: PHOENIX-6136.master.v1.patch
>
> Time Spent: 10m
> Remaining Estimate: 0h
>
> We get en exception from Spark when using the phoenix-spark connector.
> {noformat}
> 2020-09-11 13:29:28,252 WARN [main] component.AbstractLifeCycle: FAILED
> org.glassfish.jersey.servlet.ServletContainer-62c42a3@f5c8fa49==org.glassfish.jersey.servlet.ServletContainer,jsp=null,order=-1,inst=true,async=true:
> javax.servlet.UnavailableException: Servlet class
> org.glassfish.jersey.servlet.ServletContainer is not a javax.servlet.Servlet
> javax.servlet.UnavailableException: Servlet class
> org.glassfish.jersey.servlet.ServletContainer is not a javax.servlet.Servlet
> at
> org.spark_project.jetty.servlet.ServletHolder.checkServletType(ServletHolder.java:505)
> This seems to be caused by a conflict between the servlet implementation of
> jetty, and jersey.{noformat}
> Shading both Jersey libraries in phoenix-client has solved the problem for us.
--
This message was sent by Atlassian Jira
(v8.3.4#803005)