[ 
https://issues.apache.org/jira/browse/PHOENIX-3721?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=16069156#comment-16069156
 ] 

Kanagha Pradha commented on PHOENIX-3721:
-----------------------------------------

This issue is also occurring with phoenix 4.10, HBase 0.98.24 version using 
phoenix spark (spark version 2.0.2) when calling DataFrameReader.load() via 
JDBC connector. Thanks.

Stack trace: 
at 
org.apache.phoenix.query.ConnectionQueryServicesImpl.openConnection(ConnectionQueryServicesImpl.java:434)
        at 
org.apache.phoenix.query.ConnectionQueryServicesImpl.access$400(ConnectionQueryServicesImpl.java:260)
        at 
org.apache.phoenix.query.ConnectionQueryServicesImpl$12.call(ConnectionQueryServicesImpl.java:2402)
        at 
org.apache.phoenix.query.ConnectionQueryServicesImpl$12.call(ConnectionQueryServicesImpl.java:2378)
        at 
org.apache.phoenix.util.PhoenixContextExecutor.call(PhoenixContextExecutor.java:76)
        at 
org.apache.phoenix.query.ConnectionQueryServicesImpl.init(ConnectionQueryServicesImpl.java:2378)
        at 
org.apache.phoenix.jdbc.PhoenixDriver.getConnectionQueryServices(PhoenixDriver.java:255)
        at 
org.apache.phoenix.jdbc.PhoenixEmbeddedDriver.createConnection(PhoenixEmbeddedDriver.java:149)
        at org.apache.phoenix.jdbc.PhoenixDriver.connect(PhoenixDriver.java:221)
        at 
org.apache.spark.sql.execution.datasources.jdbc.DriverWrapper.connect(DriverWrapper.scala:45)
        at 
org.apache.spark.sql.execution.datasources.jdbc.JdbcUtils$$anonfun$createConnectionFactory$2.apply(JdbcUtils.scala:65)
        at 
org.apache.spark.sql.execution.datasources.jdbc.JdbcUtils$$anonfun$createConnectionFactory$2.apply(JdbcUtils.scala:56)
        at 
org.apache.spark.sql.execution.datasources.jdbc.JDBCRDD$.resolveTable(JDBCRDD.scala:123)
        at 
org.apache.spark.sql.execution.datasources.jdbc.JDBCRelation.<init>(JDBCRelation.scala:117)
        at 
org.apache.spark.sql.execution.datasources.jdbc.JdbcRelationProvider.createRelation(JdbcRelationProvider.scala:53)
        at 
org.apache.spark.sql.execution.datasources.DataSource.resolveRelation(DataSource.scala:328)
        at org.apache.spark.sql.DataFrameReader.load(DataFrameReader.scala:149)
        at org.apache.spark.sql.DataFrameReader.load(DataFrameReader.scala:122)

// while running in cluster mode
ExecutionContext ctx;
final Dataset<Row> df = ctx.getInputDataFrame(..)
                .map(...)
                .orElseGet(() -> executeSQL(ctx));
executeSQL() =>
DataFrameReader dfReader = 
ctx.sql().read().format("jdbc").options(ImmutableMap.of(
                "driver", "org.apache.phoenix.jdbc.PhoenixDriver",
                "url", ctx.getConnectionUrl(),
                "dbtable", tableSql
        ));
dfReader.load();
 


> CSV bulk load doesn't work well with SYSTEM.MUTEX
> -------------------------------------------------
>
>                 Key: PHOENIX-3721
>                 URL: https://issues.apache.org/jira/browse/PHOENIX-3721
>             Project: Phoenix
>          Issue Type: Bug
>    Affects Versions: 4.10.0
>            Reporter: Sergey Soldatov
>            Priority: Blocker
>
> This is quite strange. I'm using HBase 1.2.4 and current master branch.
> During the running CSV bulk load in the regular way I got the following 
> exception: 
> {noformat}
> xception in thread "main" java.sql.SQLException: 
> org.apache.hadoop.ipc.RemoteException(org.apache.hadoop.hbase.TableExistsException):
>  SYSTEM.MUTEX
>       at 
> org.apache.phoenix.query.ConnectionQueryServicesImpl$12.call(ConnectionQueryServicesImpl.java:2465)
>       at 
> org.apache.phoenix.query.ConnectionQueryServicesImpl$12.call(ConnectionQueryServicesImpl.java:2382)
>       at 
> org.apache.phoenix.util.PhoenixContextExecutor.call(PhoenixContextExecutor.java:76)
>       at 
> org.apache.phoenix.query.ConnectionQueryServicesImpl.init(ConnectionQueryServicesImpl.java:2382)
>       at 
> org.apache.phoenix.jdbc.PhoenixDriver.getConnectionQueryServices(PhoenixDriver.java:255)
>       at 
> org.apache.phoenix.jdbc.PhoenixEmbeddedDriver.createConnection(PhoenixEmbeddedDriver.java:149)
>       at org.apache.phoenix.jdbc.PhoenixDriver.connect(PhoenixDriver.java:221)
>       at java.sql.DriverManager.getConnection(DriverManager.java:664)
>       at java.sql.DriverManager.getConnection(DriverManager.java:208)
>       at org.apache.phoenix.util.QueryUtil.getConnection(QueryUtil.java:337)
>       at org.apache.phoenix.util.QueryUtil.getConnection(QueryUtil.java:329)
>       at 
> org.apache.phoenix.mapreduce.AbstractBulkLoadTool.loadData(AbstractBulkLoadTool.java:209)
>       at 
> org.apache.phoenix.mapreduce.AbstractBulkLoadTool.run(AbstractBulkLoadTool.java:183)
>       at org.apache.hadoop.util.ToolRunner.run(ToolRunner.java:70)
>       at org.apache.hadoop.util.ToolRunner.run(ToolRunner.java:84)
>       at 
> org.apache.phoenix.mapreduce.CsvBulkLoadTool.main(CsvBulkLoadTool.java:109)
>       at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
>       at 
> sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
>       at 
> sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
>       at java.lang.reflect.Method.invoke(Method.java:497)
>       at org.apache.hadoop.util.RunJar.run(RunJar.java:221)
>       at org.apache.hadoop.util.RunJar.main(RunJar.java:136)
> Caused by: 
> org.apache.hadoop.ipc.RemoteException(org.apache.hadoop.hbase.TableExistsException):
>  SYSTEM.MUTEX
>       at 
> org.apache.hadoop.hbase.master.procedure.CreateTableProcedure.prepareCreate(CreateTableProcedure.java:285)
>       at 
> org.apache.hadoop.hbase.master.procedure.CreateTableProcedure.executeFromState(CreateTableProcedure.java:106)
>       at 
> org.apache.hadoop.hbase.master.procedure.CreateTableProcedure.executeFromState(CreateTableProcedure.java:58)
>       at 
> org.apache.hadoop.hbase.procedure2.StateMachineProcedure.execute(StateMachineProcedure.java:119)
>       at 
> org.apache.hadoop.hbase.procedure2.Procedure.doExecute(Procedure.java:498)
>       at 
> org.apache.hadoop.hbase.procedure2.ProcedureExecutor.execProcedure(ProcedureExecutor.java:1061)
>       at 
> org.apache.hadoop.hbase.procedure2.ProcedureExecutor.execLoop(ProcedureExecutor.java:856)
>       at 
> org.apache.hadoop.hbase.procedure2.ProcedureExecutor.execLoop(ProcedureExecutor.java:809)
>       at 
> org.apache.hadoop.hbase.procedure2.ProcedureExecutor.access$400(ProcedureExecutor.java:75)
>       at 
> org.apache.hadoop.hbase.procedure2.ProcedureExecutor$2.run(ProcedureExecutor.java:495)
> {noformat}
> Checked the code and it seems that the problem is in the createSysMutexTable 
> function. Its expect TableExistsException (and skip it), but in my case the 
> exception is wrapped by RemoteException, so it's not skipped and the init 
> fails. The easy fix is to handle RemoteException and check that it wraps 
> TableExistsException, but it looks a bit  ugly.  
> [~jamestaylor] [~samarthjain] any thoughts? 



--
This message was sent by Atlassian JIRA
(v6.4.14#64029)

Reply via email to