I am testing some code using Phoenix Spark plug in to read a Phoenix table with 
a namespace  prefix in the table name (the table is created as a phoenix table 
not a hbase table), but it returns an TableNotFoundException.

The table is obviously there because I can query it using plain phoenix sql 
through  Squirrel, and using spark sql to query has no problem at all.

The error log is in the attached file: tableNoFound.txt

My testing code is in the attached  file: query.java

The weird thing is in the attached code, if I run testSpark alone it gives the 
above exception, but if I run the testJdbc first, and followed by testSpark, 
both of them work.

I am running on the  HDP 2.5  platform, with phoenix 4.7.0.2.5.0.0-1245

The problem does not exist at all when I was running the same code on HDP 2.4 
cluster, with phoenix  4.4.

Neither does the problem occur when I query a table without a namespace prefix 
in the DB table name, on HDP 2.5

By the way, here is how the HBase looks like when I list it.

hbase(main):031:0* list
TABLE
ACME:ENDPOINT_CONFIG
ACME:ENDPOINT_STATUS
LONG:ENDPOINTS
LONG:RADIOCHANNELS
LONG:REGIONINFORMATION
LONG:TGBSTATISTICS
SENSUS1:ENDPOINTS
SENSUS1:RADIOCHANNELS
SENSUS1:REGIONINFORMATION
SENSUS1:TGBSTATISTICS
SENSUS2:ENDPOINTS
SENSUS2:RADIOCHANNELS
SENSUS2:REGIONINFORMATION
SENSUS2:TGBSTATISTICS
SENSUS:ENDPOINTS
SENSUS:RADIOCHANNELS
SENSUS:REGIONINFORMATION
SENSUS:TGBSTATISTICS
SYSTEM.CATALOG
SYSTEM:CATALOG
SYSTEM:FUNCTION
SYSTEM:SEQUENCE
SYSTEM:STATS
TENANT
24 row(s) in 0.0090 seconds

=> ["ACME:ENDPOINT_CONFIG", "ACME:ENDPOINT_STATUS", "LONG:ENDPOINTS", 
"LONG:RADIOCHANNELS", "LONG:REGIONINFORMATION", "LONG:TGBSTATISTICS", 
"SENSUS1:ENDPOINTS", "SENSUS1:RADIOCHANNELS", "SENSUS1:REGIONINFORMATION", 
"SENSUS1:TGBSTATISTICS", "SENSUS2:ENDPOINTS", "SENSUS2:RADIOCHANNELS", 
"SENSUS2:REGIONINFORMATION", "SENSUS2:TGBSTATISTICS", "SENSUS:ENDPOINTS", 
"SENSUS:RADIOCHANNELS", "SENSUS:REGIONINFORMATION", "SENSUS:TGBSTATISTICS", 
"SYSTEM.CATALOG", "SYSTEM:CATALOG", "SYSTEM:FUNCTION", "SYSTEM:SEQUENCE", 
"SYSTEM:STATS", "TENANT"]


16/11/03 16:32:25 INFO ZooKeeper: Initiating client connection, 
connectString=luna-sdp-nms-01.davis.sensus.lab:2181 sessionTimeout=90000 
watcher=org.apache.hadoop.hbase.zookeeper.PendingWatcher@27898e13
16/11/03 16:32:25 INFO ClientCnxn: Opening socket connection to server 
10.22.13.19/10.22.13.19:2181. Will not attempt to authenticate using SASL 
(unknown error)
16/11/03 16:32:25 INFO ClientCnxn: Socket connection established to 
10.22.13.19/10.22.13.19:2181, initiating session
16/11/03 16:32:25 INFO ClientCnxn: Session establishment complete on server 
10.22.13.19/10.22.13.19:2181, sessionid = 0x1582610cca900a6, negotiated timeout 
= 40000
16/11/03 16:32:25 INFO Metrics: Initializing metrics system: phoenix
16/11/03 16:32:25 WARN MetricsConfig: Cannot locate configuration: tried 
hadoop-metrics2-phoenix.properties,hadoop-metrics2.properties
16/11/03 16:32:25 INFO MetricsSystemImpl: Scheduled snapshot period at 10 
second(s).
16/11/03 16:32:25 INFO MetricsSystemImpl: phoenix metrics system started
16/11/03 16:32:26 ERROR Application: sql error: 
org.apache.phoenix.schema.TableNotFoundException: ERROR 1012 (42M03): Table 
undefined. tableName=ACME:ENDPOINT_STATUS
        at 
org.apache.phoenix.schema.PMetaDataImpl.getTableRef(PMetaDataImpl.java:265)
        at 
org.apache.phoenix.jdbc.PhoenixConnection.getTable(PhoenixConnection.java:449)
        at 
org.apache.phoenix.util.PhoenixRuntime.getTable(PhoenixRuntime.java:407)
        at 
org.apache.phoenix.util.PhoenixRuntime.generateColumnInfo(PhoenixRuntime.java:433)
        at 
org.apache.phoenix.mapreduce.util.PhoenixConfigurationUtil.getSelectColumnMetadataList(PhoenixConfigurationUtil.java:279)
        at org.apache.phoenix.spark.PhoenixRDD.toDataFrame(PhoenixRDD.scala:106)
        at 
org.apache.phoenix.spark.PhoenixRelation.schema(PhoenixRelation.scala:57)
        at 
org.apache.spark.sql.execution.datasources.LogicalRelation.<init>(LogicalRelation.scala:37)
        at org.apache.spark.sql.DataFrameReader.load(DataFrameReader.scala:125)
        at 
com.sensus.NMSEngineOnHadoop.Application.testSpark(Application.java:150)
        at com.sensus.NMSEngineOnHadoop.Application.main(Application.java:129)
        at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
        at 
sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
        at 
sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
        at java.lang.reflect.Method.invoke(Method.java:498)
        at 
org.apache.spark.deploy.SparkSubmit$.org$apache$spark$deploy$SparkSubmit$$runMain(SparkSubmit.scala:731)
        at 
org.apache.spark.deploy.SparkSubmit$.doRunMain$1(SparkSubmit.scala:181)
        at org.apache.spark.deploy.SparkSubmit$.submit(SparkSubmit.scala:206)
        at org.apache.spark.deploy.SparkSubmit$.main(SparkSubmit.scala:121)
        at org.apache.spark.deploy.SparkSubmit.main(SparkSubmit.scala)
16/11/03 16:32:26 ERROR Application: dataframe error: 
java.lang.NullPointerException
        at 
com.sensus.NMSEngineOnHadoop.Application.testSpark(Application.java:157)
        at com.sensus.NMSEngineOnHadoop.Application.main(Application.java:129)
        at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
        at 
sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
        at 
sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
        at java.lang.reflect.Method.invoke(Method.java:498)
        at 
org.apache.spark.deploy.SparkSubmit$.org$apache$spark$deploy$SparkSubmit$$runMain(SparkSubmit.scala:731)
        at 
org.apache.spark.deploy.SparkSubmit$.doRunMain$1(SparkSubmit.scala:181)
        at org.apache.spark.deploy.SparkSubmit$.submit(SparkSubmit.scala:206)
        at org.apache.spark.deploy.SparkSubmit$.main(SparkSubmit.scala:121)
        at org.apache.spark.deploy.SparkSubmit.main(SparkSubmit.scala)
16/11/03 16:32:26 INFO SparkContext: Invoking stop() from shutdown hook
16/11/03 16:32:26 INFO ContextHandler: stopped 
o.s.j.s.ServletContextHandler{/static/sql,null}
16/11/03 16:32:26 INFO ContextHandler: stopped 
o.s.j.s.ServletContextHandler{/SQL/execution/json,null}
16/11/03 16:32:26 INFO ContextHandler: stopped 
o.s.j.s.ServletContextHandler{/SQL/execution,null}
16/11/03 16:32:26 INFO ContextHandler: stopped 
o.s.j.s.ServletContextHandler{/SQL/json,null}
16/11/03 16:32:26 INFO ContextHandler: stopped 
o.s.j.s.ServletContextHandler{/SQL,null}
16/11/03 16:32:26 INFO ContextHandler: stopped 
o.s.j.s.ServletContextHandler{/metrics/json,null}
16/11/03 16:32:26 INFO ContextHandler: stopped 
o.s.j.s.ServletContextHandler{/stages/stage/kill,null}
16/11/03 16:32:26 INFO ContextHandler: stopped 
o.s.j.s.ServletContextHandler{/api,null}
16/11/03 16:32:26 INFO ContextHandler: stopped 
o.s.j.s.ServletContextHandler{/,null}
16/11/03 16:32:26 INFO ContextHandler: stopped 
o.s.j.s.ServletContextHandler{/static,null}
16/11/03 16:32:26 INFO ContextHandler: stopped 
o.s.j.s.ServletContextHandler{/executors/threadDump/json,null}
16/11/03 16:32:26 INFO ContextHandler: stopped 
o.s.j.s.ServletContextHandler{/executors/threadDump,null}
16/11/03 16:32:26 INFO ContextHandler: stopped 
o.s.j.s.ServletContextHandler{/executors/json,null}
16/11/03 16:32:26 INFO ContextHandler: stopped 
o.s.j.s.ServletContextHandler{/executors,null}
16/11/03 16:32:26 INFO ContextHandler: stopped 
o.s.j.s.ServletContextHandler{/environment/json,null}
16/11/03 16:32:26 INFO ContextHandler: stopped 
o.s.j.s.ServletContextHandler{/environment,null}
16/11/03 16:32:26 INFO ContextHandler: stopped 
o.s.j.s.ServletContextHandler{/storage/rdd/json,null}
16/11/03 16:32:26 INFO ContextHandler: stopped 
o.s.j.s.ServletContextHandler{/storage/rdd,null}
16/11/03 16:32:26 INFO ContextHandler: stopped 
o.s.j.s.ServletContextHandler{/storage/json,null}
16/11/03 16:32:26 INFO ContextHandler: stopped 
o.s.j.s.ServletContextHandler{/storage,null}
16/11/03 16:32:26 INFO ContextHandler: stopped 
o.s.j.s.ServletContextHandler{/stages/pool/json,null}
16/11/03 16:32:26 INFO ContextHandler: stopped 
o.s.j.s.ServletContextHandler{/stages/pool,null}
16/11/03 16:32:26 INFO ContextHandler: stopped 
o.s.j.s.ServletContextHandler{/stages/stage/json,null}
16/11/03 16:32:26 INFO ContextHandler: stopped 
o.s.j.s.ServletContextHandler{/stages/stage,null}
16/11/03 16:32:26 INFO ContextHandler: stopped 
o.s.j.s.ServletContextHandler{/stages/json,null}
16/11/03 16:32:26 INFO ContextHandler: stopped 
o.s.j.s.ServletContextHandler{/stages,null}
16/11/03 16:32:26 INFO ContextHandler: stopped 
o.s.j.s.ServletContextHandler{/jobs/job/json,null}
16/11/03 16:32:26 INFO ContextHandler: stopped 
o.s.j.s.ServletContextHandler{/jobs/job,null}
16/11/03 16:32:26 INFO ContextHandler: stopped 
o.s.j.s.ServletContextHandler{/jobs/json,null}
16/11/03 16:32:26 INFO ContextHandler: stopped 
o.s.j.s.ServletContextHandler{/jobs,null}
16/11/03 16:32:26 INFO SparkUI: Stopped Spark web UI at 
http://192.168.100.10:4040
16/11/03 16:32:26 INFO MapOutputTrackerMasterEndpoint: 
MapOutputTrackerMasterEndpoint stopped!
16/11/03 16:32:26 INFO MemoryStore: MemoryStore cleared
16/11/03 16:32:26 INFO BlockManager: BlockManager stopped
16/11/03 16:32:26 INFO BlockManagerMaster: BlockManagerMaster stopped
16/11/03 16:32:26 INFO OutputCommitCoordinator$OutputCommitCoordinatorEndpoint: 
OutputCommitCoordinator stopped!
16/11/03 16:32:26 INFO SparkContext: Successfully stopped SparkContext
16/11/03 16:32:26 INFO ShutdownHookManager: Shutdown hook called
16/11/03 16:32:26 INFO ShutdownHookManager: Deleting directory 
/tmp/spark-6121baef-3d66-473e-8799-6733fb414ddd
16/11/03 16:32:26 INFO RemoteActorRefProvider$RemotingTerminator: Shutting down 
remote daemon.
16/11/03 16:32:26 INFO RemoteActorRefProvider$RemotingTerminator: Remote daemon 
shut down; proceeding with flushing remote transports.
16/11/03 16:32:26 INFO RemoteActorRefProvider$RemotingTerminator: Remoting shut 
down.
16/11/03 16:32:27 INFO ShutdownHookManager: Deleting directory 
/tmp/spark-6121baef-3d66-473e-8799-6733fb414ddd/httpd-0cfae97e-687e-47ce-af4e-301b0900a4c8

Attachment: query.java
Description: query.java

Reply via email to