[ 
https://issues.apache.org/jira/browse/PHOENIX-3460?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=16155492#comment-16155492
 ] 

Stas Sukhanov commented on PHOENIX-3460:
----------------------------------------

Hi, I have the same problem and conducted some investigation. There is a 
problem in {{org.apache.phoenix.util.PhoenixRuntime}} in 
[getTable|https://github.com/apache/phoenix/blob/master/phoenix-core/src/main/java/org/apache/phoenix/util/PhoenixRuntime.java#L442]
 and 
[generateColumnInfo|https://github.com/apache/phoenix/blob/master/phoenix-core/src/main/java/org/apache/phoenix/util/PhoenixRuntime.java#L469]
 methods when one uses namespace and not schema (e.g. "namespace:table").

Code in phoenix-spark calls {{generateColumnInfo}} that removes quotes by 
calling {{SchemaUtil.normalizeFullTableName(tableName)}} and passes call to 
{{getTable}}. When {{getTable}} fails to find table in cache it goes to 
fallback (see catch block). Without quotas that block treats namespace as 
schema and fails by throwing exception with origin table name. Unfortunately 
there is no good workaround. One option is to call 
{{MetaDataClient.updateCache}} manually beforehand and fill up the cache then 
{{getTable}} works on driver but you most likely get exception on workers.

In our project we included phoenix-core in shaded jar and replaced 
{{PhoenixRuntime}} with our implementation that doesn't convert namespace to 
schema.



> Phoenix Spark plugin cannot find table with a Namespace prefix
> --------------------------------------------------------------
>
>                 Key: PHOENIX-3460
>                 URL: https://issues.apache.org/jira/browse/PHOENIX-3460
>             Project: Phoenix
>          Issue Type: Bug
>    Affects Versions: 4.8.0
>         Environment: HDP 2.5
>            Reporter: Xindian Long
>              Labels: namespaces, phoenix, spark
>             Fix For: 4.7.0
>
>
> I am testing some code using Phoenix Spark plug in to read a Phoenix table 
> with a namespace prefix in the table name (the table is created as a phoenix 
> table not a hbase table), but it returns an TableNotFoundException.
> The table is obviously there because I can query it using plain phoenix sql 
> through Squirrel. In addition, using spark sql to query it has no problem at 
> all.
> I am running on the HDP 2.5 platform, with phoenix 4.7.0.2.5.0.0-1245
> The problem does not exist at all when I was running the same code on HDP 2.4 
> cluster, with phoenix 4.4.
> Neither does the problem occur when I query a table without a namespace 
> prefix in the DB table name, on HDP 2.5
> The log is in the attached file: tableNoFound.txt
> My testing code is also attached.
> The weird thing is in the attached code, if I run testSpark alone it gives 
> the above exception, but if I run the testJdbc first, and followed by 
> testSpark, both of them work.
>  After changing to create table by using
> create table ACME.ENDPOINT_STATUS
> The phoenix-spark plug in seems working. I also find some weird behavior,
> If I do both the following
> create table ACME.ENDPOINT_STATUS ...
> create table "ACME:ENDPOINT_STATUS" ...
> Both table shows up in phoenix, the first one shows as Schema ACME, and table 
> name ENDPOINT_STATUS, and the later on shows as scheme none, and table name 
> ACME:ENDPOINT_STATUS.
> However, in HBASE, I only see one table ACME:ENDPOINT_STATUS. In addition, 
> upserts in the table ACME.ENDPOINT_STATUS show up in the other table, so is 
> the other way around.
>  



--
This message was sent by Atlassian JIRA
(v6.4.14#64029)

Reply via email to