[
https://issues.apache.org/jira/browse/PHOENIX-3460?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=16152659#comment-16152659
]
Aritomo Abe edited comment on PHOENIX-3460 at 9/4/17 2:21 PM:
--
I have the same problem (HBase 1.0.3, Phoenix 4.8.0, Spark 1.6.3). But I cannot
just use "." to solve this issue. My scheme was created using the following
syntax:
create table if not exists "create table "test_namespace:test_table" (id
varchar not null primary key);
I can work with the table through JDBC, Phoenix console, etc. But not from
Spark:
org.apache.phoenix.schema.TableNotFoundException: ERROR 1012 (42M03): Table
undefined. tableName=test_namespace:test_table
Spark code is really simple:
{code:java}
import org.apache.hadoop.conf.Configuration
import org.apache.spark.SparkContext
import org.apache.spark.sql.SQLContext
import org.apache.phoenix.spark._
val configuration = new Configuration()
val sqlContext = new SQLContext(sc)
val df = sqlContext.phoenixTableAsDataFrame(
"\"test_namespace:test_table\"", Array("id"), conf = configuration
)
{code}
was (Author: aritomo_abe):
I have the same problem. But I cannot just use "." to solve this issue. My
scheme was created using the following syntax:
create table if not exists "create table "test_namespace:test_table" (id
varchar not null primary key);
I can work with the table through JDBC, Phoenix console, etc. But not from
Spark:
org.apache.phoenix.schema.TableNotFoundException: ERROR 1012 (42M03): Table
undefined. tableName=test_namespace:test_table
Spark code is really simple:
{code:java}
import org.apache.hadoop.conf.Configuration
import org.apache.spark.SparkContext
import org.apache.spark.sql.SQLContext
import org.apache.phoenix.spark._
val configuration = new Configuration()
val sqlContext = new SQLContext(sc)
val df = sqlContext.phoenixTableAsDataFrame(
"\"test_namespace:test_table\"", Array("id"), conf = configuration
)
{code}
> Phoenix Spark plugin cannot find table with a Namespace prefix
> --
>
> Key: PHOENIX-3460
> URL: https://issues.apache.org/jira/browse/PHOENIX-3460
> Project: Phoenix
> Issue Type: Bug
>Affects Versions: 4.8.0
> Environment: HDP 2.5
>Reporter: Xindian Long
> Labels: phoenix, spark
> Fix For: 4.7.0
>
>
> I am testing some code using Phoenix Spark plug in to read a Phoenix table
> with a namespace prefix in the table name (the table is created as a phoenix
> table not a hbase table), but it returns an TableNotFoundException.
> The table is obviously there because I can query it using plain phoenix sql
> through Squirrel. In addition, using spark sql to query it has no problem at
> all.
> I am running on the HDP 2.5 platform, with phoenix 4.7.0.2.5.0.0-1245
> The problem does not exist at all when I was running the same code on HDP 2.4
> cluster, with phoenix 4.4.
> Neither does the problem occur when I query a table without a namespace
> prefix in the DB table name, on HDP 2.5
> The log is in the attached file: tableNoFound.txt
> My testing code is also attached.
> The weird thing is in the attached code, if I run testSpark alone it gives
> the above exception, but if I run the testJdbc first, and followed by
> testSpark, both of them work.
> After changing to create table by using
> create table ACME.ENDPOINT_STATUS
> The phoenix-spark plug in seems working. I also find some weird behavior,
> If I do both the following
> create table ACME.ENDPOINT_STATUS ...
> create table "ACME:ENDPOINT_STATUS" ...
> Both table shows up in phoenix, the first one shows as Schema ACME, and table
> name ENDPOINT_STATUS, and the later on shows as scheme none, and table name
> ACME:ENDPOINT_STATUS.
> However, in HBASE, I only see one table ACME:ENDPOINT_STATUS. In addition,
> upserts in the table ACME.ENDPOINT_STATUS show up in the other table, so is
> the other way around.
>
--
This message was sent by Atlassian JIRA
(v6.4.14#64029)