[ https://issues.apache.org/jira/browse/PHOENIX-2036?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=14645284#comment-14645284 ]
Dan Meany commented on PHOENIX-2036: ------------------------------------ Regarding the Spark integration may need to test tables with and without schema names using DataFrames > PhoenixConfigurationUtil should provide a pre-normalize table name to > PhoenixRuntime > ------------------------------------------------------------------------------------ > > Key: PHOENIX-2036 > URL: https://issues.apache.org/jira/browse/PHOENIX-2036 > Project: Phoenix > Issue Type: Bug > Reporter: Siddhi Mehta > Assignee: maghamravikiran > Priority: Minor > Attachments: PHOENIX-2036-spark-v2.patch, PHOENIX-2036-spark.patch, > PHOENIX-2036-v1.patch, PHOENIX-2036-v1.patch, PHOENIX-2036-v2.patch, > PHOENIX-2036.patch > > Original Estimate: 24h > Remaining Estimate: 24h > > I was trying a basic store using PhoenixHBaseStorage and ran into some issues > with it complaining about TableNotFoundException. > The table(CUSTOM_ENTITY."z02") in question exists. > Looking at the stacktrace I think its likely related to the change in > PHOENIX-1682 where phoenix runtime expects a pre-normalized table name. > We need to update > PhoenixConfigurationUtil.getSelectColumnMetadataList(Configuration) be pass a > pre-normalized table. -- This message was sent by Atlassian JIRA (v6.3.4#6332)