Hi Hussain, I'm not familiar with the Spark temporary table syntax. Perhaps you can work around it by using other options, such as the DataFrame.save() functionality which is documented [1] and unit tested [2].
I suspect what you're encountering is a valid use case. If you could also file a JIRA ticket, and as a bonus, provide a patch, that would be great. Best of luck, Josh [1] https://phoenix.apache.org/phoenix_spark.html [2] https://github.com/apache/phoenix/blob/master/phoenix-spark/src/it/scala/org/apache/phoenix/spark/PhoenixSparkIT.scala#L297 On Wed, Nov 16, 2016 at 4:10 AM, Hussain Pirosha < [email protected]> wrote: > I am trying to insert into temporary table created on a Spark (v 1.6) > DataFrame loaded using Phoenix-Spark (v 4.4) plugin. Below is the code: > > val sc = new SparkContext("local", "phoenix-test") > > val configuration = new Configuration() > > configuration.set("zookeeper.znode.parent", "/hbase-unsecure") > > val df = sqlContext.phoenixTableAsDataFrame( > > "EMAIL_ENRON", Array("MAIL_FROM", "MAIL_TO"), conf = configuration) > > df.registerTempTable("TEMP_TABLE"); > > Table which is defined in HBase using Phoenix is : > > TABLE EMAIL_ENRON(MAIL_FROM BIGINT NOT NULL, MAIL_TO BIGINT NOT NULL > CONSTRAINT pk PRIMARY KEY(MAIL_FROM, MAIL_TO)); > > While trying to insert into temporary table, spark-shell gives the below > error > > Insert statement: > > sqlContext.sql("insert into table TEMP_TABLE select t.* from (select > 55,66) t"); > > Exception: > > scala> sqlContext.sql("insert into table TEMP_TABLE select t.* from > (select 55,66) t"); > > 16/11/16 11:15:46 INFO ParseDriver: Parsing command: insert into table > TEMP_TABLE select t.* from (select 55,66) t > > 16/11/16 11:15:46 INFO ParseDriver: Parse Completed > > org.apache.spark.sql.AnalysisException: unresolved operator > 'InsertIntoTable LogicalRDD [MAIL_FROM#0L,MAIL_TO#1L], MapPartitionsRDD[3] > at createDataFrame at PhoenixRDD.scala:117, Map(), false, false; > > at org.apache.spark.sql.catalyst.analysis.CheckAnalysis$class. > failAnalysis(CheckAnalysis.scala:38) > > at org.apache.spark.sql.catalyst.analysis.Analyzer. > failAnalysis(Analyzer.scala:44) > > at org.apache.spark.sql.catalyst.analysis.CheckAnalysis$$ > anonfun$checkAnalysis$1.apply(CheckAnalysis.scala:203) > > What is the correct way to insert into a temporary table ? From the > exception it looks like that Phoenix-Spark does allow inserting into > temporary table, may be the syntax that i am using is incorrect. > > > > ------------------------------ > > > > > > > NOTE: This message may contain information that is confidential, > proprietary, privileged or otherwise protected by law. The message is > intended solely for the named addressee. If received in error, please > destroy and notify the sender. Any use of this email is prohibited when > received in error. Impetus does not represent, warrant and/or guarantee, > that the integrity of this communication has been maintained nor that the > communication is free of errors, virus, interception or interference. >
