Hello,

I’m running some spark sql queries with registerTempTable function. But i get 
below error:

org.apache.spark.sql.AnalysisException: resolved attribute(s) 
day#1680,year#1678,dt#1682,month#1679,hour#1681 missing from 
searchs#1670,productsize#1666,referrer#1659,day#1653,sessionid#1674,partnerid#1657,producturl#1665,productprice#1667,docid#1675,userid#1672,landingid#1673,year#1651,productcategory#1662,month#1652,pagetype#1661,productimageurl#1664,currency#1668,cartamount#1669,productname#1663,hour#1654,category#1671,dt#1655,date#1658,currenturl#1660
 in operator !Project 
[partnerid#1657,date#1658,referrer#1659,currenturl#1660,pagetype#1661,productcategory#1662,productname#1663,productimageurl#1664,producturl#1665,productsize#1666,productprice#1667,currency#1668,cartamount#1669,searchs#1670,category#1671,userid#1672,landingid#1673,sessionid#1674,docid#1675,year#1678,month#1679,day#1680,hour#1681,dt#1682];
        at 
org.apache.spark.sql.catalyst.analysis.CheckAnalysis$class.failAnalysis(CheckAnalysis.scala:38)


However, when i use “create table” in sql query instead of calling 
registerTempTable function it does not give error. But create table way is much 
slower than “registerTempTable”

Do you have any idea why it gives error when i use “registerTempTable”?

My query which gives error is below, also this query uses tables which are 
created with “registerTempTable”:

sqlContext.sql(
  """
    |select sessionid,date,nexteventdate, 
unix_timestamp(nexteventdate)-unix_timestamp(date) as duration
    |from (select t1.sessionid,t1.date, min(t2.date) nexteventdate
    |from partner t1 join partner t2 on t1.sessionid=t2.sessionid where 
unix_timestamp(t2.date)>unix_timestamp(t1.date)
    |group by t1.sessionid,t1.date) inq
  """.stripMargin).registerTempTable("partner_duration")



Reply via email to