Reposting again as unable to find the root cause where things are going wrong.
Experts please help . ---------- Forwarded message ---------- From: Divya Gehlot <divya.htco...@gmail.com> Date: 15 April 2016 at 19:13 Subject: [Help]:Strange Issue :Debug Spark Dataframe code To: "user @spark" <user@spark.apache.org> Hi, I am using Spark 1.5.2 with Scala 2.10. Is there any other option apart from "explain(true)" to debug Spark Dataframe code . I am facing strange issue . I have a lookuo dataframe and using it join another dataframe on different columns . I am getting *Analysis exception* in third join. When I checked the logical plan , its using the same reference for key but while selecting the columns reference are changing. For example df1 = COLUMN1#15,COLUMN2#16,COLUMN3#17 In first two joins I am getting the same reference and joining is happening For first two join the column COLUMN1#15 I am getting the COLUMN2#16 and COLUMN3#17. But at third join COLUMN1#15 is same but the other column reference are updating as COLUMN2#167,COLUMN3#168 Its throwing Spark Analysis Exception > org.apache.spark.sql.AnalysisException: resolved attribute(s) COLUMN1#15 > missing from after two joins,the dataframe has more than 25 columns Could anybody help light the path by holding the torch. Would really appreciate the help. Thanks, Divya