Hi Rishi,

If you look in the Spark UI, have any executors registered?

Are you able to collect a jstack of the driver process?

-Sandy

On Tue, Jan 20, 2015 at 9:07 PM, Rishi Yadav <ri...@infoobjects.com> wrote:

>  I am joining two tables as below, the program stalls at below log line
> and never proceeds.
> What might be the issue and possible solution?
>
> >>> INFO SparkContext: Starting job: RangePartitioner at Exchange.scala:79
>
> Table 1 has  450 columns
> Table2 has  100 columns
>
> Both tables have few million rows
>
>
>             val table1= myTable1.as('table1)
>             val table2= myTable2.as('table2)
>             val results=
> table1.join(table2,LeftOuter,Some("table1.Id".attr === "table2.id".attr ))
>
>
>            println(results.count())
>
> Thanks and Regards,
> Rishi
> @meditativesoul
>

Reply via email to