Re: Spark 2.0 - Join statement compile error

2016-08-30 Thread shengshanzhang
Hi ,try this way val df = sales_demand.join(product_master,sales_demand("INVENTORY_ITEM_ID") === product_master("INVENTORY_ITEM_ID"),"inner") > 在 2016年8月30日,下午5:52,Jacek Laskowski 写道: > > Hi Mich, > > This is the first time I've been told about $ for string interpolation (as

Re: Spark 2.0 - Join statement compile error

2016-08-30 Thread Jacek Laskowski
Hi Mich, This is the first time I've been told about $ for string interpolation (as the function not the placeholder). Thanks for letting me know about it! What is often used is s"whatever you want to reference inside the string $-prefix unless it is a complex expression" i.e. scala> s"I'm

Re: Spark 2.0 - Join statement compile error

2016-08-30 Thread Mich Talebzadeh
Actually I doubled checked this ‘s’ String Interpolator In Scala scala> val chars = "This is Scala" chars: String = This is Scala scala> println($"$chars") This is Scala OK so far fine. In shell (ksh) can do chars="This is Scala" print "$chars" This is Scala In Shell print "$charsand it is

Re: Spark 2.0 - Join statement compile error

2016-08-28 Thread Mich Talebzadeh
Yes I realised that. Actually I thought it was s not $. it has been around in shell for years say for actual values --> ${LOG_FILE}, for position 's/ etc cat ${LOG_FILE} | egrep -v 'rows affected|return status|&&&' | sed -e 's/^[]*//g' -e 's/^//g' -e '/^$/d' > temp.out Dr

Re: Spark 2.0 - Join statement compile error

2016-08-28 Thread Jacek Laskowski
Hi Mich, This is Scala's string interpolation which allow for replacing $-prefixed expressions with their values. It's what cool kids use in Scala to do templating and concatenation  Jacek On 23 Aug 2016 9:21 a.m., "Mich Talebzadeh" wrote: > What is --> s below

Re: Spark 2.0 - Join statement compile error

2016-08-23 Thread Mich Talebzadeh
What is --> s below before the text of sql? *var* sales_order_sql_stmt =* s*"""SELECT ORDER_NUMBER , INVENTORY_ITEM_ID, ORGANIZATION_ID, from_unixtime(unix_timestamp(SCHEDULE_SHIP_DATE,'-MM-dd'), '-MM-dd') AS schedule_date FROM sales_order_demand WHERE

Re: Spark 2.0 - Join statement compile error

2016-08-23 Thread Deepak Sharma
On Tue, Aug 23, 2016 at 10:32 AM, Deepak Sharma wrote: > *val* *df** = > **sales_demand**.**join**(**product_master**,**sales_demand**.$"INVENTORY_ITEM_ID" > =**== **product_master**.$"INVENTORY_ITEM_ID",**"inner"**)* Ignore the last statement. It should look something

Re: Spark 2.0 - Join statement compile error

2016-08-22 Thread Deepak Sharma
Hi Subhajit Try this in your join: *val* *df** = **sales_demand**.**join**(**product_master**,**sales_demand**.$"INVENTORY_ITEM_ID" =**== **product_master**.$"INVENTORY_ITEM_ID",**"inner"**)* On Tue, Aug 23, 2016 at 2:30 AM, Subhajit Purkayastha wrote: > *All,* > > > > *I

Re: Spark 2.0 - Join statement compile error

2016-08-22 Thread Vishal Maru
try putting join condition as String On Mon, Aug 22, 2016 at 5:00 PM, Subhajit Purkayastha wrote: > *All,* > > > > *I have the following dataFrames and the temp table. * > > > > *I am trying to create a new DF , the following statement is not compiling* > > > > *val* *df** =

Spark 2.0 - Join statement compile error

2016-08-22 Thread Subhajit Purkayastha
All, I have the following dataFrames and the temp table. I am trying to create a new DF , the following statement is not compiling val df = sales_demand.join(product_master,(sales_demand.INVENTORY_ITEM_ID==product_ma ster.INVENTORY_ITEM_ID),joinType="inner") What am I