Hi Kiran,

Thanks for responding. We would like to know how industry is dealing scenario 
like Update in SPARK.  Here is our scenario Manjunath, We are in process of 
migrating our SQL server data to Spark. We have our logic in stored procedure, 
where we dynamically create SQL String and execute that SQL String (Dynamic 
SQL), we would like to implement Dynamic string and submit to hive context and 
execute it .

Here is the query in SQL

UPDATE     table1
      SET       X = A
        ,Y  = B
 FROM     Table1
 WHERE  ISNULL([Z] ,'') <> ''
      AND  [ColumnW] NOT IN ('X' ,'ACD', 'A', 'B', 'C')
     AND   [ColumnA] IS NULL


We would like to convert  using Spark SQL , the other way i would think of is 
using of Data frame with "WithColumn" along with WHEN condition for each column 
i.e X and Y , here when condition will have same repetitive code applied on 
each column based on above where clause stmt/condition . I would like to know 
Industry practices for these kind of scenarios.

On 10/26/2016 4:09 AM, Manjunath, Kiran wrote:
Hi,

Can you elaborate with sample example on why you would want to do so?
Ideally there would be a better approach than solving such problems as 
mentioned below.

A sample example would help to understand the problem.

Regards,
Kiran

From: Mahender Sarangam 
<mahender.bigd...@outlook.com><mailto:mahender.bigd...@outlook.com>
Date: Wednesday, October 26, 2016 at 2:05 PM
To: user <user@spark.apache.org><mailto:user@spark.apache.org>
Subject: Any Dynamic Compilation of Scala Query

Hi,

Is there any way to dynamically execute a string  which has scala code
against spark engine. We are dynamically creating scala file, we would
like to submit this scala file to Spark, but currently spark accepts
only JAR file has input from Remote Job submission. Is there any other
way to submit .SCALA instead of .JAR to REST API of Spark ?

/MS



Reply via email to