Sorry, I'm not quite following what your intent is.  Not sure how TagN column 
is being derived.  Is Dataset1 your input and Dataset 2 your output?  I Don't 
see the relationship between them clearly.  Can you describe your input, and 
the expected output.
--
Ali

On Feb 2, 2016, at 11:28 PM, Divya Gehlot <divya.htco...@gmail.com> wrote:

> Hi,
> I have data set like :
> Dataset 1
> HeaderCol1 HeadCol2 HeadCol3 
>  dataset 1 dataset2 dataset 3
> dataset 11 dataset13 dataset 13
> dataset 21 dataset22 dataset 23
> 
> Datset 2
> HeadColumn1 HeadColumn2    HeadColumn3     HeadColumn4
> Tag1              Dataset1          
> Tag2              Dataset1               Dataset2
> Tag3              Dataset1              Dataset2               Dataset3
> Tag4             DifferentDataset1
> Tag5             DifferentDataset1   DifferentDataset2 
> Tag6             DifferentDataset1    DifferentDataset2     DifferentDataset3
> 
> 
> My requirement is to tag dataset(adding one more column) based on dataset 1
> 
> 
> Can I do implement it in spark.
> In RDBMS we have implemented using dynamic sql.
> 
> Would really appreciate the help.
> 
> 
> Thanks,
> Divya 
>  
> 
> 
> 
> 
> On 3 February 2016 at 11:42, Ali Tajeldin EDU <alitedu1...@gmail.com> wrote:
> While you can construct the SQL string dynamically in scala/java/python, it 
> would be best to use the Dataframe API for creating dynamic SQL queries.  See 
> http://spark.apache.org/docs/1.5.2/sql-programming-guide.html for details.
> 
> On Feb 2, 2016, at 6:49 PM, Divya Gehlot <divya.htco...@gmail.com> wrote:
> 
>> Hi,
>> Does Spark supports dyamic sql ?
>> Would really appreciate the help , if any one could share some 
>> references/examples.
>> 
>> 
>> 
>> Thanks,
>> Divya 
> 
> 

Reply via email to