Re: Dynamic data ingestion into SparkSQL - Interesting question
Yes, I did the same. It's working. Thanks! On 21-Nov-2017 4:04 PM, "Fernando Pereira" wrote: > Did you consider do string processing to build the SQL expression which > you can execute with spark.sql(...)? > Some examples: https://spark.apache.org/docs/latest/sql- > programming-guide.html#hive-tables > > Cheers > > On 21 November 2017 at 03:27, Aakash Basu > wrote: > >> Hi all, >> >> Any help? PFB. >> >> Thanks, >> Aakash. >> >> On 20-Nov-2017 6:58 PM, "Aakash Basu" wrote: >> >>> Hi all, >>> >>> I have a table which will have 4 columns - >>> >>> | Expression|filter_condition| from_clause| >>> group_by_columns| >>> >>> >>> This file may have variable number of rows depending on the no. of KPIs >>> I need to calculate. >>> >>> I need to write a SparkSQL program which will have to read this file and >>> run each line of queries dynamically by fetching each column value for a >>> particular row and create a select query out of it and run inside a >>> dataframe, later saving it as a temporary table. >>> >>> Did anyone do this kind of exercise? If yes, can I get some help on it >>> pls? >>> >>> Thanks, >>> Aakash. >>> >> >
Re: Dynamic data ingestion into SparkSQL - Interesting question
Did you consider do string processing to build the SQL expression which you can execute with spark.sql(...)? Some examples: https://spark.apache.org/docs/latest/sql-programming-guide.html#hive-tables Cheers On 21 November 2017 at 03:27, Aakash Basu wrote: > Hi all, > > Any help? PFB. > > Thanks, > Aakash. > > On 20-Nov-2017 6:58 PM, "Aakash Basu" wrote: > >> Hi all, >> >> I have a table which will have 4 columns - >> >> | Expression|filter_condition| from_clause| >> group_by_columns| >> >> >> This file may have variable number of rows depending on the no. of KPIs I >> need to calculate. >> >> I need to write a SparkSQL program which will have to read this file and >> run each line of queries dynamically by fetching each column value for a >> particular row and create a select query out of it and run inside a >> dataframe, later saving it as a temporary table. >> >> Did anyone do this kind of exercise? If yes, can I get some help on it >> pls? >> >> Thanks, >> Aakash. >> >
Re: Dynamic data ingestion into SparkSQL - Interesting question
Hi all, Any help? PFB. Thanks, Aakash. On 20-Nov-2017 6:58 PM, "Aakash Basu" wrote: > Hi all, > > I have a table which will have 4 columns - > > | Expression|filter_condition| from_clause| > group_by_columns| > > > This file may have variable number of rows depending on the no. of KPIs I > need to calculate. > > I need to write a SparkSQL program which will have to read this file and > run each line of queries dynamically by fetching each column value for a > particular row and create a select query out of it and run inside a > dataframe, later saving it as a temporary table. > > Did anyone do this kind of exercise? If yes, can I get some help on it pls? > > Thanks, > Aakash. >