Hi all,

I have a table which will have 4 columns -

|          Expression|    filter_condition|         from_clause|
group_by_columns|


This file may have variable number of rows depending on the no. of KPIs I
need to calculate.

I need to write a SparkSQL program which will have to read this file and
run each line of queries dynamically by fetching each column value for a
particular row and create a select query out of it and run inside a
dataframe, later saving it as a temporary table.

Did anyone do this kind of exercise? If yes, can I get some help on it pls?

Thanks,
Aakash.

Reply via email to