Hi,

I am looking on how to add multiple folders to spark context and then make
it as a dataframe.

Lets say I have below folder

/daas/marts/US/file1.txt
/daas/marts/CH/file2.txt
/daas/marts/SG/file3.txt.

Above files have same schema. I dont want to create multiple dataframes
instead create only one dataframe by taking above folders/files as single
input to SqlContext.  Does anyone have solution for this?

Thanks,
Asmath.

Reply via email to