Hi,
I have a use case ,where I have one parent directory

File stucture looks like
hdfs:///TestDirectory/spark1/part files( created by some spark job )
hdfs:///TestDirectory/spark2/ part files (created by some spark job )

spark1 and spark 2 has different schema

like spark 1  part files schema
carname model year

Spark2 part files schema
carowner city  carcost


As these spark 1 and spark2 directory gets created dynamically
can have spark3 directory with different schema

M requirement is to read the parent directory and list sub drectory
and create dataframe for each subdirectory

I am not able to get how can I list subdirectory under parent directory and
dynamically create dataframes.

Thanks,
Divya

Reply via email to