Hi All, Is there a way to access multiple dictionaries with different schema structures inside a list in txt file, individually in isolation/combination as needed, from Spark shell using Scala? The need is to use information from different combinations of the dictionaries to calculate for reporting purpose. Thought of different options -
1) Load the data from txt file into data frame, and then segregating them into different data frames. This way, all the different dictionaries with different schema structures inside the list can be accessed as separate data frames, and calculations done using Spark Sql 2) Converting the txt file to parquet or json file might not be much helpful, as the issue of being able to access the different dictionaries individually or in combination would still be a challenge. Hence am thinking the first option would be better. Any other suggestions? ThanksSrabasti