:
You need open the Stage\'s page which is taking time, and see how long
its spending on GC etc. Also it will be good to post that Stage and its
previous transformation's code snippet to make us understand it better.
Thanks
Best Regards
On Fri, Apr 3, 2015 at 1:05 PM, Vijay Innamuri vijay.innam
* is inefficient. Is
there any alternative processing for xml files?
- How to create Spark SQL table with the above xml data?
Regards
Vijay Innamuri
On 16 March 2015 at 12:12, Akhil Das ak...@sigmoidanalytics.com wrote:
One approach would be, If you are using fileStream you can access
Hi All,
Processing streaming JSON files with Spark features (Spark streaming and
Spark SQL), is very efficient and works like a charm.
Below is the code snippet to process JSON files.
windowDStream.foreachRDD(IncomingFiles = {
val IncomingFilesTable =