hi all,
I am a very newbie of apache spark, recently I have tried spark on yarn, it 
works for batch process. Now we want to try streaming process using 
spark-streaming, and still, use yarn for resource scheduler as we want to 
manager all the resource of the cluster used for computing tasks in a unified 
way.  Can this works? 
any suggestions are welcome!
 

--------------------------------------------------------------------------------

Best Regards!

Reply via email to