Hi all,

When I run sc.stop() in a standalone program, does that mean all resource
used but sc, such as memory, process created by it and CPU will be free? It
is possible that restart a SparkContext in a standalone program?

I want to use Spark to run a job on files batch by batch. Let's say there
are 100 files in one batch. What I tried is as following:

while(){
  Create a new sc in SparkContext
  .
  .
  .
  sc.stop()
}

But it doesn't work. So is there any way that can allow me do it in this
way?

Thanks,

Xiang

-- 
Xiang Huo
Department of Computer Science
University of Illinois at Chicago(UIC)
Chicago, Illinois
US
Email: huoxiang5...@gmail.com
           or xh...@uic.edu

Reply via email to