Hi,
how we can create new sparkcontext from Ipython or jupyter session
i mean if i use current sparkcontext and i run sc.stop()

how i can launch new one from ipython without restart newsession of ipython
by refreshing browser ??

why i code some functions and i figreout i forgot  something insde function
,but that function i add it by two way :


   i simply add the module by
          sys.path.append(pathofmodule)
         or by sc.addpyFile from (if i apply on each workers )

somone can explain me how i can make unit test in pyspark ?
i should make it in local or in cluster and how i can do that in pyspark ?

thank you for advance .

Reply via email to