Hi All,

 

I want to develop a server side application:

 

User submit request à Server run spark application and return (this might
take a few seconds).

 

So I want to host the server to keep the long-live context, I don’t know
whether this is reasonable or not.

 

Basically I try to have a global JavaSparkContext instance and keep it
there, and initialize some RDD. Then my java application will use it to
submit the job.

 

So now I have some questions:

 

1, if I don’t close it, will there any timeout I need to configure on the
spark server?

2, In theory I want to design something similar to Spark shell (which also
host a default sc there), just it is not shell based. 

 

Any suggestion? I think my request is very common for application
development, here must someone has done it before?

 

Regards,

 

Shawn

Reply via email to