keep sparkContext alive and wait for next job just like spark-shell

Previous Topic Next Topic
 
classic Classic list List threaded Threaded
1 message Options
Reply | Threaded
Open this post in threaded view
|

keep sparkContext alive and wait for next job just like spark-shell

CondyZhou
Hi ,All. i am confused of how can i keep a sparkContext alive. Just in the situation that we write a sql query on a web and backend we init a sparkContext then submit the spark jobs. However the question is everytime we run the query string,spark with request the resources from yarn.It is painful to waste a lot of time on init the sparkContxt. So i think about the way to run the spark job that when the query finnished the context will not ended and do not release the resource on yarn just like the spark-shell( i find the spark-shell will keep the resources when started). Is there any idea,please give me a tutor . Thanks for all !

Sent from the Apache Spark User List mailing list archive at Nabble.com.