keep sparkContext alive and wait for next job just like spark-shell
i am confused of how can i keep a sparkContext alive. Just in the situation that we write a sql query on a web and backend we init a sparkContext then submit the spark jobs. However the question is everytime we run the query string,spark with request the resources from yarn.It is painful to waste a lot of time on init the sparkContxt.
So i think about the way to run the spark job that when the query finnished the context will not ended and do not release the resource on yarn just like the spark-shell( i find the spark-shell will keep the resources when started).
Is there any idea,please give me a tutor . Thanks for all !
Sent from the Apache Spark User List mailing list archive at Nabble.com.