Long running Spark Job Status on Remote Submission
I am submitting a Spark Job on Yarn cluster from a remote machine which is not in the cluster itself. When there are some jobs which take some large time, the spark-submit process never exits as it still waits for the status of the job. Though on the cluster, the job gets finished successfully.
How do I get the status of such long-running jobs so that I can do the further tasks on my remote machine after the job completion? Livy is one choice but I want to do it without that, if possible.