Spark Streaming Job completed without executing next batches

Previous Topic Next Topic
 
classic Classic list List threaded Threaded
2 messages Options
Reply | Threaded
Open this post in threaded view
|

Spark Streaming Job completed without executing next batches

khajaasmath786
Hi,

I have scheduled spark streaming job to run every 30 minutes and it was running fine till 32 hours and suddenly I see status of Finsished instead of running (Since it always run in background and shows up in resource manager)

Am i doing anything wrong here? how come job was finished without picking next bacth from kafka.

I run using below command in cloudera cluster.

spark2-submit --class com.telematics.datascience.drivers.OCCDataPointDriver --master yarn --queue hadvaoccx_dse_pool --principal [hidden email] --keytab ./va_dflt.keytab  Telematics.jar -c /home/yyy1k78/occtelematics/application-datapoint-hdfs-dyn.properties &

Thanks,
Asmath
Reply | Threaded
Open this post in threaded view
|

Re: Spark Streaming Job completed without executing next batches

khajaasmath786
Here is screenshot . Status shows finished but it should be running for next batch to pick up the data.


Inline image 1

On Thu, Nov 16, 2017 at 10:01 PM, KhajaAsmath Mohammed <[hidden email]> wrote:
Hi,

I have scheduled spark streaming job to run every 30 minutes and it was running fine till 32 hours and suddenly I see status of Finsished instead of running (Since it always run in background and shows up in resource manager)

Am i doing anything wrong here? how come job was finished without picking next bacth from kafka.

I run using below command in cloudera cluster.

spark2-submit --class com.telematics.datascience.drivers.OCCDataPointDriver --master yarn --queue hadvaoccx_dse_pool --principal [hidden email] --keytab ./va_dflt.keytab  Telematics.jar -c /home/yyy1k78/occtelematics/application-datapoint-hdfs-dyn.properties &

Thanks,
Asmath