Making spark streaming application single threaded

classic Classic list List threaded Threaded
1 message Options
Reply | Threaded
Open this post in threaded view

Making spark streaming application single threaded

Hi All,

Is there any property which makes my spark streaming application a single

I researched on this property, *spark.dynamicAllocation.maxExecutors=1*, but
as far as I understand this launches a maximum of one container but not a
single thread. In local mode, we can configure the number of threads using
local[*]. But, how can I do the same in cluster mode?

I am trying to read data from Kafka and I see in my logs, every Kafka
message is being read 3 times. I wanted this to be read only once. How can I
achieve this?

Thanks in advance,

Sent from:

To unsubscribe e-mail: [hidden email]