Making spark streaming application single threaded
Is there any property which makes my spark streaming application a single
I researched on this property, *spark.dynamicAllocation.maxExecutors=1*, but
as far as I understand this launches a maximum of one container but not a
single thread. In local mode, we can configure the number of threads using
local[*]. But, how can I do the same in cluster mode?
I am trying to read data from Kafka and I see in my logs, every Kafka
message is being read 3 times. I wanted this to be read only once. How can I