RecordTooLargeException in Spark *Structured* Streaming

classic Classic list List threaded Threaded
3 messages Options
Reply | Threaded
Open this post in threaded view
|

RecordTooLargeException in Spark *Structured* Streaming

Eric Beabes

I keep getting this error message:


The message is 1169350 bytes when serialized which is larger than the maximum request size you have configured with the max.request.size configuration.

 

As indicated in other posts, I am trying to set the “max.request.size” configuration in the Producer as follows:


---------------------

.writeStream

.format("kafka")

.option(

  "kafka.bootstrap.servers",

  conig.outputBootstrapServer

)

.option(ProducerConfig.MAX_REQUEST_SIZE_CONFIG, "10000000")

---------------------

 

But this is not working. Am I setting this correctly? Is there a different way to set this property under Spark Structured Streaming?


Please help. Thanks.


Reply | Threaded
Open this post in threaded view
|

Re: RecordTooLargeException in Spark *Structured* Streaming

Jungtaek Lim-2
Hi,

You need to add the prefix "kafka." for the configurations which should be propagated to the Kafka. Others will be used in Spark data source itself. (Kafka connector in this case)


Hope this helps.

Thanks,
Jungtaek Lim (HeartSaVioR)


On Tue, May 26, 2020 at 6:42 AM Something Something <[hidden email]> wrote:

I keep getting this error message:


The message is 1169350 bytes when serialized which is larger than the maximum request size you have configured with the max.request.size configuration.

 

As indicated in other posts, I am trying to set the “max.request.size” configuration in the Producer as follows:


---------------------

.writeStream

.format("kafka")

.option(

  "kafka.bootstrap.servers",

  conig.outputBootstrapServer

)

.option(ProducerConfig.MAX_REQUEST_SIZE_CONFIG, "10000000")

---------------------

 

But this is not working. Am I setting this correctly? Is there a different way to set this property under Spark Structured Streaming?


Please help. Thanks.


Reply | Threaded
Open this post in threaded view
|

Re: RecordTooLargeException in Spark *Structured* Streaming

Eric Beabes
Thanks. Missed that part of documentation. Appreciate your help. Regards.

On Mon, May 25, 2020 at 10:42 PM Jungtaek Lim <[hidden email]> wrote:
Hi,

You need to add the prefix "kafka." for the configurations which should be propagated to the Kafka. Others will be used in Spark data source itself. (Kafka connector in this case)


Hope this helps.

Thanks,
Jungtaek Lim (HeartSaVioR)


On Tue, May 26, 2020 at 6:42 AM Something Something <[hidden email]> wrote:

I keep getting this error message:


The message is 1169350 bytes when serialized which is larger than the maximum request size you have configured with the max.request.size configuration.

 

As indicated in other posts, I am trying to set the “max.request.size” configuration in the Producer as follows:


---------------------

.writeStream

.format("kafka")

.option(

  "kafka.bootstrap.servers",

  conig.outputBootstrapServer

)

.option(ProducerConfig.MAX_REQUEST_SIZE_CONFIG, "10000000")

---------------------

 

But this is not working. Am I setting this correctly? Is there a different way to set this property under Spark Structured Streaming?


Please help. Thanks.