Iterative Streaming with Spark

classic Classic list List threaded Threaded
1 message Options
Reply | Threaded
Open this post in threaded view

Iterative Streaming with Spark


I am doing a benchmark in Flink, Storm, Spark for an iterative streaming
The goal is to make a window for a stream and do an iterative computation
per window.

Both Flink and Storm provides a window function with a list or iterator. But
in Spark,
I  am not quite sure how to do this. Is it possible to get all the elements
in a window as a list or iterator or etc?

And the goal after this windowed computation is to do a reduce operation to
get a
globally synchronized value.

Is this possible with Spark Streaming?

Sent from:

To unsubscribe e-mail: [hidden email]