Using Spark Streaming for analyzing changing data

classic Classic list List threaded Threaded
1 message Options
Reply | Threaded
Open this post in threaded view

Using Spark Streaming for analyzing changing data


We have a use case where there's a stream of events while every event has an
ID and its current state with a timestamp:


We want to ask questions about the current state of the *whole dataset*,
from the beginning of time, such as:
  "how many items are now in ongoing state"

(but bear in mind that there are more complicated questions, and all of them
are asking about the _current_ state of the dataset, from the beginning of

I haven't found any simple, performant way of doing it.

The ways I've found are:
1. Using mapGroupsWithState, where I groupByKey on the ID, and update the
state always for the latest event by timestamp
2. Using groupByKey on the ID, and leaving only the matched event whose
timestamp is the latest

Both methods are not good because the first one involves state which means
checkpointing, memory, etc., and the second involves shuffling and sorting.

We will have a lot of such queries in order to populate a real-time

I wonder, as a general question, what is the correct way to process this
type of data in Spark Streaming?

Sent from:

To unsubscribe e-mail: [hidden email]