Kafka streaming out of memory

classic Classic list List threaded Threaded
2 messages Options
Reply | Threaded
Open this post in threaded view
|

Kafka streaming out of memory

yufei sun
This post has NOT been accepted by the mailing list yet.
This post was updated on .
 when i use the kafka streaming with spark, the rdds store in memory and never be realesed, so OOM will happen after time. how can i release these rdds? i have used StorageLevel.NONE, but it didn't work.please help me , thanks!
Reply | Threaded
Open this post in threaded view
|

Re: Kafka streaming out of memory

yufei sun
This post has NOT been accepted by the mailing list yet.
here is my code and my spark version is 0.8.1
JavaDStream<String> kafkaStream = jssc.kafkaStream(
                                p.getProperty("spark.kafka.zkquorum"),
                                p.getProperty("spark.kafka.groupid"), topics);

            final NginxConfigResolver solver = new NginxConfigResolver();
            solver.init(nginxconfig);
            JavaDStream<SerializSparkLog> logDStream = kafkaStream
                                .map(new Function<String, SerializSparkLog>() {

                                        @Override
                                        public SerializSparkLog call(String arg0) throws Exception {
                                                // TODO Auto-generated method stub
                                                return solver.resolve(arg0);
                                        }

                                });
               
                 logDStream.foreach(new Function<JavaRDD<SerializSparkLog>, Void>() {

                        @Override
                        public Void call(JavaRDD<SerializSparkLog> arg0) throws Exception {
                                List<SerializSparkLog> logs = arg0.collect();
                                System.out.println(logs.size());
                                return null;
                        }

                });
                jssc.start();