Error in run spark.ContextCleaner under Spark 1.0.0

classic Classic list List threaded Threaded
2 messages Options
Reply | Threaded
Open this post in threaded view
|

Error in run spark.ContextCleaner under Spark 1.0.0

Haoming Zhang
Hi all,

I tried to run a simple Spark Streaming program with sbt. The compile process was correct, but when I run the program I will get an error:

"ERROR spark.ContextCleaner: Error in cleaning thread"

I'm not sure this is a bug or something, because I can get the running result as I expected, only an error will be reported.

The following is the full log:
[info] Set current project to Simple Streaming Project (in build file:/home/feicun/workspace/tempStream/)
[info] Running SimpleStream
Before // This is a word that I printed by "println"
14/06/23 12:03:24 INFO spark.SecurityManager: Changing view acls to: feicun
14/06/23 12:03:24 INFO spark.SecurityManager: SecurityManager: authentication disabled; ui acls disabled; users with view permissions: Set(feicun)
14/06/23 12:03:24 INFO slf4j.Slf4jLogger: Slf4jLogger started
14/06/23 12:03:24 INFO Remoting: Starting remoting
14/06/23 12:03:24 INFO Remoting: Remoting started; listening on addresses :[akka.tcp://spark@manjaro:37906]
14/06/23 12:03:24 INFO Remoting: Remoting now listens on addresses: [akka.tcp://spark@manjaro:37906]
14/06/23 12:03:24 INFO spark.SparkEnv: Registering MapOutputTracker
14/06/23 12:03:24 INFO spark.SparkEnv: Registering BlockManagerMaster
14/06/23 12:03:24 INFO storage.DiskBlockManager: Created local directory at /tmp/spark-local-20140623120324-3cf5
14/06/23 12:03:24 INFO storage.MemoryStore: MemoryStore started with capacity 819.3 MB.
14/06/23 12:03:24 INFO network.ConnectionManager: Bound socket to port 39964 with id = ConnectionManagerId(manjaro,39964)
14/06/23 12:03:24 INFO storage.BlockManagerMaster: Trying to register BlockManager
14/06/23 12:03:24 INFO storage.BlockManagerInfo: Registering block manager manjaro:39964 with 819.3 MB RAM
14/06/23 12:03:24 INFO storage.BlockManagerMaster: Registered BlockManager
14/06/23 12:03:24 INFO spark.HttpServer: Starting HTTP Server
14/06/23 12:03:24 INFO server.Server: jetty-8.1.14.v20131031
14/06/23 12:03:24 INFO server.AbstractConnector: Started SocketConnector@0.0.0.0:38377
14/06/23 12:03:24 INFO broadcast.HttpBroadcast: Broadcast server started at http://10.154.17.101:38377
14/06/23 12:03:24 INFO spark.HttpFileServer: HTTP File server directory is /tmp/spark-f3a10cb8-bdfa-4838-97d1-11bde412f10c
14/06/23 12:03:24 INFO spark.HttpServer: Starting HTTP Server
14/06/23 12:03:24 INFO server.Server: jetty-8.1.14.v20131031
14/06/23 12:03:24 INFO server.AbstractConnector: Started SocketConnector@0.0.0.0:51366
14/06/23 12:03:24 INFO server.Server: jetty-8.1.14.v20131031
14/06/23 12:03:24 INFO server.AbstractConnector: Started SelectChannelConnector@0.0.0.0:4040
14/06/23 12:03:24 INFO ui.SparkUI: Started SparkUI at http://manjaro:4040
fileStreamorg.apache.spark.streaming.dstream.MappedDStream@64084936 // This is a DStream that I expected
14/06/23 12:03:25 INFO network.ConnectionManager: Selector thread was interrupted!
14/06/23 12:03:25 ERROR spark.ContextCleaner: Error in cleaning thread
java.lang.InterruptedException
        at java.lang.Object.wait(Native Method)
        at java.lang.ref.ReferenceQueue.remove(ReferenceQueue.java:135)
        at org.apache.spark.ContextCleaner$$anonfun$org$apache$spark$ContextCleaner$$keepCleaning$1.apply$mcV$sp(ContextCleaner.scala:117)
        at org.apache.spark.ContextCleaner$$anonfun$org$apache$spark$ContextCleaner$$keepCleaning$1.apply(ContextCleaner.scala:115)
        at org.apache.spark.ContextCleaner$$anonfun$org$apache$spark$ContextCleaner$$keepCleaning$1.apply(ContextCleaner.scala:115)
        at org.apache.spark.util.Utils$.logUncaughtExceptions(Utils.scala:1160)
        at org.apache.spark.ContextCleaner.org$apache$spark$ContextCleaner$$keepCleaning(ContextCleaner.scala:114)
        at org.apache.spark.ContextCleaner$$anon$3.run(ContextCleaner.scala:65)
14/06/23 12:03:25 ERROR util.Utils: Uncaught exception in thread SparkListenerBus
java.lang.InterruptedException
        at java.util.concurrent.locks.AbstractQueuedSynchronizer.doAcquireSharedInterruptibly(AbstractQueuedSynchronizer.java:996)
        at java.util.concurrent.locks.AbstractQueuedSynchronizer.acquireSharedInterruptibly(AbstractQueuedSynchronizer.java:1303)
        at java.util.concurrent.Semaphore.acquire(Semaphore.java:317)
        at org.apache.spark.scheduler.LiveListenerBus$$anon$1$$anonfun$run$1.apply$mcV$sp(LiveListenerBus.scala:48)
        at org.apache.spark.scheduler.LiveListenerBus$$anon$1$$anonfun$run$1.apply(LiveListenerBus.scala:47)
        at org.apache.spark.scheduler.LiveListenerBus$$anon$1$$anonfun$run$1.apply(LiveListenerBus.scala:47)
        at org.apache.spark.util.Utils$.logUncaughtExceptions(Utils.scala:1160)
        at org.apache.spark.scheduler.LiveListenerBus$$anon$1.run(LiveListenerBus.scala:46)
[success] Total time: 2 s, completed Jun 23, 2014 12:03:25 PM


Here is my build.sbt file:
 
name := "Simple Streaming Project"

version := "1.0"

scalaVersion := "2.10.4"

libraryDependencies += "org.apache.spark" %% "spark-core" % "1.0.0"

libraryDependencies += "org.apache.spark" %% "spark-streaming" % "1.0.0"

resolvers += "Sonatype release" at "https://oss.sonatype.org/content/repositories/releases"
 
resolvers += "Akka repo" at "http://repo.akka.io/releases/"
 
resolvers += "Spray repo" at "http://repo.spray.cc"


Regards,
Reply | Threaded
Open this post in threaded view
|

Re: Error in run spark.ContextCleaner under Spark 1.0.0

Andrew Or-2
Hi Haoming,

You can safely disregard this error. This is printed at the end of the execution when we clean up and kill the daemon context cleaning thread. In the future it would be good to silence this particular message, as it may be confusing to users.

Andrew


2014-06-23 12:13 GMT-07:00 Haoming Zhang <[hidden email]>:
Hi all,

I tried to run a simple Spark Streaming program with sbt. The compile process was correct, but when I run the program I will get an error:

"ERROR spark.ContextCleaner: Error in cleaning thread"

I'm not sure this is a bug or something, because I can get the running result as I expected, only an error will be reported.

The following is the full log:
[info] Set current project to Simple Streaming Project (in build file:/home/feicun/workspace/tempStream/)
[info] Running SimpleStream
Before // This is a word that I printed by "println"
14/06/23 12:03:24 INFO spark.SecurityManager: Changing view acls to: feicun
14/06/23 12:03:24 INFO spark.SecurityManager: SecurityManager: authentication disabled; ui acls disabled; users with view permissions: Set(feicun)
14/06/23 12:03:24 INFO slf4j.Slf4jLogger: Slf4jLogger started
14/06/23 12:03:24 INFO Remoting: Starting remoting
14/06/23 12:03:24 INFO Remoting: Remoting started; listening on addresses :[akka.tcp://spark@manjaro:37906]
14/06/23 12:03:24 INFO Remoting: Remoting now listens on addresses: [akka.tcp://spark@manjaro:37906]
14/06/23 12:03:24 INFO spark.SparkEnv: Registering MapOutputTracker
14/06/23 12:03:24 INFO spark.SparkEnv: Registering BlockManagerMaster
14/06/23 12:03:24 INFO storage.DiskBlockManager: Created local directory at /tmp/spark-local-20140623120324-3cf5
14/06/23 12:03:24 INFO storage.MemoryStore: MemoryStore started with capacity 819.3 MB.
14/06/23 12:03:24 INFO network.ConnectionManager: Bound socket to port 39964 with id = ConnectionManagerId(manjaro,39964)
14/06/23 12:03:24 INFO storage.BlockManagerMaster: Trying to register BlockManager
14/06/23 12:03:24 INFO storage.BlockManagerInfo: Registering block manager manjaro:39964 with 819.3 MB RAM
14/06/23 12:03:24 INFO storage.BlockManagerMaster: Registered BlockManager
14/06/23 12:03:24 INFO spark.HttpServer: Starting HTTP Server
14/06/23 12:03:24 INFO server.Server: jetty-8.1.14.v20131031
14/06/23 12:03:24 INFO server.AbstractConnector: Started SocketConnector@0.0.0.0:38377
14/06/23 12:03:24 INFO broadcast.HttpBroadcast: Broadcast server started at http://10.154.17.101:38377
14/06/23 12:03:24 INFO spark.HttpFileServer: HTTP File server directory is /tmp/spark-f3a10cb8-bdfa-4838-97d1-11bde412f10c
14/06/23 12:03:24 INFO spark.HttpServer: Starting HTTP Server
14/06/23 12:03:24 INFO server.Server: jetty-8.1.14.v20131031
14/06/23 12:03:24 INFO server.AbstractConnector: Started SocketConnector@0.0.0.0:51366
14/06/23 12:03:24 INFO server.Server: jetty-8.1.14.v20131031
14/06/23 12:03:24 INFO server.AbstractConnector: Started SelectChannelConnector@0.0.0.0:4040
14/06/23 12:03:24 INFO ui.SparkUI: Started SparkUI at http://manjaro:4040
fileStreamorg.apache.spark.streaming.dstream.MappedDStream@64084936 // This is a DStream that I expected
14/06/23 12:03:25 INFO network.ConnectionManager: Selector thread was interrupted!
14/06/23 12:03:25 ERROR spark.ContextCleaner: Error in cleaning thread
java.lang.InterruptedException
        at java.lang.Object.wait(Native Method)
        at java.lang.ref.ReferenceQueue.remove(ReferenceQueue.java:135)
        at org.apache.spark.ContextCleaner$$anonfun$org$apache$spark$ContextCleaner$$keepCleaning$1.apply$mcV$sp(ContextCleaner.scala:117)
        at org.apache.spark.ContextCleaner$$anonfun$org$apache$spark$ContextCleaner$$keepCleaning$1.apply(ContextCleaner.scala:115)
        at org.apache.spark.ContextCleaner$$anonfun$org$apache$spark$ContextCleaner$$keepCleaning$1.apply(ContextCleaner.scala:115)
        at org.apache.spark.util.Utils$.logUncaughtExceptions(Utils.scala:1160)
        at org.apache.spark.ContextCleaner.org$apache$spark$ContextCleaner$$keepCleaning(ContextCleaner.scala:114)
        at org.apache.spark.ContextCleaner$$anon$3.run(ContextCleaner.scala:65)
14/06/23 12:03:25 ERROR util.Utils: Uncaught exception in thread SparkListenerBus
java.lang.InterruptedException
        at java.util.concurrent.locks.AbstractQueuedSynchronizer.doAcquireSharedInterruptibly(AbstractQueuedSynchronizer.java:996)
        at java.util.concurrent.locks.AbstractQueuedSynchronizer.acquireSharedInterruptibly(AbstractQueuedSynchronizer.java:1303)
        at java.util.concurrent.Semaphore.acquire(Semaphore.java:317)
        at org.apache.spark.scheduler.LiveListenerBus$$anon$1$$anonfun$run$1.apply$mcV$sp(LiveListenerBus.scala:48)
        at org.apache.spark.scheduler.LiveListenerBus$$anon$1$$anonfun$run$1.apply(LiveListenerBus.scala:47)
        at org.apache.spark.scheduler.LiveListenerBus$$anon$1$$anonfun$run$1.apply(LiveListenerBus.scala:47)
        at org.apache.spark.util.Utils$.logUncaughtExceptions(Utils.scala:1160)
        at org.apache.spark.scheduler.LiveListenerBus$$anon$1.run(LiveListenerBus.scala:46)
[success] Total time: 2 s, completed Jun 23, 2014 12:03:25 PM


Here is my build.sbt file:
 
name := "Simple Streaming Project"

version := "1.0"

scalaVersion := "2.10.4"

libraryDependencies += "org.apache.spark" %% "spark-core" % "1.0.0"

libraryDependencies += "org.apache.spark" %% "spark-streaming" % "1.0.0"

resolvers += "Sonatype release" at "https://oss.sonatype.org/content/repositories/releases"
 
resolvers += "Akka repo" at "http://repo.akka.io/releases/"
 
resolvers += "Spray repo" at "http://repo.spray.cc"


Regards,