tests that run locally fail when run through bamboo

classic Classic list List threaded Threaded
3 messages Options
Reply | Threaded
Open this post in threaded view
|

tests that run locally fail when run through bamboo

amoc

I have a few test cases for Spark which extend TestSuiteBase from org.apache.spark.streaming.

The tests run fine on my machine but when I commit to repo and run the tests automatically with bamboo the test cases fail with these errors.

 

How to fix?

 

 

21-May-2014 16:33:09

[info] StreamingZigZagSpec:

21-May-2014 16:33:09

[info] - compute zigzag indicator in stream *** FAILED ***

21-May-2014 16:33:09

[info]   org.apache.spark.SparkException: Job aborted: Task 1.0:1 failed 1 times (most recent failure: Exception failure: java.io.StreamCorruptedException: invalid type code: AC)

21-May-2014 16:33:09

[info]   at org.apache.spark.scheduler.DAGScheduler$$anonfun$org$apache$spark$scheduler$DAGScheduler$$abortStage$1.apply(DAGScheduler.scala:1020)

21-May-2014 16:33:09

[info]   at org.apache.spark.scheduler.DAGScheduler$$anonfun$org$apache$spark$scheduler$DAGScheduler$$abortStage$1.apply(DAGScheduler.scala:1018)

21-May-2014 16:33:09

[info]   at scala.collection.mutable.ResizableArray$class.foreach(ResizableArray.scala:59)

21-May-2014 16:33:09

[info]   at scala.collection.mutable.ArrayBuffer.foreach(ArrayBuffer.scala:47)

21-May-2014 16:33:09

[info]   at org.apache.spark.scheduler.DAGScheduler.org$apache$spark$scheduler$DAGScheduler$$abortStage(DAGScheduler.scala:1018)

21-May-2014 16:33:09

[info]   at org.apache.spark.scheduler.DAGScheduler$$anonfun$processEvent$10.apply(DAGScheduler.scala:604)

21-May-2014 16:33:09

[info]   at org.apache.spark.scheduler.DAGScheduler$$anonfun$processEvent$10.apply(DAGScheduler.scala:604)

21-May-2014 16:33:09

[info]   at scala.Option.foreach(Option.scala:236)

21-May-2014 16:33:09

[info]   at org.apache.spark.scheduler.DAGScheduler.processEvent(DAGScheduler.scala:604)

21-May-2014 16:33:09

[info]   at org.apache.spark.scheduler.DAGScheduler$$anonfun$start$1$$anon$2$$anonfun$receive$1.applyOrElse(DAGScheduler.scala:190)

21-May-2014 16:33:09

[info]   ...

21-May-2014 16:33:09

[info] - compute zigzag indicator in stream with intermittent empty RDDs *** FAILED ***

21-May-2014 16:33:09

[info]   Operation timed out after 10042 ms (TestSuiteBase.scala:283)

21-May-2014 16:33:09

[info] - compute zigzag indicator in stream with 3 empty RDDs *** FAILED ***

21-May-2014 16:33:09

[info]   org.apache.spark.SparkException: Job aborted: Task 1.0:1 failed 1 times (most recent failure: Exception failure: java.io.FileNotFoundException: /tmp/spark-local-20140521163241-1707/0f/shuffle_1_1_1 (No such file or directory))

21-May-2014 16:33:09

[info]   at org.apache.spark.scheduler.DAGScheduler$$anonfun$org$apache$spark$scheduler$DAGScheduler$$abortStage$1.apply(DAGScheduler.scala:1020)

21-May-2014 16:33:09

[info]   at org.apache.spark.scheduler.DAGScheduler$$anonfun$org$apache$spark$scheduler$DAGScheduler$$abortStage$1.apply(DAGScheduler.scala:1018)

21-May-2014 16:33:09

[info]   at scala.collection.mutable.ResizableArray$class.foreach(ResizableArray.scala:59)

21-May-2014 16:33:09

[info]   at scala.collection.mutable.ArrayBuffer.foreach(ArrayBuffer.scala:47)

21-May-2014 16:33:09

[info]   at org.apache.spark.scheduler.DAGScheduler.org$apache$spark$scheduler$DAGScheduler$$abortStage(DAGScheduler.scala:1018)

21-May-2014 16:33:09

[info]   at org.apache.spark.scheduler.DAGScheduler$$anonfun$processEvent$10.apply(DAGScheduler.scala:604)

21-May-2014 16:33:09

[info]   at org.apache.spark.scheduler.DAGScheduler$$anonfun$processEvent$10.apply(DAGScheduler.scala:604)

21-May-2014 16:33:09

[info]   at scala.Option.foreach(Option.scala:236)

21-May-2014 16:33:09

[info]   at org.apache.spark.scheduler.DAGScheduler.processEvent(DAGScheduler.scala:604)

21-May-2014 16:33:09

[info]   at org.apache.spark.scheduler.DAGScheduler$$anonfun$start$1$$anon$2$$anonfun$receive$1.applyOrElse(DAGScheduler.scala:190)

21-May-2014 16:33:09

[info]   ...

21-May-2014 16:33:09

[info] - compute zigzag indicator in stream w notification for each change  *** FAILED ***

21-May-2014 16:33:09

[info]   org.apache.spark.SparkException: Job aborted: Task 141.0:0 failed 1 times (most recent failure: Exception failure: java.io.FileNotFoundException: http://10.10.1.9:62793/broadcast_1)

21-May-2014 16:33:09

[info]   at org.apache.spark.scheduler.DAGScheduler$$anonfun$org$apache$spark$scheduler$DAGScheduler$$abortStage$1.apply(DAGScheduler.scala:1020)

21-May-2014 16:33:09

[info]   at org.apache.spark.scheduler.DAGScheduler$$anonfun$org$apache$spark$scheduler$DAGScheduler$$abortStage$1.apply(DAGScheduler.scala:1018)

21-May-2014 16:33:09

[info]   at scala.collection.mutable.ResizableArray$class.foreach(ResizableArray.scala:59)

21-May-2014 16:33:09

[info]   at scala.collection.mutable.ArrayBuffer.foreach(ArrayBuffer.scala:47)

21-May-2014 16:33:09

[info]   at org.apache.spark.scheduler.DAGScheduler.org$apache$spark$scheduler$DAGScheduler$$abortStage(DAGScheduler.scala:1018)

21-May-2014 16:33:09

[info]   at org.apache.spark.scheduler.DAGScheduler$$anonfun$processEvent$10.apply(DAGScheduler.scala:604)

21-May-2014 16:33:09

[info]   at org.apache.spark.scheduler.DAGScheduler$$anonfun$processEvent$10.apply(DAGScheduler.scala:604)

21-May-2014 16:33:09

[info]   at scala.Option.foreach(Option.scala:236)

21-May-2014 16:33:09

[info]   at org.apache.spark.scheduler.DAGScheduler.processEvent(DAGScheduler.scala:604)

21-May-2014 16:33:09

[info]   at org.apache.spark.scheduler.DAGScheduler$$anonfun$start$1$$anon$2$$anonfun$receive$1.applyOrElse(DAGScheduler.scala:190)

21-May-2014 16:33:09

[info]   ...

21-May-2014 16:33:09

[info] - compute zigzag in stream where there is 1 key/RDDs but multiple key types exist

21-May-2014 16:33:09

[info] - compute zigzag in stream where RDDs have more than 1 key

21-May-2014 16:33:09

16:33:09.819 [spark-akka.actor.default-dispatcher-15] INFO  Remoting - Remoting shut down

21-May-2014 16:33:09

16:33:09.819 [spark-akka.actor.default-dispatcher-4] INFO  a.r.RemoteActorRefProvider$RemotingTerminator - Remoting shut down.

21-May-2014 16:33:09

[info] Run completed in 52 seconds, 792 milliseconds.

21-May-2014 16:33:09

[info] Total number of tests run: 36

21-May-2014 16:33:09

[info] Suites: completed 8, aborted 0

21-May-2014 16:33:09

[info] Tests: succeeded 25, failed 11, canceled 0, ignored 0, pending 0

 

 

 

-Adrian

 

Reply | Threaded
Open this post in threaded view
|

RE: tests that run locally fail when run through bamboo

amoc

Just found this at the top of the log:

 

17:14:41.124 [pool-7-thread-3-ScalaTest-running-StreamingSpikeSpec] WARN  o.e.j.u.component.AbstractLifeCycle - FAILED SelectChannelConnector@0.0.0.0:4040: java.net.BindException: Address already in use

build   21-May-2014 17:14:41   java.net.BindException: Address already in use

 

Is there a way to set these connection up so that they don’t all start on the same port (that’s my guess for the root cause of the issue)

 

From: Adrian Mocanu [mailto:[hidden email]]
Sent: May-21-14 4:58 PM
To: [hidden email]; [hidden email]
Subject: tests that run locally fail when run through bamboo

 

I have a few test cases for Spark which extend TestSuiteBase from org.apache.spark.streaming.

The tests run fine on my machine but when I commit to repo and run the tests automatically with bamboo the test cases fail with these errors.

 

How to fix?

 

 

21-May-2014 16:33:09

[info] StreamingZigZagSpec:

21-May-2014 16:33:09

[info] - compute zigzag indicator in stream *** FAILED ***

21-May-2014 16:33:09

[info]   org.apache.spark.SparkException: Job aborted: Task 1.0:1 failed 1 times (most recent failure: Exception failure: java.io.StreamCorruptedException: invalid type code: AC)

21-May-2014 16:33:09

[info]   at org.apache.spark.scheduler.DAGScheduler$$anonfun$org$apache$spark$scheduler$DAGScheduler$$abortStage$1.apply(DAGScheduler.scala:1020)

21-May-2014 16:33:09

[info]   at org.apache.spark.scheduler.DAGScheduler$$anonfun$org$apache$spark$scheduler$DAGScheduler$$abortStage$1.apply(DAGScheduler.scala:1018)

21-May-2014 16:33:09

[info]   at scala.collection.mutable.ResizableArray$class.foreach(ResizableArray.scala:59)

21-May-2014 16:33:09

[info]   at scala.collection.mutable.ArrayBuffer.foreach(ArrayBuffer.scala:47)

21-May-2014 16:33:09

[info]   at org.apache.spark.scheduler.DAGScheduler.org$apache$spark$scheduler$DAGScheduler$$abortStage(DAGScheduler.scala:1018)

21-May-2014 16:33:09

[info]   at org.apache.spark.scheduler.DAGScheduler$$anonfun$processEvent$10.apply(DAGScheduler.scala:604)

21-May-2014 16:33:09

[info]   at org.apache.spark.scheduler.DAGScheduler$$anonfun$processEvent$10.apply(DAGScheduler.scala:604)

21-May-2014 16:33:09

[info]   at scala.Option.foreach(Option.scala:236)

21-May-2014 16:33:09

[info]   at org.apache.spark.scheduler.DAGScheduler.processEvent(DAGScheduler.scala:604)

21-May-2014 16:33:09

[info]   at org.apache.spark.scheduler.DAGScheduler$$anonfun$start$1$$anon$2$$anonfun$receive$1.applyOrElse(DAGScheduler.scala:190)

21-May-2014 16:33:09

[info]   ...

21-May-2014 16:33:09

[info] - compute zigzag indicator in stream with intermittent empty RDDs *** FAILED ***

21-May-2014 16:33:09

[info]   Operation timed out after 10042 ms (TestSuiteBase.scala:283)

21-May-2014 16:33:09

[info] - compute zigzag indicator in stream with 3 empty RDDs *** FAILED ***

21-May-2014 16:33:09

[info]   org.apache.spark.SparkException: Job aborted: Task 1.0:1 failed 1 times (most recent failure: Exception failure: java.io.FileNotFoundException: /tmp/spark-local-20140521163241-1707/0f/shuffle_1_1_1 (No such file or directory))

21-May-2014 16:33:09

[info]   at org.apache.spark.scheduler.DAGScheduler$$anonfun$org$apache$spark$scheduler$DAGScheduler$$abortStage$1.apply(DAGScheduler.scala:1020)

21-May-2014 16:33:09

[info]   at org.apache.spark.scheduler.DAGScheduler$$anonfun$org$apache$spark$scheduler$DAGScheduler$$abortStage$1.apply(DAGScheduler.scala:1018)

21-May-2014 16:33:09

[info]   at scala.collection.mutable.ResizableArray$class.foreach(ResizableArray.scala:59)

21-May-2014 16:33:09

[info]   at scala.collection.mutable.ArrayBuffer.foreach(ArrayBuffer.scala:47)

21-May-2014 16:33:09

[info]   at org.apache.spark.scheduler.DAGScheduler.org$apache$spark$scheduler$DAGScheduler$$abortStage(DAGScheduler.scala:1018)

21-May-2014 16:33:09

[info]   at org.apache.spark.scheduler.DAGScheduler$$anonfun$processEvent$10.apply(DAGScheduler.scala:604)

21-May-2014 16:33:09

[info]   at org.apache.spark.scheduler.DAGScheduler$$anonfun$processEvent$10.apply(DAGScheduler.scala:604)

21-May-2014 16:33:09

[info]   at scala.Option.foreach(Option.scala:236)

21-May-2014 16:33:09

[info]   at org.apache.spark.scheduler.DAGScheduler.processEvent(DAGScheduler.scala:604)

21-May-2014 16:33:09

[info]   at org.apache.spark.scheduler.DAGScheduler$$anonfun$start$1$$anon$2$$anonfun$receive$1.applyOrElse(DAGScheduler.scala:190)

21-May-2014 16:33:09

[info]   ...

21-May-2014 16:33:09

[info] - compute zigzag indicator in stream w notification for each change  *** FAILED ***

21-May-2014 16:33:09

[info]   org.apache.spark.SparkException: Job aborted: Task 141.0:0 failed 1 times (most recent failure: Exception failure: java.io.FileNotFoundException: http://10.10.1.9:62793/broadcast_1)

21-May-2014 16:33:09

[info]   at org.apache.spark.scheduler.DAGScheduler$$anonfun$org$apache$spark$scheduler$DAGScheduler$$abortStage$1.apply(DAGScheduler.scala:1020)

21-May-2014 16:33:09

[info]   at org.apache.spark.scheduler.DAGScheduler$$anonfun$org$apache$spark$scheduler$DAGScheduler$$abortStage$1.apply(DAGScheduler.scala:1018)

21-May-2014 16:33:09

[info]   at scala.collection.mutable.ResizableArray$class.foreach(ResizableArray.scala:59)

21-May-2014 16:33:09

[info]   at scala.collection.mutable.ArrayBuffer.foreach(ArrayBuffer.scala:47)

21-May-2014 16:33:09

[info]   at org.apache.spark.scheduler.DAGScheduler.org$apache$spark$scheduler$DAGScheduler$$abortStage(DAGScheduler.scala:1018)

21-May-2014 16:33:09

[info]   at org.apache.spark.scheduler.DAGScheduler$$anonfun$processEvent$10.apply(DAGScheduler.scala:604)

21-May-2014 16:33:09

[info]   at org.apache.spark.scheduler.DAGScheduler$$anonfun$processEvent$10.apply(DAGScheduler.scala:604)

21-May-2014 16:33:09

[info]   at scala.Option.foreach(Option.scala:236)

21-May-2014 16:33:09

[info]   at org.apache.spark.scheduler.DAGScheduler.processEvent(DAGScheduler.scala:604)

21-May-2014 16:33:09

[info]   at org.apache.spark.scheduler.DAGScheduler$$anonfun$start$1$$anon$2$$anonfun$receive$1.applyOrElse(DAGScheduler.scala:190)

21-May-2014 16:33:09

[info]   ...

21-May-2014 16:33:09

[info] - compute zigzag in stream where there is 1 key/RDDs but multiple key types exist

21-May-2014 16:33:09

[info] - compute zigzag in stream where RDDs have more than 1 key

21-May-2014 16:33:09

16:33:09.819 [spark-akka.actor.default-dispatcher-15] INFO  Remoting - Remoting shut down

21-May-2014 16:33:09

16:33:09.819 [spark-akka.actor.default-dispatcher-4] INFO  a.r.RemoteActorRefProvider$RemotingTerminator - Remoting shut down.

21-May-2014 16:33:09

[info] Run completed in 52 seconds, 792 milliseconds.

21-May-2014 16:33:09

[info] Total number of tests run: 36

21-May-2014 16:33:09

[info] Suites: completed 8, aborted 0

21-May-2014 16:33:09

[info] Tests: succeeded 25, failed 11, canceled 0, ignored 0, pending 0

 

 

 

-Adrian

 

Reply | Threaded
Open this post in threaded view
|

Re: tests that run locally fail when run through bamboo

Tathagata Das
This do happens sometimes, but it is a warning because Spark is designed try successive ports until it succeeds. So unless a crazzzzy number of successive ports are blocked (runaway processes?? insufficient clearing of ports by OS??), these errors should not be a problem for tests passing. 


On Wed, May 21, 2014 at 2:31 PM, Adrian Mocanu <[hidden email]> wrote:

Just found this at the top of the log:

 

17:14:41.124 [pool-7-thread-3-ScalaTest-running-StreamingSpikeSpec] WARN  o.e.j.u.component.AbstractLifeCycle - FAILED SelectChannelConnector@0.0.0.0:4040: java.net.BindException: Address already in use

build   21-May-2014 17:14:41   java.net.BindException: Address already in use

 

Is there a way to set these connection up so that they don’t all start on the same port (that’s my guess for the root cause of the issue)

 

From: Adrian Mocanu [mailto:[hidden email]]
Sent: May-21-14 4:58 PM
To: [hidden email]; [hidden email]
Subject: tests that run locally fail when run through bamboo

 

I have a few test cases for Spark which extend TestSuiteBase from org.apache.spark.streaming.

The tests run fine on my machine but when I commit to repo and run the tests automatically with bamboo the test cases fail with these errors.

 

How to fix?

 

 

21-May-2014 16:33:09

[info] StreamingZigZagSpec:

21-May-2014 16:33:09

[info] - compute zigzag indicator in stream *** FAILED ***

21-May-2014 16:33:09

[info]   org.apache.spark.SparkException: Job aborted: Task 1.0:1 failed 1 times (most recent failure: Exception failure: java.io.StreamCorruptedException: invalid type code: AC)

21-May-2014 16:33:09

[info]   at org.apache.spark.scheduler.DAGScheduler$$anonfun$org$apache$spark$scheduler$DAGScheduler$$abortStage$1.apply(DAGScheduler.scala:1020)

21-May-2014 16:33:09

[info]   at org.apache.spark.scheduler.DAGScheduler$$anonfun$org$apache$spark$scheduler$DAGScheduler$$abortStage$1.apply(DAGScheduler.scala:1018)

21-May-2014 16:33:09

[info]   at scala.collection.mutable.ResizableArray$class.foreach(ResizableArray.scala:59)

21-May-2014 16:33:09

[info]   at scala.collection.mutable.ArrayBuffer.foreach(ArrayBuffer.scala:47)

21-May-2014 16:33:09

[info]   at org.apache.spark.scheduler.DAGScheduler.org$apache$spark$scheduler$DAGScheduler$$abortStage(DAGScheduler.scala:1018)

21-May-2014 16:33:09

[info]   at org.apache.spark.scheduler.DAGScheduler$$anonfun$processEvent$10.apply(DAGScheduler.scala:604)

21-May-2014 16:33:09

[info]   at org.apache.spark.scheduler.DAGScheduler$$anonfun$processEvent$10.apply(DAGScheduler.scala:604)

21-May-2014 16:33:09

[info]   at scala.Option.foreach(Option.scala:236)

21-May-2014 16:33:09

[info]   at org.apache.spark.scheduler.DAGScheduler.processEvent(DAGScheduler.scala:604)

21-May-2014 16:33:09

[info]   at org.apache.spark.scheduler.DAGScheduler$$anonfun$start$1$$anon$2$$anonfun$receive$1.applyOrElse(DAGScheduler.scala:190)

21-May-2014 16:33:09

[info]   ...

21-May-2014 16:33:09

[info] - compute zigzag indicator in stream with intermittent empty RDDs *** FAILED ***

21-May-2014 16:33:09

[info]   Operation timed out after 10042 ms (TestSuiteBase.scala:283)

21-May-2014 16:33:09

[info] - compute zigzag indicator in stream with 3 empty RDDs *** FAILED ***

21-May-2014 16:33:09

[info]   org.apache.spark.SparkException: Job aborted: Task 1.0:1 failed 1 times (most recent failure: Exception failure: java.io.FileNotFoundException: /tmp/spark-local-20140521163241-1707/0f/shuffle_1_1_1 (No such file or directory))

21-May-2014 16:33:09

[info]   at org.apache.spark.scheduler.DAGScheduler$$anonfun$org$apache$spark$scheduler$DAGScheduler$$abortStage$1.apply(DAGScheduler.scala:1020)

21-May-2014 16:33:09

[info]   at org.apache.spark.scheduler.DAGScheduler$$anonfun$org$apache$spark$scheduler$DAGScheduler$$abortStage$1.apply(DAGScheduler.scala:1018)

21-May-2014 16:33:09

[info]   at scala.collection.mutable.ResizableArray$class.foreach(ResizableArray.scala:59)

21-May-2014 16:33:09

[info]   at scala.collection.mutable.ArrayBuffer.foreach(ArrayBuffer.scala:47)

21-May-2014 16:33:09

[info]   at org.apache.spark.scheduler.DAGScheduler.org$apache$spark$scheduler$DAGScheduler$$abortStage(DAGScheduler.scala:1018)

21-May-2014 16:33:09

[info]   at org.apache.spark.scheduler.DAGScheduler$$anonfun$processEvent$10.apply(DAGScheduler.scala:604)

21-May-2014 16:33:09

[info]   at org.apache.spark.scheduler.DAGScheduler$$anonfun$processEvent$10.apply(DAGScheduler.scala:604)

21-May-2014 16:33:09

[info]   at scala.Option.foreach(Option.scala:236)

21-May-2014 16:33:09

[info]   at org.apache.spark.scheduler.DAGScheduler.processEvent(DAGScheduler.scala:604)

21-May-2014 16:33:09

[info]   at org.apache.spark.scheduler.DAGScheduler$$anonfun$start$1$$anon$2$$anonfun$receive$1.applyOrElse(DAGScheduler.scala:190)

21-May-2014 16:33:09

[info]   ...

21-May-2014 16:33:09

[info] - compute zigzag indicator in stream w notification for each change  *** FAILED ***

21-May-2014 16:33:09

[info]   org.apache.spark.SparkException: Job aborted: Task 141.0:0 failed 1 times (most recent failure: Exception failure: java.io.FileNotFoundException: http://10.10.1.9:62793/broadcast_1)

21-May-2014 16:33:09

[info]   at org.apache.spark.scheduler.DAGScheduler$$anonfun$org$apache$spark$scheduler$DAGScheduler$$abortStage$1.apply(DAGScheduler.scala:1020)

21-May-2014 16:33:09

[info]   at org.apache.spark.scheduler.DAGScheduler$$anonfun$org$apache$spark$scheduler$DAGScheduler$$abortStage$1.apply(DAGScheduler.scala:1018)

21-May-2014 16:33:09

[info]   at scala.collection.mutable.ResizableArray$class.foreach(ResizableArray.scala:59)

21-May-2014 16:33:09

[info]   at scala.collection.mutable.ArrayBuffer.foreach(ArrayBuffer.scala:47)

21-May-2014 16:33:09

[info]   at org.apache.spark.scheduler.DAGScheduler.org$apache$spark$scheduler$DAGScheduler$$abortStage(DAGScheduler.scala:1018)

21-May-2014 16:33:09

[info]   at org.apache.spark.scheduler.DAGScheduler$$anonfun$processEvent$10.apply(DAGScheduler.scala:604)

21-May-2014 16:33:09

[info]   at org.apache.spark.scheduler.DAGScheduler$$anonfun$processEvent$10.apply(DAGScheduler.scala:604)

21-May-2014 16:33:09

[info]   at scala.Option.foreach(Option.scala:236)

21-May-2014 16:33:09

[info]   at org.apache.spark.scheduler.DAGScheduler.processEvent(DAGScheduler.scala:604)

21-May-2014 16:33:09

[info]   at org.apache.spark.scheduler.DAGScheduler$$anonfun$start$1$$anon$2$$anonfun$receive$1.applyOrElse(DAGScheduler.scala:190)

21-May-2014 16:33:09

[info]   ...

21-May-2014 16:33:09

[info] - compute zigzag in stream where there is 1 key/RDDs but multiple key types exist

21-May-2014 16:33:09

[info] - compute zigzag in stream where RDDs have more than 1 key

21-May-2014 16:33:09

16:33:09.819 [spark-akka.actor.default-dispatcher-15] INFO  Remoting - Remoting shut down

21-May-2014 16:33:09

16:33:09.819 [spark-akka.actor.default-dispatcher-4] INFO  a.r.RemoteActorRefProvider$RemotingTerminator - Remoting shut down.

21-May-2014 16:33:09

[info] Run completed in 52 seconds, 792 milliseconds.

21-May-2014 16:33:09

[info] Total number of tests run: 36

21-May-2014 16:33:09

[info] Suites: completed 8, aborted 0

21-May-2014 16:33:09

[info] Tests: succeeded 25, failed 11, canceled 0, ignored 0, pending 0

 

 

 

-Adrian