java.lang.ClassNotFoundException

classic Classic list List threaded Threaded
10 messages Options
Reply | Threaded
Open this post in threaded view
|

java.lang.ClassNotFoundException

Jaonary Rabarisoa
Hi all,

I got java.lang.ClassNotFoundException even with "addJar" called. The jar file is present in each node.

I use the version of spark from github master.

Any ideas ?


Jaonary 
Reply | Threaded
Open this post in threaded view
|

Re: java.lang.ClassNotFoundException

Ognen Duzlevski-2
Have you looked at the individual nodes logs? Can you post a bit more of
the exception's output?

On 3/26/14, 8:42 AM, Jaonary Rabarisoa wrote:

> Hi all,
>
> I got java.lang.ClassNotFoundException even with "addJar" called. The
> jar file is present in each node.
>
> I use the version of spark from github master.
>
> Any ideas ?
>
>
> Jaonary
Reply | Threaded
Open this post in threaded view
|

Re: java.lang.ClassNotFoundException

Jaonary Rabarisoa
Here the output that I get :

[error] (run-main-0) org.apache.spark.SparkException: Job aborted: Task 1.0:1 failed 4 times (most recent failure: Exception failure in TID 6 on host 172.166.86.36: java.lang.ClassNotFoundException: value.models.ReIdDataSetEntry)
org.apache.spark.SparkException: Job aborted: Task 1.0:1 failed 4 times (most recent failure: Exception failure in TID 6 on host 172.166.86.36: java.lang.ClassNotFoundException: value.models.ReIdDataSetEntry)
at org.apache.spark.scheduler.DAGScheduler$$anonfun$org$apache$spark$scheduler$DAGScheduler$$abortStage$1.apply(DAGScheduler.scala:1011)
at org.apache.spark.scheduler.DAGScheduler$$anonfun$org$apache$spark$scheduler$DAGScheduler$$abortStage$1.apply(DAGScheduler.scala:1009)
at scala.collection.mutable.ResizableArray$class.foreach(ResizableArray.scala:59)
at scala.collection.mutable.ArrayBuffer.foreach(ArrayBuffer.scala:47)
at org.apache.spark.scheduler.DAGScheduler.org$apache$spark$scheduler$DAGScheduler$$abortStage(DAGScheduler.scala:1009)
at org.apache.spark.scheduler.DAGScheduler$$anonfun$processEvent$10.apply(DAGScheduler.scala:596)
at org.apache.spark.scheduler.DAGScheduler$$anonfun$processEvent$10.apply(DAGScheduler.scala:596)
at scala.Option.foreach(Option.scala:236)
at org.apache.spark.scheduler.DAGScheduler.processEvent(DAGScheduler.scala:596)
at org.apache.spark.scheduler.DAGScheduler$$anonfun$start$1$$anon$2$$anonfun$receive$1.applyOrElse(DAGScheduler.scala:146)
at akka.actor.ActorCell.receiveMessage(ActorCell.scala:498)
at akka.actor.ActorCell.invoke(ActorCell.scala:456)
at akka.dispatch.Mailbox.processMailbox(Mailbox.scala:237)
at akka.dispatch.Mailbox.run(Mailbox.scala:219)
at akka.dispatch.ForkJoinExecutorConfigurator$AkkaForkJoinTask.exec(AbstractDispatcher.scala:386)
at scala.concurrent.forkjoin.ForkJoinTask.doExec(ForkJoinTask.java:260)
at scala.concurrent.forkjoin.ForkJoinPool$WorkQueue.runTask(ForkJoinPool.java:1339)
at scala.concurrent.forkjoin.ForkJoinPool.runWorker(ForkJoinPool.java:1979)
at scala.concurrent.forkjoin.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:107)

Spark says that the jar is added :

14/03/26 15:49:18 INFO SparkContext: Added JAR target/scala-2.10/value-spark_2.10-1.0.jar





On Wed, Mar 26, 2014 at 3:34 PM, Ognen Duzlevski <[hidden email]> wrote:
Have you looked at the individual nodes logs? Can you post a bit more of the exception's output?


On 3/26/14, 8:42 AM, Jaonary Rabarisoa wrote:
Hi all,

I got java.lang.ClassNotFoundException even with "addJar" called. The jar file is present in each node.

I use the version of spark from github master.

Any ideas ?


Jaonary

Reply | Threaded
Open this post in threaded view
|

Re: java.lang.ClassNotFoundException

Jaonary Rabarisoa
I notice that I get this error when I'm trying to load an objectFile with  val viperReloaded = context.objectFile[ReIdDataSetEntry]("data")


On Wed, Mar 26, 2014 at 3:58 PM, Jaonary Rabarisoa <[hidden email]> wrote:
Here the output that I get :

[error] (run-main-0) org.apache.spark.SparkException: Job aborted: Task 1.0:1 failed 4 times (most recent failure: Exception failure in TID 6 on host 172.166.86.36: java.lang.ClassNotFoundException: value.models.ReIdDataSetEntry)
org.apache.spark.SparkException: Job aborted: Task 1.0:1 failed 4 times (most recent failure: Exception failure in TID 6 on host 172.166.86.36: java.lang.ClassNotFoundException: value.models.ReIdDataSetEntry)
at org.apache.spark.scheduler.DAGScheduler$$anonfun$org$apache$spark$scheduler$DAGScheduler$$abortStage$1.apply(DAGScheduler.scala:1011)
at org.apache.spark.scheduler.DAGScheduler$$anonfun$org$apache$spark$scheduler$DAGScheduler$$abortStage$1.apply(DAGScheduler.scala:1009)
at scala.collection.mutable.ResizableArray$class.foreach(ResizableArray.scala:59)
at scala.collection.mutable.ArrayBuffer.foreach(ArrayBuffer.scala:47)
at org.apache.spark.scheduler.DAGScheduler.org$apache$spark$scheduler$DAGScheduler$$abortStage(DAGScheduler.scala:1009)
at org.apache.spark.scheduler.DAGScheduler$$anonfun$processEvent$10.apply(DAGScheduler.scala:596)
at org.apache.spark.scheduler.DAGScheduler$$anonfun$processEvent$10.apply(DAGScheduler.scala:596)
at scala.Option.foreach(Option.scala:236)
at org.apache.spark.scheduler.DAGScheduler.processEvent(DAGScheduler.scala:596)
at org.apache.spark.scheduler.DAGScheduler$$anonfun$start$1$$anon$2$$anonfun$receive$1.applyOrElse(DAGScheduler.scala:146)
at akka.actor.ActorCell.receiveMessage(ActorCell.scala:498)
at akka.actor.ActorCell.invoke(ActorCell.scala:456)
at akka.dispatch.Mailbox.processMailbox(Mailbox.scala:237)
at akka.dispatch.Mailbox.run(Mailbox.scala:219)
at akka.dispatch.ForkJoinExecutorConfigurator$AkkaForkJoinTask.exec(AbstractDispatcher.scala:386)
at scala.concurrent.forkjoin.ForkJoinTask.doExec(ForkJoinTask.java:260)
at scala.concurrent.forkjoin.ForkJoinPool$WorkQueue.runTask(ForkJoinPool.java:1339)
at scala.concurrent.forkjoin.ForkJoinPool.runWorker(ForkJoinPool.java:1979)
at scala.concurrent.forkjoin.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:107)

Spark says that the jar is added :

14/03/26 15:49:18 INFO SparkContext: Added JAR target/scala-2.10/value-spark_2.10-1.0.jar





On Wed, Mar 26, 2014 at 3:34 PM, Ognen Duzlevski <[hidden email]> wrote:
Have you looked at the individual nodes logs? Can you post a bit more of the exception's output?


On 3/26/14, 8:42 AM, Jaonary Rabarisoa wrote:
Hi all,

I got java.lang.ClassNotFoundException even with "addJar" called. The jar file is present in each node.

I use the version of spark from github master.

Any ideas ?


Jaonary


Reply | Threaded
Open this post in threaded view
|

Re: java.lang.ClassNotFoundException

Ognen Duzlevski-2
Have you looked through the logs fully? I have seen this (in my limited experience) pop up as a result of previous exceptions/errors, also as a result of being unable to serialize objects etc.
Ognen

On 3/26/14, 10:39 AM, Jaonary Rabarisoa wrote:
I notice that I get this error when I'm trying to load an objectFile with  val viperReloaded = context.objectFile[ReIdDataSetEntry]("data")


On Wed, Mar 26, 2014 at 3:58 PM, Jaonary Rabarisoa <[hidden email]> wrote:
Here the output that I get :

[error] (run-main-0) org.apache.spark.SparkException: Job aborted: Task 1.0:1 failed 4 times (most recent failure: Exception failure in TID 6 on host 172.166.86.36: java.lang.ClassNotFoundException: value.models.ReIdDataSetEntry)
org.apache.spark.SparkException: Job aborted: Task 1.0:1 failed 4 times (most recent failure: Exception failure in TID 6 on host 172.166.86.36: java.lang.ClassNotFoundException: value.models.ReIdDataSetEntry)
at org.apache.spark.scheduler.DAGScheduler$$anonfun$org$apache$spark$scheduler$DAGScheduler$$abortStage$1.apply(DAGScheduler.scala:1011)
at org.apache.spark.scheduler.DAGScheduler$$anonfun$org$apache$spark$scheduler$DAGScheduler$$abortStage$1.apply(DAGScheduler.scala:1009)
at scala.collection.mutable.ResizableArray$class.foreach(ResizableArray.scala:59)
at scala.collection.mutable.ArrayBuffer.foreach(ArrayBuffer.scala:47)
at org.apache.spark.scheduler.DAGScheduler.org$apache$spark$scheduler$DAGScheduler$$abortStage(DAGScheduler.scala:1009)
at org.apache.spark.scheduler.DAGScheduler$$anonfun$processEvent$10.apply(DAGScheduler.scala:596)
at org.apache.spark.scheduler.DAGScheduler$$anonfun$processEvent$10.apply(DAGScheduler.scala:596)
at scala.Option.foreach(Option.scala:236)
at org.apache.spark.scheduler.DAGScheduler.processEvent(DAGScheduler.scala:596)
at org.apache.spark.scheduler.DAGScheduler$$anonfun$start$1$$anon$2$$anonfun$receive$1.applyOrElse(DAGScheduler.scala:146)
at akka.actor.ActorCell.receiveMessage(ActorCell.scala:498)
at akka.actor.ActorCell.invoke(ActorCell.scala:456)
at akka.dispatch.Mailbox.processMailbox(Mailbox.scala:237)
at akka.dispatch.Mailbox.run(Mailbox.scala:219)
at akka.dispatch.ForkJoinExecutorConfigurator$AkkaForkJoinTask.exec(AbstractDispatcher.scala:386)
at scala.concurrent.forkjoin.ForkJoinTask.doExec(ForkJoinTask.java:260)
at scala.concurrent.forkjoin.ForkJoinPool$WorkQueue.runTask(ForkJoinPool.java:1339)
at scala.concurrent.forkjoin.ForkJoinPool.runWorker(ForkJoinPool.java:1979)
at scala.concurrent.forkjoin.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:107)

Spark says that the jar is added :

14/03/26 15:49:18 INFO SparkContext: Added JAR target/scala-2.10/value-spark_2.10-1.0.jar





On Wed, Mar 26, 2014 at 3:34 PM, Ognen Duzlevski <[hidden email]> wrote:
Have you looked at the individual nodes logs? Can you post a bit more of the exception's output?


On 3/26/14, 8:42 AM, Jaonary Rabarisoa wrote:
Hi all,

I got java.lang.ClassNotFoundException even with "addJar" called. The jar file is present in each node.

I use the version of spark from github master.

Any ideas ?


Jaonary

Reply | Threaded
Open this post in threaded view
|

Re: java.lang.ClassNotFoundException

Jaonary Rabarisoa
In fact, It may be related to object serialization :

14/03/26 17:02:19 INFO TaskSetManager: Serialized task 3.0:1 as 2025 bytes in 1 ms
14/03/26 17:02:19 WARN TaskSetManager: Lost TID 6 (task 3.0:0)
14/03/26 17:02:19 INFO TaskSetManager: Loss was due to java.lang.ClassNotFoundException: value.models.ReIdDataSetEntry [duplicate 3]
14/03/26 17:02:19 INFO TaskSetManager: Starting task 3.0:0 as TID 8 on executor 0: 132.166.86.13 (PROCESS_LOCAL)


In this case, What should I do ?


On Wed, Mar 26, 2014 at 4:46 PM, Ognen Duzlevski <[hidden email]> wrote:
Have you looked through the logs fully? I have seen this (in my limited experience) pop up as a result of previous exceptions/errors, also as a result of being unable to serialize objects etc.
Ognen


On 3/26/14, 10:39 AM, Jaonary Rabarisoa wrote:
I notice that I get this error when I'm trying to load an objectFile with  val viperReloaded = context.objectFile[ReIdDataSetEntry]("data")


On Wed, Mar 26, 2014 at 3:58 PM, Jaonary Rabarisoa <[hidden email]> wrote:
Here the output that I get :

[error] (run-main-0) org.apache.spark.SparkException: Job aborted: Task 1.0:1 failed 4 times (most recent failure: Exception failure in TID 6 on host 172.166.86.36: java.lang.ClassNotFoundException: value.models.ReIdDataSetEntry)
org.apache.spark.SparkException: Job aborted: Task 1.0:1 failed 4 times (most recent failure: Exception failure in TID 6 on host 172.166.86.36: java.lang.ClassNotFoundException: value.models.ReIdDataSetEntry)
at org.apache.spark.scheduler.DAGScheduler$$anonfun$org$apache$spark$scheduler$DAGScheduler$$abortStage$1.apply(DAGScheduler.scala:1011)
at org.apache.spark.scheduler.DAGScheduler$$anonfun$org$apache$spark$scheduler$DAGScheduler$$abortStage$1.apply(DAGScheduler.scala:1009)
at scala.collection.mutable.ResizableArray$class.foreach(ResizableArray.scala:59)
at scala.collection.mutable.ArrayBuffer.foreach(ArrayBuffer.scala:47)
at org.apache.spark.scheduler.DAGScheduler.org$apache$spark$scheduler$DAGScheduler$$abortStage(DAGScheduler.scala:1009)
at org.apache.spark.scheduler.DAGScheduler$$anonfun$processEvent$10.apply(DAGScheduler.scala:596)
at org.apache.spark.scheduler.DAGScheduler$$anonfun$processEvent$10.apply(DAGScheduler.scala:596)
at scala.Option.foreach(Option.scala:236)
at org.apache.spark.scheduler.DAGScheduler.processEvent(DAGScheduler.scala:596)
at org.apache.spark.scheduler.DAGScheduler$$anonfun$start$1$$anon$2$$anonfun$receive$1.applyOrElse(DAGScheduler.scala:146)
at akka.actor.ActorCell.receiveMessage(ActorCell.scala:498)
at akka.actor.ActorCell.invoke(ActorCell.scala:456)
at akka.dispatch.Mailbox.processMailbox(Mailbox.scala:237)
at akka.dispatch.Mailbox.run(Mailbox.scala:219)
at akka.dispatch.ForkJoinExecutorConfigurator$AkkaForkJoinTask.exec(AbstractDispatcher.scala:386)
at scala.concurrent.forkjoin.ForkJoinTask.doExec(ForkJoinTask.java:260)
at scala.concurrent.forkjoin.ForkJoinPool$WorkQueue.runTask(ForkJoinPool.java:1339)
at scala.concurrent.forkjoin.ForkJoinPool.runWorker(ForkJoinPool.java:1979)
at scala.concurrent.forkjoin.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:107)

Spark says that the jar is added :

14/03/26 15:49:18 INFO SparkContext: Added JAR target/scala-2.10/value-spark_2.10-1.0.jar





On Wed, Mar 26, 2014 at 3:34 PM, Ognen Duzlevski <[hidden email]> wrote:
Have you looked at the individual nodes logs? Can you post a bit more of the exception's output?


On 3/26/14, 8:42 AM, Jaonary Rabarisoa wrote:
Hi all,

I got java.lang.ClassNotFoundException even with "addJar" called. The jar file is present in each node.

I use the version of spark from github master.

Any ideas ?


Jaonary


Reply | Threaded
Open this post in threaded view
|

Re: java.lang.ClassNotFoundException

Yana
In reply to this post by Ognen Duzlevski-2
I might be way off here but are you looking at the logs on the worker
machines? I am running an older version (0.8) and when I look at the
error log for the executor process I see the exact location where the
executor process tries to load the jar from...with a line like this:

14/03/26 13:57:11 INFO executor.Executor: Adding
file:/dirs/dirs/spark/work/app-20140326135710-0029/0/./spark-test.jar
to class loader

You said "The jar file is present in each node", do you see any
information on the executor indicating that it's trying to load the
jar or where it's loading it from? I can't tell for sure by looking at
your logs but they seem to be logs from the master and driver, not
from the executor itself?

On Wed, Mar 26, 2014 at 11:46 AM, Ognen Duzlevski
<[hidden email]> wrote:

> Have you looked through the logs fully? I have seen this (in my limited
> experience) pop up as a result of previous exceptions/errors, also as a
> result of being unable to serialize objects etc.
> Ognen
>
>
> On 3/26/14, 10:39 AM, Jaonary Rabarisoa wrote:
>
> I notice that I get this error when I'm trying to load an objectFile with
> val viperReloaded = context.objectFile[ReIdDataSetEntry]("data")
>
>
> On Wed, Mar 26, 2014 at 3:58 PM, Jaonary Rabarisoa <[hidden email]>
> wrote:
>>
>> Here the output that I get :
>>
>> [error] (run-main-0) org.apache.spark.SparkException: Job aborted: Task
>> 1.0:1 failed 4 times (most recent failure: Exception failure in TID 6 on
>> host 172.166.86.36: java.lang.ClassNotFoundException:
>> value.models.ReIdDataSetEntry)
>> org.apache.spark.SparkException: Job aborted: Task 1.0:1 failed 4 times
>> (most recent failure: Exception failure in TID 6 on host 172.166.86.36:
>> java.lang.ClassNotFoundException: value.models.ReIdDataSetEntry)
>> at
>> org.apache.spark.scheduler.DAGScheduler$$anonfun$org$apache$spark$scheduler$DAGScheduler$$abortStage$1.apply(DAGScheduler.scala:1011)
>> at
>> org.apache.spark.scheduler.DAGScheduler$$anonfun$org$apache$spark$scheduler$DAGScheduler$$abortStage$1.apply(DAGScheduler.scala:1009)
>> at
>> scala.collection.mutable.ResizableArray$class.foreach(ResizableArray.scala:59)
>> at scala.collection.mutable.ArrayBuffer.foreach(ArrayBuffer.scala:47)
>> at
>> org.apache.spark.scheduler.DAGScheduler.org$apache$spark$scheduler$DAGScheduler$$abortStage(DAGScheduler.scala:1009)
>> at
>> org.apache.spark.scheduler.DAGScheduler$$anonfun$processEvent$10.apply(DAGScheduler.scala:596)
>> at
>> org.apache.spark.scheduler.DAGScheduler$$anonfun$processEvent$10.apply(DAGScheduler.scala:596)
>> at scala.Option.foreach(Option.scala:236)
>> at
>> org.apache.spark.scheduler.DAGScheduler.processEvent(DAGScheduler.scala:596)
>> at
>> org.apache.spark.scheduler.DAGScheduler$$anonfun$start$1$$anon$2$$anonfun$receive$1.applyOrElse(DAGScheduler.scala:146)
>> at akka.actor.ActorCell.receiveMessage(ActorCell.scala:498)
>> at akka.actor.ActorCell.invoke(ActorCell.scala:456)
>> at akka.dispatch.Mailbox.processMailbox(Mailbox.scala:237)
>> at akka.dispatch.Mailbox.run(Mailbox.scala:219)
>> at
>> akka.dispatch.ForkJoinExecutorConfigurator$AkkaForkJoinTask.exec(AbstractDispatcher.scala:386)
>> at scala.concurrent.forkjoin.ForkJoinTask.doExec(ForkJoinTask.java:260)
>> at
>> scala.concurrent.forkjoin.ForkJoinPool$WorkQueue.runTask(ForkJoinPool.java:1339)
>> at
>> scala.concurrent.forkjoin.ForkJoinPool.runWorker(ForkJoinPool.java:1979)
>> at
>> scala.concurrent.forkjoin.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:107)
>>
>> Spark says that the jar is added :
>>
>> 14/03/26 15:49:18 INFO SparkContext: Added JAR
>> target/scala-2.10/value-spark_2.10-1.0.jar
>>
>>
>>
>>
>>
>> On Wed, Mar 26, 2014 at 3:34 PM, Ognen Duzlevski
>> <[hidden email]> wrote:
>>>
>>> Have you looked at the individual nodes logs? Can you post a bit more of
>>> the exception's output?
>>>
>>>
>>> On 3/26/14, 8:42 AM, Jaonary Rabarisoa wrote:
>>>>
>>>> Hi all,
>>>>
>>>> I got java.lang.ClassNotFoundException even with "addJar" called. The
>>>> jar file is present in each node.
>>>>
>>>> I use the version of spark from github master.
>>>>
>>>> Any ideas ?
>>>>
>>>>
>>>> Jaonary
>
>
Reply | Threaded
Open this post in threaded view
|

Re: java.lang.ClassNotFoundException

Jaonary Rabarisoa


On Wed, Mar 26, 2014 at 5:50 PM, Yana Kadiyska <[hidden email]> wrote:
I might be way off here but are you looking at the logs on the worker
machines? I am running an older version (0.8) and when I look at the
error log for the executor process I see the exact location where the
executor process tries to load the jar from...with a line like this:

14/03/26 13:57:11 INFO executor.Executor: Adding
file:/dirs/dirs/spark/work/app-20140326135710-0029/0/./spark-test.jar
to class loader

You said "The jar file is present in each node", do you see any
information on the executor indicating that it's trying to load the
jar or where it's loading it from? I can't tell for sure by looking at
your logs but they seem to be logs from the master and driver, not
from the executor itself?

On Wed, Mar 26, 2014 at 11:46 AM, Ognen Duzlevski
<[hidden email]> wrote:
> Have you looked through the logs fully? I have seen this (in my limited
> experience) pop up as a result of previous exceptions/errors, also as a
> result of being unable to serialize objects etc.
> Ognen
>
>
> On 3/26/14, 10:39 AM, Jaonary Rabarisoa wrote:
>
> I notice that I get this error when I'm trying to load an objectFile with
> val viperReloaded = context.objectFile[ReIdDataSetEntry]("data")
>
>
> On Wed, Mar 26, 2014 at 3:58 PM, Jaonary Rabarisoa <[hidden email]>
> wrote:
>>
>> Here the output that I get :
>>
>> [error] (run-main-0) org.apache.spark.SparkException: Job aborted: Task
>> 1.0:1 failed 4 times (most recent failure: Exception failure in TID 6 on
>> host 172.166.86.36: java.lang.ClassNotFoundException:
>> value.models.ReIdDataSetEntry)
>> org.apache.spark.SparkException: Job aborted: Task 1.0:1 failed 4 times
>> (most recent failure: Exception failure in TID 6 on host 172.166.86.36:
>> java.lang.ClassNotFoundException: value.models.ReIdDataSetEntry)
>> at
>> org.apache.spark.scheduler.DAGScheduler$$anonfun$org$apache$spark$scheduler$DAGScheduler$$abortStage$1.apply(DAGScheduler.scala:1011)
>> at
>> org.apache.spark.scheduler.DAGScheduler$$anonfun$org$apache$spark$scheduler$DAGScheduler$$abortStage$1.apply(DAGScheduler.scala:1009)
>> at
>> scala.collection.mutable.ResizableArray$class.foreach(ResizableArray.scala:59)
>> at scala.collection.mutable.ArrayBuffer.foreach(ArrayBuffer.scala:47)
>> at
>> org.apache.spark.scheduler.DAGScheduler.org$apache$spark$scheduler$DAGScheduler$$abortStage(DAGScheduler.scala:1009)
>> at
>> org.apache.spark.scheduler.DAGScheduler$$anonfun$processEvent$10.apply(DAGScheduler.scala:596)
>> at
>> org.apache.spark.scheduler.DAGScheduler$$anonfun$processEvent$10.apply(DAGScheduler.scala:596)
>> at scala.Option.foreach(Option.scala:236)
>> at
>> org.apache.spark.scheduler.DAGScheduler.processEvent(DAGScheduler.scala:596)
>> at
>> org.apache.spark.scheduler.DAGScheduler$$anonfun$start$1$$anon$2$$anonfun$receive$1.applyOrElse(DAGScheduler.scala:146)
>> at akka.actor.ActorCell.receiveMessage(ActorCell.scala:498)
>> at akka.actor.ActorCell.invoke(ActorCell.scala:456)
>> at akka.dispatch.Mailbox.processMailbox(Mailbox.scala:237)
>> at akka.dispatch.Mailbox.run(Mailbox.scala:219)
>> at
>> akka.dispatch.ForkJoinExecutorConfigurator$AkkaForkJoinTask.exec(AbstractDispatcher.scala:386)
>> at scala.concurrent.forkjoin.ForkJoinTask.doExec(ForkJoinTask.java:260)
>> at
>> scala.concurrent.forkjoin.ForkJoinPool$WorkQueue.runTask(ForkJoinPool.java:1339)
>> at
>> scala.concurrent.forkjoin.ForkJoinPool.runWorker(ForkJoinPool.java:1979)
>> at
>> scala.concurrent.forkjoin.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:107)
>>
>> Spark says that the jar is added :
>>
>> 14/03/26 15:49:18 INFO SparkContext: Added JAR
>> target/scala-2.10/value-spark_2.10-1.0.jar
>>
>>
>>
>>
>>
>> On Wed, Mar 26, 2014 at 3:34 PM, Ognen Duzlevski
>> <[hidden email]> wrote:
>>>
>>> Have you looked at the individual nodes logs? Can you post a bit more of
>>> the exception's output?
>>>
>>>
>>> On 3/26/14, 8:42 AM, Jaonary Rabarisoa wrote:
>>>>
>>>> Hi all,
>>>>
>>>> I got java.lang.ClassNotFoundException even with "addJar" called. The
>>>> jar file is present in each node.
>>>>
>>>> I use the version of spark from github master.
>>>>
>>>> Any ideas ?
>>>>
>>>>
>>>> Jaonary
>
>

Reply | Threaded
Open this post in threaded view
|

Re: java.lang.ClassNotFoundException

Aniket Mokashi
context.objectFile[ReIdDataSetEntry]("data") -not sure how this is compiled in scala. But, if it uses some sort of ObjectInputStream, you need to be careful - ObjectInputStream uses root classloader to load classes and does not work with jars that are added to TCCC. Apache commons has ClassLoaderObjectInputStream to workaround this.


On Wed, Mar 26, 2014 at 1:38 PM, Jaonary Rabarisoa <[hidden email]> wrote:


On Wed, Mar 26, 2014 at 5:50 PM, Yana Kadiyska <[hidden email]> wrote:
I might be way off here but are you looking at the logs on the worker
machines? I am running an older version (0.8) and when I look at the
error log for the executor process I see the exact location where the
executor process tries to load the jar from...with a line like this:

14/03/26 13:57:11 INFO executor.Executor: Adding
file:/dirs/dirs/spark/work/app-20140326135710-0029/0/./spark-test.jar
to class loader

You said "The jar file is present in each node", do you see any
information on the executor indicating that it's trying to load the
jar or where it's loading it from? I can't tell for sure by looking at
your logs but they seem to be logs from the master and driver, not
from the executor itself?

On Wed, Mar 26, 2014 at 11:46 AM, Ognen Duzlevski
<[hidden email]> wrote:
> Have you looked through the logs fully? I have seen this (in my limited
> experience) pop up as a result of previous exceptions/errors, also as a
> result of being unable to serialize objects etc.
> Ognen
>
>
> On 3/26/14, 10:39 AM, Jaonary Rabarisoa wrote:
>
> I notice that I get this error when I'm trying to load an objectFile with
> val viperReloaded = context.objectFile[ReIdDataSetEntry]("data")
>
>
> On Wed, Mar 26, 2014 at 3:58 PM, Jaonary Rabarisoa <[hidden email]>
> wrote:
>>
>> Here the output that I get :
>>
>> [error] (run-main-0) org.apache.spark.SparkException: Job aborted: Task
>> 1.0:1 failed 4 times (most recent failure: Exception failure in TID 6 on
>> host 172.166.86.36: java.lang.ClassNotFoundException:
>> value.models.ReIdDataSetEntry)
>> org.apache.spark.SparkException: Job aborted: Task 1.0:1 failed 4 times
>> (most recent failure: Exception failure in TID 6 on host 172.166.86.36:
>> java.lang.ClassNotFoundException: value.models.ReIdDataSetEntry)
>> at
>> org.apache.spark.scheduler.DAGScheduler$$anonfun$org$apache$spark$scheduler$DAGScheduler$$abortStage$1.apply(DAGScheduler.scala:1011)
>> at
>> org.apache.spark.scheduler.DAGScheduler$$anonfun$org$apache$spark$scheduler$DAGScheduler$$abortStage$1.apply(DAGScheduler.scala:1009)
>> at
>> scala.collection.mutable.ResizableArray$class.foreach(ResizableArray.scala:59)
>> at scala.collection.mutable.ArrayBuffer.foreach(ArrayBuffer.scala:47)
>> at
>> org.apache.spark.scheduler.DAGScheduler.org$apache$spark$scheduler$DAGScheduler$$abortStage(DAGScheduler.scala:1009)
>> at
>> org.apache.spark.scheduler.DAGScheduler$$anonfun$processEvent$10.apply(DAGScheduler.scala:596)
>> at
>> org.apache.spark.scheduler.DAGScheduler$$anonfun$processEvent$10.apply(DAGScheduler.scala:596)
>> at scala.Option.foreach(Option.scala:236)
>> at
>> org.apache.spark.scheduler.DAGScheduler.processEvent(DAGScheduler.scala:596)
>> at
>> org.apache.spark.scheduler.DAGScheduler$$anonfun$start$1$$anon$2$$anonfun$receive$1.applyOrElse(DAGScheduler.scala:146)
>> at akka.actor.ActorCell.receiveMessage(ActorCell.scala:498)
>> at akka.actor.ActorCell.invoke(ActorCell.scala:456)
>> at akka.dispatch.Mailbox.processMailbox(Mailbox.scala:237)
>> at akka.dispatch.Mailbox.run(Mailbox.scala:219)
>> at
>> akka.dispatch.ForkJoinExecutorConfigurator$AkkaForkJoinTask.exec(AbstractDispatcher.scala:386)
>> at scala.concurrent.forkjoin.ForkJoinTask.doExec(ForkJoinTask.java:260)
>> at
>> scala.concurrent.forkjoin.ForkJoinPool$WorkQueue.runTask(ForkJoinPool.java:1339)
>> at
>> scala.concurrent.forkjoin.ForkJoinPool.runWorker(ForkJoinPool.java:1979)
>> at
>> scala.concurrent.forkjoin.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:107)
>>
>> Spark says that the jar is added :
>>
>> 14/03/26 15:49:18 INFO SparkContext: Added JAR
>> target/scala-2.10/value-spark_2.10-1.0.jar
>>
>>
>>
>>
>>
>> On Wed, Mar 26, 2014 at 3:34 PM, Ognen Duzlevski
>> <[hidden email]> wrote:
>>>
>>> Have you looked at the individual nodes logs? Can you post a bit more of
>>> the exception's output?
>>>
>>>
>>> On 3/26/14, 8:42 AM, Jaonary Rabarisoa wrote:
>>>>
>>>> Hi all,
>>>>
>>>> I got java.lang.ClassNotFoundException even with "addJar" called. The
>>>> jar file is present in each node.
>>>>
>>>> I use the version of spark from github master.
>>>>
>>>> Any ideas ?
>>>>
>>>>
>>>> Jaonary
>
>




--
"...:::Aniket:::... Quetzalco@tl"
Reply | Threaded
Open this post in threaded view
|

Re: java.lang.ClassNotFoundException

Jaonary Rabarisoa
The issue and a workaround can be found here https://github.com/apache/spark/pull/181


On Wed, Mar 26, 2014 at 10:12 PM, Aniket Mokashi <[hidden email]> wrote:
context.objectFile[ReIdDataSetEntry]("data") -not sure how this is compiled in scala. But, if it uses some sort of ObjectInputStream, you need to be careful - ObjectInputStream uses root classloader to load classes and does not work with jars that are added to TCCC. Apache commons has ClassLoaderObjectInputStream to workaround this.


On Wed, Mar 26, 2014 at 1:38 PM, Jaonary Rabarisoa <[hidden email]> wrote:


On Wed, Mar 26, 2014 at 5:50 PM, Yana Kadiyska <[hidden email]> wrote:
I might be way off here but are you looking at the logs on the worker
machines? I am running an older version (0.8) and when I look at the
error log for the executor process I see the exact location where the
executor process tries to load the jar from...with a line like this:

14/03/26 13:57:11 INFO executor.Executor: Adding
file:/dirs/dirs/spark/work/app-20140326135710-0029/0/./spark-test.jar
to class loader

You said "The jar file is present in each node", do you see any
information on the executor indicating that it's trying to load the
jar or where it's loading it from? I can't tell for sure by looking at
your logs but they seem to be logs from the master and driver, not
from the executor itself?

On Wed, Mar 26, 2014 at 11:46 AM, Ognen Duzlevski
<[hidden email]> wrote:
> Have you looked through the logs fully? I have seen this (in my limited
> experience) pop up as a result of previous exceptions/errors, also as a
> result of being unable to serialize objects etc.
> Ognen
>
>
> On 3/26/14, 10:39 AM, Jaonary Rabarisoa wrote:
>
> I notice that I get this error when I'm trying to load an objectFile with
> val viperReloaded = context.objectFile[ReIdDataSetEntry]("data")
>
>
> On Wed, Mar 26, 2014 at 3:58 PM, Jaonary Rabarisoa <[hidden email]>
> wrote:
>>
>> Here the output that I get :
>>
>> [error] (run-main-0) org.apache.spark.SparkException: Job aborted: Task
>> 1.0:1 failed 4 times (most recent failure: Exception failure in TID 6 on
>> host 172.166.86.36: java.lang.ClassNotFoundException:
>> value.models.ReIdDataSetEntry)
>> org.apache.spark.SparkException: Job aborted: Task 1.0:1 failed 4 times
>> (most recent failure: Exception failure in TID 6 on host 172.166.86.36:
>> java.lang.ClassNotFoundException: value.models.ReIdDataSetEntry)
>> at
>> org.apache.spark.scheduler.DAGScheduler$$anonfun$org$apache$spark$scheduler$DAGScheduler$$abortStage$1.apply(DAGScheduler.scala:1011)
>> at
>> org.apache.spark.scheduler.DAGScheduler$$anonfun$org$apache$spark$scheduler$DAGScheduler$$abortStage$1.apply(DAGScheduler.scala:1009)
>> at
>> scala.collection.mutable.ResizableArray$class.foreach(ResizableArray.scala:59)
>> at scala.collection.mutable.ArrayBuffer.foreach(ArrayBuffer.scala:47)
>> at
>> org.apache.spark.scheduler.DAGScheduler.org$apache$spark$scheduler$DAGScheduler$$abortStage(DAGScheduler.scala:1009)
>> at
>> org.apache.spark.scheduler.DAGScheduler$$anonfun$processEvent$10.apply(DAGScheduler.scala:596)
>> at
>> org.apache.spark.scheduler.DAGScheduler$$anonfun$processEvent$10.apply(DAGScheduler.scala:596)
>> at scala.Option.foreach(Option.scala:236)
>> at
>> org.apache.spark.scheduler.DAGScheduler.processEvent(DAGScheduler.scala:596)
>> at
>> org.apache.spark.scheduler.DAGScheduler$$anonfun$start$1$$anon$2$$anonfun$receive$1.applyOrElse(DAGScheduler.scala:146)
>> at akka.actor.ActorCell.receiveMessage(ActorCell.scala:498)
>> at akka.actor.ActorCell.invoke(ActorCell.scala:456)
>> at akka.dispatch.Mailbox.processMailbox(Mailbox.scala:237)
>> at akka.dispatch.Mailbox.run(Mailbox.scala:219)
>> at
>> akka.dispatch.ForkJoinExecutorConfigurator$AkkaForkJoinTask.exec(AbstractDispatcher.scala:386)
>> at scala.concurrent.forkjoin.ForkJoinTask.doExec(ForkJoinTask.java:260)
>> at
>> scala.concurrent.forkjoin.ForkJoinPool$WorkQueue.runTask(ForkJoinPool.java:1339)
>> at
>> scala.concurrent.forkjoin.ForkJoinPool.runWorker(ForkJoinPool.java:1979)
>> at
>> scala.concurrent.forkjoin.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:107)
>>
>> Spark says that the jar is added :
>>
>> 14/03/26 15:49:18 INFO SparkContext: Added JAR
>> target/scala-2.10/value-spark_2.10-1.0.jar
>>
>>
>>
>>
>>
>> On Wed, Mar 26, 2014 at 3:34 PM, Ognen Duzlevski
>> <[hidden email]> wrote:
>>>
>>> Have you looked at the individual nodes logs? Can you post a bit more of
>>> the exception's output?
>>>
>>>
>>> On 3/26/14, 8:42 AM, Jaonary Rabarisoa wrote:
>>>>
>>>> Hi all,
>>>>
>>>> I got java.lang.ClassNotFoundException even with "addJar" called. The
>>>> jar file is present in each node.
>>>>
>>>> I use the version of spark from github master.
>>>>
>>>> Any ideas ?
>>>>
>>>>
>>>> Jaonary
>
>




--
"...:::Aniket:::... Quetzalco@tl"