NO SUCH METHOD EXCEPTION

classic Classic list List threaded Threaded
2 messages Options
Reply | Threaded
Open this post in threaded view
|

NO SUCH METHOD EXCEPTION

Jeyaraj, Arockia R (Arockia)

Hi,

 

Can anyone help me to resolve this issue? Why am I getting NoSuchMethod exception?

 

14/03/11 09:56:11 ERROR executor.Executor: Exception in task ID 0
java.lang.NoSuchMethodError: scala.Predef$.augmentString(Ljava/lang/String;)Lsca
la/collection/immutable/StringOps;
at kafka.utils.VerifiableProperties.getIntInRange(VerifiableProperties.s
cala:75)
at kafka.utils.VerifiableProperties.getInt(VerifiableProperties.scala:58
)
at kafka.utils.ZKConfig.<init>(ZkUtils.scala:837)
at kafka.consumer.ConsumerConfig.<init>(ConsumerConfig.scala:73)
at kafka.consumer.ConsumerConfig.<init>(ConsumerConfig.scala:77)
at org.apache.spark.streaming.kafka.KafkaReceiver.onStart(KafkaInputDStr
eam.scala:98)
at org.apache.spark.streaming.dstream.NetworkReceiver.start(NetworkInput
DStream.scala:126)
at org.apache.spark.streaming.scheduler.NetworkInputTracker$ReceiverExec
utor$$anonfun$8.apply(NetworkInputTracker.scala:173)
at org.apache.spark.streaming.scheduler.NetworkInputTracker$ReceiverExec
utor$$anonfun$8.apply(NetworkInputTracker.scala:169)
at org.apache.spark.SparkContext$$anonfun$runJob$4.apply(SparkContext.sc
ala:884)
at org.apache.spark.SparkContext$$anonfun$runJob$4.apply(SparkContext.sc
ala:884)
at org.apache.spark.scheduler.ResultTask.runTask(ResultTask.scala:109)
at org.apache.spark.scheduler.Task.run(Task.scala:53)
at org.apache.spark.executor.Executor$TaskRunner$$anonfun$run$1.apply$mc
V$sp(Executor.scala:213)
at org.apache.spark.deploy.SparkHadoopUtil.runAsUser(SparkHadoopUtil.sca
la:49)
at org.apache.spark.executor.Executor$TaskRunner.run(Executor.scala:178)

at java.util.concurrent.ThreadPoolExecutor$Worker.runTask(ThreadPoolExec
utor.java:886)
at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor
.java:908)
at java.lang.Thread.run(Thread.java:619)
14/03/11 09:56:11 WARN scheduler.TaskSetManager: Lost TID 0 (task 0.0:0)
14/03/11 09:56:11 WARN scheduler.TaskSetManager: Loss was due to java.lang.NoSuc
hMethodError
java.lang.NoSuchMethodError: scala.Predef$.augmentString(Ljava/lang/String;)Lsca
la/collection/immutable/StringOps;
at kafka.utils.VerifiableProperties.getIntInRange(VerifiableProperties.s
cala:75)
at kafka.utils.VerifiableProperties.getInt(VerifiableProperties.scala:58
)
at kafka.utils.ZKConfig.<init>(ZkUtils.scala:837)
at kafka.consumer.ConsumerConfig.<init>(ConsumerConfig.scala:73)
at kafka.consumer.ConsumerConfig.<init>(ConsumerConfig.scala:77)
at org.apache.spark.streaming.kafka.KafkaReceiver.onStart(KafkaInputDStr
eam.scala:98)
at org.apache.spark.streaming.dstream.NetworkReceiver.start(NetworkInput
DStream.scala:126)
at org.apache.spark.streaming.scheduler.NetworkInputTracker$ReceiverExec
utor$$anonfun$8.apply(NetworkInputTracker.scala:173)
at org.apache.spark.streaming.scheduler.NetworkInputTracker$ReceiverExec
utor$$anonfun$8.apply(NetworkInputTracker.scala:169)
at org.apache.spark.SparkContext$$anonfun$runJob$4.apply(SparkContext.sc
ala:884)
at org.apache.spark.SparkContext$$anonfun$runJob$4.apply(SparkContext.sc
ala:884)
at org.apache.spark.scheduler.ResultTask.runTask(ResultTask.scala:109)
at org.apache.spark.scheduler.Task.run(Task.scala:53)
at org.apache.spark.executor.Executor$TaskRunner$$anonfun$run$1.apply$mc
V$sp(Executor.scala:213)
at org.apache.spark.deploy.SparkHadoopUtil.runAsUser(SparkHadoopUtil.sca
la:49)
at org.apache.spark.executor.Executor$TaskRunner.run(Executor.scala:178)

at java.util.concurrent.ThreadPoolExecutor$Worker.runTask(ThreadPoolExec
utor.java:886)
at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor
.java:908)
at java.lang.Thread.run(Thread.java:619)
14/03/11 09:56:11 ERROR scheduler.TaskSetManager: Task 0.0:0 failed 1 times; abo
rting job
14/03/11 09:56:11 INFO scheduler.TaskSchedulerImpl: Remove TaskSet 0.0 from pool

14/03/11 09:56:11 INFO scheduler.DAGScheduler: Failed to run runJob at NetworkIn
putTracker.scala:182
[error] (Thread-34) org.apache.spark.SparkException: Job aborted: Task 0.0:0 fai

 

 

Thanks

Arockia Raja

Reply | Threaded
Open this post in threaded view
|

Re: NO SUCH METHOD EXCEPTION

Matei Zaharia
Administrator
Since it’s from Scala, it might mean you’re running with a different version of Scala than you compiled Spark with. Spark 0.8 and earlier use Scala 2.9, while Spark 0.9 uses Scala 2.10.

Matei

On Mar 11, 2014, at 8:19 AM, Jeyaraj, Arockia R (Arockia) <[hidden email]> wrote:

Hi,
 
Can anyone help me to resolve this issue? Why am I getting NoSuchMethod exception?
 

14/03/11 09:56:11 ERROR executor.Executor: Exception in task ID 0
java.lang.NoSuchMethodError: scala.Predef$.augmentString(Ljava/lang/String;)Lsca
la/collection/immutable/StringOps;
at kafka.utils.VerifiableProperties.getIntInRange(VerifiableProperties.s
cala:75)
at kafka.utils.VerifiableProperties.getInt(VerifiableProperties.scala:58
)
at kafka.utils.ZKConfig.<init>(ZkUtils.scala:837)
at kafka.consumer.ConsumerConfig.<init>(ConsumerConfig.scala:73)
at kafka.consumer.ConsumerConfig.<init>(ConsumerConfig.scala:77)
at org.apache.spark.streaming.kafka.KafkaReceiver.onStart(KafkaInputDStr
eam.scala:98)
at org.apache.spark.streaming.dstream.NetworkReceiver.start(NetworkInput
DStream.scala:126)
at org.apache.spark.streaming.scheduler.NetworkInputTracker$ReceiverExec
utor$$anonfun$8.apply(NetworkInputTracker.scala:173)
at org.apache.spark.streaming.scheduler.NetworkInputTracker$ReceiverExec
utor$$anonfun$8.apply(NetworkInputTracker.scala:169)
at org.apache.spark.SparkContext$$anonfun$runJob$4.apply(SparkContext.sc
ala:884)
at org.apache.spark.SparkContext$$anonfun$runJob$4.apply(SparkContext.sc
ala:884)
at org.apache.spark.scheduler.ResultTask.runTask(ResultTask.scala:109)
at org.apache.spark.scheduler.Task.run(Task.scala:53)
at org.apache.spark.executor.Executor$TaskRunner$$anonfun$run$1.apply$mc
V$sp(Executor.scala:213)
at org.apache.spark.deploy.SparkHadoopUtil.runAsUser(SparkHadoopUtil.sca
la:49)
at org.apache.spark.executor.Executor$TaskRunner.run(Executor.scala:178)

at java.util.concurrent.ThreadPoolExecutor$Worker.runTask(ThreadPoolExec
utor.java:886)
at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor
.java:908)
at java.lang.Thread.run(Thread.java:619)
14/03/11 09:56:11 WARN scheduler.TaskSetManager: Lost TID 0 (task 0.0:0)
14/03/11 09:56:11 WARN scheduler.TaskSetManager: Loss was due to java.lang.NoSuc
hMethodError
java.lang.NoSuchMethodError: scala.Predef$.augmentString(Ljava/lang/String;)Lsca
la/collection/immutable/StringOps;
at kafka.utils.VerifiableProperties.getIntInRange(VerifiableProperties.s
cala:75)
at kafka.utils.VerifiableProperties.getInt(VerifiableProperties.scala:58
)
at kafka.utils.ZKConfig.<init>(ZkUtils.scala:837)
at kafka.consumer.ConsumerConfig.<init>(ConsumerConfig.scala:73)
at kafka.consumer.ConsumerConfig.<init>(ConsumerConfig.scala:77)
at org.apache.spark.streaming.kafka.KafkaReceiver.onStart(KafkaInputDStr
eam.scala:98)
at org.apache.spark.streaming.dstream.NetworkReceiver.start(NetworkInput
DStream.scala:126)
at org.apache.spark.streaming.scheduler.NetworkInputTracker$ReceiverExec
utor$$anonfun$8.apply(NetworkInputTracker.scala:173)
at org.apache.spark.streaming.scheduler.NetworkInputTracker$ReceiverExec
utor$$anonfun$8.apply(NetworkInputTracker.scala:169)
at org.apache.spark.SparkContext$$anonfun$runJob$4.apply(SparkContext.sc
ala:884)
at org.apache.spark.SparkContext$$anonfun$runJob$4.apply(SparkContext.sc
ala:884)
at org.apache.spark.scheduler.ResultTask.runTask(ResultTask.scala:109)
at org.apache.spark.scheduler.Task.run(Task.scala:53)
at org.apache.spark.executor.Executor$TaskRunner$$anonfun$run$1.apply$mc
V$sp(Executor.scala:213)
at org.apache.spark.deploy.SparkHadoopUtil.runAsUser(SparkHadoopUtil.sca
la:49)
at org.apache.spark.executor.Executor$TaskRunner.run(Executor.scala:178)

at java.util.concurrent.ThreadPoolExecutor$Worker.runTask(ThreadPoolExec
utor.java:886)
at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor
.java:908)
at java.lang.Thread.run(Thread.java:619)
14/03/11 09:56:11 ERROR scheduler.TaskSetManager: Task 0.0:0 failed 1 times; abo
rting job
14/03/11 09:56:11 INFO scheduler.TaskSchedulerImpl: Remove TaskSet 0.0 from pool

14/03/11 09:56:11 INFO scheduler.DAGScheduler: Failed to run runJob at NetworkIn
putTracker.scala:182
[error] (Thread-34) org.apache.spark.SparkException: Job aborted: Task 0.0:0 fai

 
 
Thanks
Arockia Raja