Quantcast

AbstractMethodError

Previous Topic Next Topic
 
classic Classic list List threaded Threaded
4 messages Options
Reply | Threaded
Open this post in threaded view
|  
Report Content as Inappropriate

AbstractMethodError

leosandylh@gmail.com
I write a example MyWordCount , just set spark.akka.frameSize larger than default . but when I run this jar , there is a problem :
 
13/12/19 18:53:48 INFO ClusterTaskSetManager: Lost TID 0 (task 0.0:0)
13/12/19 18:53:48 INFO ClusterTaskSetManager: Loss was due to java.lang.AbstractMethodError
java.lang.AbstractMethodError: org.apache.spark.api.java.function.WrappedFunction1.call(Ljava/lang/Object;)Ljava/lang/Object;
        at org.apache.spark.api.java.function.WrappedFunction1.apply(WrappedFunction1.scala:31)
        at org.apache.spark.api.java.JavaRDDLike$$anonfun$fn$1$1.apply(JavaRDDLike.scala:90)
        at org.apache.spark.api.java.JavaRDDLike$$anonfun$fn$1$1.apply(JavaRDDLike.scala:90)
        at scala.collection.Iterator$$anon$21.hasNext(Iterator.scala:440)
        at scala.collection.Iterator$class.foreach(Iterator.scala:772)
        at scala.collection.Iterator$$anon$21.foreach(Iterator.scala:437)
        at scala.collection.generic.Growable$class.$plus$plus$eq(Growable.scala:48)
        at scala.collection.mutable.ArrayBuffer.$plus$plus$eq(ArrayBuffer.scala:102)
        at scala.collection.TraversableOnce$class.toBuffer(TraversableOnce.scala:250)
        at scala.collection.Iterator$$anon$21.toBuffer(Iterator.scala:437)
        at scala.collection.TraversableOnce$class.toArray(TraversableOnce.scala:237)
        at scala.collection.Iterator$$anon$21.toArray(Iterator.scala:437)
        at org.apache.spark.rdd.RDD$$anonfun$1.apply(RDD.scala:560)
        at org.apache.spark.rdd.RDD$$anonfun$1.apply(RDD.scala:560)
        at org.apache.spark.SparkContext$$anonfun$runJob$4.apply(SparkContext.scala:758)
        at org.apache.spark.SparkContext$$anonfun$runJob$4.apply(SparkContext.scala:758)
 
it caused by  this code :
JavaRDD<String> words = lines.flatMap(new FlatMapFunction<String, String>() {
    public Iterable<String> call(String s) {
        return Arrays.asList(s.split(" "));
    } });
 
there is the parent class:
 
private[spark] abstract class WrappedFunction1[T, R] extends AbstractFunction1[T, R] {
  @throws(classOf[Exception])
  def call(t: T): R
 
  final def apply(t: T): R = call(t)
}
 
my code is same as the JavaWordCount , I don't know what's the error .
 
Thanks
 
Leo
 

Reply | Threaded
Open this post in threaded view
|  
Report Content as Inappropriate

Re: AbstractMethodError

Azuryy Yu
Leo,
Which version Spark are you used? It was caused compiled by Scala-2.10.

Spark-0.8-x using scala-2.9, so you must use the same major version to compile spark code.


On Mon, Dec 23, 2013 at 4:00 PM, [hidden email] <[hidden email]> wrote:
I write a example MyWordCount , just set spark.akka.frameSize larger than default . but when I run this jar , there is a problem :
 
13/12/19 18:53:48 INFO ClusterTaskSetManager: Lost TID 0 (task 0.0:0)
13/12/19 18:53:48 INFO ClusterTaskSetManager: Loss was due to java.lang.AbstractMethodError
java.lang.AbstractMethodError: org.apache.spark.api.java.function.WrappedFunction1.call(Ljava/lang/Object;)Ljava/lang/Object;
        at org.apache.spark.api.java.function.WrappedFunction1.apply(WrappedFunction1.scala:31)
        at org.apache.spark.api.java.JavaRDDLike$$anonfun$fn$1$1.apply(JavaRDDLike.scala:90)
        at org.apache.spark.api.java.JavaRDDLike$$anonfun$fn$1$1.apply(JavaRDDLike.scala:90)
        at scala.collection.Iterator$$anon$21.hasNext(Iterator.scala:440)
        at scala.collection.Iterator$class.foreach(Iterator.scala:772)
        at scala.collection.Iterator$$anon$21.foreach(Iterator.scala:437)
        at scala.collection.generic.Growable$class.$plus$plus$eq(Growable.scala:48)
        at scala.collection.mutable.ArrayBuffer.$plus$plus$eq(ArrayBuffer.scala:102)
        at scala.collection.TraversableOnce$class.toBuffer(TraversableOnce.scala:250)
        at scala.collection.Iterator$$anon$21.toBuffer(Iterator.scala:437)
        at scala.collection.TraversableOnce$class.toArray(TraversableOnce.scala:237)
        at scala.collection.Iterator$$anon$21.toArray(Iterator.scala:437)
        at org.apache.spark.rdd.RDD$$anonfun$1.apply(RDD.scala:560)
        at org.apache.spark.rdd.RDD$$anonfun$1.apply(RDD.scala:560)
        at org.apache.spark.SparkContext$$anonfun$runJob$4.apply(SparkContext.scala:758)
        at org.apache.spark.SparkContext$$anonfun$runJob$4.apply(SparkContext.scala:758)
 
it caused by  this code :
JavaRDD<String> words = lines.flatMap(new FlatMapFunction<String, String>() {
    public Iterable<String> call(String s) {
        return Arrays.asList(s.split(" "));
    } });
 
there is the parent class:
 
private[spark] abstract class WrappedFunction1[T, R] extends AbstractFunction1[T, R] {
  @throws(classOf[Exception])
  def call(t: T): R
 
  final def apply(t: T): R = call(t)
}
 
my code is same as the JavaWordCount , I don't know what's the error .
 
Thanks
 
Leo
 


Reply | Threaded
Open this post in threaded view
|  
Report Content as Inappropriate

Re: Re: AbstractMethodError

leosandylh@gmail.com

I used Spark0.8.0 and the scala version is 2.9.3 .
my IDE is eclipse juno with the scala plugin 2.9.3 .
I write WordCount in my eclipse ,and copy it to the server .
then compile the code with maven: mvn package , run the example : mvn exec:java -Dexec.mainClass="MyWordCount" -Dexec.args="spark://xxxx:7077 hdfs://xxxx:8030/user/xx/README.md 100"  .
 

 
Date: 2013-12-23 16:26
Subject: Re: AbstractMethodError
Leo,
Which version Spark are you used? It was caused compiled by Scala-2.10.

Spark-0.8-x using scala-2.9, so you must use the same major version to compile spark code.


On Mon, Dec 23, 2013 at 4:00 PM, [hidden email] <[hidden email]> wrote:
I write a example MyWordCount , just set spark.akka.frameSize larger than default . but when I run this jar , there is a problem :
 
13/12/19 18:53:48 INFO ClusterTaskSetManager: Lost TID 0 (task 0.0:0)
13/12/19 18:53:48 INFO ClusterTaskSetManager: Loss was due to java.lang.AbstractMethodError
java.lang.AbstractMethodError: org.apache.spark.api.java.function.WrappedFunction1.call(Ljava/lang/Object;)Ljava/lang/Object;
        at org.apache.spark.api.java.function.WrappedFunction1.apply(WrappedFunction1.scala:31)
        at org.apache.spark.api.java.JavaRDDLike$$anonfun$fn$1$1.apply(JavaRDDLike.scala:90)
        at org.apache.spark.api.java.JavaRDDLike$$anonfun$fn$1$1.apply(JavaRDDLike.scala:90)
        at scala.collection.Iterator$$anon$21.hasNext(Iterator.scala:440)
        at scala.collection.Iterator$class.foreach(Iterator.scala:772)
        at scala.collection.Iterator$$anon$21.foreach(Iterator.scala:437)
        at scala.collection.generic.Growable$class.$plus$plus$eq(Growable.scala:48)
        at scala.collection.mutable.ArrayBuffer.$plus$plus$eq(ArrayBuffer.scala:102)
        at scala.collection.TraversableOnce$class.toBuffer(TraversableOnce.scala:250)
        at scala.collection.Iterator$$anon$21.toBuffer(Iterator.scala:437)
        at scala.collection.TraversableOnce$class.toArray(TraversableOnce.scala:237)
        at scala.collection.Iterator$$anon$21.toArray(Iterator.scala:437)
        at org.apache.spark.rdd.RDD$$anonfun$1.apply(RDD.scala:560)
        at org.apache.spark.rdd.RDD$$anonfun$1.apply(RDD.scala:560)
        at org.apache.spark.SparkContext$$anonfun$runJob$4.apply(SparkContext.scala:758)
        at org.apache.spark.SparkContext$$anonfun$runJob$4.apply(SparkContext.scala:758)
 
it caused by  this code :
JavaRDD<String> words = lines.flatMap(new FlatMapFunction<String, String>() {
    public Iterable<String> call(String s) {
        return Arrays.asList(s.split(" "));
    } });
 
there is the parent class:
 
private[spark] abstract class WrappedFunction1[T, R] extends AbstractFunction1[T, R] {
  @throws(classOf[Exception])
  def call(t: T): R
 
  final def apply(t: T): R = call(t)
}
 
my code is same as the JavaWordCount , I don't know what's the error .
 
Thanks
 
Leo
 


Reply | Threaded
Open this post in threaded view
|  
Report Content as Inappropriate

Re: Re: AbstractMethodError

Josh Rosen
Spark 0.8 and earlier have a known bug when using Eclipse to compile Spark programs written with the Java API (https://spark-project.atlassian.net/browse/SPARK-902).  I fixed that bug in https://github.com/apache/incubator-spark/pull/100, which was also included in the new 0.8.1 release (https://spark.incubator.apache.org/releases/spark-release-0-8-1.html).

Is your Maven build using the Eclipse compiler plugin, by any chance?


On Mon, Dec 23, 2013 at 12:46 AM, [hidden email] <[hidden email]> wrote:
I used Spark0.8.0 and the scala version is 2.9.3 .
my IDE is eclipse juno with the scala plugin 2.9.3 .
I write WordCount in my eclipse ,and copy it to the server .
then compile the code with maven: mvn package , run the example : mvn exec:java -Dexec.mainClass="MyWordCount" -Dexec.args="spark://xxxx:7077 hdfs://xxxx:8030/user/xx/README.md 100"  .
 

 
Date: <a href="tel:2013-12-23%C2%A016" value="+12013122316" target="_blank">2013-12-23 16:26
Subject: Re: AbstractMethodError
Leo,
Which version Spark are you used? It was caused compiled by Scala-2.10.

Spark-0.8-x using scala-2.9, so you must use the same major version to compile spark code.


On Mon, Dec 23, 2013 at 4:00 PM, [hidden email] <[hidden email]> wrote:
I write a example MyWordCount , just set spark.akka.frameSize larger than default . but when I run this jar , there is a problem :
 
13/12/19 18:53:48 INFO ClusterTaskSetManager: Lost TID 0 (task 0.0:0)
13/12/19 18:53:48 INFO ClusterTaskSetManager: Loss was due to java.lang.AbstractMethodError
java.lang.AbstractMethodError: org.apache.spark.api.java.function.WrappedFunction1.call(Ljava/lang/Object;)Ljava/lang/Object;
        at org.apache.spark.api.java.function.WrappedFunction1.apply(WrappedFunction1.scala:31)
        at org.apache.spark.api.java.JavaRDDLike$$anonfun$fn$1$1.apply(JavaRDDLike.scala:90)
        at org.apache.spark.api.java.JavaRDDLike$$anonfun$fn$1$1.apply(JavaRDDLike.scala:90)
        at scala.collection.Iterator$$anon$21.hasNext(Iterator.scala:440)
        at scala.collection.Iterator$class.foreach(Iterator.scala:772)
        at scala.collection.Iterator$$anon$21.foreach(Iterator.scala:437)
        at scala.collection.generic.Growable$class.$plus$plus$eq(Growable.scala:48)
        at scala.collection.mutable.ArrayBuffer.$plus$plus$eq(ArrayBuffer.scala:102)
        at scala.collection.TraversableOnce$class.toBuffer(TraversableOnce.scala:250)
        at scala.collection.Iterator$$anon$21.toBuffer(Iterator.scala:437)
        at scala.collection.TraversableOnce$class.toArray(TraversableOnce.scala:237)
        at scala.collection.Iterator$$anon$21.toArray(Iterator.scala:437)
        at org.apache.spark.rdd.RDD$$anonfun$1.apply(RDD.scala:560)
        at org.apache.spark.rdd.RDD$$anonfun$1.apply(RDD.scala:560)
        at org.apache.spark.SparkContext$$anonfun$runJob$4.apply(SparkContext.scala:758)
        at org.apache.spark.SparkContext$$anonfun$runJob$4.apply(SparkContext.scala:758)
 
it caused by  this code :
JavaRDD<String> words = lines.flatMap(new FlatMapFunction<String, String>() {
    public Iterable<String> call(String s) {
        return Arrays.asList(s.split(" "));
    } });
 
there is the parent class:
 
private[spark] abstract class WrappedFunction1[T, R] extends AbstractFunction1[T, R] {
  @throws(classOf[Exception])
  def call(t: T): R
 
  final def apply(t: T): R = call(t)
}
 
my code is same as the JavaWordCount , I don't know what's the error .
 
Thanks
 
Leo
 



Loading...