NoSuchMethodError running Spark on YARN

classic Classic list List threaded Threaded
3 messages Options
Reply | Threaded
Open this post in threaded view
|

NoSuchMethodError running Spark on YARN

Sandy Ryza
Hi,

I hit this when trying to an example job with Spark master against YARN:

Exception in thread "main" java.lang.NoSuchMethodError: com.typesafe.config.ConfigFactory.invalidateCaches()V
at org.apache.spark.SparkConf.<init>(SparkConf.scala:35)
at org.apache.spark.SparkConf.<init>(SparkConf.scala:30)
at org.apache.spark.deploy.yarn.Client$.main(Client.scala:487)
at org.apache.spark.deploy.yarn.Client.main(Client.scala)

I pretty much followed the 0.8.1 doc:
$MY_SPARK_HOME/bin/spark-class org.apache.spark.deploy.yarn.Client --jar $MY_SPARK_HOME/examples/target/scala-2.10/spark-examples-assembly-0.9.0-incubating.jar --class org.apache.spark.examples.SparkPi --args yarn-standalone --num-workers 1 --master-memory 1g --worker-memory 1g --worker-cores 1

Any idea what's going on?  Will keep digging, but thought somebody might recognize it.

thanks for any guidance,
Sandy
Reply | Threaded
Open this post in threaded view
|

Re: NoSuchMethodError running Spark on YARN

Sandy Ryza
Verified that com.typesafe.config.ConfigFactory is present in the Spark assembly jar so now I'm even more confused.


On Mon, Jan 6, 2014 at 8:51 PM, Sandy Ryza <[hidden email]> wrote:
Hi,

I hit this when trying to an example job with Spark master against YARN:

Exception in thread "main" java.lang.NoSuchMethodError: com.typesafe.config.ConfigFactory.invalidateCaches()V
at org.apache.spark.SparkConf.<init>(SparkConf.scala:35)
at org.apache.spark.SparkConf.<init>(SparkConf.scala:30)
at org.apache.spark.deploy.yarn.Client$.main(Client.scala:487)
at org.apache.spark.deploy.yarn.Client.main(Client.scala)

I pretty much followed the 0.8.1 doc:
$MY_SPARK_HOME/bin/spark-class org.apache.spark.deploy.yarn.Client --jar $MY_SPARK_HOME/examples/target/scala-2.10/spark-examples-assembly-0.9.0-incubating.jar --class org.apache.spark.examples.SparkPi --args yarn-standalone --num-workers 1 --master-memory 1g --worker-memory 1g --worker-cores 1

Any idea what's going on?  Will keep digging, but thought somebody might recognize it.

thanks for any guidance,
Sandy

Reply | Threaded
Open this post in threaded view
|

Re: NoSuchMethodError running Spark on YARN

srowen
Sandy looks like a version mismatch: compiling vs Spark HEAD and
running vs 0.8.1?

The method was added to typesafe config on Oct 8 2012 and released in
about version 0.6:
https://github.com/typesafehub/config/commit/5f486f65ac68745ca89059a5b6b144c2daa5d157#diff-3d5ac6ed49837be68d4f47d3b96b1c81

typesafe config comes in via Akka, and Spark 0.8.1 depends on
akka-actor 2.0.5 which in turn depends on config 0.3.1:
[INFO] +- org.apache.spark:spark-core_2.9.3:jar:0.8.1-incubating:compile
...
[INFO] |  +- com.typesafe.akka:akka-actor:jar:2.0.5:compile
[INFO] |  |  \- com.typesafe:config:jar:0.3.1:compile

http://search.maven.org/#artifactdetails%7Corg.apache.spark%7Cspark-parent%7C0.8.1-incubating%7Cpom

But Spark 0.9.0-SNAPSHOT / HEAD depends on akka-actor 2.2.3 which
depends on typesafe config 1.0.2.

On Tue, Jan 7, 2014 at 5:01 AM, Sandy Ryza <[hidden email]> wrote:

> Verified that com.typesafe.config.ConfigFactory is present in the Spark
> assembly jar so now I'm even more confused.
>
>
> On Mon, Jan 6, 2014 at 8:51 PM, Sandy Ryza <[hidden email]> wrote:
>>
>> Hi,
>>
>> I hit this when trying to an example job with Spark master against YARN:
>>
>> Exception in thread "main" java.lang.NoSuchMethodError:
>> com.typesafe.config.ConfigFactory.invalidateCaches()V
>> at org.apache.spark.SparkConf.<init>(SparkConf.scala:35)
>> at org.apache.spark.SparkConf.<init>(SparkConf.scala:30)
>> at org.apache.spark.deploy.yarn.Client$.main(Client.scala:487)
>> at org.apache.spark.deploy.yarn.Client.main(Client.scala)
>>
>> I pretty much followed the 0.8.1 doc:
>> $MY_SPARK_HOME/bin/spark-class org.apache.spark.deploy.yarn.Client --jar
>> $MY_SPARK_HOME/examples/target/scala-2.10/spark-examples-assembly-0.9.0-incubating.jar
>> --class org.apache.spark.examples.SparkPi --args yarn-standalone
>> --num-workers 1 --master-memory 1g --worker-memory 1g --worker-cores 1
>>
>> Any idea what's going on?  Will keep digging, but thought somebody might
>> recognize it.
>>
>> thanks for any guidance,
>> Sandy
>
>