Use of nscala-time within spark-shell

Previous Topic Next Topic
 
classic Classic list List threaded Threaded
2 messages Options
Reply | Threaded
Open this post in threaded view
|  
Report Content as Inappropriate

Use of nscala-time within spark-shell

Hammam
This post has NOT been accepted by the mailing list yet.
Hi All,

Thanks in advance for your help. I have timestamp which I need to convert to datetime using scala. A folder contains the three needed jar files: "joda-convert-1.5.jar  joda-time-2.4.jar  nscala-time_2.11-1.8.0.jar"
Using scala REPL and adding the jars: scala -classpath "*.jar"
I can use nscala-time like following:

scala> import com.github.nscala_time.time.Imports._
import com.github.nscala_time.time.Imports._

scala> import org.joda._
import org.joda._

scala> DateTime.now
res0: org.joda.time.DateTime = 2015-02-12T15:51:46.928+01:00

But when i try to use spark-shell:
ADD_JARS=/home/scala_test_class/nscala-time_2.11-1.8.0.jar,/home/scala_test_class/joda-time-2.4.jar,/home/scala_test_class/joda-convert-1.5.jar /usr/local/spark/bin/spark-shell --master local --driver-memory 2g --executor-memory 2g --executor-cores 1

It successfully imports the jars:

scala> import com.github.nscala_time.time.Imports._
import com.github.nscala_time.time.Imports._

scala> import org.joda._
import org.joda._

but fails using them
scala> DateTime.now
java.lang.NoSuchMethodError: scala.Predef$.$conforms()Lscala/Predef$$less$colon$less;
        at com.github.nscala_time.time.LowPriorityOrderingImplicits$class.ReadableInstantOrdering(Implicits.scala:69)
        at com.github.nscala_time.time.Imports$.ReadableInstantOrdering(Imports.scala:20)
        at com.github.nscala_time.time.OrderingImplicits$class.$init$(Implicits.scala:61)
        at com.github.nscala_time.time.Imports$.<init>(Imports.scala:20)
        at com.github.nscala_time.time.Imports$.<clinit>(Imports.scala)
        at $iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC.<init>(<console>:17)
        at $iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC.<init>(<console>:22)
        at $iwC$$iwC$$iwC$$iwC$$iwC$$iwC.<init>(<console>:24)
        at $iwC$$iwC$$iwC$$iwC$$iwC.<init>(<console>:26)
        at $iwC$$iwC$$iwC$$iwC.<init>(<console>:28)
        at $iwC$$iwC$$iwC.<init>(<console>:30)
        at $iwC$$iwC.<init>(<console>:32)
        at $iwC.<init>(<console>:34)
        at <init>(<console>:36)
        at .<init>(<console>:40)
        at .<clinit>(<console>)
        at .<init>(<console>:7)
        at .<clinit>(<console>)
        at $print(<console>)
        at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
        at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:57)
        at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
        at java.lang.reflect.Method.invoke(Method.java:606)
        at org.apache.spark.repl.SparkIMain$ReadEvalPrint.call(SparkIMain.scala:852)
        at org.apache.spark.repl.SparkIMain$Request.loadAndRun(SparkIMain.scala:1125)
        at org.apache.spark.repl.SparkIMain.loadAndRunReq$1(SparkIMain.scala:674)
        at org.apache.spark.repl.SparkIMain.interpret(SparkIMain.scala:705)
        at org.apache.spark.repl.SparkIMain.interpret(SparkIMain.scala:669)
        at org.apache.spark.repl.SparkILoop.reallyInterpret$1(SparkILoop.scala:828)
        at org.apache.spark.repl.SparkILoop.interpretStartingWith(SparkILoop.scala:873)
        at org.apache.spark.repl.SparkILoop.command(SparkILoop.scala:785)
        at org.apache.spark.repl.SparkILoop.processLine$1(SparkILoop.scala:628)
        at org.apache.spark.repl.SparkILoop.innerLoop$1(SparkILoop.scala:636)
        at org.apache.spark.repl.SparkILoop.loop(SparkILoop.scala:641)
        at org.apache.spark.repl.SparkILoop$$anonfun$process$1.apply$mcZ$sp(SparkILoop.scala:968)
        at org.apache.spark.repl.SparkILoop$$anonfun$process$1.apply(SparkILoop.scala:916)
        at org.apache.spark.repl.SparkILoop$$anonfun$process$1.apply(SparkILoop.scala:916)
        at scala.tools.nsc.util.ScalaClassLoader$.savingContextLoader(ScalaClassLoader.scala:135)
        at org.apache.spark.repl.SparkILoop.process(SparkILoop.scala:916)
        at org.apache.spark.repl.SparkILoop.process(SparkILoop.scala:1011)
        at org.apache.spark.repl.Main$.main(Main.scala:31)
        at org.apache.spark.repl.Main.main(Main.scala)
        at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
        at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:57)
        at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
        at java.lang.reflect.Method.invoke(Method.java:606)
        at org.apache.spark.deploy.SparkSubmit$.launch(SparkSubmit.scala:358)
        at org.apache.spark.deploy.SparkSubmit$.main(SparkSubmit.scala:75)
        at org.apache.spark.deploy.SparkSubmit.main(SparkSubmit.scala)

Your help is very aappreciated,

Regards,

Hammam
Reply | Threaded
Open this post in threaded view
|  
Report Content as Inappropriate

Re: Use of nscala-time within spark-shell

usufarif100@yahoo.com
This post has NOT been accepted by the mailing list yet.
This post was updated on .
This error is due to the mismatch with Scala version. nscala-time_2.11-1.8.0.jar requires Scala v 2.11.x

I was using Scala 2.10 & found the nscala-time_2.9.3 compatible with Scala version 2.10. I used the following dependency in my POM.xml to resolve the java.lang.NoSuchMethodError: scala.Predef$.$conforms()Lscala/Predef$$less$colon$less; error.

        <dependency>
            <groupId>com.github.nscala-time</groupId>
            <artifactId>nscala-time_2.9.3</artifactId>
            <version>2.10.0</version>
        </dependency>

You can find more information for the  nscala-time dependencies here.
Hope this helps you to resolve your dependency.

Yusuf
Loading...