How to compile Spark applications using sbt?

classic Classic list List threaded Threaded
5 messages Options
Reply | Threaded
Open this post in threaded view
|

How to compile Spark applications using sbt?

Tao Xiao
Hi,

    I'm learning Spark 0.9 by its tutorial. To write my first application in Scala, I followed the instructions of A Standalong App in Scala but failed when compiling my application. To be specific, in Spark's home directory(/home/soft/spark-0.9.0-incubating-bin-hadoop1), I created a directory src/main/scala and put SimpleApp.scala in it and put simple.sbt in  Spark's home directory. 
   
   Then I tried to compile my application with the command "sbt/sbt package" and got the following errors:

[root@dev4 spark-0.9.0-incubating-bin-hadoop1]# sbt/sbt package
Launching sbt from sbt/sbt-launch-0.12.4.jar
[info] Loading project definition from /home/soft/spark-0.9.0-incubating-bin-hadoop1/project/project
[info] Loading project definition from /home/soft/spark-0.9.0-incubating-bin-hadoop1/project
[info] Set current project to Simple App Project (in build file:/home/soft/spark-0.9.0-incubating-bin-hadoop1/)
Getting Scala 2.10 ...

:: problems summary ::
:::: WARNINGS
module not found: org.scala-lang#scala-compiler;2.10

==== local: tried

 /root/.ivy2/local/org.scala-lang/scala-compiler/2.10/ivys/ivy.xml

==== typesafe-ivy-releases: tried


==== Maven Central: tried


==== sonatype-snapshots: tried


module not found: org.scala-lang#scala-library;2.10

==== local: tried

 /root/.ivy2/local/org.scala-lang/scala-library/2.10/ivys/ivy.xml

==== typesafe-ivy-releases: tried


==== Maven Central: tried


==== sonatype-snapshots: tried


::::::::::::::::::::::::::::::::::::::::::::::

::          UNRESOLVED DEPENDENCIES         ::

::::::::::::::::::::::::::::::::::::::::::::::

:: org.scala-lang#scala-compiler;2.10: not found

:: org.scala-lang#scala-library;2.10: not found

::::::::::::::::::::::::::::::::::::::::::::::



:: USE VERBOSE OR DEBUG MESSAGE LEVEL FOR MORE DETAILS
unresolved dependency: org.scala-lang#scala-compiler;2.10: not found
unresolved dependency: org.scala-lang#scala-library;2.10: not found
Error during sbt execution: Error retrieving required libraries
  (see /root/.sbt/boot/update.log for complete log)
xsbti.RetrieveException: Could not retrieve Scala 2.10
at xsbt.boot.ModuleDefinition.fail(ResolverHelper.java:12)
at xsbt.boot.ModuleDefinition.retrieveFailed(ResolverHelper.java:9)
at xsbt.boot.Launch.update(Launch.scala:267)
at xsbt.boot.Launch$$anonfun$xsbt$boot$Launch$$getScalaProvider0$3.apply(Launch.scala:181)
at scala.Option.getOrElse(Option.scala:108)
at xsbt.boot.Launch$$anon$3.call(Launch.scala:167)
at xsbt.boot.Locks$GlobalLock.withChannel$1(Locks.scala:98)
at xsbt.boot.Locks$GlobalLock.withChannelRetries$1(Locks.scala:81)
at xsbt.boot.Locks$GlobalLock$$anonfun$withFileLock$1.apply(Locks.scala:102)
at xsbt.boot.Using$.withResource(Using.scala:11)
at xsbt.boot.Using$.apply(Using.scala:10)
at xsbt.boot.Locks$GlobalLock.ignoringDeadlockAvoided(Locks.scala:62)
at xsbt.boot.Locks$GlobalLock.liftedTree1$1(Locks.scala:52)
at xsbt.boot.Locks$GlobalLock.withLock(Locks.scala:52)
at xsbt.boot.Locks$.apply0(Locks.scala:31)
at xsbt.boot.Locks$.apply(Locks.scala:28)
at xsbt.boot.Launch.locked(Launch.scala:165)
at xsbt.boot.Launch.getScalaProvider(Launch.scala:167)
at xsbt.boot.Launch$$anonfun$1.apply(Launch.scala:76)
at org.apache.ivy.plugins.namespace.NamespaceRule.newEntry(Cache.scala:17)
at org.apache.ivy.plugins.namespace.NamespaceRule.apply(Cache.scala:12)
at xsbt.boot.Launch.getScala(Launch.scala:79)
at xsbt.boot.Launch.getScala(Launch.scala:78)
at xsbt.boot.Launch.getScala(Launch.scala:77)
at sbt.ScalaInstance$.apply(ScalaInstance.scala:43)
at sbt.ScalaInstance$.apply(ScalaInstance.scala:34)
at sbt.Defaults$$anonfun$scalaInstanceSetting$1.apply(Defaults.scala:272)
at sbt.Defaults$$anonfun$scalaInstanceSetting$1.apply(Defaults.scala:269)
at sbt.Scoped$$anonfun$hf4$1.apply(Structure.scala:580)
at sbt.Scoped$$anonfun$hf4$1.apply(Structure.scala:580)
at scala.Function1$$anonfun$compose$1.apply(Function1.scala:49)
at sbt.Scoped$Reduced$$anonfun$combine$1$$anonfun$apply$12.apply(Structure.scala:311)
at sbt.Scoped$Reduced$$anonfun$combine$1$$anonfun$apply$12.apply(Structure.scala:311)
at sbt.$tilde$greater$$anonfun$$u2219$1.apply(TypeFunctions.scala:41)
at sbt.std.Transform$$anon$5.work(System.scala:71)
at sbt.Execute$$anonfun$submit$1$$anonfun$apply$1.apply(Execute.scala:232)
at sbt.Execute$$anonfun$submit$1$$anonfun$apply$1.apply(Execute.scala:232)
at sbt.ErrorHandling$.wideConvert(ErrorHandling.scala:18)
at sbt.Execute.work(Execute.scala:238)
at sbt.Execute$$anonfun$submit$1.apply(Execute.scala:232)
at sbt.Execute$$anonfun$submit$1.apply(Execute.scala:232)
at sbt.ConcurrentRestrictions$$anon$4$$anonfun$1.apply(ConcurrentRestrictions.scala:160)
at sbt.CompletionService$$anon$2.call(CompletionService.scala:30)
at java.util.concurrent.FutureTask$Sync.innerRun(FutureTask.java:303)
at java.util.concurrent.FutureTask.run(FutureTask.java:138)
at java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:439)
at java.util.concurrent.FutureTask$Sync.innerRun(FutureTask.java:303)
at java.util.concurrent.FutureTask.run(FutureTask.java:138)
at java.util.concurrent.ThreadPoolExecutor$Worker.runTask(ThreadPoolExecutor.java:895)
at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:918)
at java.lang.Thread.run(Thread.java:662)
[error] (root/*:scala-instance) xsbti.RetrieveException: Could not retrieve Scala 2.10
[error] Total time: 9 s, completed Feb 19, 2014 5:12:37 PM



      It seems scala-2.10 was not available, but Spark-0.9 comes with Scala-2.10 included, doesn't it ? 

      How to solve this problem and what is the correct way to compile Spark application written in Scala/Java? 


Reply | Threaded
Open this post in threaded view
|

Re: How to compile Spark applications using sbt?

dachuan

Try to use 2.10.3

I have met the same problem

On Feb 19, 2014 4:21 AM, "Tao Xiao" <[hidden email]> wrote:
Hi,

    I'm learning Spark 0.9 by its tutorial. To write my first application in Scala, I followed the instructions of A Standalong App in Scala but failed when compiling my application. To be specific, in Spark's home directory(/home/soft/spark-0.9.0-incubating-bin-hadoop1), I created a directory src/main/scala and put SimpleApp.scala in it and put simple.sbt in  Spark's home directory. 
   
   Then I tried to compile my application with the command "sbt/sbt package" and got the following errors:

[root@dev4 spark-0.9.0-incubating-bin-hadoop1]# sbt/sbt package
Launching sbt from sbt/sbt-launch-0.12.4.jar
[info] Loading project definition from /home/soft/spark-0.9.0-incubating-bin-hadoop1/project/project
[info] Loading project definition from /home/soft/spark-0.9.0-incubating-bin-hadoop1/project
[info] Set current project to Simple App Project (in build file:/home/soft/spark-0.9.0-incubating-bin-hadoop1/)
Getting Scala 2.10 ...

:: problems summary ::
:::: WARNINGS
module not found: org.scala-lang#scala-compiler;2.10

==== local: tried

 /root/.ivy2/local/org.scala-lang/scala-compiler/2.10/ivys/ivy.xml

==== typesafe-ivy-releases: tried


==== Maven Central: tried


==== sonatype-snapshots: tried


module not found: org.scala-lang#scala-library;2.10

==== local: tried

 /root/.ivy2/local/org.scala-lang/scala-library/2.10/ivys/ivy.xml

==== typesafe-ivy-releases: tried


==== Maven Central: tried


==== sonatype-snapshots: tried


::::::::::::::::::::::::::::::::::::::::::::::

::          UNRESOLVED DEPENDENCIES         ::

::::::::::::::::::::::::::::::::::::::::::::::

:: org.scala-lang#scala-compiler;2.10: not found

:: org.scala-lang#scala-library;2.10: not found

::::::::::::::::::::::::::::::::::::::::::::::



:: USE VERBOSE OR DEBUG MESSAGE LEVEL FOR MORE DETAILS
unresolved dependency: org.scala-lang#scala-compiler;2.10: not found
unresolved dependency: org.scala-lang#scala-library;2.10: not found
Error during sbt execution: Error retrieving required libraries
  (see /root/.sbt/boot/update.log for complete log)
xsbti.RetrieveException: Could not retrieve Scala 2.10
at xsbt.boot.ModuleDefinition.fail(ResolverHelper.java:12)
at xsbt.boot.ModuleDefinition.retrieveFailed(ResolverHelper.java:9)
at xsbt.boot.Launch.update(Launch.scala:267)
at xsbt.boot.Launch$$anonfun$xsbt$boot$Launch$$getScalaProvider0$3.apply(Launch.scala:181)
at scala.Option.getOrElse(Option.scala:108)
at xsbt.boot.Launch$$anon$3.call(Launch.scala:167)
at xsbt.boot.Locks$GlobalLock.withChannel$1(Locks.scala:98)
at xsbt.boot.Locks$GlobalLock.withChannelRetries$1(Locks.scala:81)
at xsbt.boot.Locks$GlobalLock$$anonfun$withFileLock$1.apply(Locks.scala:102)
at xsbt.boot.Using$.withResource(Using.scala:11)
at xsbt.boot.Using$.apply(Using.scala:10)
at xsbt.boot.Locks$GlobalLock.ignoringDeadlockAvoided(Locks.scala:62)
at xsbt.boot.Locks$GlobalLock.liftedTree1$1(Locks.scala:52)
at xsbt.boot.Locks$GlobalLock.withLock(Locks.scala:52)
at xsbt.boot.Locks$.apply0(Locks.scala:31)
at xsbt.boot.Locks$.apply(Locks.scala:28)
at xsbt.boot.Launch.locked(Launch.scala:165)
at xsbt.boot.Launch.getScalaProvider(Launch.scala:167)
at xsbt.boot.Launch$$anonfun$1.apply(Launch.scala:76)
at org.apache.ivy.plugins.namespace.NamespaceRule.newEntry(Cache.scala:17)
at org.apache.ivy.plugins.namespace.NamespaceRule.apply(Cache.scala:12)
at xsbt.boot.Launch.getScala(Launch.scala:79)
at xsbt.boot.Launch.getScala(Launch.scala:78)
at xsbt.boot.Launch.getScala(Launch.scala:77)
at sbt.ScalaInstance$.apply(ScalaInstance.scala:43)
at sbt.ScalaInstance$.apply(ScalaInstance.scala:34)
at sbt.Defaults$$anonfun$scalaInstanceSetting$1.apply(Defaults.scala:272)
at sbt.Defaults$$anonfun$scalaInstanceSetting$1.apply(Defaults.scala:269)
at sbt.Scoped$$anonfun$hf4$1.apply(Structure.scala:580)
at sbt.Scoped$$anonfun$hf4$1.apply(Structure.scala:580)
at scala.Function1$$anonfun$compose$1.apply(Function1.scala:49)
at sbt.Scoped$Reduced$$anonfun$combine$1$$anonfun$apply$12.apply(Structure.scala:311)
at sbt.Scoped$Reduced$$anonfun$combine$1$$anonfun$apply$12.apply(Structure.scala:311)
at sbt.$tilde$greater$$anonfun$$u2219$1.apply(TypeFunctions.scala:41)
at sbt.std.Transform$$anon$5.work(System.scala:71)
at sbt.Execute$$anonfun$submit$1$$anonfun$apply$1.apply(Execute.scala:232)
at sbt.Execute$$anonfun$submit$1$$anonfun$apply$1.apply(Execute.scala:232)
at sbt.ErrorHandling$.wideConvert(ErrorHandling.scala:18)
at sbt.Execute.work(Execute.scala:238)
at sbt.Execute$$anonfun$submit$1.apply(Execute.scala:232)
at sbt.Execute$$anonfun$submit$1.apply(Execute.scala:232)
at sbt.ConcurrentRestrictions$$anon$4$$anonfun$1.apply(ConcurrentRestrictions.scala:160)
at sbt.CompletionService$$anon$2.call(CompletionService.scala:30)
at java.util.concurrent.FutureTask$Sync.innerRun(FutureTask.java:303)
at java.util.concurrent.FutureTask.run(FutureTask.java:138)
at java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:439)
at java.util.concurrent.FutureTask$Sync.innerRun(FutureTask.java:303)
at java.util.concurrent.FutureTask.run(FutureTask.java:138)
at java.util.concurrent.ThreadPoolExecutor$Worker.runTask(ThreadPoolExecutor.java:895)
at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:918)
at java.lang.Thread.run(Thread.java:662)
[error] (root/*:scala-instance) xsbti.RetrieveException: Could not retrieve Scala 2.10
[error] Total time: 9 s, completed Feb 19, 2014 5:12:37 PM



      It seems scala-2.10 was not available, but Spark-0.9 comes with Scala-2.10 included, doesn't it ? 

      How to solve this problem and what is the correct way to compile Spark application written in Scala/Java? 


Reply | Threaded
Open this post in threaded view
|

Re: How to compile Spark applications using sbt?

Mayur Rustagi
Just to clarify. you have to change that in simple.sbt :)



On Wed, Feb 19, 2014 at 4:51 AM, dachuan <[hidden email]> wrote:

Try to use 2.10.3

I have met the same problem

On Feb 19, 2014 4:21 AM, "Tao Xiao" <[hidden email]> wrote:
Hi,

    I'm learning Spark 0.9 by its tutorial. To write my first application in Scala, I followed the instructions of A Standalong App in Scala but failed when compiling my application. To be specific, in Spark's home directory(/home/soft/spark-0.9.0-incubating-bin-hadoop1), I created a directory src/main/scala and put SimpleApp.scala in it and put simple.sbt in  Spark's home directory. 
   
   Then I tried to compile my application with the command "sbt/sbt package" and got the following errors:

[root@dev4 spark-0.9.0-incubating-bin-hadoop1]# sbt/sbt package
Launching sbt from sbt/sbt-launch-0.12.4.jar
[info] Loading project definition from /home/soft/spark-0.9.0-incubating-bin-hadoop1/project/project
[info] Loading project definition from /home/soft/spark-0.9.0-incubating-bin-hadoop1/project
[info] Set current project to Simple App Project (in build file:/home/soft/spark-0.9.0-incubating-bin-hadoop1/)
Getting Scala 2.10 ...

:: problems summary ::
:::: WARNINGS
module not found: org.scala-lang#scala-compiler;2.10

==== local: tried

 /root/.ivy2/local/org.scala-lang/scala-compiler/2.10/ivys/ivy.xml

==== typesafe-ivy-releases: tried


==== Maven Central: tried


==== sonatype-snapshots: tried


module not found: org.scala-lang#scala-library;2.10

==== local: tried

 /root/.ivy2/local/org.scala-lang/scala-library/2.10/ivys/ivy.xml

==== typesafe-ivy-releases: tried


==== Maven Central: tried


==== sonatype-snapshots: tried


::::::::::::::::::::::::::::::::::::::::::::::

::          UNRESOLVED DEPENDENCIES         ::

::::::::::::::::::::::::::::::::::::::::::::::

:: org.scala-lang#scala-compiler;2.10: not found

:: org.scala-lang#scala-library;2.10: not found

::::::::::::::::::::::::::::::::::::::::::::::



:: USE VERBOSE OR DEBUG MESSAGE LEVEL FOR MORE DETAILS
unresolved dependency: org.scala-lang#scala-compiler;2.10: not found
unresolved dependency: org.scala-lang#scala-library;2.10: not found
Error during sbt execution: Error retrieving required libraries
  (see /root/.sbt/boot/update.log for complete log)
xsbti.RetrieveException: Could not retrieve Scala 2.10
at xsbt.boot.ModuleDefinition.fail(ResolverHelper.java:12)
at xsbt.boot.ModuleDefinition.retrieveFailed(ResolverHelper.java:9)
at xsbt.boot.Launch.update(Launch.scala:267)
at xsbt.boot.Launch$$anonfun$xsbt$boot$Launch$$getScalaProvider0$3.apply(Launch.scala:181)
at scala.Option.getOrElse(Option.scala:108)
at xsbt.boot.Launch$$anon$3.call(Launch.scala:167)
at xsbt.boot.Locks$GlobalLock.withChannel$1(Locks.scala:98)
at xsbt.boot.Locks$GlobalLock.withChannelRetries$1(Locks.scala:81)
at xsbt.boot.Locks$GlobalLock$$anonfun$withFileLock$1.apply(Locks.scala:102)
at xsbt.boot.Using$.withResource(Using.scala:11)
at xsbt.boot.Using$.apply(Using.scala:10)
at xsbt.boot.Locks$GlobalLock.ignoringDeadlockAvoided(Locks.scala:62)
at xsbt.boot.Locks$GlobalLock.liftedTree1$1(Locks.scala:52)
at xsbt.boot.Locks$GlobalLock.withLock(Locks.scala:52)
at xsbt.boot.Locks$.apply0(Locks.scala:31)
at xsbt.boot.Locks$.apply(Locks.scala:28)
at xsbt.boot.Launch.locked(Launch.scala:165)
at xsbt.boot.Launch.getScalaProvider(Launch.scala:167)
at xsbt.boot.Launch$$anonfun$1.apply(Launch.scala:76)
at org.apache.ivy.plugins.namespace.NamespaceRule.newEntry(Cache.scala:17)
at org.apache.ivy.plugins.namespace.NamespaceRule.apply(Cache.scala:12)
at xsbt.boot.Launch.getScala(Launch.scala:79)
at xsbt.boot.Launch.getScala(Launch.scala:78)
at xsbt.boot.Launch.getScala(Launch.scala:77)
at sbt.ScalaInstance$.apply(ScalaInstance.scala:43)
at sbt.ScalaInstance$.apply(ScalaInstance.scala:34)
at sbt.Defaults$$anonfun$scalaInstanceSetting$1.apply(Defaults.scala:272)
at sbt.Defaults$$anonfun$scalaInstanceSetting$1.apply(Defaults.scala:269)
at sbt.Scoped$$anonfun$hf4$1.apply(Structure.scala:580)
at sbt.Scoped$$anonfun$hf4$1.apply(Structure.scala:580)
at scala.Function1$$anonfun$compose$1.apply(Function1.scala:49)
at sbt.Scoped$Reduced$$anonfun$combine$1$$anonfun$apply$12.apply(Structure.scala:311)
at sbt.Scoped$Reduced$$anonfun$combine$1$$anonfun$apply$12.apply(Structure.scala:311)
at sbt.$tilde$greater$$anonfun$$u2219$1.apply(TypeFunctions.scala:41)
at sbt.std.Transform$$anon$5.work(System.scala:71)
at sbt.Execute$$anonfun$submit$1$$anonfun$apply$1.apply(Execute.scala:232)
at sbt.Execute$$anonfun$submit$1$$anonfun$apply$1.apply(Execute.scala:232)
at sbt.ErrorHandling$.wideConvert(ErrorHandling.scala:18)
at sbt.Execute.work(Execute.scala:238)
at sbt.Execute$$anonfun$submit$1.apply(Execute.scala:232)
at sbt.Execute$$anonfun$submit$1.apply(Execute.scala:232)
at sbt.ConcurrentRestrictions$$anon$4$$anonfun$1.apply(ConcurrentRestrictions.scala:160)
at sbt.CompletionService$$anon$2.call(CompletionService.scala:30)
at java.util.concurrent.FutureTask$Sync.innerRun(FutureTask.java:303)
at java.util.concurrent.FutureTask.run(FutureTask.java:138)
at java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:439)
at java.util.concurrent.FutureTask$Sync.innerRun(FutureTask.java:303)
at java.util.concurrent.FutureTask.run(FutureTask.java:138)
at java.util.concurrent.ThreadPoolExecutor$Worker.runTask(ThreadPoolExecutor.java:895)
at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:918)
at java.lang.Thread.run(Thread.java:662)
[error] (root/*:scala-instance) xsbti.RetrieveException: Could not retrieve Scala 2.10
[error] Total time: 9 s, completed Feb 19, 2014 5:12:37 PM



      It seems scala-2.10 was not available, but Spark-0.9 comes with Scala-2.10 included, doesn't it ? 

      How to solve this problem and what is the correct way to compile Spark application written in Scala/Java? 



Reply | Threaded
Open this post in threaded view
|

Re: How to compile Spark applications using sbt?

Tao Xiao
dachuan and Mayur,

Thanks for your replies! I have successfully built my application with your help, and I have one more question.

I followed the tutorial here.

In the SimpleApp.scala file, we can see this line :
      val sc = new SparkContext("local", "Simple App", "YOUR_SPARK_HOME",
List("target/scala-2.10/simple-project_2.10-1.0.jar")) 

I don't know why the generated JAR file is named target/scala-2.10/simple-project_2.10-1.0.jar, I guess it has something to do with the file simple.sbt.

In the simple.sbt  file, we can see the following lines:
        name := "Simple Project"
        version := "1.0"
        scalaVersion := "2.10.3"

I suppose simple-project_2.10-1.0.jar is named in this way: <name>_<scalaVersion>_<version>.jar, where <name> is a variant of the name in SimpleApp.scala (each word is in lower-case and concatenated by a dash). 

Am I right?  But here I have changed the scalaVersion to 2.10.3, yet <scalaVersion> in the JAR file generated was still  2.10




2014-02-19 23:33 GMT+08:00 Mayur Rustagi <[hidden email]>:
Just to clarify. you have to change that in simple.sbt :)



On Wed, Feb 19, 2014 at 4:51 AM, dachuan <[hidden email]> wrote:

Try to use 2.10.3

I have met the same problem

On Feb 19, 2014 4:21 AM, "Tao Xiao" <[hidden email]> wrote:
Hi,

    I'm learning Spark 0.9 by its tutorial. To write my first application in Scala, I followed the instructions of A Standalong App in Scala but failed when compiling my application. To be specific, in Spark's home directory(/home/soft/spark-0.9.0-incubating-bin-hadoop1), I created a directory src/main/scala and put SimpleApp.scala in it and put simple.sbt in  Spark's home directory. 
   
   Then I tried to compile my application with the command "sbt/sbt package" and got the following errors:

[root@dev4 spark-0.9.0-incubating-bin-hadoop1]# sbt/sbt package
Launching sbt from sbt/sbt-launch-0.12.4.jar
[info] Loading project definition from /home/soft/spark-0.9.0-incubating-bin-hadoop1/project/project
[info] Loading project definition from /home/soft/spark-0.9.0-incubating-bin-hadoop1/project
[info] Set current project to Simple App Project (in build file:/home/soft/spark-0.9.0-incubating-bin-hadoop1/)
Getting Scala 2.10 ...

:: problems summary ::
:::: WARNINGS
module not found: org.scala-lang#scala-compiler;2.10

==== local: tried

 /root/.ivy2/local/org.scala-lang/scala-compiler/2.10/ivys/ivy.xml

==== typesafe-ivy-releases: tried


==== Maven Central: tried


==== sonatype-snapshots: tried


module not found: org.scala-lang#scala-library;2.10

==== local: tried

 /root/.ivy2/local/org.scala-lang/scala-library/2.10/ivys/ivy.xml

==== typesafe-ivy-releases: tried


==== Maven Central: tried


==== sonatype-snapshots: tried


::::::::::::::::::::::::::::::::::::::::::::::

::          UNRESOLVED DEPENDENCIES         ::

::::::::::::::::::::::::::::::::::::::::::::::

:: org.scala-lang#scala-compiler;2.10: not found

:: org.scala-lang#scala-library;2.10: not found

::::::::::::::::::::::::::::::::::::::::::::::



:: USE VERBOSE OR DEBUG MESSAGE LEVEL FOR MORE DETAILS
unresolved dependency: org.scala-lang#scala-compiler;2.10: not found
unresolved dependency: org.scala-lang#scala-library;2.10: not found
Error during sbt execution: Error retrieving required libraries
  (see /root/.sbt/boot/update.log for complete log)
xsbti.RetrieveException: Could not retrieve Scala 2.10
at xsbt.boot.ModuleDefinition.fail(ResolverHelper.java:12)
at xsbt.boot.ModuleDefinition.retrieveFailed(ResolverHelper.java:9)
at xsbt.boot.Launch.update(Launch.scala:267)
at xsbt.boot.Launch$$anonfun$xsbt$boot$Launch$$getScalaProvider0$3.apply(Launch.scala:181)
at scala.Option.getOrElse(Option.scala:108)
at xsbt.boot.Launch$$anon$3.call(Launch.scala:167)
at xsbt.boot.Locks$GlobalLock.withChannel$1(Locks.scala:98)
at xsbt.boot.Locks$GlobalLock.withChannelRetries$1(Locks.scala:81)
at xsbt.boot.Locks$GlobalLock$$anonfun$withFileLock$1.apply(Locks.scala:102)
at xsbt.boot.Using$.withResource(Using.scala:11)
at xsbt.boot.Using$.apply(Using.scala:10)
at xsbt.boot.Locks$GlobalLock.ignoringDeadlockAvoided(Locks.scala:62)
at xsbt.boot.Locks$GlobalLock.liftedTree1$1(Locks.scala:52)
at xsbt.boot.Locks$GlobalLock.withLock(Locks.scala:52)
at xsbt.boot.Locks$.apply0(Locks.scala:31)
at xsbt.boot.Locks$.apply(Locks.scala:28)
at xsbt.boot.Launch.locked(Launch.scala:165)
at xsbt.boot.Launch.getScalaProvider(Launch.scala:167)
at xsbt.boot.Launch$$anonfun$1.apply(Launch.scala:76)
at org.apache.ivy.plugins.namespace.NamespaceRule.newEntry(Cache.scala:17)
at org.apache.ivy.plugins.namespace.NamespaceRule.apply(Cache.scala:12)
at xsbt.boot.Launch.getScala(Launch.scala:79)
at xsbt.boot.Launch.getScala(Launch.scala:78)
at xsbt.boot.Launch.getScala(Launch.scala:77)
at sbt.ScalaInstance$.apply(ScalaInstance.scala:43)
at sbt.ScalaInstance$.apply(ScalaInstance.scala:34)
at sbt.Defaults$$anonfun$scalaInstanceSetting$1.apply(Defaults.scala:272)
at sbt.Defaults$$anonfun$scalaInstanceSetting$1.apply(Defaults.scala:269)
at sbt.Scoped$$anonfun$hf4$1.apply(Structure.scala:580)
at sbt.Scoped$$anonfun$hf4$1.apply(Structure.scala:580)
at scala.Function1$$anonfun$compose$1.apply(Function1.scala:49)
at sbt.Scoped$Reduced$$anonfun$combine$1$$anonfun$apply$12.apply(Structure.scala:311)
at sbt.Scoped$Reduced$$anonfun$combine$1$$anonfun$apply$12.apply(Structure.scala:311)
at sbt.$tilde$greater$$anonfun$$u2219$1.apply(TypeFunctions.scala:41)
at sbt.std.Transform$$anon$5.work(System.scala:71)
at sbt.Execute$$anonfun$submit$1$$anonfun$apply$1.apply(Execute.scala:232)
at sbt.Execute$$anonfun$submit$1$$anonfun$apply$1.apply(Execute.scala:232)
at sbt.ErrorHandling$.wideConvert(ErrorHandling.scala:18)
at sbt.Execute.work(Execute.scala:238)
at sbt.Execute$$anonfun$submit$1.apply(Execute.scala:232)
at sbt.Execute$$anonfun$submit$1.apply(Execute.scala:232)
at sbt.ConcurrentRestrictions$$anon$4$$anonfun$1.apply(ConcurrentRestrictions.scala:160)
at sbt.CompletionService$$anon$2.call(CompletionService.scala:30)
at java.util.concurrent.FutureTask$Sync.innerRun(FutureTask.java:303)
at java.util.concurrent.FutureTask.run(FutureTask.java:138)
at java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:439)
at java.util.concurrent.FutureTask$Sync.innerRun(FutureTask.java:303)
at java.util.concurrent.FutureTask.run(FutureTask.java:138)
at java.util.concurrent.ThreadPoolExecutor$Worker.runTask(ThreadPoolExecutor.java:895)
at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:918)
at java.lang.Thread.run(Thread.java:662)
[error] (root/*:scala-instance) xsbti.RetrieveException: Could not retrieve Scala 2.10
[error] Total time: 9 s, completed Feb 19, 2014 5:12:37 PM



      It seems scala-2.10 was not available, but Spark-0.9 comes with Scala-2.10 included, doesn't it ? 

      How to solve this problem and what is the correct way to compile Spark application written in Scala/Java? 




Reply | Threaded
Open this post in threaded view
|

Re: How to compile Spark applications using sbt?

dachuan
I wish I could also understand the cryptic sbt configuration file. Because only through that way, I can compile more programs correctly, for example, programs in GraphX, Spark Streaming, and any new dependencies.


On Wed, Feb 19, 2014 at 9:45 PM, Tao Xiao <[hidden email]> wrote:
dachuan and Mayur,

Thanks for your replies! I have successfully built my application with your help, and I have one more question.

I followed the tutorial here.

In the SimpleApp.scala file, we can see this line :
      val sc = new SparkContext("local", "Simple App", "YOUR_SPARK_HOME",
List("target/scala-2.10/simple-project_2.10-1.0.jar")) 

I don't know why the generated JAR file is named target/scala-2.10/simple-project_2.10-1.0.jar, I guess it has something to do with the file simple.sbt.

In the simple.sbt  file, we can see the following lines:
        name := "Simple Project"
        version := "1.0"
        scalaVersion := "2.10.3"

I suppose simple-project_2.10-1.0.jar is named in this way: <name>_<scalaVersion>_<version>.jar, where <name> is a variant of the name in SimpleApp.scala (each word is in lower-case and concatenated by a dash). 

Am I right?  But here I have changed the scalaVersion to 2.10.3, yet <scalaVersion> in the JAR file generated was still  2.10




2014-02-19 23:33 GMT+08:00 Mayur Rustagi <[hidden email]>:

Just to clarify. you have to change that in simple.sbt :)

Mayur Rustagi
Ph: <a href="tel:%2B919632149971" value="+919632149971" target="_blank">+919632149971


On Wed, Feb 19, 2014 at 4:51 AM, dachuan <[hidden email]> wrote:

Try to use 2.10.3

I have met the same problem

On Feb 19, 2014 4:21 AM, "Tao Xiao" <[hidden email]> wrote:
Hi,

    I'm learning Spark 0.9 by its tutorial. To write my first application in Scala, I followed the instructions of A Standalong App in Scala but failed when compiling my application. To be specific, in Spark's home directory(/home/soft/spark-0.9.0-incubating-bin-hadoop1), I created a directory src/main/scala and put SimpleApp.scala in it and put simple.sbt in  Spark's home directory. 
   
   Then I tried to compile my application with the command "sbt/sbt package" and got the following errors:

[root@dev4 spark-0.9.0-incubating-bin-hadoop1]# sbt/sbt package
Launching sbt from sbt/sbt-launch-0.12.4.jar
[info] Loading project definition from /home/soft/spark-0.9.0-incubating-bin-hadoop1/project/project
[info] Loading project definition from /home/soft/spark-0.9.0-incubating-bin-hadoop1/project
[info] Set current project to Simple App Project (in build file:/home/soft/spark-0.9.0-incubating-bin-hadoop1/)
Getting Scala 2.10 ...

:: problems summary ::
:::: WARNINGS
module not found: org.scala-lang#scala-compiler;2.10

==== local: tried

 /root/.ivy2/local/org.scala-lang/scala-compiler/2.10/ivys/ivy.xml

==== typesafe-ivy-releases: tried


==== Maven Central: tried


==== sonatype-snapshots: tried


module not found: org.scala-lang#scala-library;2.10

==== local: tried

 /root/.ivy2/local/org.scala-lang/scala-library/2.10/ivys/ivy.xml

==== typesafe-ivy-releases: tried


==== Maven Central: tried


==== sonatype-snapshots: tried


::::::::::::::::::::::::::::::::::::::::::::::

::          UNRESOLVED DEPENDENCIES         ::

::::::::::::::::::::::::::::::::::::::::::::::

:: org.scala-lang#scala-compiler;2.10: not found

:: org.scala-lang#scala-library;2.10: not found

::::::::::::::::::::::::::::::::::::::::::::::



:: USE VERBOSE OR DEBUG MESSAGE LEVEL FOR MORE DETAILS
unresolved dependency: org.scala-lang#scala-compiler;2.10: not found
unresolved dependency: org.scala-lang#scala-library;2.10: not found
Error during sbt execution: Error retrieving required libraries
  (see /root/.sbt/boot/update.log for complete log)
xsbti.RetrieveException: Could not retrieve Scala 2.10
at xsbt.boot.ModuleDefinition.fail(ResolverHelper.java:12)
at xsbt.boot.ModuleDefinition.retrieveFailed(ResolverHelper.java:9)
at xsbt.boot.Launch.update(Launch.scala:267)
at xsbt.boot.Launch$$anonfun$xsbt$boot$Launch$$getScalaProvider0$3.apply(Launch.scala:181)
at scala.Option.getOrElse(Option.scala:108)
at xsbt.boot.Launch$$anon$3.call(Launch.scala:167)
at xsbt.boot.Locks$GlobalLock.withChannel$1(Locks.scala:98)
at xsbt.boot.Locks$GlobalLock.withChannelRetries$1(Locks.scala:81)
at xsbt.boot.Locks$GlobalLock$$anonfun$withFileLock$1.apply(Locks.scala:102)
at xsbt.boot.Using$.withResource(Using.scala:11)
at xsbt.boot.Using$.apply(Using.scala:10)
at xsbt.boot.Locks$GlobalLock.ignoringDeadlockAvoided(Locks.scala:62)
at xsbt.boot.Locks$GlobalLock.liftedTree1$1(Locks.scala:52)
at xsbt.boot.Locks$GlobalLock.withLock(Locks.scala:52)
at xsbt.boot.Locks$.apply0(Locks.scala:31)
at xsbt.boot.Locks$.apply(Locks.scala:28)
at xsbt.boot.Launch.locked(Launch.scala:165)
at xsbt.boot.Launch.getScalaProvider(Launch.scala:167)
at xsbt.boot.Launch$$anonfun$1.apply(Launch.scala:76)
at org.apache.ivy.plugins.namespace.NamespaceRule.newEntry(Cache.scala:17)
at org.apache.ivy.plugins.namespace.NamespaceRule.apply(Cache.scala:12)
at xsbt.boot.Launch.getScala(Launch.scala:79)
at xsbt.boot.Launch.getScala(Launch.scala:78)
at xsbt.boot.Launch.getScala(Launch.scala:77)
at sbt.ScalaInstance$.apply(ScalaInstance.scala:43)
at sbt.ScalaInstance$.apply(ScalaInstance.scala:34)
at sbt.Defaults$$anonfun$scalaInstanceSetting$1.apply(Defaults.scala:272)
at sbt.Defaults$$anonfun$scalaInstanceSetting$1.apply(Defaults.scala:269)
at sbt.Scoped$$anonfun$hf4$1.apply(Structure.scala:580)
at sbt.Scoped$$anonfun$hf4$1.apply(Structure.scala:580)
at scala.Function1$$anonfun$compose$1.apply(Function1.scala:49)
at sbt.Scoped$Reduced$$anonfun$combine$1$$anonfun$apply$12.apply(Structure.scala:311)
at sbt.Scoped$Reduced$$anonfun$combine$1$$anonfun$apply$12.apply(Structure.scala:311)
at sbt.$tilde$greater$$anonfun$$u2219$1.apply(TypeFunctions.scala:41)
at sbt.std.Transform$$anon$5.work(System.scala:71)
at sbt.Execute$$anonfun$submit$1$$anonfun$apply$1.apply(Execute.scala:232)
at sbt.Execute$$anonfun$submit$1$$anonfun$apply$1.apply(Execute.scala:232)
at sbt.ErrorHandling$.wideConvert(ErrorHandling.scala:18)
at sbt.Execute.work(Execute.scala:238)
at sbt.Execute$$anonfun$submit$1.apply(Execute.scala:232)
at sbt.Execute$$anonfun$submit$1.apply(Execute.scala:232)
at sbt.ConcurrentRestrictions$$anon$4$$anonfun$1.apply(ConcurrentRestrictions.scala:160)
at sbt.CompletionService$$anon$2.call(CompletionService.scala:30)
at java.util.concurrent.FutureTask$Sync.innerRun(FutureTask.java:303)
at java.util.concurrent.FutureTask.run(FutureTask.java:138)
at java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:439)
at java.util.concurrent.FutureTask$Sync.innerRun(FutureTask.java:303)
at java.util.concurrent.FutureTask.run(FutureTask.java:138)
at java.util.concurrent.ThreadPoolExecutor$Worker.runTask(ThreadPoolExecutor.java:895)
at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:918)
at java.lang.Thread.run(Thread.java:662)
[error] (root/*:scala-instance) xsbti.RetrieveException: Could not retrieve Scala 2.10
[error] Total time: 9 s, completed Feb 19, 2014 5:12:37 PM



      It seems scala-2.10 was not available, but Spark-0.9 comes with Scala-2.10 included, doesn't it ? 

      How to solve this problem and what is the correct way to compile Spark application written in Scala/Java? 







--
Dachuan Huang
Cellphone: 614-390-7234
2015 Neil Avenue
Ohio State University
Columbus, Ohio
U.S.A.
43210