libraryDependencies configuration is different for sbt assembly vs sbt run

classic Classic list List threaded Threaded
2 messages Options
Reply | Threaded
Open this post in threaded view
|

libraryDependencies configuration is different for sbt assembly vs sbt run

kamatsuoka
When I run "sbt assembly", I use the "provided" configuration in the build.sbt library dependency, to avoid conflicts in the fat jar:

libraryDependencies += "org.apache.spark" %% "spark-core" % "0.8.1-incubating" % "provided"

But if I want to do "sbt run", I have to remove the "provided," otherwise it doesn't find the Spark classes.

Is there a way to set up my build.sbt so that it does the right thing in both cases, without monkeying with my build.sbt each time?

Reply | Threaded
Open this post in threaded view
|

Re: libraryDependencies configuration is different for sbt assembly vs sbt run

kamatsuoka
It turns out there's a way to make this work!  Here's a project/Build.scala adapted from Eugene Yokota's answer on StackOverflow.  I have most of my settings in my build.sbt, so this is a minimalist version of Eugene's answer.

import sbtassembly.Plugin.AssemblyKeys
import AssemblyKeys._
import sbt._
import Keys._

object ApplicationBuild extends Build {

  val Unprovided = config("unprovided") extend Runtime

  val root = project.in(file(".")).
    configs(Unprovided).
    settings(
      name := "my-project",
      libraryDependencies ++= Seq(
        "org.apache.spark" %% "spark-core" % "0.9.0-incubating" % "unprovided"
      ),
      assembleArtifact in packageScala := false
    ).
    settings(inConfig(Unprovided)(Classpaths.configSettings ++ Seq(
      run <<= Defaults.runTask(fullClasspath, mainClass in(Runtime, run), runner in(Runtime, run))
    )): _*).
    settings(
      run <<= (run in Unprovided)
    )
}