Building a Standalone App in Scala and graphX

classic Classic list List threaded Threaded
6 messages Options
Reply | Threaded
Open this post in threaded view
|

Building a Standalone App in Scala and graphX

xben
Hello,

I'm trying to build a very simple scala standalone app using the graphx libraby. I basically copy/pasted the trangle count example and written the sbt file but I get the following error when trying to bulid the program:

[error] /data/home/benjamin/spark/spark-0.9.0-incubating-bin-hadoop2/ben_new/src/main/scala/testGraph.scala:2: object graphx is not a member of package org.apache.spark
[error] import org.apache.spark.graphx._
[error]                         ^
[error] /data/home/benjamin/spark/spark-0.9.0-incubating-bin-hadoop2/ben_new/src/main/scala/testGraph.scala:16: not found: value GraphLoader
[error]     val graph = GraphLoader.edgeListFile(sc, LINKS, true).partitionBy(PartitionStrategy.RandomVertexCut)
[error]                 ^
[error] two errors found
[error] (compile:compile) Compilation failed
[error] Total time: 13 s, completed Feb 17, 2014 4:47:37 PM

here is the sbt file i'm using:

name := "Graph Test"

version := "1.0"

scalaVersion := "2.10.0"

libraryDependencies += "org.apache.spark" %% "spark-core" % "0.9.0-incubating"

libraryDependencies += "org.apache.hadoop" % "hadoop-client" % "2.0.0-cdh4.4.0"

resolvers += "Akka Repository" at "http://repo.akka.io/releases/"

Any idea on what's wrong ghere?
Reply | Threaded
Open this post in threaded view
|

Re: Building a Standalone App in Scala and graphX

Evan R. Sparks
You need to set your libraryDependencies to include the "spark-graphx" artifact. 

We should add a note to the graphx and mllib pages to include linking instructions (like we have for streaming: http://spark.incubator.apache.org/docs/latest/streaming-programming-guide.html#linking)


On Mon, Feb 17, 2014 at 9:02 AM, xben <[hidden email]> wrote:
Hello,

I'm trying to build a very simple scala standalone app using the graphx
libraby. I basically copy/pasted the trangle count example and written the
sbt file but I get the following error when trying to bulid the program:

[error]
/data/home/benjamin/spark/spark-0.9.0-incubating-bin-hadoop2/ben_new/src/main/scala/testGraph.scala:2:
object graphx is not a member of package org.apache.spark
[error] import org.apache.spark.graphx._
[error]                         ^
[error]
/data/home/benjamin/spark/spark-0.9.0-incubating-bin-hadoop2/ben_new/src/main/scala/testGraph.scala:16:
not found: value GraphLoader
[error]     val graph = GraphLoader.edgeListFile(sc, LINKS,
true).partitionBy(PartitionStrategy.RandomVertexCut)
[error]                 ^
[error] two errors found
[error] (compile:compile) Compilation failed
[error] Total time: 13 s, completed Feb 17, 2014 4:47:37 PM

here is the sbt file i'm using:

name := "Graph Test"

version := "1.0"

scalaVersion := "2.10.0"

libraryDependencies += "org.apache.spark" %% "spark-core" %
"0.9.0-incubating"

libraryDependencies += "org.apache.hadoop" % "hadoop-client" %
"2.0.0-cdh4.4.0"

resolvers += "Akka Repository" at "http://repo.akka.io/releases/"

Any idea on what's wrong ghere?



--
View this message in context: http://apache-spark-user-list.1001560.n3.nabble.com/Building-a-Standalone-App-in-Scala-and-graphX-tp1622.html
Sent from the Apache Spark User List mailing list archive at Nabble.com.

Reply | Threaded
Open this post in threaded view
|

Re: Building a Standalone App in Scala and graphX

xben
In reply to this post by xben
Oh... I fixed the error myself by adding the following line in my sbt file :

libraryDependencies += "org.apache.spark" %% "spark-core" % "0.9.0-incubating"
Reply | Threaded
Open this post in threaded view
|

Re: Building a Standalone App in Scala and graphX

xben
Sorry, I meant the following line

libraryDependencies += "org.apache.spark" %% "spark-graphx" % "0.9.0-incubating"
Reply | Threaded
Open this post in threaded view
|

Re: Building a Standalone App in Scala and graphX

dachuan
Hi, sorry about this, my question is not related to yours, I am just checking around the mailing list in the hope of finding the solution to my question.

I am compiling one standalone scala spark program, but it reports sbt.ResolveException: unresolved dependency: org.scala-lang#scala-library;2.10: not found error.

Your graphx question is one stage later than mine, and do you know how does sbt find scala-library?

thanks,
dachuan.


On Mon, Feb 17, 2014 at 1:10 PM, xben <[hidden email]> wrote:
Sorry, I meant the following line

libraryDependencies += "org.apache.spark" %% *"spark-graphx*" %
"0.9.0-incubating"



--
View this message in context: http://apache-spark-user-list.1001560.n3.nabble.com/Building-a-Standalone-App-in-Scala-and-graphX-tp1622p1633.html
Sent from the Apache Spark User List mailing list archive at Nabble.com.



--
Dachuan Huang
Cellphone: 614-390-7234
2015 Neil Avenue
Ohio State University
Columbus, Ohio
U.S.A.
43210
Reply | Threaded
Open this post in threaded view
|

Re: Building a Standalone App in Scala and graphX

moxiecui
Ignore me if you have solve this problem.

You should set your "scalaVersion" in your .sbt file to the exact scala version number you are using.

use
   $ scala -version
to find your version num.


Hope this will be helpful.