Spark 1.0.2 Can GroupByTest example be run in Eclipse without change

classic Classic list List threaded Threaded
2 messages Options
Reply | Threaded
Open this post in threaded view
|

Spark 1.0.2 Can GroupByTest example be run in Eclipse without change

Shing Hing Man-2
Hi,

I have noticed that the GroupByTest example in
https://github.com/apache/spark/blob/master/examples/src/main/scala/org/apache/spark/examples/GroupByTest.scala
has been changed to be run using spark-submit.
Previously, I set "local" as the first command line parameter, and this enable me to run GroupByTest in Eclipse.
val sc = new SparkContext(args(0), "GroupBy Test",
System.getenv("SPARK_HOME"), SparkContext.jarOfClass(this.getClass).toSeq)


In the latest GroupByTest code, I can not pass in "local" as the first comand line parameter :
val sparkConf = new SparkConf().setAppName("GroupBy Test")
var numMappers = if (args.length > 0) args(0).toInt else 2
var numKVPairs = if (args.length > 1) args(1).toInt else 1000
var valSize = if (args.length > 2) args(2).toInt else 1000
var numReducers = if (args.length > 3) args(3).toInt else numMappers
val sc = new SparkContext(sparkConf)


Is there a way to specify "master=local" (maybe in an environment variable), so that I can run the latest
version of GroupByTest in Eclipse without changing the code.

Thanks in advance for your assistance !

Shing
Reply | Threaded
Open this post in threaded view
|

Re: Spark 1.0.2 Can GroupByTest example be run in Eclipse without change

Shing Hing Man-2
After looking at the source code of SparkConf.scala,   I found the following solution.
Just set the following Java system property :
-Dspark.master=local

Shing


On Monday, 1 September 2014, 22:09, Shing Hing Man <[hidden email]> wrote:


Hi,

I have noticed that the GroupByTest example in
https://github.com/apache/spark/blob/master/examples/src/main/scala/org/apache/spark/examples/GroupByTest.scala
has been changed to be run using spark-submit.
Previously, I set "local" as the first command line parameter, and this enable me to run GroupByTest in Eclipse.
val sc = new SparkContext(args(0), "GroupBy Test",
System.getenv("SPARK_HOME"), SparkContext.jarOfClass(this.getClass).toSeq)


In the latest GroupByTest code, I can not pass in "local" as the first comand line parameter :
val sparkConf = new SparkConf().setAppName("GroupBy Test")
var numMappers = if (args.length > 0) args(0).toInt else 2
var numKVPairs = if (args.length > 1) args(1).toInt else 1000
var valSize = if (args.length > 2) args(2).toInt else 1000
var numReducers = if (args.length > 3) args(3).toInt else numMappers
val sc = new SparkContext(sparkConf)


Is there a way to specify "master=local" (maybe in an environment variable), so that I can run the latest
version of GroupByTest in Eclipse without changing the code.

Thanks in advance for your assistance !

Shing