Quantcast

JavaSparkConf

Previous Topic Next Topic
 
classic Classic list List threaded Threaded
3 messages Options
Reply | Threaded
Open this post in threaded view
|  
Report Content as Inappropriate

JavaSparkConf

Soren Macbeth
There is a JavaSparkContext, but no JavaSparkConf object. I know SparkConf is new in 0.9.x.

Is there a plan to add something like this to the java api?

It's rather a bother to have things like setAll take a scala Traverable[String String] when using SparkConf from the java api.

At a minimum adding methods signatures for java collections where there are currently scala collection would be a good start.

TIA
Reply | Threaded
Open this post in threaded view
|  
Report Content as Inappropriate

Re: JavaSparkConf

Patrick Wendell
This class was made to be "java friendly" so that we wouldn't have to
use two versions. The class itself is simple. But I agree adding java
setters would be nice.

On Tue, Apr 29, 2014 at 8:32 PM, Soren Macbeth <[hidden email]> wrote:

> There is a JavaSparkContext, but no JavaSparkConf object. I know SparkConf
> is new in 0.9.x.
>
> Is there a plan to add something like this to the java api?
>
> It's rather a bother to have things like setAll take a scala
> Traverable[String String] when using SparkConf from the java api.
>
> At a minimum adding methods signatures for java collections where there are
> currently scala collection would be a good start.
>
> TIA
Reply | Threaded
Open this post in threaded view
|  
Report Content as Inappropriate

Re: JavaSparkConf

Soren Macbeth
My implication is that it isn't "java friendly" enough. The follow methods return scala objects

getAkkaConf
getAll
getExecutorEnv

and the follow method require scala objects as their params

setAll
setExecutorEnv (both of the bulk methods)

so-- while it is usable from java, I wouldn't call it friendly. all of the bulk setters and getters take and return scala objects (the exception being setJars, luckily).  


On Tue, Apr 29, 2014 at 10:23 PM, Patrick Wendell <[hidden email]> wrote:
This class was made to be "java friendly" so that we wouldn't have to
use two versions. The class itself is simple. But I agree adding java
setters would be nice.

On Tue, Apr 29, 2014 at 8:32 PM, Soren Macbeth <[hidden email]> wrote:
> There is a JavaSparkContext, but no JavaSparkConf object. I know SparkConf
> is new in 0.9.x.
>
> Is there a plan to add something like this to the java api?
>
> It's rather a bother to have things like setAll take a scala
> Traverable[String String] when using SparkConf from the java api.
>
> At a minimum adding methods signatures for java collections where there are
> currently scala collection would be a good start.
>
> TIA

Loading...