Defining SparkShell Init?

classic Classic list List threaded Threaded
4 messages Options
Reply | Threaded
Open this post in threaded view
|

Defining SparkShell Init?

Kyle Ellrott
Is there a way to define a set of commands to 'initialize' the environment in the SparkShell?
I'd like to create a wrapper script that starts up the sparkshell and does some boiler plate initialization (imports and variable creation) before handing things over to me. 

Kyle 
Reply | Threaded
Open this post in threaded view
|

Re: Defining SparkShell Init?

Mayur Rustagi
That's actually not a bad idea. To have a shellboilerplate.scala in the same folder that is used to initialize the shell. 
Shell is a script that end of they day starts a JVM with jars from the spark project , mostly you'll have to modify the spark classes and reassemble using sbt. It's messy but thr may be easier ways to feed some data to shell script/JVM then connect with stdin. 
Regards
Mayur

On Monday, February 17, 2014, Kyle Ellrott <[hidden email]> wrote:
Is there a way to define a set of commands to 'initialize' the environment in the SparkShell?
I'd like to create a wrapper script that starts up the sparkshell and does some boiler plate initialization (imports and variable creation) before handing things over to me. 

Kyle 


--
Sent from Gmail Mobile
Reply | Threaded
Open this post in threaded view
|

Re: Defining SparkShell Init?

Prashant Sharma
There is a way to :load in shell, where you can specify the path of your boilerplate.scala. These things would be streamlined "once" we have scala 2.11 (I hope.)


On Tue, Feb 18, 2014 at 7:20 AM, Mayur Rustagi <[hidden email]> wrote:
That's actually not a bad idea. To have a shellboilerplate.scala in the same folder that is used to initialize the shell. 
Shell is a script that end of they day starts a JVM with jars from the spark project , mostly you'll have to modify the spark classes and reassemble using sbt. It's messy but thr may be easier ways to feed some data to shell script/JVM then connect with stdin. 
Regards
Mayur

On Monday, February 17, 2014, Kyle Ellrott <[hidden email]> wrote:
Is there a way to define a set of commands to 'initialize' the environment in the SparkShell?
I'd like to create a wrapper script that starts up the sparkshell and does some boiler plate initialization (imports and variable creation) before handing things over to me. 

Kyle 


--
Sent from Gmail Mobile



--
Prashant
Reply | Threaded
Open this post in threaded view
|

Re: Defining SparkShell Init?

Andrew Ash

Why would scala 0.11 change things here? I'm not familiar with what features you're referring.

I would support a prelude file in ~/.sparkrc our similar that is automatically imported on spark shell startup if it exists.

Sent from my mobile phone

On Feb 17, 2014 9:11 PM, "Prashant Sharma" <[hidden email]> wrote:
There is a way to :load in shell, where you can specify the path of your boilerplate.scala. These things would be streamlined "once" we have scala 2.11 (I hope.)


On Tue, Feb 18, 2014 at 7:20 AM, Mayur Rustagi <[hidden email]> wrote:
That's actually not a bad idea. To have a shellboilerplate.scala in the same folder that is used to initialize the shell. 
Shell is a script that end of they day starts a JVM with jars from the spark project , mostly you'll have to modify the spark classes and reassemble using sbt. It's messy but thr may be easier ways to feed some data to shell script/JVM then connect with stdin. 
Regards
Mayur

On Monday, February 17, 2014, Kyle Ellrott <[hidden email]> wrote:
Is there a way to define a set of commands to 'initialize' the environment in the SparkShell?
I'd like to create a wrapper script that starts up the sparkshell and does some boiler plate initialization (imports and variable creation) before handing things over to me. 

Kyle 


--
Sent from Gmail Mobile



--
Prashant