.sparkrc for Spark shell?

Previous Topic Next Topic
 
classic Classic list List threaded Threaded
4 messages Options
Reply | Threaded
Open this post in threaded view
|

.sparkrc for Spark shell?

Jianshi Huang
To make my shell experience merrier, I need to import several packages, and define implicit sparkContext and sqlContext.

Is there a startup file (e.g. ~/.sparkrc) that Spark shell will load when it's started?


Cheers,
--
Jianshi Huang

LinkedIn: jianshi
Twitter: @jshuang
Github & Blog: http://huangjs.github.com/
Reply | Threaded
Open this post in threaded view
|

Re: .sparkrc for Spark shell?

Prashant Sharma
Hey,

You can use spark-shell -i sparkrc, to do this.

Prashant Sharma




On Wed, Sep 3, 2014 at 2:17 PM, Jianshi Huang <[hidden email]> wrote:
To make my shell experience merrier, I need to import several packages, and define implicit sparkContext and sqlContext.

Is there a startup file (e.g. ~/.sparkrc) that Spark shell will load when it's started?


Cheers,
--
Jianshi Huang

LinkedIn: jianshi
Twitter: @jshuang
Github & Blog: http://huangjs.github.com/

Reply | Threaded
Open this post in threaded view
|

Re: .sparkrc for Spark shell?

prismalytics
Hello:

Question...

Is the below -- more or less -- basically equivalent to doing this for pyspark:

   user$ export PYTHONSTARTUP=
/path/to/my/pythonStartup.py
   user$ pyspark

Actually, this is how I start a pyspark... by reverse engineering how pyspark
starts, I wrote a broader pythonStartup.py script so that, among other
things, I adds the the environment & imports that I need (numpy, matplotlib,
scipy, etc), and also can use it like this:

 >> python  -i 
/path/to/my/pythonStartup.py
 >> bpython -i
/path/to/my/pythonStartup.py (excellent for it's code intelligence / completion).
 >> And used for the python shell that starts in my WING IDE.

So just curious about '-i'. :)

Thank you,
didata


On 09/03/2014 07:05 AM, Prashant Sharma wrote:
Hey,

You can use spark-shell -i sparkrc, to do this.

Prashant Sharma




On Wed, Sep 3, 2014 at 2:17 PM, Jianshi Huang <[hidden email]> wrote:
To make my shell experience merrier, I need to import several packages, and define implicit sparkContext and sqlContext.

Is there a startup file (e.g. ~/.sparkrc) that Spark shell will load when it's started?


Cheers,
--
Jianshi Huang

LinkedIn: jianshi
Twitter: @jshuang
Github & Blog: http://huangjs.github.com/


--
Dimension Data, LLC.
Sincerely yours,
Team Dimension Data


Dimension Data, LLC. | www.didata.us
P: 212.882.1276[hidden email]

Follow Us: https://www.LinkedIn.com/company/didata

Dimension Data, LLC.
Data Analytics you can literally count on.

Reply | Threaded
Open this post in threaded view
|

Re: .sparkrc for Spark shell?

Jianshi Huang
In reply to this post by Prashant Sharma
I se. Thanks Prashant!

Jianshi


On Wed, Sep 3, 2014 at 7:05 PM, Prashant Sharma <[hidden email]> wrote:
Hey,

You can use spark-shell -i sparkrc, to do this.

Prashant Sharma




On Wed, Sep 3, 2014 at 2:17 PM, Jianshi Huang <[hidden email]> wrote:
To make my shell experience merrier, I need to import several packages, and define implicit sparkContext and sqlContext.

Is there a startup file (e.g. ~/.sparkrc) that Spark shell will load when it's started?


Cheers,
--
Jianshi Huang

LinkedIn: jianshi
Twitter: @jshuang
Github & Blog: http://huangjs.github.com/




--
Jianshi Huang

LinkedIn: jianshi
Twitter: @jshuang
Github & Blog: http://huangjs.github.com/