Proper way to create standalone app with custom Spark version

classic Classic list List threaded Threaded
1 message Options
Reply | Threaded
Open this post in threaded view
|

Proper way to create standalone app with custom Spark version

Andrei
We can create standalone Spark application by simply adding "spark-core_2.x" to build.sbt/pom.xml and connecting it to Spark master. 

We can also compile custom version of Spark (e.g. compiled against Hadoop 2.x) from source and deploy it to cluster manually. 

But what is a proper way to use _custom version_ of Spark in _standalone application_? We can't simply include custom version into build.sbt/pom.xml, since it's not in central repository. 

---- 

I'm currently trying to deploy custom version to local Maven repository and add it to SBT project. Another option is to add Spark as local jar to every project. But both of these ways look overcomplicated and in general wrong. 

What is an implied way to solve this issue? 

Thanks, 
Andrei