running spark application compiled with 1.6 on spark 2.1 cluster

Previous Topic Next Topic
 
classic Classic list List threaded Threaded
2 messages Options
Reply | Threaded
Open this post in threaded view
|  
Report Content as Inappropriate

running spark application compiled with 1.6 on spark 2.1 cluster

satishl
This post has NOT been accepted by the mailing list yet.
My Spark application is compiled with 1.6 spark core and dependencies.
When I try to run this app on a spark 2.1 cluster - I run into
ERROR ApplicationMaster: User class threw exception: java.lang.NoClassDefFoundError: org/apache/spark/Logging


I was hoping that 2.+ spark is backward compatible and I wouldnt need to recompile my application.
is this a supported scenario - i.e., can I run app compiled with spark 1.6 on a 2.+ spark cluster?

Reply | Threaded
Open this post in threaded view
|  
Report Content as Inappropriate

Re: running spark application compiled with 1.6 on spark 2.1 cluster

blyncsy.david.lewis
This post has NOT been accepted by the mailing list yet.
No! That's not possible.

On Jul 26, 2017 11:45 PM, "satishl [via Apache Spark User List]" <[hidden email]> wrote:
My Spark application is compiled with 1.6 spark core and dependencies.
When I try to run this app on a spark 2.1 cluster - I run into
ERROR ApplicationMaster: User class threw exception: java.lang.NoClassDefFoundError: org/apache/spark/Logging


I was hoping that 2.+ spark is backward compatible and I wouldnt need to recompile my application.
is this a supported scenario - i.e., can I run app compiled with spark 1.6 on a 2.+ spark cluster?




To unsubscribe from Apache Spark User List, click here.
NAML
Loading...