Spark - Scala-Java interoperablity

Previous Topic Next Topic
 
classic Classic list List threaded Threaded
2 messages Options
Reply | Threaded
Open this post in threaded view
|

Spark - Scala-Java interoperablity

Ramesh Mathikumar
Hi Team,

A quick question from my side.

Can I use spark-submit which contains both java and scala in a single
workflow. By single workflow I mean main program is in Java (Wrapped
in Spark) and it calls a module to calculate something on the payload
which is in Scala (wrapped in Spark).

Are there any compatibility / interoperability issues around?

Regards,
Ramster

---------------------------------------------------------------------
To unsubscribe e-mail: [hidden email]

Reply | Threaded
Open this post in threaded view
|

Re: Spark - Scala-Java interoperablity

srowen
That should be fine. The JVM doesn't care how the bytecode it is
executing was produced. As long as you were able to compile it
together - which sometimes means using plugins like scala-maven-plugin
for mixed compilation - the result should be fine.

On Sun, Aug 16, 2020 at 4:28 PM Ramesh Mathikumar
<[hidden email]> wrote:

>
> Hi Team,
>
> A quick question from my side.
>
> Can I use spark-submit which contains both java and scala in a single
> workflow. By single workflow I mean main program is in Java (Wrapped
> in Spark) and it calls a module to calculate something on the payload
> which is in Scala (wrapped in Spark).
>
> Are there any compatibility / interoperability issues around?
>
> Regards,
> Ramster
>
> ---------------------------------------------------------------------
> To unsubscribe e-mail: [hidden email]
>

---------------------------------------------------------------------
To unsubscribe e-mail: [hidden email]