How Spark Framework works a Compiler

classic Classic list List threaded Threaded
2 messages Options
Reply | Threaded
Open this post in threaded view
|

How Spark Framework works a Compiler

Renganathan M

Hi,

 

I have read in many blogs that Spark framework is a compiler itself.

 

It generates the DAG; optimizes it and executes it. The DAG is generated from the user submitted code ( be it in Java, Scala, Python or R). So when we submit a JAR file (it has the list of compiled classes), in the first step, does Spark use reflection to read the class files and then generates DAG ? I am not quit getting what really happens from the point where user submits the JAR file, to DAG generation.

 

I tried looking for answers; but not able to get any.

 

Can someone please help.

 

Thanks!

 

Sent from Mail for Windows 10

 

Reply | Threaded
Open this post in threaded view
|

Re: How Spark Framework works a Compiler

srowen
No it's much simpler than that. Spark is just a bunch of APIs that user applications call into to cause it to form a DAG and execute it. There's no need to reflection or transpiling or anything. The user app is just calling the framework directly, not the other way around.

On Sun, Jan 3, 2021 at 4:49 AM Renganathan M <[hidden email]> wrote:

Hi,

 

I have read in many blogs that Spark framework is a compiler itself.

 

It generates the DAG; optimizes it and executes it. The DAG is generated from the user submitted code ( be it in Java, Scala, Python or R). So when we submit a JAR file (it has the list of compiled classes), in the first step, does Spark use reflection to read the class files and then generates DAG ? I am not quit getting what really happens from the point where user submits the JAR file, to DAG generation.

 

I tried looking for answers; but not able to get any.

 

Can someone please help.

 

Thanks!

 

Sent from Mail for Windows 10