pyspark working with a different Python version than the cluster

classic Classic list List threaded Threaded
2 messages Options
Reply | Threaded
Open this post in threaded view
|

pyspark working with a different Python version than the cluster

mickkelodeon
Hi,
Something is happening to me that I don't quite understand.
I ran pyspark on a machine that has Python 3.5 where I managed to run some commands, even the Spark cluster is using Python 3.4.
If I do the same with spark-submit I get the "Python in worker has different version 3.4 than that in driver 3.5" error.
Why is Pyspark working then?

Thanks. Regards
Reply | Threaded
Open this post in threaded view
|

Re: pyspark working with a different Python version than the cluster

Tang Jinxin
Hi  Copon,
  Python In worker use python3 to termine, It may return python3.4 In some nodes, Could you check python3 results?

Best wishes,
Jinxin

xiaoxingstack
邮箱:xiaoxingstack@...

签名由 网易邮箱大师 定制

On 04/23/2020 01:02, [hidden email] wrote:
Hi,
Something is happening to me that I don't quite understand.
I ran pyspark on a machine that has Python 3.5 where I managed to run some commands, even the Spark cluster is using Python 3.4.
If I do the same with spark-submit I get the "Python in worker has different version 3.4 than that in driver 3.5" error.
Why is Pyspark working then?

Thanks. Regards