Committer to use if "spark.sql.sources.partitionOverwriteMode": 'dynamic'

classic Classic list List threaded Threaded
1 message Options
Reply | Threaded
Open this post in threaded view
|

Committer to use if "spark.sql.sources.partitionOverwriteMode": 'dynamic'

edge7
Hi,

I am using Spark on EMR, and I was hoping to use their optimised committer,
but it looks like that, if
"spark.sql.sources.partitionOverwriteMode": 'dynamic' then
it will not be used.

What are the best practices to use in this case?
The renaming phase in S3, is very slow, and the bottleneck in my job.

Thanks,




--
Sent from: http://apache-spark-user-list.1001560.n3.nabble.com/

---------------------------------------------------------------------
To unsubscribe e-mail: [hidden email]