[SparkSQL] too many open files although ulimit set to 1048576
This post has NOT been accepted by the mailing list yet.
This post was updated on .
data: 150,000,000 inner join 100,000,000
Spark version: 2.0.2
although ulimit -a is already set to 1048576(set more may cause error)
but still throw the error
java.io.FileNotFoundException: /tmp/spark-c32ce764-44ee-457c-96ed-44c77f829845/executor-8ba31059-3d65-47b3-97d5-73da78bb9e16/blockmgr-da6afd20-0635-42e6-af6c-f70a1dec738f/01/temp_shuffle_86d4051d-9dfa-4517-bc95-6c323ae699fa (too many open files)