Spark Pipe wrapException

classic Classic list List threaded Threaded
1 message Options
Reply | Threaded
Open this post in threaded view
|

Spark Pipe wrapException

''癫、砜'
When I use RDD.pipe("program") to analysis data, the spark throw wrapException. Something special is the native program just do "scanf" and "printf", we find when the scale of data is small, everything is ok, but when the scale of data increate, we got these exception.
We try to analysis the reason:The stack tell us socket time out, and the time is about 60 seconds, so we add "dfs.socket.timeout" to "dfs.xml" but it doesn't work.
Here is the error stack, maybe someone got the same problem as me,looking forward to the Reply.
the