java.nio.file.FileSystemException: /tmp/spark- .._cache : No space left on device

classic Classic list List threaded Threaded
3 messages Options
Reply | Threaded
Open this post in threaded view
|

java.nio.file.FileSystemException: /tmp/spark- .._cache : No space left on device

Polisetti, Venkata Siva Rama Gopala Krishna

Hi

Am getting below exception when I Run Spark-submit in linux machine , can someone give quick solution with commands

Driver stacktrace:

- Job 0 failed: count at DailyGainersAndLosersPublisher.scala:145, took 5.749450 s

org.apache.spark.SparkException: Job aborted due to stage failure: Task 4 in stage 0.0 failed 4 times, most recent failure: Lost task 4.3 in stage 0.0 (TID 6, 172.29.62.145, executor 0): java.nio.file.FileSystemException: /tmp/spark-523d5331-3884-440c-ac0d-f46838c2029f/executor-390c9cd7-217e-42f3-97cb-fa2734405585/spark-206d92c0-f0d3-443c-97b2-39494e2c5fdd/-4230744641534510169119_cache -> ./PublishGainersandLosers-1.0-SNAPSHOT-shaded-Gopal.jar: No space left on device

        at sun.nio.fs.UnixException.translateToIOException(UnixException.java:91)

        at sun.nio.fs.UnixException.rethrowAsIOException(UnixException.java:102)

        at sun.nio.fs.UnixCopyFile.copyFile(UnixCopyFile.java:253)

        at sun.nio.fs.UnixCopyFile.copy(UnixCopyFile.java:581)

        at sun.nio.fs.UnixFileSystemProvider.copy(UnixFileSystemProvider.java:253)

        at java.nio.file.Files.copy(Files.java:1274)

        at org.apache.spark.util.Utils$.org$apache$spark$util$Utils$$copyRecursive(Utils.scala:625)

        at org.apache.spark.util.Utils$.copyFile(Utils.scala:596)

        at org.apache.spark.util.Utils$.fetchFile(Utils.scala:473)

        at org.apache.spark.executor.Executor$$anonfun$org$apache$spark$executor$Executor$$updateDependencies$5.apply(Executor.scala:696)

        at org.apache.spark.executor.Executor$$anonfun$org$apache$spark$executor$Executor$$updateDependencies$5.apply(Executor.scala:688)

        at scala.collection.TraversableLike$WithFilter$$anonfun$foreach$1.apply(TraversableLike.scala:733)

        at scala.collection.mutable.HashMap$$anonfun$foreach$1.apply(HashMap.scala:99)

        at scala.collection.mutable.HashMap$$anonfun$foreach$1.apply(HashMap.scala:99)

        at scala.collection.mutable.HashTable$class.foreachEntry(HashTable.scala:230)

        at scala.collection.mutable.HashMap.foreachEntry(HashMap.scala:40)

        at scala.collection.mutable.HashMap.foreach(HashMap.scala:99)

        at scala.collection.TraversableLike$WithFilter.foreach(TraversableLike.scala:732)

        at org.apache.spark.executor.Executor.org$apache$spark$executor$Executor$$updateDependencies(Executor.scala:688)

        at org.apache.spark.executor.Executor$TaskRunner.run(Executor.scala:308)

        at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1142)

        at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:617)

        at java.lang.Thread.run(Thread.java:745)

 




The information contained in this message is intended only for the recipient, and may be a confidential attorney-client communication or may otherwise be privileged and confidential and protected from disclosure. If the reader of this message is not the intended recipient, or an employee or agent responsible for delivering this message to the intended recipient, please be aware that any dissemination or copying of this communication is strictly prohibited. If you have received this communication in error, please immediately notify us by replying to the message and deleting it from your computer. S&P Global Inc. reserves the right, subject to applicable local law, to monitor, review and process the content of any electronic message or information sent to or from S&P Global Inc. e-mail addresses without informing the sender or recipient of the message. By sending electronic message or information to S&P Global Inc. e-mail addresses you, as the sender, are consenting to S&P Global Inc. processing any of your personal data therein.
Reply | Threaded
Open this post in threaded view
|

Re: java.nio.file.FileSystemException: /tmp/spark- .._cache : No space left on device

jeevan.ks
Hi Venkata,

On a quick glance, it looks like a file-related issue more so than an executor issue. If the logs are not that important, I would clear /tmp/spark-events/ directory and assign a suitable permission (e.g., chmod 755) to that and rerun the application.

chmod 755 /tmp/spark-events/  

Thanks and regards,
Jeevan K. Srivatsa


On Fri, 17 Aug 2018 at 15:20, Polisetti, Venkata Siva Rama Gopala Krishna <[hidden email]> wrote:

Hi

Am getting below exception when I Run Spark-submit in linux machine , can someone give quick solution with commands

Driver stacktrace:

- Job 0 failed: count at DailyGainersAndLosersPublisher.scala:145, took 5.749450 s

org.apache.spark.SparkException: Job aborted due to stage failure: Task 4 in stage 0.0 failed 4 times, most recent failure: Lost task 4.3 in stage 0.0 (TID 6, 172.29.62.145, executor 0): java.nio.file.FileSystemException: /tmp/spark-523d5331-3884-440c-ac0d-f46838c2029f/executor-390c9cd7-217e-42f3-97cb-fa2734405585/spark-206d92c0-f0d3-443c-97b2-39494e2c5fdd/-4230744641534510169119_cache -> ./PublishGainersandLosers-1.0-SNAPSHOT-shaded-Gopal.jar: No space left on device

        at sun.nio.fs.UnixException.translateToIOException(UnixException.java:91)

        at sun.nio.fs.UnixException.rethrowAsIOException(UnixException.java:102)

        at sun.nio.fs.UnixCopyFile.copyFile(UnixCopyFile.java:253)

        at sun.nio.fs.UnixCopyFile.copy(UnixCopyFile.java:581)

        at sun.nio.fs.UnixFileSystemProvider.copy(UnixFileSystemProvider.java:253)

        at java.nio.file.Files.copy(Files.java:1274)

        at org.apache.spark.util.Utils$.org$apache$spark$util$Utils$$copyRecursive(Utils.scala:625)

        at org.apache.spark.util.Utils$.copyFile(Utils.scala:596)

        at org.apache.spark.util.Utils$.fetchFile(Utils.scala:473)

        at org.apache.spark.executor.Executor$$anonfun$org$apache$spark$executor$Executor$$updateDependencies$5.apply(Executor.scala:696)

        at org.apache.spark.executor.Executor$$anonfun$org$apache$spark$executor$Executor$$updateDependencies$5.apply(Executor.scala:688)

        at scala.collection.TraversableLike$WithFilter$$anonfun$foreach$1.apply(TraversableLike.scala:733)

        at scala.collection.mutable.HashMap$$anonfun$foreach$1.apply(HashMap.scala:99)

        at scala.collection.mutable.HashMap$$anonfun$foreach$1.apply(HashMap.scala:99)

        at scala.collection.mutable.HashTable$class.foreachEntry(HashTable.scala:230)

        at scala.collection.mutable.HashMap.foreachEntry(HashMap.scala:40)

        at scala.collection.mutable.HashMap.foreach(HashMap.scala:99)

        at scala.collection.TraversableLike$WithFilter.foreach(TraversableLike.scala:732)

        at org.apache.spark.executor.Executor.org$apache$spark$executor$Executor$$updateDependencies(Executor.scala:688)

        at org.apache.spark.executor.Executor$TaskRunner.run(Executor.scala:308)

        at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1142)

        at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:617)

        at java.lang.Thread.run(Thread.java:745)

 




The information contained in this message is intended only for the recipient, and may be a confidential attorney-client communication or may otherwise be privileged and confidential and protected from disclosure. If the reader of this message is not the intended recipient, or an employee or agent responsible for delivering this message to the intended recipient, please be aware that any dissemination or copying of this communication is strictly prohibited. If you have received this communication in error, please immediately notify us by replying to the message and deleting it from your computer. S&P Global Inc. reserves the right, subject to applicable local law, to monitor, review and process the content of any electronic message or information sent to or from S&P Global Inc. e-mail addresses without informing the sender or recipient of the message. By sending electronic message or information to S&P Global Inc. e-mail addresses you, as the sender, are consenting to S&P Global Inc. processing any of your personal data therein.
Reply | Threaded
Open this post in threaded view
|

Re: java.nio.file.FileSystemException: /tmp/spark- .._cache : No space left on device

naresh Goud
Also check enough space available on /tmp directory 

On Fri, Aug 17, 2018 at 10:14 AM Jeevan K. Srivatsa <[hidden email]> wrote:
Hi Venkata,

On a quick glance, it looks like a file-related issue more so than an executor issue. If the logs are not that important, I would clear /tmp/spark-events/ directory and assign a suitable permission (e.g., chmod 755) to that and rerun the application.

chmod 755 /tmp/spark-events/  

Thanks and regards,
Jeevan K. Srivatsa


On Fri, 17 Aug 2018 at 15:20, Polisetti, Venkata Siva Rama Gopala Krishna <[hidden email]> wrote:

Hi

Am getting below exception when I Run Spark-submit in linux machine , can someone give quick solution with commands

Driver stacktrace:

- Job 0 failed: count at DailyGainersAndLosersPublisher.scala:145, took 5.749450 s

org.apache.spark.SparkException: Job aborted due to stage failure: Task 4 in stage 0.0 failed 4 times, most recent failure: Lost task 4.3 in stage 0.0 (TID 6, 172.29.62.145, executor 0): java.nio.file.FileSystemException: /tmp/spark-523d5331-3884-440c-ac0d-f46838c2029f/executor-390c9cd7-217e-42f3-97cb-fa2734405585/spark-206d92c0-f0d3-443c-97b2-39494e2c5fdd/-4230744641534510169119_cache -> ./PublishGainersandLosers-1.0-SNAPSHOT-shaded-Gopal.jar: No space left on device

        at sun.nio.fs.UnixException.translateToIOException(UnixException.java:91)

        at sun.nio.fs.UnixException.rethrowAsIOException(UnixException.java:102)

        at sun.nio.fs.UnixCopyFile.copyFile(UnixCopyFile.java:253)

        at sun.nio.fs.UnixCopyFile.copy(UnixCopyFile.java:581)

        at sun.nio.fs.UnixFileSystemProvider.copy(UnixFileSystemProvider.java:253)

        at java.nio.file.Files.copy(Files.java:1274)

        at org.apache.spark.util.Utils$.org$apache$spark$util$Utils$$copyRecursive(Utils.scala:625)

        at org.apache.spark.util.Utils$.copyFile(Utils.scala:596)

        at org.apache.spark.util.Utils$.fetchFile(Utils.scala:473)

        at org.apache.spark.executor.Executor$$anonfun$org$apache$spark$executor$Executor$$updateDependencies$5.apply(Executor.scala:696)

        at org.apache.spark.executor.Executor$$anonfun$org$apache$spark$executor$Executor$$updateDependencies$5.apply(Executor.scala:688)

        at scala.collection.TraversableLike$WithFilter$$anonfun$foreach$1.apply(TraversableLike.scala:733)

        at scala.collection.mutable.HashMap$$anonfun$foreach$1.apply(HashMap.scala:99)

        at scala.collection.mutable.HashMap$$anonfun$foreach$1.apply(HashMap.scala:99)

        at scala.collection.mutable.HashTable$class.foreachEntry(HashTable.scala:230)

        at scala.collection.mutable.HashMap.foreachEntry(HashMap.scala:40)

        at scala.collection.mutable.HashMap.foreach(HashMap.scala:99)

        at scala.collection.TraversableLike$WithFilter.foreach(TraversableLike.scala:732)

        at org.apache.spark.executor.Executor.org$apache$spark$executor$Executor$$updateDependencies(Executor.scala:688)

        at org.apache.spark.executor.Executor$TaskRunner.run(Executor.scala:308)

        at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1142)

        at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:617)

        at java.lang.Thread.run(Thread.java:745)

 




The information contained in this message is intended only for the recipient, and may be a confidential attorney-client communication or may otherwise be privileged and confidential and protected from disclosure. If the reader of this message is not the intended recipient, or an employee or agent responsible for delivering this message to the intended recipient, please be aware that any dissemination or copying of this communication is strictly prohibited. If you have received this communication in error, please immediately notify us by replying to the message and deleting it from your computer. S&P Global Inc. reserves the right, subject to applicable local law, to monitor, review and process the content of any electronic message or information sent to or from S&P Global Inc. e-mail addresses without informing the sender or recipient of the message. By sending electronic message or information to S&P Global Inc. e-mail addresses you, as the sender, are consenting to S&P Global Inc. processing any of your personal data therein.
--