Spark 1.5.2 error on quitting spark in windows 7

classic Classic list List threaded Threaded
1 message Options
Reply | Threaded
Open this post in threaded view
|

Spark 1.5.2 error on quitting spark in windows 7

skypickle
This post has NOT been accepted by the mailing list yet.

If I start spark-shell then just quit, I get an error.


scala> :q
Stopping spark context.
15/12/09 23:43:32 ERROR ShutdownHookManager: Exception while deleting Spark temp dir: C:\Users\Stefan\AppData\Local\Temp\spark-68d3a813-9c55-4649-aa7a-5fc269e669e7
java.io.IOException: Failed to delete: C:\Users\Stefan\AppData\Local\Temp\spark-68d3a813-9c55-4649-aa7a-5fc269e669e7
        at org.apache.spark.util.Utils$.deleteRecursively(Utils.scala:884)

So, if u use winutils to examine the directory:

C:\Users\Stefan\AppData\Local\Temp>winutils ls spark-cb325426-4a3c-48ec-becc-baaa077bea1f
drwx------ 1 BloomBear-SSD\Stefan BloomBear-SSD\None 0 Dec 10 2015 spark-cb325426-4a3c-48ec-becc-baaa077bea1f

I interpret this to mean that the OWNER has read/write/execute privs on this folder.
So why does scala have a problem deleting it?

Just for fun I also installed a set of windows executables that are ports of common UNIX utilities - http://sourceforge.net/projects/unxutils/?source=typ_redirect

So now I can run a command like ls and get


C:\Users\Stefan\AppData\Local\Temp>ls -al
total 61
drwxrwxrwx   1 user     group           0 Dec  9 23:44 .
drwxrwxrwx   1 user     group           0 Dec  9 22:27 ..
drwxrwxrwx   1 user     group           0 Dec  9 23:43 61135062-623a-4624-b406-fbd0ae9308ae_resources
drwxrwxrwx   1 user     group           0 Dec  9 23:43 9cc17e8c-2941-4768-9f55-e740e54dab0b_resources
-rw-rw-rw-   1 user     group           0 Sep  4  2013 FXSAPIDebugLogFile.txt
drwxrwxrwx   1 user     group           0 Dec  9 23:43 Stefan
-rw-rw-rw-   1 user     group       16400 Dec  9 21:07 etilqs_3SQb9MejUX0BHwy
-rw-rw-rw-   1 user     group        2052 Dec  9 21:41 etilqs_8YWZWJEClIYRrKf
drwxrwxrwx   1 user     group           0 Dec  9 23:43 hsperfdata_Stefan
-rw-rw-rw-   1 user     group       19968 Dec  9 23:09 jansi-64-1-8475478299913367674.11
-rw-rw-rw-   1 user     group       18944 Dec  9 23:43 jansi-64-1.5.2.dll
-rw-rw-rw-   1 user     group        2031 Dec  9 23:15 sbt3359615202868869571.log
drwxrwxrwx   1 user     group           0 Dec  9 23:43 spark-68d3a813-9c55-4649-aa7a-5fc269e669e7

Now the spark directory is being seen by windows as fully readable by EVERYONE.
In any event, can someone enlighten me about their environment to avoid this irritating error. Here is my environment:


windows 7 64 bit
Spark 1.5.2
Scala 2.10.6
Python 2.7.10 (from Anaconda)

PATH includes:
C:\Users\Stefan\spark-1.5.2-bin-hadoop2.6\bin
C:\ProgramData\Oracle\Java\javapath
C:\Users\Stefan\scala
C:\Users\Stefan\hadoop-2.6.0\bin
C:\ProgramData\Oracle\Java\javapath

SYSTEM variables set are:
SPARK_HOME=C:\Users\Stefan\spark-1.5.2-bin-hadoop2.6
JAVA_HOME=C:\Program Files\Java\jre1.8.0_65
HADOOP_HOME=C:\Users\Stefan\hadoop-2.6.0
(where the bin\winutils resides)
winutils.exe chmod 777 /tmp/hive

\tmp\hive directory at root on C; drive with full permissions,
e.g.
>winutils ls \tmp\hive
drwxrwxrwx 1 BloomBear-SSD\Stefan BloomBear-SSD\None 0 Dec  8 2015 \tmp\hive