Spark build/sbt assembly

classic Classic list List threaded Threaded
6 messages Options
Reply | Threaded
Open this post in threaded view
|

Spark build/sbt assembly

Rahul Palamuttam
This post has NOT been accepted by the mailing list yet.
Hi All,

I hope this is the right place to post troubleshooting questions.
I've been following the install instructions and I get the following error when running the following from Spark home directory

$./build/sbt
Using /usr/java/jdk1.8.0_20/ as default JAVA_HOME.
Note, this will be overridden by -java-home if it is set.
Attempting to fetch sbt
Launching sbt from build/sbt-launch-0.13.7.jar
Error: Invalid or corrupt jarfile build/sbt-launch-0.13.7.jar

However when I run sbt assembly it compiles, with a couple of warnings, but it works none-the less.
Is the build/sbt script deprecated? I do notice on one node it works but on the other it gives me the above error.

Thanks,

Rahul P
Reply | Threaded
Open this post in threaded view
|

Re: Spark build/sbt assembly

pa1975
This post has NOT been accepted by the mailing list yet.
i think it has to do with 64bit/32bit mismatch between jar version and OS version on the node where it is happening
Reply | Threaded
Open this post in threaded view
|

Re: Spark build/sbt assembly

Kiluvya.A
This post has NOT been accepted by the mailing list yet.
In reply to this post by Rahul Palamuttam
Try using Linux kernel 3.13.xx, I encountered the same error in kernel 3.16.xx.

To check kernel version, enter "uname -r"

Rahul Palamuttam wrote
$./build/sbt
Using /usr/java/jdk1.8.0_20/ as default JAVA_HOME.
Note, this will be overridden by -java-home if it is set.
Attempting to fetch sbt
Launching sbt from build/sbt-launch-0.13.7.jar
Error: Invalid or corrupt jarfile build/sbt-launch-0.13.7.jar
Reply | Threaded
Open this post in threaded view
|

Re: Spark build/sbt assembly

Torgos
This post has NOT been accepted by the mailing list yet.
In reply to this post by Rahul Palamuttam
It seems like some sort of link has become deprecated.  I encountered this issue when testing out 1.4.1 installation, but had the exact same issue when I tried reinstalling previously downloaded 1.3.0 and 1.2.0 .tgz files, which had worked at the time of download.  All the listed stackoverflow solutions tend to be convoluted or not work, so hopefully this gets fixed soon.
Reply | Threaded
Open this post in threaded view
|

Re: Spark build/sbt assembly

Torgos
This post has NOT been accepted by the mailing list yet.
In reply to this post by Kiluvya.A
@Kiluvya, I'm using 3.13.xx, and it still fails.
Reply | Threaded
Open this post in threaded view
|

Re: Spark build/sbt assembly

rake
This post has NOT been accepted by the mailing list yet.
In reply to this post by Rahul Palamuttam
Rahul Palamuttam wrote
$./build/sbt
Using /usr/java/jdk1.8.0_20/ as default JAVA_HOME.
Note, this will be overridden by -java-home if it is set.
Attempting to fetch sbt
Launching sbt from build/sbt-launch-0.13.7.jar
Error: Invalid or corrupt jarfile build/sbt-launch-0.13.7.jar
Are you trying to build Spark from the 1.4.1 sources?

If you got the Spark sources from their main downloads page

http://spark.apache.org/downloads.html

And selected "1.4.1" and the "Source Code" options, to obtain the file spark-1.4.1.tgz, then the  file build/sbt-launch-0.13.7.jar that the Spark-supplied sbt scripts install is corrupt. It's a fraction of the size it should be.

I got by that problem by borrowing the "build" directory from a 1.5 version of Spark. Apparently solved that, but ran into another build error involving "test-jars", another thing that's fine in maven but breaks SBT.

Rahul Palamuttam wrote
However when I run sbt assembly it compiles, with a couple of warnings, but it works none-the less.
Is the build/sbt script deprecated? I do notice on one node it works but on the other it gives me the above error.

I recall that there was a recent discussion on one of the lists regarding the build/sbt file. That it was not actually always running the Spark-supplied version of SBT (0.13.7), but whatever version of SBT you had installed. So having a recent version of SBT installed could improve your odds.

If someone out there has figured out how to build Spark with SBT, please share.
Randy Kerber
Data Science Consultant
San Jose, California