I'm starting to explore the Spark Job Server contributed by Ooyala , running from the master branch.
I started by developing and submitting a simple job and the JAR check gave me errors on a seemingly good jar. I disabled the fingerprint checking on the jar and I could submit it, but when I tried to submit the job, it could not find it's classpath. Therefore I decided to take a couple of steps backwards and go through the example in the docs.
Using the (Hello)WordCount example, upload is OK and the jar in in the UI as well, but when I submit the job, I get the same classpathNotFound error as before:
19:07 $ curl -d "" 'localhost:8090/jobs?appName=test&classPath=spark.jobserver.WordCountExample'
"result": "classPath spark.jobserver.WordCountExample not found"
I'm not sure where it goes wrong. Here's what seems to be the relevant snippet in the server logs:
[2014-05-22 19:17:28,891] INFO .apache.spark.SparkContext  [akka://JobServer/user/context-
supervisor/666d021a-spark.jobserver.WordCountExample] - Added JAR /tmp/spark-jobserver/filedao/data/test-2014-05-22T18:44:09.254+02:00.jar at http://172.17.42.1:37978/jars/test-2014-05-22T18:44:09.254+02:00.jar with timestamp 1400779048891
[2014-05-22 19:17:28,891] INFO util.ContextURLClassLoader  [akka://JobServer/user/context-supervisor/666d021a-spark.jobserver.WordCountExample] - Added URL file:/tmp/spark-jobserver/filedao/data/test-2014-05-22T18:44:09.254+02:00.jar to ContextURLClassLoader
[2014-05-22 19:17:28,891] INFO spark.jobserver.JarUtils$  [akka://JobServer/user/context-supervisor/666d021a-spark.jobserver.WordCountExample] - Loading object spark.jobserver.WordCountExample$ using loader spark.jobserver.util.ContextURLClassLoader@5deae1b7
[2014-05-22 19:17:28,892] INFO spark.jobserver.JarUtils$  [akka://JobServer/user/context-supervisor/666d021a-spark.jobserver.WordCountExample] - Loading class spark.jobserver.WordCountExample using loader spark.jobserver.util.ContextURLClassLoader@5deae1b7
***** all OK until here and then ...*****
[2014-05-22 19:17:28,892] INFO ocalContextSupervisorActor  [akka://JobServer/user/context-supervisor] - Shutting down context 666d021a-spark.jobserver.WordCountExample
Any ideas? Something silly I might be doing? btw, I'm running in dev mode using sbt and default config (local).
We're using the Spark Job Server in production, from GitHub [master] running against a recent Spark-1.0 snapshot so it definitely works. I'm afraid the only time we've seen a similar error was an unfortunate case of PEBKAC.
First and foremost, have you tried doing an unzip -l "/tmp/spark-jobserver/filedao/data/test-2014-05-22T18:44:09.254+02:00.jar" on the JAR uploaded to the server to make sure the class is where you're expecting it to be?
It's not uncommon for a package statement to be neglected when moving classes around in an IDE like Eclipse.
On 22 May 2014 18:25, Gerard Maas <[hidden email]> wrote:
Thanks for the tip on the /tmp dir. I had unzipped all the jars before uploading to check for the class. The issue is that the jars were not uploaded correctly.
I was not familiar with the '@' syntax of curl and omitted it, resulting in a Jar file containing only the jar's name.
curl --data-binary @sparkjobservertest_2.10-0.1.jar localhost:8090/jars/test
Definitively a case of PEBKAC or 'PICNIC' as I used to know it. :-)
On Thu, May 22, 2014 at 7:52 PM, Michael Cutler <[hidden email]> wrote:
|Free forum by Nabble||Edit this page|