jarOfClass method no found in SparkContext

classic Classic list List threaded Threaded
5 messages Options
Reply | Threaded
Open this post in threaded view
|

jarOfClass method no found in SparkContext

arjun biswas
Hello All ,

I have installed spark on my machine and was succesful in running sbt/sbt package as well as sbt/sbt assembly . I am trying to run the examples in java from eclipse . To be precise i am trying to run the JavaLogQuery example from eclipse . The issue is i am unable to resolve this compilation problem of jarOfClass being not available inside the Java Spark Context . I have added all the possible jars and is using Spark 0.8.1 incubating which is the latest one with scala 2.9.3 .I have all jars to the classpath to the point that i do not get any import error . However JavaSparkContext.jarOfClass gives the above error saying the jarOfClass method is unavailable in the JavaSparkContext . I am using Spark-0.8.1 incubating and scala 2.9.3 . Has anyone tried to run the java sample examples from eclipse . Please note that this is a compile time error in eclipse .

Regards
Arjun
Reply | Threaded
Open this post in threaded view
|

Re: jarOfClass method no found in SparkContext

Tathagata Das
Could it be possible that you have an older version of JavaSparkContext (i.e. from an older version of Spark) in your path? Please check that there aren't two versions of Spark accidentally included in your class path used in Eclipse. It would not give errors in the import (as it finds the imported packages and classes) but would give such errors as it may be unfortunately finding an older version of JavaSparkContext class in the class path. 

TD


On Wed, Jan 15, 2014 at 4:14 PM, arjun biswas <[hidden email]> wrote:
Hello All ,

I have installed spark on my machine and was succesful in running sbt/sbt package as well as sbt/sbt assembly . I am trying to run the examples in java from eclipse . To be precise i am trying to run the JavaLogQuery example from eclipse . The issue is i am unable to resolve this compilation problem of jarOfClass being not available inside the Java Spark Context . I have added all the possible jars and is using Spark 0.8.1 incubating which is the latest one with scala 2.9.3 .I have all jars to the classpath to the point that i do not get any import error . However JavaSparkContext.jarOfClass gives the above error saying the jarOfClass method is unavailable in the JavaSparkContext . I am using Spark-0.8.1 incubating and scala 2.9.3 . Has anyone tried to run the java sample examples from eclipse . Please note that this is a compile time error in eclipse .

Regards
Arjun

Reply | Threaded
Open this post in threaded view
|

Re: jarOfClass method no found in SparkContext

arjun biswas
Could it be possible that you have an older version of JavaSparkContext (i.e. from an older version of Spark) in your path? Please check that there aren't two versions of Spark accidentally included in your class path used in Eclipse. It would not give errors in the import (as it finds the imported packages and classes) but would give such errors as it may be unfortunately finding an older version of JavaSparkContext class in the class path. 

>>

I have the following three jars in the class path of eclipse .and no other jar is currently in the classpath
1)google-collections-0.8.jar
2)scala-library.jar
3)spark-core_2.9.3-0.8.1-incubating.jar

Am i using the correct jar files to run the java samples from eclipse ?

Regards




On Wed, Jan 15, 2014 at 4:36 PM, Tathagata Das <[hidden email]> wrote:
Could it be possible that you have an older version of JavaSparkContext (i.e. from an older version of Spark) in your path? Please check that there aren't two versions of Spark accidentally included in your class path used in Eclipse. It would not give errors in the import (as it finds the imported packages and classes) but would give such errors as it may be unfortunately finding an older version of JavaSparkContext class in the class path. 

TD


On Wed, Jan 15, 2014 at 4:14 PM, arjun biswas <[hidden email]> wrote:
Hello All ,

I have installed spark on my machine and was succesful in running sbt/sbt package as well as sbt/sbt assembly . I am trying to run the examples in java from eclipse . To be precise i am trying to run the JavaLogQuery example from eclipse . The issue is i am unable to resolve this compilation problem of jarOfClass being not available inside the Java Spark Context . I have added all the possible jars and is using Spark 0.8.1 incubating which is the latest one with scala 2.9.3 .I have all jars to the classpath to the point that i do not get any import error . However JavaSparkContext.jarOfClass gives the above error saying the jarOfClass method is unavailable in the JavaSparkContext . I am using Spark-0.8.1 incubating and scala 2.9.3 . Has anyone tried to run the java sample examples from eclipse . Please note that this is a compile time error in eclipse .

Regards
Arjun


Reply | Threaded
Open this post in threaded view
|

Re: jarOfClass method no found in SparkContext

Patrick Wendell
Hm, are you sure you haven't included the master branch of Spark
somehow in your classpath? jarOfClass was added to Java in the master
branch and Spark 0.9.0 (RC). So it seems a lot like you have a newer
(post 0.8.X) version of the examples.

- Patrick

On Wed, Jan 15, 2014 at 5:04 PM, arjun biswas <[hidden email]> wrote:

> Could it be possible that you have an older version of JavaSparkContext
> (i.e. from an older version of Spark) in your path? Please check that there
> aren't two versions of Spark accidentally included in your class path used
> in Eclipse. It would not give errors in the import (as it finds the imported
> packages and classes) but would give such errors as it may be unfortunately
> finding an older version of JavaSparkContext class in the class path.
>
>>>
>
> I have the following three jars in the class path of eclipse .and no other
> jar is currently in the classpath
> 1)google-collections-0.8.jar
> 2)scala-library.jar
> 3)spark-core_2.9.3-0.8.1-incubating.jar
>
> Am i using the correct jar files to run the java samples from eclipse ?
>
> Regards
>
>
>
>
> On Wed, Jan 15, 2014 at 4:36 PM, Tathagata Das <[hidden email]>
> wrote:
>>
>> Could it be possible that you have an older version of JavaSparkContext
>> (i.e. from an older version of Spark) in your path? Please check that there
>> aren't two versions of Spark accidentally included in your class path used
>> in Eclipse. It would not give errors in the import (as it finds the imported
>> packages and classes) but would give such errors as it may be unfortunately
>> finding an older version of JavaSparkContext class in the class path.
>>
>> TD
>>
>>
>> On Wed, Jan 15, 2014 at 4:14 PM, arjun biswas <[hidden email]>
>> wrote:
>>>
>>> Hello All ,
>>>
>>> I have installed spark on my machine and was succesful in running sbt/sbt
>>> package as well as sbt/sbt assembly . I am trying to run the examples in
>>> java from eclipse . To be precise i am trying to run the JavaLogQuery
>>> example from eclipse . The issue is i am unable to resolve this compilation
>>> problem of jarOfClass being not available inside the Java Spark Context . I
>>> have added all the possible jars and is using Spark 0.8.1 incubating which
>>> is the latest one with scala 2.9.3 .I have all jars to the classpath to the
>>> point that i do not get any import error . However
>>> JavaSparkContext.jarOfClass gives the above error saying the jarOfClass
>>> method is unavailable in the JavaSparkContext . I am using Spark-0.8.1
>>> incubating and scala 2.9.3 . Has anyone tried to run the java sample
>>> examples from eclipse . Please note that this is a compile time error in
>>> eclipse .
>>>
>>> Regards
>>> Arjun
>>
>>
>
Reply | Threaded
Open this post in threaded view
|

Re: jarOfClass method no found in SparkContext

arjun biswas
Thanks for pointing me to that mistake . Yes i was using the spark 0.8.1 incubating jar and the master branch code examples . I corrected the mistake

Regards


On Wed, Jan 15, 2014 at 5:51 PM, Patrick Wendell <[hidden email]> wrote:
Hm, are you sure you haven't included the master branch of Spark
somehow in your classpath? jarOfClass was added to Java in the master
branch and Spark 0.9.0 (RC). So it seems a lot like you have a newer
(post 0.8.X) version of the examples.

- Patrick

On Wed, Jan 15, 2014 at 5:04 PM, arjun biswas <[hidden email]> wrote:
> Could it be possible that you have an older version of JavaSparkContext
> (i.e. from an older version of Spark) in your path? Please check that there
> aren't two versions of Spark accidentally included in your class path used
> in Eclipse. It would not give errors in the import (as it finds the imported
> packages and classes) but would give such errors as it may be unfortunately
> finding an older version of JavaSparkContext class in the class path.
>
>>>
>
> I have the following three jars in the class path of eclipse .and no other
> jar is currently in the classpath
> 1)google-collections-0.8.jar
> 2)scala-library.jar
> 3)spark-core_2.9.3-0.8.1-incubating.jar
>
> Am i using the correct jar files to run the java samples from eclipse ?
>
> Regards
>
>
>
>
> On Wed, Jan 15, 2014 at 4:36 PM, Tathagata Das <[hidden email]>
> wrote:
>>
>> Could it be possible that you have an older version of JavaSparkContext
>> (i.e. from an older version of Spark) in your path? Please check that there
>> aren't two versions of Spark accidentally included in your class path used
>> in Eclipse. It would not give errors in the import (as it finds the imported
>> packages and classes) but would give such errors as it may be unfortunately
>> finding an older version of JavaSparkContext class in the class path.
>>
>> TD
>>
>>
>> On Wed, Jan 15, 2014 at 4:14 PM, arjun biswas <[hidden email]>
>> wrote:
>>>
>>> Hello All ,
>>>
>>> I have installed spark on my machine and was succesful in running sbt/sbt
>>> package as well as sbt/sbt assembly . I am trying to run the examples in
>>> java from eclipse . To be precise i am trying to run the JavaLogQuery
>>> example from eclipse . The issue is i am unable to resolve this compilation
>>> problem of jarOfClass being not available inside the Java Spark Context . I
>>> have added all the possible jars and is using Spark 0.8.1 incubating which
>>> is the latest one with scala 2.9.3 .I have all jars to the classpath to the
>>> point that i do not get any import error . However
>>> JavaSparkContext.jarOfClass gives the above error saying the jarOfClass
>>> method is unavailable in the JavaSparkContext . I am using Spark-0.8.1
>>> incubating and scala 2.9.3 . Has anyone tried to run the java sample
>>> examples from eclipse . Please note that this is a compile time error in
>>> eclipse .
>>>
>>> Regards
>>> Arjun
>>
>>
>