ADD_JARS not working on 0.9

classic Classic list List threaded Threaded
13 messages Options
Reply | Threaded
Open this post in threaded view
|

ADD_JARS not working on 0.9

Andre Kuhnen
Hello, my spark-shell tells me taht the jar are added but it can not import any of my stuff


When I used the same steps on 0.8  everything worked fine

Thanks

Reply | Threaded
Open this post in threaded view
|

Re: ADD_JARS not working on 0.9

Andrew Ash
Hi Andre,

I've also noticed this.  The jar needs to be added to SPARK_CLASSPATH also now.



On Thu, Feb 13, 2014 at 2:12 PM, Andre Kuhnen <[hidden email]> wrote:
Hello, my spark-shell tells me taht the jar are added but it can not import any of my stuff


When I used the same steps on 0.8  everything worked fine

Thanks


Reply | Threaded
Open this post in threaded view
|

Re: ADD_JARS not working on 0.9

ssimanta
In reply to this post by Andre Kuhnen
Use 
SPARK_CLASSPATH along with ADD_JARS


On Thu, Feb 13, 2014 at 5:12 PM, Andre Kuhnen <[hidden email]> wrote:
Hello, my spark-shell tells me taht the jar are added but it can not import any of my stuff


When I used the same steps on 0.8  everything worked fine

Thanks


Reply | Threaded
Open this post in threaded view
|

Re: ADD_JARS not working on 0.9

Andrew Ash
I filed a bug so we can track the fix: https://spark-project.atlassian.net/browse/SPARK-1089


On Thu, Feb 13, 2014 at 2:21 PM, Soumya Simanta <[hidden email]> wrote:
Use 
SPARK_CLASSPATH along with ADD_JARS


On Thu, Feb 13, 2014 at 5:12 PM, Andre Kuhnen <[hidden email]> wrote:
Hello, my spark-shell tells me taht the jar are added but it can not import any of my stuff


When I used the same steps on 0.8  everything worked fine

Thanks



Reply | Threaded
Open this post in threaded view
|

Re: ADD_JARS not working on 0.9

Andre Kuhnen

Thanks a lot.

Em 13/02/2014 20:27, "Andrew Ash" <[hidden email]> escreveu:
I filed a bug so we can track the fix: https://spark-project.atlassian.net/browse/SPARK-1089


On Thu, Feb 13, 2014 at 2:21 PM, Soumya Simanta <[hidden email]> wrote:
Use 
SPARK_CLASSPATH along with ADD_JARS


On Thu, Feb 13, 2014 at 5:12 PM, Andre Kuhnen <[hidden email]> wrote:
Hello, my spark-shell tells me taht the jar are added but it can not import any of my stuff


When I used the same steps on 0.8  everything worked fine

Thanks



Reply | Threaded
Open this post in threaded view
|

Re: ADD_JARS not working on 0.9

Andre Kuhnen
thanks guys,  but now I am having this problem, and I am compiling my jar with scala version 2.10.3 and sbt 013
any idea?

Failed to initialize compiler: NoSuchMethodError.
This is most often remedied by a full clean and recompile.
Otherwise, your classpath may continue bytecode compiled by
different and incompatible versions of scala.



2014-02-13 23:16 GMT-02:00 Andre Kuhnen <[hidden email]>:

Thanks a lot.

Em 13/02/2014 20:27, "Andrew Ash" <[hidden email]> escreveu:

I filed a bug so we can track the fix: https://spark-project.atlassian.net/browse/SPARK-1089


On Thu, Feb 13, 2014 at 2:21 PM, Soumya Simanta <[hidden email]> wrote:
Use 
SPARK_CLASSPATH along with ADD_JARS


On Thu, Feb 13, 2014 at 5:12 PM, Andre Kuhnen <[hidden email]> wrote:
Hello, my spark-shell tells me taht the jar are added but it can not import any of my stuff


When I used the same steps on 0.8  everything worked fine

Thanks




Reply | Threaded
Open this post in threaded view
|

Re: ADD_JARS not working on 0.9

Andre Kuhnen
Solved,  it was sbt version



2014-02-14 10:51 GMT-02:00 Andre Kuhnen <[hidden email]>:
thanks guys,  but now I am having this problem, and I am compiling my jar with scala version 2.10.3 and sbt 013
any idea?

Failed to initialize compiler: NoSuchMethodError.
This is most often remedied by a full clean and recompile.
Otherwise, your classpath may continue bytecode compiled by
different and incompatible versions of scala.



2014-02-13 23:16 GMT-02:00 Andre Kuhnen <[hidden email]>:

Thanks a lot.

Em 13/02/2014 20:27, "Andrew Ash" <[hidden email]> escreveu:

I filed a bug so we can track the fix: https://spark-project.atlassian.net/browse/SPARK-1089


On Thu, Feb 13, 2014 at 2:21 PM, Soumya Simanta <[hidden email]> wrote:
Use 
SPARK_CLASSPATH along with ADD_JARS


On Thu, Feb 13, 2014 at 5:12 PM, Andre Kuhnen <[hidden email]> wrote:
Hello, my spark-shell tells me taht the jar are added but it can not import any of my stuff


When I used the same steps on 0.8  everything worked fine

Thanks





Reply | Threaded
Open this post in threaded view
|

Re: ADD_JARS not working on 0.9

Vyacheslav Baranov
In reply to this post by Andrew Ash
Hello Andrew,

I'm running on the same problem when I try to import a jar using ':cp' repl command. This used to work on 0.8:

scala> import org.msgpack
<console>:10: error: msgpack is not a member of org
       import org.msgpack
              ^

scala> :cp /path/to/msgpack-0.6.8.jar
Added '/path/to/msgpack-0.6.8.jar'.  Your new classpath is:
"/usr/share/lib/spark/*:/usr/lib/spark/conf:/usr/lib/spark/jars/spark-assembly-0.8.0-incubating-hadoop1.2.1.jar:/path/to/msgpack-0.6.8.jar"
14/02/14 20:04:00 INFO server.Server: jetty-7.x.y-SNAPSHOT
14/02/14 20:04:00 INFO server.AbstractConnector: Started [hidden email]

scala> import org.msgpack
import org.msgpack

And it's not working on 0.9:

scala> import org.msgpack
<console>:10: error: object msgpack is not a member of package org
       import org.msgpack
              ^

scala> :cp /path/to/msgpack-0.6.8.jar
Added '/path/to/msgpack-0.6.8.jar'.  Your new classpath is:
"/usr/share/lib/spark/*:/usr/lib/spark/conf:/usr/lib/spark/jars/spark-assembly-0.9.0-incubating-hadoop2.2.0.jar:/path/to/msgpack-0.6.8.jar"
Nothing to replay.

scala> import org.msgpack
<console>:7: error: object msgpack is not a member of package org
       import org.msgpack
              ^

Probably, it's worth to add this to issue's comments

Thank you,
Vyacheslav

On 14/02/14 02:26, Andrew Ash wrote:
I filed a bug so we can track the fix: https://spark-project.atlassian.net/browse/SPARK-1089


On Thu, Feb 13, 2014 at 2:21 PM, Soumya Simanta <[hidden email]> wrote:
Use 
SPARK_CLASSPATH along with ADD_JARS


On Thu, Feb 13, 2014 at 5:12 PM, Andre Kuhnen <[hidden email]> wrote:
Hello, my spark-shell tells me taht the jar are added but it can not import any of my stuff


When I used the same steps on 0.8  everything worked fine

Thanks




Reply | Threaded
Open this post in threaded view
|

Re: ADD_JARS not working on 0.9

Andrew Ash
Hi Vyacheslav,

If you could add that to the ticket directly that would be valuable because you're more familiar with the specific problem than me!

Andrew


On Fri, Feb 14, 2014 at 8:10 AM, Vyacheslav Baranov <[hidden email]> wrote:
Hello Andrew,

I'm running on the same problem when I try to import a jar using ':cp' repl command. This used to work on 0.8:

scala> import org.msgpack
<console>:10: error: msgpack is not a member of org
       import org.msgpack
              ^

scala> :cp /path/to/msgpack-0.6.8.jar
Added '/path/to/msgpack-0.6.8.jar'.  Your new classpath is:
"/usr/share/lib/spark/*:/usr/lib/spark/conf:/usr/lib/spark/jars/spark-assembly-0.8.0-incubating-hadoop1.2.1.jar:/path/to/msgpack-0.6.8.jar"
14/02/14 20:04:00 INFO server.Server: jetty-7.x.y-SNAPSHOT
14/02/14 20:04:00 INFO server.AbstractConnector: Started [hidden email]

scala> import org.msgpack
import org.msgpack

And it's not working on 0.9:

scala> import org.msgpack
<console>:10: error: object msgpack is not a member of package org
       import org.msgpack
              ^

scala> :cp /path/to/msgpack-0.6.8.jar
Added '/path/to/msgpack-0.6.8.jar'.  Your new classpath is:
"/usr/share/lib/spark/*:/usr/lib/spark/conf:/usr/lib/spark/jars/spark-assembly-0.9.0-incubating-hadoop2.2.0.jar:/path/to/msgpack-0.6.8.jar"
Nothing to replay.

scala> import org.msgpack
<console>:7: error: object msgpack is not a member of package org
       import org.msgpack
              ^

Probably, it's worth to add this to issue's comments

Thank you,
Vyacheslav


On 14/02/14 02:26, Andrew Ash wrote:
I filed a bug so we can track the fix: https://spark-project.atlassian.net/browse/SPARK-1089


On Thu, Feb 13, 2014 at 2:21 PM, Soumya Simanta <[hidden email]> wrote:
Use 
SPARK_CLASSPATH along with ADD_JARS


On Thu, Feb 13, 2014 at 5:12 PM, Andre Kuhnen <[hidden email]> wrote:
Hello, my spark-shell tells me taht the jar are added but it can not import any of my stuff


When I used the same steps on 0.8  everything worked fine

Thanks





Reply | Threaded
Open this post in threaded view
|

Re: ADD_JARS not working on 0.9

Vyacheslav Baranov
Andrew,

I've created account on Amplab Jira, but unfortunately I don't have permission to comment.

Vyacheslav

On 15/02/14 00:28, Andrew Ash wrote:
Hi Vyacheslav,

If you could add that to the ticket directly that would be valuable because you're more familiar with the specific problem than me!

Andrew


On Fri, Feb 14, 2014 at 8:10 AM, Vyacheslav Baranov <[hidden email]> wrote:
Hello Andrew,

I'm running on the same problem when I try to import a jar using ':cp' repl command. This used to work on 0.8:

scala> import org.msgpack
<console>:10: error: msgpack is not a member of org
       import org.msgpack
              ^

scala> :cp /path/to/msgpack-0.6.8.jar
Added '/path/to/msgpack-0.6.8.jar'.  Your new classpath is:
"/usr/share/lib/spark/*:/usr/lib/spark/conf:/usr/lib/spark/jars/spark-assembly-0.8.0-incubating-hadoop1.2.1.jar:/path/to/msgpack-0.6.8.jar"
14/02/14 20:04:00 INFO server.Server: jetty-7.x.y-SNAPSHOT
14/02/14 20:04:00 INFO server.AbstractConnector: Started [hidden email]

scala> import org.msgpack
import org.msgpack

And it's not working on 0.9:

scala> import org.msgpack
<console>:10: error: object msgpack is not a member of package org
       import org.msgpack
              ^

scala> :cp /path/to/msgpack-0.6.8.jar
Added '/path/to/msgpack-0.6.8.jar'.  Your new classpath is:
"/usr/share/lib/spark/*:/usr/lib/spark/conf:/usr/lib/spark/jars/spark-assembly-0.9.0-incubating-hadoop2.2.0.jar:/path/to/msgpack-0.6.8.jar"
Nothing to replay.

scala> import org.msgpack
<console>:7: error: object msgpack is not a member of package org
       import org.msgpack
              ^

Probably, it's worth to add this to issue's comments

Thank you,
Vyacheslav


On 14/02/14 02:26, Andrew Ash wrote:
I filed a bug so we can track the fix: https://spark-project.atlassian.net/browse/SPARK-1089


On Thu, Feb 13, 2014 at 2:21 PM, Soumya Simanta <[hidden email]> wrote:
Use 
SPARK_CLASSPATH along with ADD_JARS


On Thu, Feb 13, 2014 at 5:12 PM, Andre Kuhnen <[hidden email]> wrote:
Hello, my spark-shell tells me taht the jar are added but it can not import any of my stuff


When I used the same steps on 0.8  everything worked fine

Thanks






Reply | Threaded
Open this post in threaded view
|

Re: ADD_JARS not working on 0.9

Andrew Ash
// cc Patrick, who I think helps with the Amplab Jira

Amplab Jira admins, can we make sure that newly-created users have comment permissions?  This has been standard in the open source Jira instances I've worked with in the past (like Hadoop).

Thanks!
Andrew


On Sat, Feb 15, 2014 at 4:25 AM, Vyacheslav Baranov <[hidden email]> wrote:
Andrew,

I've created account on Amplab Jira, but unfortunately I don't have permission to comment.

Vyacheslav


On 15/02/14 00:28, Andrew Ash wrote:
Hi Vyacheslav,

If you could add that to the ticket directly that would be valuable because you're more familiar with the specific problem than me!

Andrew


On Fri, Feb 14, 2014 at 8:10 AM, Vyacheslav Baranov <[hidden email]> wrote:
Hello Andrew,

I'm running on the same problem when I try to import a jar using ':cp' repl command. This used to work on 0.8:

scala> import org.msgpack
<console>:10: error: msgpack is not a member of org
       import org.msgpack
              ^

scala> :cp /path/to/msgpack-0.6.8.jar
Added '/path/to/msgpack-0.6.8.jar'.  Your new classpath is:
"/usr/share/lib/spark/*:/usr/lib/spark/conf:/usr/lib/spark/jars/spark-assembly-0.8.0-incubating-hadoop1.2.1.jar:/path/to/msgpack-0.6.8.jar"
14/02/14 20:04:00 INFO server.Server: jetty-7.x.y-SNAPSHOT
14/02/14 20:04:00 INFO server.AbstractConnector: Started [hidden email]

scala> import org.msgpack
import org.msgpack

And it's not working on 0.9:

scala> import org.msgpack
<console>:10: error: object msgpack is not a member of package org
       import org.msgpack
              ^

scala> :cp /path/to/msgpack-0.6.8.jar
Added '/path/to/msgpack-0.6.8.jar'.  Your new classpath is:
"/usr/share/lib/spark/*:/usr/lib/spark/conf:/usr/lib/spark/jars/spark-assembly-0.9.0-incubating-hadoop2.2.0.jar:/path/to/msgpack-0.6.8.jar"
Nothing to replay.

scala> import org.msgpack
<console>:7: error: object msgpack is not a member of package org
       import org.msgpack
              ^

Probably, it's worth to add this to issue's comments

Thank you,
Vyacheslav


On 14/02/14 02:26, Andrew Ash wrote:
I filed a bug so we can track the fix: https://spark-project.atlassian.net/browse/SPARK-1089


On Thu, Feb 13, 2014 at 2:21 PM, Soumya Simanta <[hidden email]> wrote:
Use 
SPARK_CLASSPATH along with ADD_JARS


On Thu, Feb 13, 2014 at 5:12 PM, Andre Kuhnen <[hidden email]> wrote:
Hello, my spark-shell tells me taht the jar are added but it can not import any of my stuff


When I used the same steps on 0.8  everything worked fine

Thanks







Reply | Threaded
Open this post in threaded view
|

Re: ADD_JARS not working on 0.9

Nan Zhu
I’m interested in fixing this

Can anyone assign the JIRA to me?

Best,

-- 
Nan Zhu

On Sunday, February 16, 2014 at 6:17 PM, Andrew Ash wrote:

// cc Patrick, who I think helps with the Amplab Jira

Amplab Jira admins, can we make sure that newly-created users have comment permissions?  This has been standard in the open source Jira instances I've worked with in the past (like Hadoop).

Thanks!
Andrew


On Sat, Feb 15, 2014 at 4:25 AM, Vyacheslav Baranov <[hidden email]> wrote:
Andrew,

I've created account on Amplab Jira, but unfortunately I don't have permission to comment.

Vyacheslav


On 15/02/14 00:28, Andrew Ash wrote:
Hi Vyacheslav,

If you could add that to the ticket directly that would be valuable because you're more familiar with the specific problem than me!

Andrew


On Fri, Feb 14, 2014 at 8:10 AM, Vyacheslav Baranov <[hidden email]> wrote:
Hello Andrew,

I'm running on the same problem when I try to import a jar using ':cp' repl command. This used to work on 0.8:

scala> import org.msgpack
<console>:10: error: msgpack is not a member of org
       import org.msgpack
              ^

scala> :cp /path/to/msgpack-0.6.8.jar
Added '/path/to/msgpack-0.6.8.jar'.  Your new classpath is:
"/usr/share/lib/spark/*:/usr/lib/spark/conf:/usr/lib/spark/jars/spark-assembly-0.8.0-incubating-hadoop1.2.1.jar:/path/to/msgpack-0.6.8.jar"
14/02/14 20:04:00 INFO server.Server: jetty-7.x.y-SNAPSHOT
14/02/14 20:04:00 INFO server.AbstractConnector: Started [hidden email]

scala> import org.msgpack
import org.msgpack

And it's not working on 0.9:

scala> import org.msgpack
<console>:10: error: object msgpack is not a member of package org
       import org.msgpack
              ^

scala> :cp /path/to/msgpack-0.6.8.jar
Added '/path/to/msgpack-0.6.8.jar'.  Your new classpath is:
"/usr/share/lib/spark/*:/usr/lib/spark/conf:/usr/lib/spark/jars/spark-assembly-0.9.0-incubating-hadoop2.2.0.jar:/path/to/msgpack-0.6.8.jar"
Nothing to replay.

scala> import org.msgpack
<console>:7: error: object msgpack is not a member of package org
       import org.msgpack
              ^

Probably, it's worth to add this to issue's comments

Thank you,
Vyacheslav


On 14/02/14 02:26, Andrew Ash wrote:
I filed a bug so we can track the fix: https://spark-project.atlassian.net/browse/SPARK-1089


On Thu, Feb 13, 2014 at 2:21 PM, Soumya Simanta <[hidden email]> wrote:
Use 
SPARK_CLASSPATH along with ADD_JARS


On Thu, Feb 13, 2014 at 5:12 PM, Andre Kuhnen <[hidden email]> wrote:
Hello, my spark-shell tells me taht the jar are added but it can not import any of my stuff


When I used the same steps on 0.8  everything worked fine

Thanks








Reply | Threaded
Open this post in threaded view
|

Re: ADD_JARS not working on 0.9

Nan Zhu
Someone wants to review the fix? https://github.com/apache/incubator-spark/pull/614

Best,

-- 
Nan Zhu

On Sunday, February 16, 2014 at 8:37 PM, Nan Zhu wrote:

I’m interested in fixing this

Can anyone assign the JIRA to me?

Best,

-- 
Nan Zhu

On Sunday, February 16, 2014 at 6:17 PM, Andrew Ash wrote:

// cc Patrick, who I think helps with the Amplab Jira

Amplab Jira admins, can we make sure that newly-created users have comment permissions?  This has been standard in the open source Jira instances I've worked with in the past (like Hadoop).

Thanks!
Andrew


On Sat, Feb 15, 2014 at 4:25 AM, Vyacheslav Baranov <[hidden email]> wrote:
Andrew,

I've created account on Amplab Jira, but unfortunately I don't have permission to comment.

Vyacheslav


On 15/02/14 00:28, Andrew Ash wrote:
Hi Vyacheslav,

If you could add that to the ticket directly that would be valuable because you're more familiar with the specific problem than me!

Andrew


On Fri, Feb 14, 2014 at 8:10 AM, Vyacheslav Baranov <[hidden email]> wrote:
Hello Andrew,

I'm running on the same problem when I try to import a jar using ':cp' repl command. This used to work on 0.8:

scala> import org.msgpack
<console>:10: error: msgpack is not a member of org
       import org.msgpack
              ^

scala> :cp /path/to/msgpack-0.6.8.jar
Added '/path/to/msgpack-0.6.8.jar'.  Your new classpath is:
"/usr/share/lib/spark/*:/usr/lib/spark/conf:/usr/lib/spark/jars/spark-assembly-0.8.0-incubating-hadoop1.2.1.jar:/path/to/msgpack-0.6.8.jar"
14/02/14 20:04:00 INFO server.Server: jetty-7.x.y-SNAPSHOT
14/02/14 20:04:00 INFO server.AbstractConnector: Started [hidden email]

scala> import org.msgpack
import org.msgpack

And it's not working on 0.9:

scala> import org.msgpack
<console>:10: error: object msgpack is not a member of package org
       import org.msgpack
              ^

scala> :cp /path/to/msgpack-0.6.8.jar
Added '/path/to/msgpack-0.6.8.jar'.  Your new classpath is:
"/usr/share/lib/spark/*:/usr/lib/spark/conf:/usr/lib/spark/jars/spark-assembly-0.9.0-incubating-hadoop2.2.0.jar:/path/to/msgpack-0.6.8.jar"
Nothing to replay.

scala> import org.msgpack
<console>:7: error: object msgpack is not a member of package org
       import org.msgpack
              ^

Probably, it's worth to add this to issue's comments

Thank you,
Vyacheslav


On 14/02/14 02:26, Andrew Ash wrote:
I filed a bug so we can track the fix: https://spark-project.atlassian.net/browse/SPARK-1089


On Thu, Feb 13, 2014 at 2:21 PM, Soumya Simanta <[hidden email]> wrote:
Use 
SPARK_CLASSPATH along with ADD_JARS


On Thu, Feb 13, 2014 at 5:12 PM, Andre Kuhnen <[hidden email]> wrote:
Hello, my spark-shell tells me taht the jar are added but it can not import any of my stuff


When I used the same steps on 0.8  everything worked fine

Thanks