Fwd: SecurityException when running tests with Spark 1.0.0

classic Classic list List threaded Threaded
6 messages Options
Reply | Threaded
Open this post in threaded view
|

Fwd: SecurityException when running tests with Spark 1.0.0

Mohit Nayak
Hi,
I've upgraded to Spark 1.0.0. I'm not able to run any tests. They throw a 
java.lang.SecurityException: class "javax.servlet.FilterRegistration"'s signer information does not match signer information of other classes in the same package

I'm using Hadoop-core 1.0.4 and running this locally.
I noticed that there was an issue regarding this and was marked as resolved [https://issues.apache.org/jira/browse/SPARK-1693]
Please guide..

--
-Mohit
[hidden email]



--
-Mohit
[hidden email]
Reply | Threaded
Open this post in threaded view
|

Re: SecurityException when running tests with Spark 1.0.0

sowen
This ultimately means you have a couple copies of the servlet APIs in
the build. What is your build like (SBT? Maven?) and what exactly are
you depending on?

On Tue, Jun 3, 2014 at 12:21 AM, Mohit Nayak <[hidden email]> wrote:

> Hi,
> I've upgraded to Spark 1.0.0. I'm not able to run any tests. They throw a
>
> java.lang.SecurityException: class "javax.servlet.FilterRegistration"'s
> signer information does not match signer information of other classes in the
> same package
>
>
> I'm using Hadoop-core 1.0.4 and running this locally.
> I noticed that there was an issue regarding this and was marked as resolved
> [https://issues.apache.org/jira/browse/SPARK-1693]
> Please guide..
>
> --
> -Mohit
> [hidden email]
>
>
>
> --
> -Mohit
> [hidden email]
Reply | Threaded
Open this post in threaded view
|

Re: SecurityException when running tests with Spark 1.0.0

Mohit Nayak
Hey,
Thanks for the reply.

I am using SBT. Here is a list of my dependancies:
    val sparkCore    = "org.apache.spark" % "spark-core_2.10" % V.spark
    val hadoopCore   = "org.apache.hadoop" % "hadoop-core"           % V.hadoop    % "provided"
    val jodaTime     = "com.github.nscala-time" %% "nscala-time"     % "0.8.0"
    val scalaUtil    = "com.twitter"       %% "util-collection"      % V.util
    val logback      = "ch.qos.logback" % "logback-classic" % "1.0.6" % "runtime"
    var openCsv      = "net.sf.opencsv" % "opencsv" % "2.1"
    var scalaTest    = "org.scalatest" % "scalatest_2.10" % "2.1.0" % "test"
    var scalaIOCore  = "com.github.scala-incubator.io" %% "scala-io-core" % V.scalaIO
    var scalaIOFile  = "com.github.scala-incubator.io" %% "scala-io-file" % V.scalaIO
    var kryo = "com.esotericsoftware.kryo" % "kryo" % "2.16"
    var spray = "io.spray" %%  "spray-json" % "1.2.5"
    var scala_reflect = "org.scala-lang" % "scala-reflect" % "2.10.3"



On Mon, Jun 2, 2014 at 4:23 PM, Sean Owen <[hidden email]> wrote:
This ultimately means you have a couple copies of the servlet APIs in
the build. What is your build like (SBT? Maven?) and what exactly are
you depending on?

On Tue, Jun 3, 2014 at 12:21 AM, Mohit Nayak <[hidden email]> wrote:
> Hi,
> I've upgraded to Spark 1.0.0. I'm not able to run any tests. They throw a
>
> java.lang.SecurityException: class "javax.servlet.FilterRegistration"'s
> signer information does not match signer information of other classes in the
> same package
>
>
> I'm using Hadoop-core 1.0.4 and running this locally.
> I noticed that there was an issue regarding this and was marked as resolved
> [https://issues.apache.org/jira/browse/SPARK-1693]
> Please guide..
>
> --
> -Mohit
> [hidden email]
>
>
>
> --
> -Mohit
> [hidden email]



--
-Mohit
[hidden email]
Reply | Threaded
Open this post in threaded view
|

Re: SecurityException when running tests with Spark 1.0.0

sowen
If it's the SBT build, I suspect you are hitting
https://issues.apache.org/jira/browse/SPARK-1949

Can you try to apply the excludes you see at
https://github.com/apache/spark/pull/906/files to your build to see if
it resolves it?

If so I think this could be helpful to commit.

On Tue, Jun 3, 2014 at 1:01 AM, Mohit Nayak <[hidden email]> wrote:

> Hey,
> Thanks for the reply.
>
> I am using SBT. Here is a list of my dependancies:
>     val sparkCore    = "org.apache.spark" % "spark-core_2.10" % V.spark
>     val hadoopCore   = "org.apache.hadoop" % "hadoop-core"           %
> V.hadoop    % "provided"
>     val jodaTime     = "com.github.nscala-time" %% "nscala-time"     %
> "0.8.0"
>     val scalaUtil    = "com.twitter"       %% "util-collection"      %
> V.util
>     val logback      = "ch.qos.logback" % "logback-classic" % "1.0.6" %
> "runtime"
>     var openCsv      = "net.sf.opencsv" % "opencsv" % "2.1"
>     var scalaTest    = "org.scalatest" % "scalatest_2.10" % "2.1.0" % "test"
>     var scalaIOCore  = "com.github.scala-incubator.io" %% "scala-io-core" %
> V.scalaIO
>     var scalaIOFile  = "com.github.scala-incubator.io" %% "scala-io-file" %
> V.scalaIO
>     var kryo = "com.esotericsoftware.kryo" % "kryo" % "2.16"
>     var spray = "io.spray" %%  "spray-json" % "1.2.5"
>     var scala_reflect = "org.scala-lang" % "scala-reflect" % "2.10.3"
>
>
>
> On Mon, Jun 2, 2014 at 4:23 PM, Sean Owen <[hidden email]> wrote:
>>
>> This ultimately means you have a couple copies of the servlet APIs in
>> the build. What is your build like (SBT? Maven?) and what exactly are
>> you depending on?
>>
>> On Tue, Jun 3, 2014 at 12:21 AM, Mohit Nayak <[hidden email]> wrote:
>> > Hi,
>> > I've upgraded to Spark 1.0.0. I'm not able to run any tests. They throw
>> > a
>> >
>> > java.lang.SecurityException: class "javax.servlet.FilterRegistration"'s
>> > signer information does not match signer information of other classes in
>> > the
>> > same package
>> >
>> >
>> > I'm using Hadoop-core 1.0.4 and running this locally.
>> > I noticed that there was an issue regarding this and was marked as
>> > resolved
>> > [https://issues.apache.org/jira/browse/SPARK-1693]
>> > Please guide..
>> >
>> > --
>> > -Mohit
>> > [hidden email]
>> >
>> >
>> >
>> > --
>> > -Mohit
>> > [hidden email]
>
>
>
>
> --
> -Mohit
> [hidden email]
Reply | Threaded
Open this post in threaded view
|

Re: SecurityException when running tests with Spark 1.0.0

Mohit Nayak
Hey,
Yup that fixed it. Thanks so much!
 
Is this the only solution, or could this be resolved in future versions of Spark ?


On Mon, Jun 2, 2014 at 5:14 PM, Sean Owen <[hidden email]> wrote:
If it's the SBT build, I suspect you are hitting
https://issues.apache.org/jira/browse/SPARK-1949

Can you try to apply the excludes you see at
https://github.com/apache/spark/pull/906/files to your build to see if
it resolves it?

If so I think this could be helpful to commit.

On Tue, Jun 3, 2014 at 1:01 AM, Mohit Nayak <[hidden email]> wrote:
> Hey,
> Thanks for the reply.
>
> I am using SBT. Here is a list of my dependancies:
>     val sparkCore    = "org.apache.spark" % "spark-core_2.10" % V.spark
>     val hadoopCore   = "org.apache.hadoop" % "hadoop-core"           %
> V.hadoop    % "provided"
>     val jodaTime     = "com.github.nscala-time" %% "nscala-time"     %
> "0.8.0"
>     val scalaUtil    = "com.twitter"       %% "util-collection"      %
> V.util
>     val logback      = "ch.qos.logback" % "logback-classic" % "1.0.6" %
> "runtime"
>     var openCsv      = "net.sf.opencsv" % "opencsv" % "2.1"
>     var scalaTest    = "org.scalatest" % "scalatest_2.10" % "2.1.0" % "test"
>     var scalaIOCore  = "com.github.scala-incubator.io" %% "scala-io-core" %
> V.scalaIO
>     var scalaIOFile  = "com.github.scala-incubator.io" %% "scala-io-file" %
> V.scalaIO
>     var kryo = "com.esotericsoftware.kryo" % "kryo" % "2.16"
>     var spray = "io.spray" %%  "spray-json" % "1.2.5"
>     var scala_reflect = "org.scala-lang" % "scala-reflect" % "2.10.3"
>
>
>
> On Mon, Jun 2, 2014 at 4:23 PM, Sean Owen <[hidden email]> wrote:
>>
>> This ultimately means you have a couple copies of the servlet APIs in
>> the build. What is your build like (SBT? Maven?) and what exactly are
>> you depending on?
>>
>> On Tue, Jun 3, 2014 at 12:21 AM, Mohit Nayak <[hidden email]> wrote:
>> > Hi,
>> > I've upgraded to Spark 1.0.0. I'm not able to run any tests. They throw
>> > a
>> >
>> > java.lang.SecurityException: class "javax.servlet.FilterRegistration"'s
>> > signer information does not match signer information of other classes in
>> > the
>> > same package
>> >
>> >
>> > I'm using Hadoop-core 1.0.4 and running this locally.
>> > I noticed that there was an issue regarding this and was marked as
>> > resolved
>> > [https://issues.apache.org/jira/browse/SPARK-1693]
>> > Please guide..
>> >
>> > --
>> > -Mohit
>> > [hidden email]
>> >
>> >
>> >
>> > --
>> > -Mohit
>> > [hidden email]
>
>
>
>
> --
> -Mohit
> [hidden email]



--
-Mohit
[hidden email]
Reply | Threaded
Open this post in threaded view
|

Re: SecurityException when running tests with Spark 1.0.0

Matei Zaharia
Administrator
You can just use the Maven build for now, even for Spark 1.0.0.

Matei

On Jun 2, 2014, at 5:30 PM, Mohit Nayak <[hidden email]> wrote:

Hey,
Yup that fixed it. Thanks so much!
 
Is this the only solution, or could this be resolved in future versions of Spark ?


On Mon, Jun 2, 2014 at 5:14 PM, Sean Owen <[hidden email]> wrote:
If it's the SBT build, I suspect you are hitting
https://issues.apache.org/jira/browse/SPARK-1949

Can you try to apply the excludes you see at
https://github.com/apache/spark/pull/906/files to your build to see if
it resolves it?

If so I think this could be helpful to commit.

On Tue, Jun 3, 2014 at 1:01 AM, Mohit Nayak <[hidden email]> wrote:
> Hey,
> Thanks for the reply.
>
> I am using SBT. Here is a list of my dependancies:
>     val sparkCore    = "org.apache.spark" % "spark-core_2.10" % V.spark
>     val hadoopCore   = "org.apache.hadoop" % "hadoop-core"           %
> V.hadoop    % "provided"
>     val jodaTime     = "com.github.nscala-time" %% "nscala-time"     %
> "0.8.0"
>     val scalaUtil    = "com.twitter"       %% "util-collection"      %
> V.util
>     val logback      = "ch.qos.logback" % "logback-classic" % "1.0.6" %
> "runtime"
>     var openCsv      = "net.sf.opencsv" % "opencsv" % "2.1"
>     var scalaTest    = "org.scalatest" % "scalatest_2.10" % "2.1.0" % "test"
>     var scalaIOCore  = "com.github.scala-incubator.io" %% "scala-io-core" %
> V.scalaIO
>     var scalaIOFile  = "com.github.scala-incubator.io" %% "scala-io-file" %
> V.scalaIO
>     var kryo = "com.esotericsoftware.kryo" % "kryo" % "2.16"
>     var spray = "io.spray" %%  "spray-json" % "1.2.5"
>     var scala_reflect = "org.scala-lang" % "scala-reflect" % "2.10.3"
>
>
>
> On Mon, Jun 2, 2014 at 4:23 PM, Sean Owen <[hidden email]> wrote:
>>
>> This ultimately means you have a couple copies of the servlet APIs in
>> the build. What is your build like (SBT? Maven?) and what exactly are
>> you depending on?
>>
>> On Tue, Jun 3, 2014 at 12:21 AM, Mohit Nayak <[hidden email]> wrote:
>> > Hi,
>> > I've upgraded to Spark 1.0.0. I'm not able to run any tests. They throw
>> > a
>> >
>> > java.lang.SecurityException: class "javax.servlet.FilterRegistration"'s
>> > signer information does not match signer information of other classes in
>> > the
>> > same package
>> >
>> >
>> > I'm using Hadoop-core 1.0.4 and running this locally.
>> > I noticed that there was an issue regarding this and was marked as
>> > resolved
>> > [https://issues.apache.org/jira/browse/SPARK-1693]
>> > Please guide..
>> >
>> > --
>> > -Mohit
>> > [hidden email]
>> >
>> >
>> >
>> > --
>> > -Mohit
>> > [hidden email]
>
>
>
>
> --
> -Mohit
> [hidden email]



--
-Mohit
[hidden email]