Unable to redirect Spark logs to slf4j

classic Classic list List threaded Threaded
7 messages Options
Reply | Threaded
Open this post in threaded view
|

Unable to redirect Spark logs to slf4j

sparhomenko
Hi,

I'm trying to redirect Spark logs to slf4j. Spark seem to be using Log4J, so I did the typical steps of forcing a Log4J-based framework to use slf4j - manually excluded slf4j-log4j12 and log4j, and included log4j-over-slf4j. When doing that however Spark starts failing on initialization with: java.lang.StackOverflowError
at java.lang.ThreadLocal.access$400(ThreadLocal.java:72)
at java.lang.ThreadLocal$ThreadLocalMap.getEntry(ThreadLocal.java:376)
at java.lang.ThreadLocal$ThreadLocalMap.access$000(ThreadLocal.java:261)
at java.lang.ThreadLocal.get(ThreadLocal.java:146)
at java.lang.StringCoding.deref(StringCoding.java:63)
at java.lang.StringCoding.encode(StringCoding.java:330)
at java.lang.String.getBytes(String.java:916)
at java.io.UnixFileSystem.getBooleanAttributes0(Native Method)
at java.io.UnixFileSystem.getBooleanAttributes(UnixFileSystem.java:242)
at java.io.File.exists(File.java:813)
at sun.misc.URLClassPath$FileLoader.getResource(URLClassPath.java:1080)
at sun.misc.URLClassPath$FileLoader.findResource(URLClassPath.java:1047)
at sun.misc.URLClassPath.findResource(URLClassPath.java:176)
at java.net.URLClassLoader$2.run(URLClassLoader.java:551)
at java.net.URLClassLoader$2.run(URLClassLoader.java:549)
at java.security.AccessController.doPrivileged(Native Method)
at java.net.URLClassLoader.findResource(URLClassLoader.java:548)
at java.lang.ClassLoader.getResource(ClassLoader.java:1147)
at org.apache.spark.Logging$class.initializeLogging(Logging.scala:109)
at org.apache.spark.Logging$class.initializeIfNecessary(Logging.scala:97)
at org.apache.spark.Logging$class.log(Logging.scala:36)
at org.apache.spark.util.Utils$.log(Utils.scala:47)
<last 4 lines repeated many, many times>

There's some related work done in SPARK-1071, but it was resolved after 0.9.0 was released. In the last comment Sean refers to a StackOverflowError which was discussed in the mailing list, I assume it might be a problem similar to mine but I was not able to find that discussion.
Is anyone aware of a way to redirect Spark 0.9.0 logs to slf4j?

--
Best regards,
Sergey Parhomenko
Reply | Threaded
Open this post in threaded view
|

Re: Unable to redirect Spark logs to slf4j

sowen
Yes I think that issue is fixed (Patrick you had the last eyes on it IIRC?)

If you are using log4j, in general, do not redirect log4j to slf4j.
Stuff using log4j is already using log4j, done.
--
Sean Owen | Director, Data Science | London


On Wed, Mar 5, 2014 at 1:12 PM, Sergey Parhomenko <[hidden email]> wrote:

> Hi,
>
> I'm trying to redirect Spark logs to slf4j. Spark seem to be using Log4J, so
> I did the typical steps of forcing a Log4J-based framework to use slf4j -
> manually excluded slf4j-log4j12 and log4j, and included log4j-over-slf4j.
> When doing that however Spark starts failing on initialization with:
> java.lang.StackOverflowError
> at java.lang.ThreadLocal.access$400(ThreadLocal.java:72)
> at java.lang.ThreadLocal$ThreadLocalMap.getEntry(ThreadLocal.java:376)
> at java.lang.ThreadLocal$ThreadLocalMap.access$000(ThreadLocal.java:261)
> at java.lang.ThreadLocal.get(ThreadLocal.java:146)
> at java.lang.StringCoding.deref(StringCoding.java:63)
> at java.lang.StringCoding.encode(StringCoding.java:330)
> at java.lang.String.getBytes(String.java:916)
> at java.io.UnixFileSystem.getBooleanAttributes0(Native Method)
> at java.io.UnixFileSystem.getBooleanAttributes(UnixFileSystem.java:242)
> at java.io.File.exists(File.java:813)
> at sun.misc.URLClassPath$FileLoader.getResource(URLClassPath.java:1080)
> at sun.misc.URLClassPath$FileLoader.findResource(URLClassPath.java:1047)
> at sun.misc.URLClassPath.findResource(URLClassPath.java:176)
> at java.net.URLClassLoader$2.run(URLClassLoader.java:551)
> at java.net.URLClassLoader$2.run(URLClassLoader.java:549)
> at java.security.AccessController.doPrivileged(Native Method)
> at java.net.URLClassLoader.findResource(URLClassLoader.java:548)
> at java.lang.ClassLoader.getResource(ClassLoader.java:1147)
> at org.apache.spark.Logging$class.initializeLogging(Logging.scala:109)
> at org.apache.spark.Logging$class.initializeIfNecessary(Logging.scala:97)
> at org.apache.spark.Logging$class.log(Logging.scala:36)
> at org.apache.spark.util.Utils$.log(Utils.scala:47)
> <last 4 lines repeated many, many times>
>
> There's some related work done in SPARK-1071, but it was resolved after
> 0.9.0 was released. In the last comment Sean refers to a StackOverflowError
> which was discussed in the mailing list, I assume it might be a problem
> similar to mine but I was not able to find that discussion.
> Is anyone aware of a way to redirect Spark 0.9.0 logs to slf4j?
>
> --
> Best regards,
> Sergey Parhomenko
Reply | Threaded
Open this post in threaded view
|

Re: Unable to redirect Spark logs to slf4j

sparhomenko
Hi Sean,

We're not using log4j actually, we're trying to redirect all logging to slf4j which then uses logback as the logging implementation.

The fix you mentioned - am I right to assume it is not part of the latest released Spark version (0.9.0)? If so, are there any workarounds or advices on how to avoid this issue in 0.9.0?

--
Best regards,
Sergey Parhomenko


On 5 March 2014 14:40, Sean Owen <[hidden email]> wrote:
Yes I think that issue is fixed (Patrick you had the last eyes on it IIRC?)

If you are using log4j, in general, do not redirect log4j to slf4j.
Stuff using log4j is already using log4j, done.
--
Sean Owen | Director, Data Science | London


On Wed, Mar 5, 2014 at 1:12 PM, Sergey Parhomenko <[hidden email]> wrote:
> Hi,
>
> I'm trying to redirect Spark logs to slf4j. Spark seem to be using Log4J, so
> I did the typical steps of forcing a Log4J-based framework to use slf4j -
> manually excluded slf4j-log4j12 and log4j, and included log4j-over-slf4j.
> When doing that however Spark starts failing on initialization with:
> java.lang.StackOverflowError
> at java.lang.ThreadLocal.access$400(ThreadLocal.java:72)
> at java.lang.ThreadLocal$ThreadLocalMap.getEntry(ThreadLocal.java:376)
> at java.lang.ThreadLocal$ThreadLocalMap.access$000(ThreadLocal.java:261)
> at java.lang.ThreadLocal.get(ThreadLocal.java:146)
> at java.lang.StringCoding.deref(StringCoding.java:63)
> at java.lang.StringCoding.encode(StringCoding.java:330)
> at java.lang.String.getBytes(String.java:916)
> at java.io.UnixFileSystem.getBooleanAttributes0(Native Method)
> at java.io.UnixFileSystem.getBooleanAttributes(UnixFileSystem.java:242)
> at java.io.File.exists(File.java:813)
> at sun.misc.URLClassPath$FileLoader.getResource(URLClassPath.java:1080)
> at sun.misc.URLClassPath$FileLoader.findResource(URLClassPath.java:1047)
> at sun.misc.URLClassPath.findResource(URLClassPath.java:176)
> at java.net.URLClassLoader$2.run(URLClassLoader.java:551)
> at java.net.URLClassLoader$2.run(URLClassLoader.java:549)
> at java.security.AccessController.doPrivileged(Native Method)
> at java.net.URLClassLoader.findResource(URLClassLoader.java:548)
> at java.lang.ClassLoader.getResource(ClassLoader.java:1147)
> at org.apache.spark.Logging$class.initializeLogging(Logging.scala:109)
> at org.apache.spark.Logging$class.initializeIfNecessary(Logging.scala:97)
> at org.apache.spark.Logging$class.log(Logging.scala:36)
> at org.apache.spark.util.Utils$.log(Utils.scala:47)
> <last 4 lines repeated many, many times>
>
> There's some related work done in SPARK-1071, but it was resolved after
> 0.9.0 was released. In the last comment Sean refers to a StackOverflowError
> which was discussed in the mailing list, I assume it might be a problem
> similar to mine but I was not able to find that discussion.
> Is anyone aware of a way to redirect Spark 0.9.0 logs to slf4j?
>
> --
> Best regards,
> Sergey Parhomenko

Reply | Threaded
Open this post in threaded view
|

Re: Unable to redirect Spark logs to slf4j

Paul Brown

Hi, Sergey --

Here's my recipe, implemented via Maven; YMMV if you need to do it via sbt, etc., but it should be equivalent:

1) Replace org.apache.spark.Logging trait with this: https://gist.github.com/prb/bc239b1616f5ac40b4e5 (supplied by Patrick during the discussion on the dev list)
2) Amend your POM using the fragment that's in the same gist.

We build two shaded JARs from the same build, one for the driver and one for the worker; to ensure that our Logging trait is the one in use in the driver (where it matters), we exclude that same class from the Spark JAR in the shade plugin configuration.

Best.
-- Paul



[hidden email] | Multifarious, Inc. | http://mult.ifario.us/


On Wed, Mar 5, 2014 at 10:02 AM, Sergey Parhomenko <[hidden email]> wrote:
Hi Sean,

We're not using log4j actually, we're trying to redirect all logging to slf4j which then uses logback as the logging implementation.

The fix you mentioned - am I right to assume it is not part of the latest released Spark version (0.9.0)? If so, are there any workarounds or advices on how to avoid this issue in 0.9.0?

--
Best regards,
Sergey Parhomenko


On 5 March 2014 14:40, Sean Owen <[hidden email]> wrote:
Yes I think that issue is fixed (Patrick you had the last eyes on it IIRC?)

If you are using log4j, in general, do not redirect log4j to slf4j.
Stuff using log4j is already using log4j, done.
--
Sean Owen | Director, Data Science | London


On Wed, Mar 5, 2014 at 1:12 PM, Sergey Parhomenko <[hidden email]> wrote:
> Hi,
>
> I'm trying to redirect Spark logs to slf4j. Spark seem to be using Log4J, so
> I did the typical steps of forcing a Log4J-based framework to use slf4j -
> manually excluded slf4j-log4j12 and log4j, and included log4j-over-slf4j.
> When doing that however Spark starts failing on initialization with:
> java.lang.StackOverflowError
> at java.lang.ThreadLocal.access$400(ThreadLocal.java:72)
> at java.lang.ThreadLocal$ThreadLocalMap.getEntry(ThreadLocal.java:376)
> at java.lang.ThreadLocal$ThreadLocalMap.access$000(ThreadLocal.java:261)
> at java.lang.ThreadLocal.get(ThreadLocal.java:146)
> at java.lang.StringCoding.deref(StringCoding.java:63)
> at java.lang.StringCoding.encode(StringCoding.java:330)
> at java.lang.String.getBytes(String.java:916)
> at java.io.UnixFileSystem.getBooleanAttributes0(Native Method)
> at java.io.UnixFileSystem.getBooleanAttributes(UnixFileSystem.java:242)
> at java.io.File.exists(File.java:813)
> at sun.misc.URLClassPath$FileLoader.getResource(URLClassPath.java:1080)
> at sun.misc.URLClassPath$FileLoader.findResource(URLClassPath.java:1047)
> at sun.misc.URLClassPath.findResource(URLClassPath.java:176)
> at java.net.URLClassLoader$2.run(URLClassLoader.java:551)
> at java.net.URLClassLoader$2.run(URLClassLoader.java:549)
> at java.security.AccessController.doPrivileged(Native Method)
> at java.net.URLClassLoader.findResource(URLClassLoader.java:548)
> at java.lang.ClassLoader.getResource(ClassLoader.java:1147)
> at org.apache.spark.Logging$class.initializeLogging(Logging.scala:109)
> at org.apache.spark.Logging$class.initializeIfNecessary(Logging.scala:97)
> at org.apache.spark.Logging$class.log(Logging.scala:36)
> at org.apache.spark.util.Utils$.log(Utils.scala:47)
> <last 4 lines repeated many, many times>
>
> There's some related work done in SPARK-1071, but it was resolved after
> 0.9.0 was released. In the last comment Sean refers to a StackOverflowError
> which was discussed in the mailing list, I assume it might be a problem
> similar to mine but I was not able to find that discussion.
> Is anyone aware of a way to redirect Spark 0.9.0 logs to slf4j?
>
> --
> Best regards,
> Sergey Parhomenko


Reply | Threaded
Open this post in threaded view
|

Re: Unable to redirect Spark logs to slf4j

Patrick Wendell
Hey All,

We have a fix for this but it didn't get merged yet. I'll put it as a
blocker for Spark 0.9.1.

https://github.com/pwendell/incubator-spark/commit/66594e88e5be50fca073a7ef38fa62db4082b3c8

https://spark-project.atlassian.net/browse/SPARK-1190

Sergey if you could try compiling Spark with this batch and seeing if
it works that would be great.

Thanks,
Patrick


On Wed, Mar 5, 2014 at 10:26 AM, Paul Brown <[hidden email]> wrote:

>
> Hi, Sergey --
>
> Here's my recipe, implemented via Maven; YMMV if you need to do it via sbt,
> etc., but it should be equivalent:
>
> 1) Replace org.apache.spark.Logging trait with this:
> https://gist.github.com/prb/bc239b1616f5ac40b4e5 (supplied by Patrick during
> the discussion on the dev list)
> 2) Amend your POM using the fragment that's in the same gist.
>
> We build two shaded JARs from the same build, one for the driver and one for
> the worker; to ensure that our Logging trait is the one in use in the driver
> (where it matters), we exclude that same class from the Spark JAR in the
> shade plugin configuration.
>
> Best.
> -- Paul
>
>
> --
> [hidden email] | Multifarious, Inc. | http://mult.ifario.us/
>
>
> On Wed, Mar 5, 2014 at 10:02 AM, Sergey Parhomenko <[hidden email]>
> wrote:
>>
>> Hi Sean,
>>
>> We're not using log4j actually, we're trying to redirect all logging to
>> slf4j which then uses logback as the logging implementation.
>>
>> The fix you mentioned - am I right to assume it is not part of the latest
>> released Spark version (0.9.0)? If so, are there any workarounds or advices
>> on how to avoid this issue in 0.9.0?
>>
>> --
>> Best regards,
>> Sergey Parhomenko
>>
>>
>> On 5 March 2014 14:40, Sean Owen <[hidden email]> wrote:
>>>
>>> Yes I think that issue is fixed (Patrick you had the last eyes on it
>>> IIRC?)
>>>
>>> If you are using log4j, in general, do not redirect log4j to slf4j.
>>> Stuff using log4j is already using log4j, done.
>>> --
>>> Sean Owen | Director, Data Science | London
>>>
>>>
>>> On Wed, Mar 5, 2014 at 1:12 PM, Sergey Parhomenko <[hidden email]>
>>> wrote:
>>> > Hi,
>>> >
>>> > I'm trying to redirect Spark logs to slf4j. Spark seem to be using
>>> > Log4J, so
>>> > I did the typical steps of forcing a Log4J-based framework to use slf4j
>>> > -
>>> > manually excluded slf4j-log4j12 and log4j, and included
>>> > log4j-over-slf4j.
>>> > When doing that however Spark starts failing on initialization with:
>>> > java.lang.StackOverflowError
>>> > at java.lang.ThreadLocal.access$400(ThreadLocal.java:72)
>>> > at java.lang.ThreadLocal$ThreadLocalMap.getEntry(ThreadLocal.java:376)
>>> > at
>>> > java.lang.ThreadLocal$ThreadLocalMap.access$000(ThreadLocal.java:261)
>>> > at java.lang.ThreadLocal.get(ThreadLocal.java:146)
>>> > at java.lang.StringCoding.deref(StringCoding.java:63)
>>> > at java.lang.StringCoding.encode(StringCoding.java:330)
>>> > at java.lang.String.getBytes(String.java:916)
>>> > at java.io.UnixFileSystem.getBooleanAttributes0(Native Method)
>>> > at java.io.UnixFileSystem.getBooleanAttributes(UnixFileSystem.java:242)
>>> > at java.io.File.exists(File.java:813)
>>> > at sun.misc.URLClassPath$FileLoader.getResource(URLClassPath.java:1080)
>>> > at
>>> > sun.misc.URLClassPath$FileLoader.findResource(URLClassPath.java:1047)
>>> > at sun.misc.URLClassPath.findResource(URLClassPath.java:176)
>>> > at java.net.URLClassLoader$2.run(URLClassLoader.java:551)
>>> > at java.net.URLClassLoader$2.run(URLClassLoader.java:549)
>>> > at java.security.AccessController.doPrivileged(Native Method)
>>> > at java.net.URLClassLoader.findResource(URLClassLoader.java:548)
>>> > at java.lang.ClassLoader.getResource(ClassLoader.java:1147)
>>> > at org.apache.spark.Logging$class.initializeLogging(Logging.scala:109)
>>> > at
>>> > org.apache.spark.Logging$class.initializeIfNecessary(Logging.scala:97)
>>> > at org.apache.spark.Logging$class.log(Logging.scala:36)
>>> > at org.apache.spark.util.Utils$.log(Utils.scala:47)
>>> > <last 4 lines repeated many, many times>
>>> >
>>> > There's some related work done in SPARK-1071, but it was resolved after
>>> > 0.9.0 was released. In the last comment Sean refers to a
>>> > StackOverflowError
>>> > which was discussed in the mailing list, I assume it might be a problem
>>> > similar to mine but I was not able to find that discussion.
>>> > Is anyone aware of a way to redirect Spark 0.9.0 logs to slf4j?
>>> >
>>> > --
>>> > Best regards,
>>> > Sergey Parhomenko
>>
>>
>
Reply | Threaded
Open this post in threaded view
|

Re: Unable to redirect Spark logs to slf4j

sparhomenko
Hi Patrick,

Thanks for the patch. I tried building a patched version of spark-core_2.10-0.9.0-incubating.jar but the Maven build fails:
[ERROR] /home/das/Work/thx/incubator-spark/core/src/main/scala/org/apache/spark/Logging.scala:22: object impl is not a member of package org.slf4j
[ERROR] import org.slf4j.impl.StaticLoggerBinder
[ERROR]                  ^
[ERROR] /home/das/Work/thx/incubator-spark/core/src/main/scala/org/apache/spark/Logging.scala:106: not found: value StaticLoggerBinder
[ERROR]     val binder = StaticLoggerBinder.getSingleton
[ERROR]                  ^
[ERROR] two errors found

The module only has compile dependency on slf4j-api, and not on slf4j-log4j12 or any other slf4j logging modules which provide org.slf4j.impl.StaticLoggerBinder. Adding slf4j-log4j12 with compile scope helps, and I confirm the logging is redirected to slf4j/Logback correctly now with the patched module. I'm not sure however if using compile scope for slf4j-log4j12 is a good idea.

--
Best regards,
Sergey Parhomenko


On 5 March 2014 20:11, Patrick Wendell <[hidden email]> wrote:
Hey All,

We have a fix for this but it didn't get merged yet. I'll put it as a
blocker for Spark 0.9.1.

https://github.com/pwendell/incubator-spark/commit/66594e88e5be50fca073a7ef38fa62db4082b3c8

https://spark-project.atlassian.net/browse/SPARK-1190

Sergey if you could try compiling Spark with this batch and seeing if
it works that would be great.

Thanks,
Patrick


On Wed, Mar 5, 2014 at 10:26 AM, Paul Brown <[hidden email]> wrote:
>
> Hi, Sergey --
>
> Here's my recipe, implemented via Maven; YMMV if you need to do it via sbt,
> etc., but it should be equivalent:
>
> 1) Replace org.apache.spark.Logging trait with this:
> https://gist.github.com/prb/bc239b1616f5ac40b4e5 (supplied by Patrick during
> the discussion on the dev list)
> 2) Amend your POM using the fragment that's in the same gist.
>
> We build two shaded JARs from the same build, one for the driver and one for
> the worker; to ensure that our Logging trait is the one in use in the driver
> (where it matters), we exclude that same class from the Spark JAR in the
> shade plugin configuration.
>
> Best.
> -- Paul
>
>
> --
> [hidden email] | Multifarious, Inc. | http://mult.ifario.us/
>
>
> On Wed, Mar 5, 2014 at 10:02 AM, Sergey Parhomenko <[hidden email]>
> wrote:
>>
>> Hi Sean,
>>
>> We're not using log4j actually, we're trying to redirect all logging to
>> slf4j which then uses logback as the logging implementation.
>>
>> The fix you mentioned - am I right to assume it is not part of the latest
>> released Spark version (0.9.0)? If so, are there any workarounds or advices
>> on how to avoid this issue in 0.9.0?
>>
>> --
>> Best regards,
>> Sergey Parhomenko
>>
>>
>> On 5 March 2014 14:40, Sean Owen <[hidden email]> wrote:
>>>
>>> Yes I think that issue is fixed (Patrick you had the last eyes on it
>>> IIRC?)
>>>
>>> If you are using log4j, in general, do not redirect log4j to slf4j.
>>> Stuff using log4j is already using log4j, done.
>>> --
>>> Sean Owen | Director, Data Science | London
>>>
>>>
>>> On Wed, Mar 5, 2014 at 1:12 PM, Sergey Parhomenko <[hidden email]>
>>> wrote:
>>> > Hi,
>>> >
>>> > I'm trying to redirect Spark logs to slf4j. Spark seem to be using
>>> > Log4J, so
>>> > I did the typical steps of forcing a Log4J-based framework to use slf4j
>>> > -
>>> > manually excluded slf4j-log4j12 and log4j, and included
>>> > log4j-over-slf4j.
>>> > When doing that however Spark starts failing on initialization with:
>>> > java.lang.StackOverflowError
>>> > at java.lang.ThreadLocal.access$400(ThreadLocal.java:72)
>>> > at java.lang.ThreadLocal$ThreadLocalMap.getEntry(ThreadLocal.java:376)
>>> > at
>>> > java.lang.ThreadLocal$ThreadLocalMap.access$000(ThreadLocal.java:261)
>>> > at java.lang.ThreadLocal.get(ThreadLocal.java:146)
>>> > at java.lang.StringCoding.deref(StringCoding.java:63)
>>> > at java.lang.StringCoding.encode(StringCoding.java:330)
>>> > at java.lang.String.getBytes(String.java:916)
>>> > at java.io.UnixFileSystem.getBooleanAttributes0(Native Method)
>>> > at java.io.UnixFileSystem.getBooleanAttributes(UnixFileSystem.java:242)
>>> > at java.io.File.exists(File.java:813)
>>> > at sun.misc.URLClassPath$FileLoader.getResource(URLClassPath.java:1080)
>>> > at
>>> > sun.misc.URLClassPath$FileLoader.findResource(URLClassPath.java:1047)
>>> > at sun.misc.URLClassPath.findResource(URLClassPath.java:176)
>>> > at java.net.URLClassLoader$2.run(URLClassLoader.java:551)
>>> > at java.net.URLClassLoader$2.run(URLClassLoader.java:549)
>>> > at java.security.AccessController.doPrivileged(Native Method)
>>> > at java.net.URLClassLoader.findResource(URLClassLoader.java:548)
>>> > at java.lang.ClassLoader.getResource(ClassLoader.java:1147)
>>> > at org.apache.spark.Logging$class.initializeLogging(Logging.scala:109)
>>> > at
>>> > org.apache.spark.Logging$class.initializeIfNecessary(Logging.scala:97)
>>> > at org.apache.spark.Logging$class.log(Logging.scala:36)
>>> > at org.apache.spark.util.Utils$.log(Utils.scala:47)
>>> > <last 4 lines repeated many, many times>
>>> >
>>> > There's some related work done in SPARK-1071, but it was resolved after
>>> > 0.9.0 was released. In the last comment Sean refers to a
>>> > StackOverflowError
>>> > which was discussed in the mailing list, I assume it might be a problem
>>> > similar to mine but I was not able to find that discussion.
>>> > Is anyone aware of a way to redirect Spark 0.9.0 logs to slf4j?
>>> >
>>> > --
>>> > Best regards,
>>> > Sergey Parhomenko
>>
>>
>

Reply | Threaded
Open this post in threaded view
|

Re: Unable to redirect Spark logs to slf4j

Patrick Wendell
Hey,

Maybe I don't understand the slf4j model completely, but I think you
need to add a concrete implementation of a logger. So in your case
you'd the logback-classic binding in place of the log4j binding at
compile time:

http://mvnrepository.com/artifact/ch.qos.logback/logback-classic/1.1.1

- Patrick

On Wed, Mar 5, 2014 at 1:52 PM, Sergey Parhomenko <[hidden email]> wrote:

> Hi Patrick,
>
> Thanks for the patch. I tried building a patched version of
> spark-core_2.10-0.9.0-incubating.jar but the Maven build fails:
> [ERROR]
> /home/das/Work/thx/incubator-spark/core/src/main/scala/org/apache/spark/Logging.scala:22:
> object impl is not a member of package org.slf4j
> [ERROR] import org.slf4j.impl.StaticLoggerBinder
> [ERROR]                  ^
> [ERROR]
> /home/das/Work/thx/incubator-spark/core/src/main/scala/org/apache/spark/Logging.scala:106:
> not found: value StaticLoggerBinder
> [ERROR]     val binder = StaticLoggerBinder.getSingleton
> [ERROR]                  ^
> [ERROR] two errors found
>
> The module only has compile dependency on slf4j-api, and not on
> slf4j-log4j12 or any other slf4j logging modules which provide
> org.slf4j.impl.StaticLoggerBinder. Adding slf4j-log4j12 with compile scope
> helps, and I confirm the logging is redirected to slf4j/Logback correctly
> now with the patched module. I'm not sure however if using compile scope for
> slf4j-log4j12 is a good idea.
>
> --
> Best regards,
> Sergey Parhomenko
>
>
> On 5 March 2014 20:11, Patrick Wendell <[hidden email]> wrote:
>>
>> Hey All,
>>
>> We have a fix for this but it didn't get merged yet. I'll put it as a
>> blocker for Spark 0.9.1.
>>
>>
>> https://github.com/pwendell/incubator-spark/commit/66594e88e5be50fca073a7ef38fa62db4082b3c8
>>
>> https://spark-project.atlassian.net/browse/SPARK-1190
>>
>> Sergey if you could try compiling Spark with this batch and seeing if
>> it works that would be great.
>>
>> Thanks,
>> Patrick
>>
>>
>> On Wed, Mar 5, 2014 at 10:26 AM, Paul Brown <[hidden email]> wrote:
>> >
>> > Hi, Sergey --
>> >
>> > Here's my recipe, implemented via Maven; YMMV if you need to do it via
>> > sbt,
>> > etc., but it should be equivalent:
>> >
>> > 1) Replace org.apache.spark.Logging trait with this:
>> > https://gist.github.com/prb/bc239b1616f5ac40b4e5 (supplied by Patrick
>> > during
>> > the discussion on the dev list)
>> > 2) Amend your POM using the fragment that's in the same gist.
>> >
>> > We build two shaded JARs from the same build, one for the driver and one
>> > for
>> > the worker; to ensure that our Logging trait is the one in use in the
>> > driver
>> > (where it matters), we exclude that same class from the Spark JAR in the
>> > shade plugin configuration.
>> >
>> > Best.
>> > -- Paul
>> >
>> >
>> > --
>> > [hidden email] | Multifarious, Inc. | http://mult.ifario.us/
>> >
>> >
>> > On Wed, Mar 5, 2014 at 10:02 AM, Sergey Parhomenko
>> > <[hidden email]>
>> > wrote:
>> >>
>> >> Hi Sean,
>> >>
>> >> We're not using log4j actually, we're trying to redirect all logging to
>> >> slf4j which then uses logback as the logging implementation.
>> >>
>> >> The fix you mentioned - am I right to assume it is not part of the
>> >> latest
>> >> released Spark version (0.9.0)? If so, are there any workarounds or
>> >> advices
>> >> on how to avoid this issue in 0.9.0?
>> >>
>> >> --
>> >> Best regards,
>> >> Sergey Parhomenko
>> >>
>> >>
>> >> On 5 March 2014 14:40, Sean Owen <[hidden email]> wrote:
>> >>>
>> >>> Yes I think that issue is fixed (Patrick you had the last eyes on it
>> >>> IIRC?)
>> >>>
>> >>> If you are using log4j, in general, do not redirect log4j to slf4j.
>> >>> Stuff using log4j is already using log4j, done.
>> >>> --
>> >>> Sean Owen | Director, Data Science | London
>> >>>
>> >>>
>> >>> On Wed, Mar 5, 2014 at 1:12 PM, Sergey Parhomenko
>> >>> <[hidden email]>
>> >>> wrote:
>> >>> > Hi,
>> >>> >
>> >>> > I'm trying to redirect Spark logs to slf4j. Spark seem to be using
>> >>> > Log4J, so
>> >>> > I did the typical steps of forcing a Log4J-based framework to use
>> >>> > slf4j
>> >>> > -
>> >>> > manually excluded slf4j-log4j12 and log4j, and included
>> >>> > log4j-over-slf4j.
>> >>> > When doing that however Spark starts failing on initialization with:
>> >>> > java.lang.StackOverflowError
>> >>> > at java.lang.ThreadLocal.access$400(ThreadLocal.java:72)
>> >>> > at
>> >>> > java.lang.ThreadLocal$ThreadLocalMap.getEntry(ThreadLocal.java:376)
>> >>> > at
>> >>> >
>> >>> > java.lang.ThreadLocal$ThreadLocalMap.access$000(ThreadLocal.java:261)
>> >>> > at java.lang.ThreadLocal.get(ThreadLocal.java:146)
>> >>> > at java.lang.StringCoding.deref(StringCoding.java:63)
>> >>> > at java.lang.StringCoding.encode(StringCoding.java:330)
>> >>> > at java.lang.String.getBytes(String.java:916)
>> >>> > at java.io.UnixFileSystem.getBooleanAttributes0(Native Method)
>> >>> > at
>> >>> > java.io.UnixFileSystem.getBooleanAttributes(UnixFileSystem.java:242)
>> >>> > at java.io.File.exists(File.java:813)
>> >>> > at
>> >>> > sun.misc.URLClassPath$FileLoader.getResource(URLClassPath.java:1080)
>> >>> > at
>> >>> >
>> >>> > sun.misc.URLClassPath$FileLoader.findResource(URLClassPath.java:1047)
>> >>> > at sun.misc.URLClassPath.findResource(URLClassPath.java:176)
>> >>> > at java.net.URLClassLoader$2.run(URLClassLoader.java:551)
>> >>> > at java.net.URLClassLoader$2.run(URLClassLoader.java:549)
>> >>> > at java.security.AccessController.doPrivileged(Native Method)
>> >>> > at java.net.URLClassLoader.findResource(URLClassLoader.java:548)
>> >>> > at java.lang.ClassLoader.getResource(ClassLoader.java:1147)
>> >>> > at
>> >>> > org.apache.spark.Logging$class.initializeLogging(Logging.scala:109)
>> >>> > at
>> >>> >
>> >>> > org.apache.spark.Logging$class.initializeIfNecessary(Logging.scala:97)
>> >>> > at org.apache.spark.Logging$class.log(Logging.scala:36)
>> >>> > at org.apache.spark.util.Utils$.log(Utils.scala:47)
>> >>> > <last 4 lines repeated many, many times>
>> >>> >
>> >>> > There's some related work done in SPARK-1071, but it was resolved
>> >>> > after
>> >>> > 0.9.0 was released. In the last comment Sean refers to a
>> >>> > StackOverflowError
>> >>> > which was discussed in the mailing list, I assume it might be a
>> >>> > problem
>> >>> > similar to mine but I was not able to find that discussion.
>> >>> > Is anyone aware of a way to redirect Spark 0.9.0 logs to slf4j?
>> >>> >
>> >>> > --
>> >>> > Best regards,
>> >>> > Sergey Parhomenko
>> >>
>> >>
>> >
>
>