Unable to load native-hadoop library

classic Classic list List threaded Threaded
3 messages Options
Reply | Threaded
Open this post in threaded view
|

Unable to load native-hadoop library

Aureliano Buendia
Hi,

I'm using spark-ec2 scripts, and spark applications do not load native hadoop libraries. I've set the native lib path like this:

export SPARK_LIBRARY_PATH='/root/ephemeral-hdfs/lib/native/'

But get these warnings in log:

WARN NativeCodeLoader: Unable to load native-hadoop library for your platform... using builtin-java classes where applicable
WARN LoadSnappy: Snappy native library not loaded


Is SPARK_LIBRARY_PATH the right variable for this? Does spark use this variable, or does my application have to set up the native libraries?
Reply | Threaded
Open this post in threaded view
|

Re: Unable to load native-hadoop library

Aureliano Buendia
I'm compiling my application against the same hadoop version on spark ec2 AMI:

<dependency>
    <groupId>org.apache.hadoop</groupId>
    <artifactId>hadoop-client</artifactId>
    <version>0.23.7</version>
</dependency>


In my shaded fat jar, I do not include this library though, which shouldn't cause this problem.



On Mon, Jan 13, 2014 at 5:28 PM, Aureliano Buendia <[hidden email]> wrote:
Hi,

I'm using spark-ec2 scripts, and spark applications do not load native hadoop libraries. I've set the native lib path like this:

export SPARK_LIBRARY_PATH='/root/ephemeral-hdfs/lib/native/'

But get these warnings in log:

WARN NativeCodeLoader: Unable to load native-hadoop library for your platform... using builtin-java classes where applicable
WARN LoadSnappy: Snappy native library not loaded


Is SPARK_LIBRARY_PATH the right variable for this? Does spark use this variable, or does my application have to set up the native libraries?

Reply | Threaded
Open this post in threaded view
|

Re: Unable to load native-hadoop library

Aureliano Buendia
I had to explicitly use  -Djava.library.path for this to work.


On Mon, Jan 13, 2014 at 5:51 PM, Aureliano Buendia <[hidden email]> wrote:
I'm compiling my application against the same hadoop version on spark ec2 AMI:

<dependency>
    <groupId>org.apache.hadoop</groupId>
    <artifactId>hadoop-client</artifactId>
    <version>0.23.7</version>
</dependency>


In my shaded fat jar, I do not include this library though, which shouldn't cause this problem.



On Mon, Jan 13, 2014 at 5:28 PM, Aureliano Buendia <[hidden email]> wrote:
Hi,

I'm using spark-ec2 scripts, and spark applications do not load native hadoop libraries. I've set the native lib path like this:

export SPARK_LIBRARY_PATH='/root/ephemeral-hdfs/lib/native/'

But get these warnings in log:

WARN NativeCodeLoader: Unable to load native-hadoop library for your platform... using builtin-java classes where applicable
WARN LoadSnappy: Snappy native library not loaded


Is SPARK_LIBRARY_PATH the right variable for this? Does spark use this variable, or does my application have to set up the native libraries?