Error reading HDFS file using spark 0.9.0 / hadoop 2.2.0 - incompatible protobuf 2.5 and 2.4.1

Previous Topic Next Topic
 
classic Classic list List threaded Threaded
23 messages Options
12
Reply | Threaded
Open this post in threaded view
|

Error reading HDFS file using spark 0.9.0 / hadoop 2.2.0 - incompatible protobuf 2.5 and 2.4.1

Prasad
Hi
I am getting the protobuf error.... while reading HDFS file using spark 0.9.0 -- i am running on hadoop 2.2.0 .

When i look thru, i find that i have both 2.4.1 and 2.5 and some blogs suggest that there is some incompatability issues betwen 2.4.1 and 2.5

hduser@prasadHdp1:~/spark-0.9.0-incubating$ find ~/ -name protobuf-java*.jar
/home/hduser/.m2/repository/com/google/protobuf/protobuf-java/2.4.1/protobuf-java-2.4.1.jar
/home/hduser/.m2/repository/org/spark-project/protobuf/protobuf-java/2.4.1-shaded/protobuf-java-2.4.1-shaded.jar
/home/hduser/spark-0.9.0-incubating/lib_managed/bundles/protobuf-java-2.5.0.jar
/home/hduser/spark-0.9.0-incubating/lib_managed/jars/protobuf-java-2.4.1-shaded.jar
/home/hduser/.ivy2/cache/com.google.protobuf/protobuf-java/bundles/protobuf-java-2.5.0.jar
/home/hduser/.ivy2/cache/org.spark-project.protobuf/protobuf-java/jars/protobuf-java-2.4.1-shaded.jar


Can someone please let me know if you faced these issues and how u fixed it.

Thanks
Prasad.
Caused by: java.lang.VerifyError: class org.apache.hadoop.security.proto.SecurityProtos$GetDelegationTokenRequestProto overrides final method getUnknownFields.()Lcom/google/protobuf/UnknownFieldSet;
        at java.lang.ClassLoader.defineClass1(Native Method)
        at java.lang.ClassLoader.defineClass(ClassLoader.java:800)
        at java.security.SecureClassLoader.defineClass(SecureClassLoader.java:142)
        at java.net.URLClassLoader.defineClass(URLClassLoader.java:449)
        at java.net.URLClassLoader.access$100(URLClassLoader.java:71)
        at java.net.URLClassLoader$1.run(URLClassLoader.java:361)
        at java.net.URLClassLoader$1.run(URLClassLoader.java:355)
        at java.security.AccessController.doPrivileged(Native Method)
        at java.net.URLClassLoader.findClass(URLClassLoader.java:354)
        at java.lang.ClassLoader.loadClass(ClassLoader.java:425)
        at java.lang.ClassLoader.loadClass(ClassLoader.java:358)
        at java.lang.Class.getDeclaredMethods0(Native Method)
        at java.lang.Class.privateGetDeclaredMethods(Class.java:2531)
        at java.lang.Class.privateGetPublicMethods(Class.java:2651)
        at java.lang.Class.privateGetPublicMethods(Class.java:2661)
        at java.lang.Class.getMethods(Class.java:1467)
        at sun.misc.ProxyGenerator.generateClassFile(ProxyGenerator.java:426)
        at sun.misc.ProxyGenerator.generateProxyClass(ProxyGenerator.java:323)
        at java.lang.reflect.Proxy.getProxyClass0(Proxy.java:636)
        at java.lang.reflect.Proxy.newProxyInstance(Proxy.java:722)
        at org.apache.hadoop.ipc.ProtobufRpcEngine.getProxy(ProtobufRpcEngine.java:92)
        at org.apache.hadoop.ipc.RPC.getProtocolProxy(RPC.java:537)


Caused by: java.lang.reflect.InvocationTargetException
        at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
        at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:57)
        at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
        at java.lang.reflect.Method.invoke(Method.java:606)






Reply | Threaded
Open this post in threaded view
|

Re: Error reading HDFS file using spark 0.9.0 / hadoop 2.2.0 - incompatible protobuf 2.5 and 2.4.1

Aureliano Buendia
Moving protobuf 2.5 jar after the spark jar can help with your error, but then you'll face the

WARN ClusterScheduler: Initial job has not accepted any resources;...

error which is still an unresolved issue in spark.

I had to downgrade protobuf in my app to 2.4.1 to get it work on spark. This is not ideal as protobuf 2.5 comes with better performance.


On Fri, Feb 28, 2014 at 4:51 PM, Prasad <[hidden email]> wrote:
Hi
I am getting the protobuf error.... while reading HDFS file using spark
0.9.0 -- i am running on hadoop 2.2.0 .

When i look thru, i find that i have both 2.4.1 and 2.5 and some blogs
suggest that there is some incompatability issues betwen 2.4.1 and 2.5

hduser@prasadHdp1:~/spark-0.9.0-incubating$ find ~/ -name protobuf-java*.jar
/home/hduser/.m2/repository/com/google/protobuf/protobuf-java/2.4.1/protobuf-java-2.4.1.jar
/home/hduser/.m2/repository/org/spark-project/protobuf/protobuf-java/2.4.1-shaded/protobuf-java-2.4.1-shaded.jar
/home/hduser/spark-0.9.0-incubating/lib_managed/bundles/protobuf-java-2.5.0.jar
/home/hduser/spark-0.9.0-incubating/lib_managed/jars/protobuf-java-2.4.1-shaded.jar
/home/hduser/.ivy2/cache/com.google.protobuf/protobuf-java/bundles/protobuf-java-2.5.0.jar
/home/hduser/.ivy2/cache/org.spark-project.protobuf/protobuf-java/jars/protobuf-java-2.4.1-shaded.jar


Can someone please let me know if you faced these issues and how u fixed it.

Thanks
Prasad.
Caused by: java.lang.VerifyError: class
org.apache.hadoop.security.proto.SecurityProtos$GetDelegationTokenRequestProto
overrides final method
getUnknownFields.()Lcom/google/protobuf/UnknownFieldSet;
        at java.lang.ClassLoader.defineClass1(Native Method)
        at java.lang.ClassLoader.defineClass(ClassLoader.java:800)
        at
java.security.SecureClassLoader.defineClass(SecureClassLoader.java:142)
        at java.net.URLClassLoader.defineClass(URLClassLoader.java:449)
        at java.net.URLClassLoader.access$100(URLClassLoader.java:71)
        at java.net.URLClassLoader$1.run(URLClassLoader.java:361)
        at java.net.URLClassLoader$1.run(URLClassLoader.java:355)
        at java.security.AccessController.doPrivileged(Native Method)
        at java.net.URLClassLoader.findClass(URLClassLoader.java:354)
        at java.lang.ClassLoader.loadClass(ClassLoader.java:425)
        at java.lang.ClassLoader.loadClass(ClassLoader.java:358)
        at java.lang.Class.getDeclaredMethods0(Native Method)
        at java.lang.Class.privateGetDeclaredMethods(Class.java:2531)
        at java.lang.Class.privateGetPublicMethods(Class.java:2651)
        at java.lang.Class.privateGetPublicMethods(Class.java:2661)
        at java.lang.Class.getMethods(Class.java:1467)
        at
sun.misc.ProxyGenerator.generateClassFile(ProxyGenerator.java:426)
        at
sun.misc.ProxyGenerator.generateProxyClass(ProxyGenerator.java:323)
        at java.lang.reflect.Proxy.getProxyClass0(Proxy.java:636)
        at java.lang.reflect.Proxy.newProxyInstance(Proxy.java:722)
        at
org.apache.hadoop.ipc.ProtobufRpcEngine.getProxy(ProtobufRpcEngine.java:92)
        at org.apache.hadoop.ipc.RPC.getProtocolProxy(RPC.java:537)


Caused by: java.lang.reflect.InvocationTargetException
        at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
        at
sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:57)
        at
sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
        at java.lang.reflect.Method.invoke(Method.java:606)










--
View this message in context: http://apache-spark-user-list.1001560.n3.nabble.com/Error-reading-HDFS-file-using-spark-0-9-0-hadoop-2-2-0-incompatible-protobuf-2-5-and-2-4-1-tp2158.html
Sent from the Apache Spark User List mailing list archive at Nabble.com.

Reply | Threaded
Open this post in threaded view
|

Re: Error reading HDFS file using spark 0.9.0 / hadoop 2.2.0 - incompatible protobuf 2.5 and 2.4.1

Ognen Duzlevski-2
In reply to this post by Prasad
I run a 2.2.0 based HDFS cluster and I use Spark-0.9.0 without any
problems to read the files.
Ognen

On 2/28/14, 10:51 AM, Prasad wrote:

> Hi
> I am getting the protobuf error.... while reading HDFS file using spark
> 0.9.0 -- i am running on hadoop 2.2.0 .
>
> When i look thru, i find that i have both 2.4.1 and 2.5 and some blogs
> suggest that there is some incompatability issues betwen 2.4.1 and 2.5
>
> hduser@prasadHdp1:~/spark-0.9.0-incubating$ find ~/ -name protobuf-java*.jar
> /home/hduser/.m2/repository/com/google/protobuf/protobuf-java/2.4.1/protobuf-java-2.4.1.jar
> /home/hduser/.m2/repository/org/spark-project/protobuf/protobuf-java/2.4.1-shaded/protobuf-java-2.4.1-shaded.jar
> /home/hduser/spark-0.9.0-incubating/lib_managed/bundles/protobuf-java-2.5.0.jar
> /home/hduser/spark-0.9.0-incubating/lib_managed/jars/protobuf-java-2.4.1-shaded.jar
> /home/hduser/.ivy2/cache/com.google.protobuf/protobuf-java/bundles/protobuf-java-2.5.0.jar
> /home/hduser/.ivy2/cache/org.spark-project.protobuf/protobuf-java/jars/protobuf-java-2.4.1-shaded.jar
>
>
> Can someone please let me know if you faced these issues and how u fixed it.
>
> Thanks
> Prasad.
> Caused by: java.lang.VerifyError: class
> org.apache.hadoop.security.proto.SecurityProtos$GetDelegationTokenRequestProto
> overrides final method
> getUnknownFields.()Lcom/google/protobuf/UnknownFieldSet;
>          at java.lang.ClassLoader.defineClass1(Native Method)
>          at java.lang.ClassLoader.defineClass(ClassLoader.java:800)
>          at
> java.security.SecureClassLoader.defineClass(SecureClassLoader.java:142)
>          at java.net.URLClassLoader.defineClass(URLClassLoader.java:449)
>          at java.net.URLClassLoader.access$100(URLClassLoader.java:71)
>          at java.net.URLClassLoader$1.run(URLClassLoader.java:361)
>          at java.net.URLClassLoader$1.run(URLClassLoader.java:355)
>          at java.security.AccessController.doPrivileged(Native Method)
>          at java.net.URLClassLoader.findClass(URLClassLoader.java:354)
>          at java.lang.ClassLoader.loadClass(ClassLoader.java:425)
>          at java.lang.ClassLoader.loadClass(ClassLoader.java:358)
>          at java.lang.Class.getDeclaredMethods0(Native Method)
>          at java.lang.Class.privateGetDeclaredMethods(Class.java:2531)
>          at java.lang.Class.privateGetPublicMethods(Class.java:2651)
>          at java.lang.Class.privateGetPublicMethods(Class.java:2661)
>          at java.lang.Class.getMethods(Class.java:1467)
>          at
> sun.misc.ProxyGenerator.generateClassFile(ProxyGenerator.java:426)
>          at
> sun.misc.ProxyGenerator.generateProxyClass(ProxyGenerator.java:323)
>          at java.lang.reflect.Proxy.getProxyClass0(Proxy.java:636)
>          at java.lang.reflect.Proxy.newProxyInstance(Proxy.java:722)
>          at
> org.apache.hadoop.ipc.ProtobufRpcEngine.getProxy(ProtobufRpcEngine.java:92)
>          at org.apache.hadoop.ipc.RPC.getProtocolProxy(RPC.java:537)
>
>
> Caused by: java.lang.reflect.InvocationTargetException
>          at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
>          at
> sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:57)
>          at
> sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
>          at java.lang.reflect.Method.invoke(Method.java:606)
>
>
>
>
>
>
>
>
>
>
> --
> View this message in context: http://apache-spark-user-list.1001560.n3.nabble.com/Error-reading-HDFS-file-using-spark-0-9-0-hadoop-2-2-0-incompatible-protobuf-2-5-and-2-4-1-tp2158.html
> Sent from the Apache Spark User List mailing list archive at Nabble.com.

Reply | Threaded
Open this post in threaded view
|

Re: Error reading HDFS file using spark 0.9.0 / hadoop 2.2.0 - incompatible protobuf 2.5 and 2.4.1

Ognen Duzlevski-2
In reply to this post by Prasad
A stupid question, by the way, you did compile Spark with Hadoop 2.2.0
support?
Ognen

On 2/28/14, 10:51 AM, Prasad wrote:

> Hi
> I am getting the protobuf error.... while reading HDFS file using spark
> 0.9.0 -- i am running on hadoop 2.2.0 .
>
> When i look thru, i find that i have both 2.4.1 and 2.5 and some blogs
> suggest that there is some incompatability issues betwen 2.4.1 and 2.5
>
> hduser@prasadHdp1:~/spark-0.9.0-incubating$ find ~/ -name protobuf-java*.jar
> /home/hduser/.m2/repository/com/google/protobuf/protobuf-java/2.4.1/protobuf-java-2.4.1.jar
> /home/hduser/.m2/repository/org/spark-project/protobuf/protobuf-java/2.4.1-shaded/protobuf-java-2.4.1-shaded.jar
> /home/hduser/spark-0.9.0-incubating/lib_managed/bundles/protobuf-java-2.5.0.jar
> /home/hduser/spark-0.9.0-incubating/lib_managed/jars/protobuf-java-2.4.1-shaded.jar
> /home/hduser/.ivy2/cache/com.google.protobuf/protobuf-java/bundles/protobuf-java-2.5.0.jar
> /home/hduser/.ivy2/cache/org.spark-project.protobuf/protobuf-java/jars/protobuf-java-2.4.1-shaded.jar
>
>
> Can someone please let me know if you faced these issues and how u fixed it.
>
> Thanks
> Prasad.
> Caused by: java.lang.VerifyError: class
> org.apache.hadoop.security.proto.SecurityProtos$GetDelegationTokenRequestProto
> overrides final method
> getUnknownFields.()Lcom/google/protobuf/UnknownFieldSet;
>          at java.lang.ClassLoader.defineClass1(Native Method)
>          at java.lang.ClassLoader.defineClass(ClassLoader.java:800)
>          at
> java.security.SecureClassLoader.defineClass(SecureClassLoader.java:142)
>          at java.net.URLClassLoader.defineClass(URLClassLoader.java:449)
>          at java.net.URLClassLoader.access$100(URLClassLoader.java:71)
>          at java.net.URLClassLoader$1.run(URLClassLoader.java:361)
>          at java.net.URLClassLoader$1.run(URLClassLoader.java:355)
>          at java.security.AccessController.doPrivileged(Native Method)
>          at java.net.URLClassLoader.findClass(URLClassLoader.java:354)
>          at java.lang.ClassLoader.loadClass(ClassLoader.java:425)
>          at java.lang.ClassLoader.loadClass(ClassLoader.java:358)
>          at java.lang.Class.getDeclaredMethods0(Native Method)
>          at java.lang.Class.privateGetDeclaredMethods(Class.java:2531)
>          at java.lang.Class.privateGetPublicMethods(Class.java:2651)
>          at java.lang.Class.privateGetPublicMethods(Class.java:2661)
>          at java.lang.Class.getMethods(Class.java:1467)
>          at
> sun.misc.ProxyGenerator.generateClassFile(ProxyGenerator.java:426)
>          at
> sun.misc.ProxyGenerator.generateProxyClass(ProxyGenerator.java:323)
>          at java.lang.reflect.Proxy.getProxyClass0(Proxy.java:636)
>          at java.lang.reflect.Proxy.newProxyInstance(Proxy.java:722)
>          at
> org.apache.hadoop.ipc.ProtobufRpcEngine.getProxy(ProtobufRpcEngine.java:92)
>          at org.apache.hadoop.ipc.RPC.getProtocolProxy(RPC.java:537)
>
>
> Caused by: java.lang.reflect.InvocationTargetException
>          at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
>          at
> sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:57)
>          at
> sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
>          at java.lang.reflect.Method.invoke(Method.java:606)
>
>
>
>
>
>
>
>
>
>
> --
> View this message in context: http://apache-spark-user-list.1001560.n3.nabble.com/Error-reading-HDFS-file-using-spark-0-9-0-hadoop-2-2-0-incompatible-protobuf-2-5-and-2-4-1-tp2158.html
> Sent from the Apache Spark User List mailing list archive at Nabble.com.

--
Some people, when confronted with a problem, think "I know, I'll use regular expressions." Now they have two problems.
-- Jamie Zawinski

Reply | Threaded
Open this post in threaded view
|

Re: Error reading HDFS file using spark 0.9.0 / hadoop 2.2.0 - incompatible protobuf 2.5 and 2.4.1

Aureliano Buendia
Doesn't hadoop 2.2 also depend on protobuf 2.4?


On Fri, Feb 28, 2014 at 5:45 PM, Ognen Duzlevski <[hidden email]> wrote:
A stupid question, by the way, you did compile Spark with Hadoop 2.2.0 support?

Ognen

On 2/28/14, 10:51 AM, Prasad wrote:
Hi
I am getting the protobuf error.... while reading HDFS file using spark
0.9.0 -- i am running on hadoop 2.2.0 .

When i look thru, i find that i have both 2.4.1 and 2.5 and some blogs
suggest that there is some incompatability issues betwen 2.4.1 and 2.5

hduser@prasadHdp1:~/spark-0.9.0-incubating$ find ~/ -name protobuf-java*.jar
/home/hduser/.m2/repository/com/google/protobuf/protobuf-java/2.4.1/protobuf-java-2.4.1.jar
/home/hduser/.m2/repository/org/spark-project/protobuf/protobuf-java/2.4.1-shaded/protobuf-java-2.4.1-shaded.jar
/home/hduser/spark-0.9.0-incubating/lib_managed/bundles/protobuf-java-2.5.0.jar
/home/hduser/spark-0.9.0-incubating/lib_managed/jars/protobuf-java-2.4.1-shaded.jar
/home/hduser/.ivy2/cache/com.google.protobuf/protobuf-java/bundles/protobuf-java-2.5.0.jar
/home/hduser/.ivy2/cache/org.spark-project.protobuf/protobuf-java/jars/protobuf-java-2.4.1-shaded.jar


Can someone please let me know if you faced these issues and how u fixed it.

Thanks
Prasad.
Caused by: java.lang.VerifyError: class
org.apache.hadoop.security.proto.SecurityProtos$GetDelegationTokenRequestProto
overrides final method
getUnknownFields.()Lcom/google/protobuf/UnknownFieldSet;
         at java.lang.ClassLoader.defineClass1(Native Method)
         at java.lang.ClassLoader.defineClass(ClassLoader.java:800)
         at
java.security.SecureClassLoader.defineClass(SecureClassLoader.java:142)
         at java.net.URLClassLoader.defineClass(URLClassLoader.java:449)
         at java.net.URLClassLoader.access$100(URLClassLoader.java:71)
         at java.net.URLClassLoader$1.run(URLClassLoader.java:361)
         at java.net.URLClassLoader$1.run(URLClassLoader.java:355)
         at java.security.AccessController.doPrivileged(Native Method)
         at java.net.URLClassLoader.findClass(URLClassLoader.java:354)
         at java.lang.ClassLoader.loadClass(ClassLoader.java:425)
         at java.lang.ClassLoader.loadClass(ClassLoader.java:358)
         at java.lang.Class.getDeclaredMethods0(Native Method)
         at java.lang.Class.privateGetDeclaredMethods(Class.java:2531)
         at java.lang.Class.privateGetPublicMethods(Class.java:2651)
         at java.lang.Class.privateGetPublicMethods(Class.java:2661)
         at java.lang.Class.getMethods(Class.java:1467)
         at
sun.misc.ProxyGenerator.generateClassFile(ProxyGenerator.java:426)
         at
sun.misc.ProxyGenerator.generateProxyClass(ProxyGenerator.java:323)
         at java.lang.reflect.Proxy.getProxyClass0(Proxy.java:636)
         at java.lang.reflect.Proxy.newProxyInstance(Proxy.java:722)
         at
org.apache.hadoop.ipc.ProtobufRpcEngine.getProxy(ProtobufRpcEngine.java:92)
         at org.apache.hadoop.ipc.RPC.getProtocolProxy(RPC.java:537)


Caused by: java.lang.reflect.InvocationTargetException
         at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
         at
sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:57)
         at
sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
         at java.lang.reflect.Method.invoke(Method.java:606)










--
View this message in context: http://apache-spark-user-list.1001560.n3.nabble.com/Error-reading-HDFS-file-using-spark-0-9-0-hadoop-2-2-0-incompatible-protobuf-2-5-and-2-4-1-tp2158.html
Sent from the Apache Spark User List mailing list archive at Nabble.com.

--
Some people, when confronted with a problem, think "I know, I'll use regular expressions." Now they have two problems.
-- Jamie Zawinski


Reply | Threaded
Open this post in threaded view
|

Re: Error reading HDFS file using spark 0.9.0 / hadoop 2.2.0 - incompatible protobuf 2.5 and 2.4.1

Egor Pahomov
Spark 0.9 uses protobuf 2.5.0
Hadoop 2.2 uses protobuf 2.5.0
protobuf 2.5.0 can read massages serialized with protobuf 2.4.1
So there is not any reason why you can't read some messages from hadoop 2.2 with protobuf 2.5.0, probably you somehow have 2.4.1 in your class path. Of course it's very bad, that you have both 2.4.1 and 2.5.0 in your classpath. Use excludes or whatever to get rid of 2.4.1.

Personally, I spend 3 days to move my project to protobuf 2.5.0 from 2.4.1. But it has to be done for the whole your project.

2014-02-28 21:49 GMT+04:00 Aureliano Buendia <[hidden email]>:
Doesn't hadoop 2.2 also depend on protobuf 2.4?


On Fri, Feb 28, 2014 at 5:45 PM, Ognen Duzlevski <[hidden email]> wrote:
A stupid question, by the way, you did compile Spark with Hadoop 2.2.0 support?

Ognen

On 2/28/14, 10:51 AM, Prasad wrote:
Hi
I am getting the protobuf error.... while reading HDFS file using spark
0.9.0 -- i am running on hadoop 2.2.0 .

When i look thru, i find that i have both 2.4.1 and 2.5 and some blogs
suggest that there is some incompatability issues betwen 2.4.1 and 2.5

hduser@prasadHdp1:~/spark-0.9.0-incubating$ find ~/ -name protobuf-java*.jar
/home/hduser/.m2/repository/com/google/protobuf/protobuf-java/2.4.1/protobuf-java-2.4.1.jar
/home/hduser/.m2/repository/org/spark-project/protobuf/protobuf-java/2.4.1-shaded/protobuf-java-2.4.1-shaded.jar
/home/hduser/spark-0.9.0-incubating/lib_managed/bundles/protobuf-java-2.5.0.jar
/home/hduser/spark-0.9.0-incubating/lib_managed/jars/protobuf-java-2.4.1-shaded.jar
/home/hduser/.ivy2/cache/com.google.protobuf/protobuf-java/bundles/protobuf-java-2.5.0.jar
/home/hduser/.ivy2/cache/org.spark-project.protobuf/protobuf-java/jars/protobuf-java-2.4.1-shaded.jar


Can someone please let me know if you faced these issues and how u fixed it.

Thanks
Prasad.
Caused by: java.lang.VerifyError: class
org.apache.hadoop.security.proto.SecurityProtos$GetDelegationTokenRequestProto
overrides final method
getUnknownFields.()Lcom/google/protobuf/UnknownFieldSet;
         at java.lang.ClassLoader.defineClass1(Native Method)
         at java.lang.ClassLoader.defineClass(ClassLoader.java:800)
         at
java.security.SecureClassLoader.defineClass(SecureClassLoader.java:142)
         at java.net.URLClassLoader.defineClass(URLClassLoader.java:449)
         at java.net.URLClassLoader.access$100(URLClassLoader.java:71)
         at java.net.URLClassLoader$1.run(URLClassLoader.java:361)
         at java.net.URLClassLoader$1.run(URLClassLoader.java:355)
         at java.security.AccessController.doPrivileged(Native Method)
         at java.net.URLClassLoader.findClass(URLClassLoader.java:354)
         at java.lang.ClassLoader.loadClass(ClassLoader.java:425)
         at java.lang.ClassLoader.loadClass(ClassLoader.java:358)
         at java.lang.Class.getDeclaredMethods0(Native Method)
         at java.lang.Class.privateGetDeclaredMethods(Class.java:2531)
         at java.lang.Class.privateGetPublicMethods(Class.java:2651)
         at java.lang.Class.privateGetPublicMethods(Class.java:2661)
         at java.lang.Class.getMethods(Class.java:1467)
         at
sun.misc.ProxyGenerator.generateClassFile(ProxyGenerator.java:426)
         at
sun.misc.ProxyGenerator.generateProxyClass(ProxyGenerator.java:323)
         at java.lang.reflect.Proxy.getProxyClass0(Proxy.java:636)
         at java.lang.reflect.Proxy.newProxyInstance(Proxy.java:722)
         at
org.apache.hadoop.ipc.ProtobufRpcEngine.getProxy(ProtobufRpcEngine.java:92)
         at org.apache.hadoop.ipc.RPC.getProtocolProxy(RPC.java:537)


Caused by: java.lang.reflect.InvocationTargetException
         at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
         at
sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:57)
         at
sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
         at java.lang.reflect.Method.invoke(Method.java:606)










--
View this message in context: http://apache-spark-user-list.1001560.n3.nabble.com/Error-reading-HDFS-file-using-spark-0-9-0-hadoop-2-2-0-incompatible-protobuf-2-5-and-2-4-1-tp2158.html
Sent from the Apache Spark User List mailing list archive at Nabble.com.

--
Some people, when confronted with a problem, think "I know, I'll use regular expressions." Now they have two problems.
-- Jamie Zawinski





--
Sincerely yours
Egor Pakhomov
Scala Developer, Yandex
Reply | Threaded
Open this post in threaded view
|

Re: Error reading HDFS file using spark 0.9.0 / hadoop 2.2.0 - incompatible protobuf 2.5 and 2.4.1

Aureliano Buendia



On Fri, Feb 28, 2014 at 7:17 PM, Egor Pahomov <[hidden email]> wrote:
Spark 0.9 uses protobuf 2.5.0

Is there another pom for when hadoop 2.2 is used? I don't see another branch for hadooop 2.2.
 
Hadoop 2.2 uses protobuf 2.5.0
protobuf 2.5.0 can read massages serialized with protobuf 2.4.1

Protobuf java code generated by ptotoc 2.4 does not compile with protobuf library 2.5. This is what the OP's error message is about.
 
So there is not any reason why you can't read some messages from hadoop 2.2 with protobuf 2.5.0, probably you somehow have 2.4.1 in your class path. Of course it's very bad, that you have both 2.4.1 and 2.5.0 in your classpath. Use excludes or whatever to get rid of 2.4.1.

Personally, I spend 3 days to move my project to protobuf 2.5.0 from 2.4.1. But it has to be done for the whole your project.

2014-02-28 21:49 GMT+04:00 Aureliano Buendia <[hidden email]>:

Doesn't hadoop 2.2 also depend on protobuf 2.4?


On Fri, Feb 28, 2014 at 5:45 PM, Ognen Duzlevski <[hidden email]> wrote:
A stupid question, by the way, you did compile Spark with Hadoop 2.2.0 support?

Ognen

On 2/28/14, 10:51 AM, Prasad wrote:
Hi
I am getting the protobuf error.... while reading HDFS file using spark
0.9.0 -- i am running on hadoop 2.2.0 .

When i look thru, i find that i have both 2.4.1 and 2.5 and some blogs
suggest that there is some incompatability issues betwen 2.4.1 and 2.5

hduser@prasadHdp1:~/spark-0.9.0-incubating$ find ~/ -name protobuf-java*.jar
/home/hduser/.m2/repository/com/google/protobuf/protobuf-java/2.4.1/protobuf-java-2.4.1.jar
/home/hduser/.m2/repository/org/spark-project/protobuf/protobuf-java/2.4.1-shaded/protobuf-java-2.4.1-shaded.jar
/home/hduser/spark-0.9.0-incubating/lib_managed/bundles/protobuf-java-2.5.0.jar
/home/hduser/spark-0.9.0-incubating/lib_managed/jars/protobuf-java-2.4.1-shaded.jar
/home/hduser/.ivy2/cache/com.google.protobuf/protobuf-java/bundles/protobuf-java-2.5.0.jar
/home/hduser/.ivy2/cache/org.spark-project.protobuf/protobuf-java/jars/protobuf-java-2.4.1-shaded.jar


Can someone please let me know if you faced these issues and how u fixed it.

Thanks
Prasad.
Caused by: java.lang.VerifyError: class
org.apache.hadoop.security.proto.SecurityProtos$GetDelegationTokenRequestProto
overrides final method
getUnknownFields.()Lcom/google/protobuf/UnknownFieldSet;
         at java.lang.ClassLoader.defineClass1(Native Method)
         at java.lang.ClassLoader.defineClass(ClassLoader.java:800)
         at
java.security.SecureClassLoader.defineClass(SecureClassLoader.java:142)
         at java.net.URLClassLoader.defineClass(URLClassLoader.java:449)
         at java.net.URLClassLoader.access$100(URLClassLoader.java:71)
         at java.net.URLClassLoader$1.run(URLClassLoader.java:361)
         at java.net.URLClassLoader$1.run(URLClassLoader.java:355)
         at java.security.AccessController.doPrivileged(Native Method)
         at java.net.URLClassLoader.findClass(URLClassLoader.java:354)
         at java.lang.ClassLoader.loadClass(ClassLoader.java:425)
         at java.lang.ClassLoader.loadClass(ClassLoader.java:358)
         at java.lang.Class.getDeclaredMethods0(Native Method)
         at java.lang.Class.privateGetDeclaredMethods(Class.java:2531)
         at java.lang.Class.privateGetPublicMethods(Class.java:2651)
         at java.lang.Class.privateGetPublicMethods(Class.java:2661)
         at java.lang.Class.getMethods(Class.java:1467)
         at
sun.misc.ProxyGenerator.generateClassFile(ProxyGenerator.java:426)
         at
sun.misc.ProxyGenerator.generateProxyClass(ProxyGenerator.java:323)
         at java.lang.reflect.Proxy.getProxyClass0(Proxy.java:636)
         at java.lang.reflect.Proxy.newProxyInstance(Proxy.java:722)
         at
org.apache.hadoop.ipc.ProtobufRpcEngine.getProxy(ProtobufRpcEngine.java:92)
         at org.apache.hadoop.ipc.RPC.getProtocolProxy(RPC.java:537)


Caused by: java.lang.reflect.InvocationTargetException
         at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
         at
sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:57)
         at
sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
         at java.lang.reflect.Method.invoke(Method.java:606)










--
View this message in context: http://apache-spark-user-list.1001560.n3.nabble.com/Error-reading-HDFS-file-using-spark-0-9-0-hadoop-2-2-0-incompatible-protobuf-2-5-and-2-4-1-tp2158.html
Sent from the Apache Spark User List mailing list archive at Nabble.com.

--
Some people, when confronted with a problem, think "I know, I'll use regular expressions." Now they have two problems.
-- Jamie Zawinski





--
Sincerely yours
Egor Pakhomov
Scala Developer, Yandex

Reply | Threaded
Open this post in threaded view
|

Re: Error reading HDFS file using spark 0.9.0 / hadoop 2.2.0 - incompatible protobuf 2.5 and 2.4.1

Egor Pahomov
In that same pom
    <profile>
      <id>yarn</id>
      <properties>
        <hadoop.major.version>2</hadoop.major.version>
        <hadoop.version>2.2.0</hadoop.version>
        <protobuf.version>2.5.0</protobuf.version>
      </properties>
      <modules>
        <module>yarn</module>
      </modules>

    </profile>


2014-02-28 23:46 GMT+04:00 Aureliano Buendia <[hidden email]>:



On Fri, Feb 28, 2014 at 7:17 PM, Egor Pahomov <[hidden email]> wrote:
Spark 0.9 uses protobuf 2.5.0

Is there another pom for when hadoop 2.2 is used? I don't see another branch for hadooop 2.2.
 
Hadoop 2.2 uses protobuf 2.5.0
protobuf 2.5.0 can read massages serialized with protobuf 2.4.1

Protobuf java code generated by ptotoc 2.4 does not compile with protobuf library 2.5. This is what the OP's error message is about.
 
So there is not any reason why you can't read some messages from hadoop 2.2 with protobuf 2.5.0, probably you somehow have 2.4.1 in your class path. Of course it's very bad, that you have both 2.4.1 and 2.5.0 in your classpath. Use excludes or whatever to get rid of 2.4.1.

Personally, I spend 3 days to move my project to protobuf 2.5.0 from 2.4.1. But it has to be done for the whole your project.

2014-02-28 21:49 GMT+04:00 Aureliano Buendia <[hidden email]>:

Doesn't hadoop 2.2 also depend on protobuf 2.4?


On Fri, Feb 28, 2014 at 5:45 PM, Ognen Duzlevski <[hidden email]> wrote:
A stupid question, by the way, you did compile Spark with Hadoop 2.2.0 support?

Ognen

On 2/28/14, 10:51 AM, Prasad wrote:
Hi
I am getting the protobuf error.... while reading HDFS file using spark
0.9.0 -- i am running on hadoop 2.2.0 .

When i look thru, i find that i have both 2.4.1 and 2.5 and some blogs
suggest that there is some incompatability issues betwen 2.4.1 and 2.5

hduser@prasadHdp1:~/spark-0.9.0-incubating$ find ~/ -name protobuf-java*.jar
/home/hduser/.m2/repository/com/google/protobuf/protobuf-java/2.4.1/protobuf-java-2.4.1.jar
/home/hduser/.m2/repository/org/spark-project/protobuf/protobuf-java/2.4.1-shaded/protobuf-java-2.4.1-shaded.jar
/home/hduser/spark-0.9.0-incubating/lib_managed/bundles/protobuf-java-2.5.0.jar
/home/hduser/spark-0.9.0-incubating/lib_managed/jars/protobuf-java-2.4.1-shaded.jar
/home/hduser/.ivy2/cache/com.google.protobuf/protobuf-java/bundles/protobuf-java-2.5.0.jar
/home/hduser/.ivy2/cache/org.spark-project.protobuf/protobuf-java/jars/protobuf-java-2.4.1-shaded.jar


Can someone please let me know if you faced these issues and how u fixed it.

Thanks
Prasad.
Caused by: java.lang.VerifyError: class
org.apache.hadoop.security.proto.SecurityProtos$GetDelegationTokenRequestProto
overrides final method
getUnknownFields.()Lcom/google/protobuf/UnknownFieldSet;
         at java.lang.ClassLoader.defineClass1(Native Method)
         at java.lang.ClassLoader.defineClass(ClassLoader.java:800)
         at
java.security.SecureClassLoader.defineClass(SecureClassLoader.java:142)
         at java.net.URLClassLoader.defineClass(URLClassLoader.java:449)
         at java.net.URLClassLoader.access$100(URLClassLoader.java:71)
         at java.net.URLClassLoader$1.run(URLClassLoader.java:361)
         at java.net.URLClassLoader$1.run(URLClassLoader.java:355)
         at java.security.AccessController.doPrivileged(Native Method)
         at java.net.URLClassLoader.findClass(URLClassLoader.java:354)
         at java.lang.ClassLoader.loadClass(ClassLoader.java:425)
         at java.lang.ClassLoader.loadClass(ClassLoader.java:358)
         at java.lang.Class.getDeclaredMethods0(Native Method)
         at java.lang.Class.privateGetDeclaredMethods(Class.java:2531)
         at java.lang.Class.privateGetPublicMethods(Class.java:2651)
         at java.lang.Class.privateGetPublicMethods(Class.java:2661)
         at java.lang.Class.getMethods(Class.java:1467)
         at
sun.misc.ProxyGenerator.generateClassFile(ProxyGenerator.java:426)
         at
sun.misc.ProxyGenerator.generateProxyClass(ProxyGenerator.java:323)
         at java.lang.reflect.Proxy.getProxyClass0(Proxy.java:636)
         at java.lang.reflect.Proxy.newProxyInstance(Proxy.java:722)
         at
org.apache.hadoop.ipc.ProtobufRpcEngine.getProxy(ProtobufRpcEngine.java:92)
         at org.apache.hadoop.ipc.RPC.getProtocolProxy(RPC.java:537)


Caused by: java.lang.reflect.InvocationTargetException
         at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
         at
sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:57)
         at
sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
         at java.lang.reflect.Method.invoke(Method.java:606)










--
View this message in context: http://apache-spark-user-list.1001560.n3.nabble.com/Error-reading-HDFS-file-using-spark-0-9-0-hadoop-2-2-0-incompatible-protobuf-2-5-and-2-4-1-tp2158.html
Sent from the Apache Spark User List mailing list archive at Nabble.com.

--
Some people, when confronted with a problem, think "I know, I'll use regular expressions." Now they have two problems.
-- Jamie Zawinski





--
Sincerely yours
Egor Pakhomov
Scala Developer, Yandex




--
Sincerely yours
Egor Pakhomov
Scala Developer, Yandex
Reply | Threaded
Open this post in threaded view
|

Re: Error reading HDFS file using spark 0.9.0 / hadoop 2.2.0 - incompatible protobuf 2.5 and 2.4.1

Egor Pahomov
Protobuf java code generated by ptotoc 2.4 does not compile with protobuf library 2.5 - that's true. What I meant: You serialized message with class generated with protobuf 2.4.1. Now you can read that message with class generated with protobuf 2.5.0 from same .proto.


2014-03-01 0:00 GMT+04:00 Egor Pahomov <[hidden email]>:
In that same pom
    <profile>
      <id>yarn</id>
      <properties>
        <hadoop.major.version>2</hadoop.major.version>
        <hadoop.version>2.2.0</hadoop.version>
        <protobuf.version>2.5.0</protobuf.version>
      </properties>
      <modules>
        <module>yarn</module>
      </modules>

    </profile>


2014-02-28 23:46 GMT+04:00 Aureliano Buendia <[hidden email]>:




On Fri, Feb 28, 2014 at 7:17 PM, Egor Pahomov <[hidden email]> wrote:
Spark 0.9 uses protobuf 2.5.0

Is there another pom for when hadoop 2.2 is used? I don't see another branch for hadooop 2.2.
 
Hadoop 2.2 uses protobuf 2.5.0
protobuf 2.5.0 can read massages serialized with protobuf 2.4.1

Protobuf java code generated by ptotoc 2.4 does not compile with protobuf library 2.5. This is what the OP's error message is about.
 
So there is not any reason why you can't read some messages from hadoop 2.2 with protobuf 2.5.0, probably you somehow have 2.4.1 in your class path. Of course it's very bad, that you have both 2.4.1 and 2.5.0 in your classpath. Use excludes or whatever to get rid of 2.4.1.

Personally, I spend 3 days to move my project to protobuf 2.5.0 from 2.4.1. But it has to be done for the whole your project.

2014-02-28 21:49 GMT+04:00 Aureliano Buendia <[hidden email]>:

Doesn't hadoop 2.2 also depend on protobuf 2.4?


On Fri, Feb 28, 2014 at 5:45 PM, Ognen Duzlevski <[hidden email]> wrote:
A stupid question, by the way, you did compile Spark with Hadoop 2.2.0 support?

Ognen

On 2/28/14, 10:51 AM, Prasad wrote:
Hi
I am getting the protobuf error.... while reading HDFS file using spark
0.9.0 -- i am running on hadoop 2.2.0 .

When i look thru, i find that i have both 2.4.1 and 2.5 and some blogs
suggest that there is some incompatability issues betwen 2.4.1 and 2.5

hduser@prasadHdp1:~/spark-0.9.0-incubating$ find ~/ -name protobuf-java*.jar
/home/hduser/.m2/repository/com/google/protobuf/protobuf-java/2.4.1/protobuf-java-2.4.1.jar
/home/hduser/.m2/repository/org/spark-project/protobuf/protobuf-java/2.4.1-shaded/protobuf-java-2.4.1-shaded.jar
/home/hduser/spark-0.9.0-incubating/lib_managed/bundles/protobuf-java-2.5.0.jar
/home/hduser/spark-0.9.0-incubating/lib_managed/jars/protobuf-java-2.4.1-shaded.jar
/home/hduser/.ivy2/cache/com.google.protobuf/protobuf-java/bundles/protobuf-java-2.5.0.jar
/home/hduser/.ivy2/cache/org.spark-project.protobuf/protobuf-java/jars/protobuf-java-2.4.1-shaded.jar


Can someone please let me know if you faced these issues and how u fixed it.

Thanks
Prasad.
Caused by: java.lang.VerifyError: class
org.apache.hadoop.security.proto.SecurityProtos$GetDelegationTokenRequestProto
overrides final method
getUnknownFields.()Lcom/google/protobuf/UnknownFieldSet;
         at java.lang.ClassLoader.defineClass1(Native Method)
         at java.lang.ClassLoader.defineClass(ClassLoader.java:800)
         at
java.security.SecureClassLoader.defineClass(SecureClassLoader.java:142)
         at java.net.URLClassLoader.defineClass(URLClassLoader.java:449)
         at java.net.URLClassLoader.access$100(URLClassLoader.java:71)
         at java.net.URLClassLoader$1.run(URLClassLoader.java:361)
         at java.net.URLClassLoader$1.run(URLClassLoader.java:355)
         at java.security.AccessController.doPrivileged(Native Method)
         at java.net.URLClassLoader.findClass(URLClassLoader.java:354)
         at java.lang.ClassLoader.loadClass(ClassLoader.java:425)
         at java.lang.ClassLoader.loadClass(ClassLoader.java:358)
         at java.lang.Class.getDeclaredMethods0(Native Method)
         at java.lang.Class.privateGetDeclaredMethods(Class.java:2531)
         at java.lang.Class.privateGetPublicMethods(Class.java:2651)
         at java.lang.Class.privateGetPublicMethods(Class.java:2661)
         at java.lang.Class.getMethods(Class.java:1467)
         at
sun.misc.ProxyGenerator.generateClassFile(ProxyGenerator.java:426)
         at
sun.misc.ProxyGenerator.generateProxyClass(ProxyGenerator.java:323)
         at java.lang.reflect.Proxy.getProxyClass0(Proxy.java:636)
         at java.lang.reflect.Proxy.newProxyInstance(Proxy.java:722)
         at
org.apache.hadoop.ipc.ProtobufRpcEngine.getProxy(ProtobufRpcEngine.java:92)
         at org.apache.hadoop.ipc.RPC.getProtocolProxy(RPC.java:537)


Caused by: java.lang.reflect.InvocationTargetException
         at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
         at
sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:57)
         at
sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
         at java.lang.reflect.Method.invoke(Method.java:606)










--
View this message in context: http://apache-spark-user-list.1001560.n3.nabble.com/Error-reading-HDFS-file-using-spark-0-9-0-hadoop-2-2-0-incompatible-protobuf-2-5-and-2-4-1-tp2158.html
Sent from the Apache Spark User List mailing list archive at Nabble.com.

--
Some people, when confronted with a problem, think "I know, I'll use regular expressions." Now they have two problems.
-- Jamie Zawinski





--
Sincerely yours
Egor Pakhomov
Scala Developer, Yandex




--
Sincerely yours
Egor Pakhomov
Scala Developer, Yandex



--
Sincerely yours
Egor Pakhomov
Scala Developer, Yandex
Reply | Threaded
Open this post in threaded view
|

Re: Error reading HDFS file using spark 0.9.0 / hadoop 2.2.0 - incompatible protobuf 2.5 and 2.4.1

Prasad
In reply to this post by Ognen Duzlevski-2
hi,
Yes, i did.
PARK_HADOOP_VERSION=2.2.0 SPARK_YARN=true sbt/sbt assembly
Further, when i use the spark-shell, i can read the same file and it works fine.
Thanks
Prasad.
Reply | Threaded
Open this post in threaded view
|

Re: Error reading HDFS file using spark 0.9.0 / hadoop 2.2.0 - incompatible protobuf 2.5 and 2.4.1

1esha
In reply to this post by Egor Pahomov
The problem is in akka remote. It contains files compiled with 2.4.*. When you run it with 2.5.* in classpath it fails like above.

Looks like moving to akka 2.3 will solve this issue. Check this issue - https://www.assembla.com/spaces/akka/tickets/3154-use-protobuf-version-2-5-0#/activity/ticket:
Reply | Threaded
Open this post in threaded view
|

Re: Error reading HDFS file using spark 0.9.0 / hadoop 2.2.0 - incompatible protobuf 2.5 and 2.4.1

Aureliano Buendia
Is there a reason for spark using the older akka?


On Sun, Mar 2, 2014 at 1:53 PM, 1esha <[hidden email]> wrote:
The problem is in akka remote. It contains files compiled with 2.4.*. When
you run it with 2.5.* in classpath it fails like above.

Looks like moving to akka 2.3 will solve this issue. Check this issue -
https://www.assembla.com/spaces/akka/tickets/3154-use-protobuf-version-2-5-0#/activity/ticket:



--
View this message in context: http://apache-spark-user-list.1001560.n3.nabble.com/Error-reading-HDFS-file-using-spark-0-9-0-hadoop-2-2-0-incompatible-protobuf-2-5-and-2-4-1-tp2158p2217.html
Sent from the Apache Spark User List mailing list archive at Nabble.com.

Reply | Threaded
Open this post in threaded view
|

Re: Error reading HDFS file using spark 0.9.0 / hadoop 2.2.0 - incompatible protobuf 2.5 and 2.4.1

dmpour23
On Sunday, 2 March 2014 19:19:49 UTC+2, Aureliano Buendia  wrote:

> Is there a reason for spark using the older akka?
>
>
>
>
> On Sun, Mar 2, 2014 at 1:53 PM, 1esha <[hidden email]> wrote:
>
> The problem is in akka remote. It contains files compiled with 2.4.*. When
>
> you run it with 2.5.* in classpath it fails like above.
>
>
>
> Looks like moving to akka 2.3 will solve this issue. Check this issue -
>
> https://www.assembla.com/spaces/akka/tickets/3154-use-protobuf-version-2-5-0#/activity/ticket:
>
>
>
>
>
>
>
>
> --
>
> View this message in context: http://apache-spark-user-list.1001560.n3.nabble.com/Error-reading-HDFS-file-using-spark-0-9-0-hadoop-2-2-0-incompatible-protobuf-2-5-and-2-4-1-tp2158p2217.html
>
>
>
>
> Sent from the Apache Spark User List mailing list archive at Nabble.com.
Is the solution to exclude the  2.4.*. dependency on protobuf or will thi produce more complications?
Reply | Threaded
Open this post in threaded view
|

Re: Error reading HDFS file using spark 0.9.0 / hadoop 2.2.0 - incompatible protobuf 2.5 and 2.4.1

Ognen Duzlevski-2

On 3/18/14, 4:49 AM, [hidden email] wrote:

> On Sunday, 2 March 2014 19:19:49 UTC+2, Aureliano Buendia  wrote:
>> Is there a reason for spark using the older akka?
>>
>>
>>
>>
>> On Sun, Mar 2, 2014 at 1:53 PM, 1esha <[hidden email]> wrote:
>>
>> The problem is in akka remote. It contains files compiled with 2.4.*. When
>>
>> you run it with 2.5.* in classpath it fails like above.
>>
>>
>>
>> Looks like moving to akka 2.3 will solve this issue. Check this issue -
>>
>> https://www.assembla.com/spaces/akka/tickets/3154-use-protobuf-version-2-5-0#/activity/ticket:
>>
>>
>> Is the solution to exclude the  2.4.*. dependency on protobuf or will thi produce more complications?
I am not sure I remember what the context was around this but I run
0.9.0 with hadoop 2.2.0 just fine.
Ognen
Reply | Threaded
Open this post in threaded view
|

Re: Error reading HDFS file using spark 0.9.0 / hadoop 2.2.0 - incompatible protobuf 2.5 and 2.4.1

Aureliano Buendia



On Tue, Mar 18, 2014 at 12:56 PM, Ognen Duzlevski <[hidden email]> wrote:

On 3/18/14, 4:49 AM, [hidden email] wrote:
On Sunday, 2 March 2014 19:19:49 UTC+2, Aureliano Buendia  wrote:
Is there a reason for spark using the older akka?




On Sun, Mar 2, 2014 at 1:53 PM, 1esha <[hidden email]> wrote:

The problem is in akka remote. It contains files compiled with 2.4.*. When

you run it with 2.5.* in classpath it fails like above.



Looks like moving to akka 2.3 will solve this issue. Check this issue -

https://www.assembla.com/spaces/akka/tickets/3154-use-protobuf-version-2-5-0#/activity/ticket:


Is the solution to exclude the  2.4.*. dependency on protobuf or will thi produce more complications?
I am not sure I remember what the context was around this but I run 0.9.0 with hadoop 2.2.0 just fine.

The problem is that spark depends on an older version of akka, which depends on an older version of protobuf (2.4).

This means people cannot use protobuf 2.5 with spark.
 
Ognen

Reply | Threaded
Open this post in threaded view
|

Re: Error reading HDFS file using spark 0.9.0 / hadoop 2.2.0 - incompatible protobuf 2.5 and 2.4.1

Gary Malouf
Can anyone verify the claims from Aureliano regarding the Akka dependency protobuf collision?  Our team has a major need to upgrade to protobuf 2.5.0 up the pipe and Spark seems to be the blocker here.


On Fri, Mar 21, 2014 at 6:49 PM, Aureliano Buendia <[hidden email]> wrote:



On Tue, Mar 18, 2014 at 12:56 PM, Ognen Duzlevski <[hidden email]> wrote:

On 3/18/14, 4:49 AM, [hidden email] wrote:
On Sunday, 2 March 2014 19:19:49 UTC+2, Aureliano Buendia  wrote:
Is there a reason for spark using the older akka?




On Sun, Mar 2, 2014 at 1:53 PM, 1esha <[hidden email]> wrote:

The problem is in akka remote. It contains files compiled with 2.4.*. When

you run it with 2.5.* in classpath it fails like above.



Looks like moving to akka 2.3 will solve this issue. Check this issue -

https://www.assembla.com/spaces/akka/tickets/3154-use-protobuf-version-2-5-0#/activity/ticket:


Is the solution to exclude the  2.4.*. dependency on protobuf or will thi produce more complications?
I am not sure I remember what the context was around this but I run 0.9.0 with hadoop 2.2.0 just fine.

The problem is that spark depends on an older version of akka, which depends on an older version of protobuf (2.4).

This means people cannot use protobuf 2.5 with spark.
 
Ognen


Reply | Threaded
Open this post in threaded view
|

Re: Error reading HDFS file using spark 0.9.0 / hadoop 2.2.0 - incompatible protobuf 2.5 and 2.4.1

Patrick Wendell
In reply to this post by Aureliano Buendia
Starting with Spark 0.9 the protobuf dependency we use is shaded and
cannot interfere with other protobuf libaries including those in
Hadoop. Not sure what's going on in this case. Would someone who is
having this problem post exactly how they are building spark?

- Patrick

On Fri, Mar 21, 2014 at 3:49 PM, Aureliano Buendia <[hidden email]> wrote:

>
>
>
> On Tue, Mar 18, 2014 at 12:56 PM, Ognen Duzlevski
> <[hidden email]> wrote:
>>
>>
>> On 3/18/14, 4:49 AM, [hidden email] wrote:
>>>
>>> On Sunday, 2 March 2014 19:19:49 UTC+2, Aureliano Buendia  wrote:
>>>>
>>>> Is there a reason for spark using the older akka?
>>>>
>>>>
>>>>
>>>>
>>>> On Sun, Mar 2, 2014 at 1:53 PM, 1esha <[hidden email]> wrote:
>>>>
>>>> The problem is in akka remote. It contains files compiled with 2.4.*.
>>>> When
>>>>
>>>> you run it with 2.5.* in classpath it fails like above.
>>>>
>>>>
>>>>
>>>> Looks like moving to akka 2.3 will solve this issue. Check this issue -
>>>>
>>>>
>>>> https://www.assembla.com/spaces/akka/tickets/3154-use-protobuf-version-2-5-0#/activity/ticket:
>>>>
>>>>
>>>> Is the solution to exclude the  2.4.*. dependency on protobuf or will
>>>> thi produce more complications?
>>
>> I am not sure I remember what the context was around this but I run 0.9.0
>> with hadoop 2.2.0 just fine.
>
>
> The problem is that spark depends on an older version of akka, which depends
> on an older version of protobuf (2.4).
>
> This means people cannot use protobuf 2.5 with spark.
>
>>
>> Ognen
>
>
Reply | Threaded
Open this post in threaded view
|

Re: Error reading HDFS file using spark 0.9.0 / hadoop 2.2.0 - incompatible protobuf 2.5 and 2.4.1

giive chen
Hi 

I am quite beginner in spark and I have similar issue last week. I don't know if my issue is the same as yours. I found that my program's jar contain protobuf and when I remove this dependency on my program's pom.xml, rebuild my program and it works. 

Here is how I solved my own issue. 

Environment: 

Spark 0.9, HDFS (Hadoop 2.3), Scala 2.10. My spark is hadoop 2 HDP2 prebuild version from http://spark.apache.org/downloads.html. I don't build spark by my own. 

Problem : 

I use spark 0.9 example folder's word count program to connect my hdfs file which is build on hadoop 2.3. The running command is "./bin/run-example org.apache.spark.examples.WordCount" 
It show "Caused by: java.lang.VerifyError". I survey a lot on web but cannot get any workable solution. 

How I Solve my issue  

I found that if I use spark 0.9's spark-shell and it can connect hdfs file without this problem. But if I use run-example command, it show java.lang.VerifyError. I think the main reason is these two command(spark-shell and run-example)'s classpath is different. 

Run-Example's classpath is $SPARK_HOME/examples/target/scala-2.10/spark-examples_2.10-assembly-0.9.0-incubating.jar::$SPARK_HOME/conf:$SPARK_HOME/assembly/target/scala-2.10/spark-assembly_2.10-0.9.0-incubating-hadoop2.2.0.jar

Spark-Home's classpath is :$SPARK_HOME/conf:$SPARK_HOME/assembly/target/scala-2.10/spark-assembly_2.10-0.9.0-incubating-hadoop2.2.0.jar

The class path difference is $SPARK_HOME/examples/target/scala-2.10/spark-examples_2.10-assembly-0.9.0-incubating.jar and it is build by exmaple program. When I look into  this jar file, I found that it contain two protobuf which I don't know where it is from. I remove all dependency from my example pom.xml and left only one dependncy "spark-core". I rebuild it and it success. 

I don't know if my issue is the same as yours. I hope it can help. 

Wisely Chen  



On Wed, Mar 26, 2014 at 7:10 AM, Patrick Wendell <[hidden email]> wrote:
Starting with Spark 0.9 the protobuf dependency we use is shaded and
cannot interfere with other protobuf libaries including those in
Hadoop. Not sure what's going on in this case. Would someone who is
having this problem post exactly how they are building spark?

- Patrick

On Fri, Mar 21, 2014 at 3:49 PM, Aureliano Buendia <[hidden email]> wrote:
>
>
>
> On Tue, Mar 18, 2014 at 12:56 PM, Ognen Duzlevski
> <[hidden email]> wrote:
>>
>>
>> On 3/18/14, 4:49 AM, [hidden email] wrote:
>>>
>>> On Sunday, 2 March 2014 19:19:49 UTC+2, Aureliano Buendia  wrote:
>>>>
>>>> Is there a reason for spark using the older akka?
>>>>
>>>>
>>>>
>>>>
>>>> On Sun, Mar 2, 2014 at 1:53 PM, 1esha <[hidden email]> wrote:
>>>>
>>>> The problem is in akka remote. It contains files compiled with 2.4.*.
>>>> When
>>>>
>>>> you run it with 2.5.* in classpath it fails like above.
>>>>
>>>>
>>>>
>>>> Looks like moving to akka 2.3 will solve this issue. Check this issue -
>>>>
>>>>
>>>> https://www.assembla.com/spaces/akka/tickets/3154-use-protobuf-version-2-5-0#/activity/ticket:
>>>>
>>>>
>>>> Is the solution to exclude the  2.4.*. dependency on protobuf or will
>>>> thi produce more complications?
>>
>> I am not sure I remember what the context was around this but I run 0.9.0
>> with hadoop 2.2.0 just fine.
>
>
> The problem is that spark depends on an older version of akka, which depends
> on an older version of protobuf (2.4).
>
> This means people cannot use protobuf 2.5 with spark.
>
>>
>> Ognen
>
>

Reply | Threaded
Open this post in threaded view
|

Re: Error reading HDFS file using spark 0.9.0 / hadoop 2.2.0 - incompatible protobuf 2.5 and 2.4.1

Prasad
Hi Wisely,
Could you please post your pom.xml here.

Thanks
Reply | Threaded
Open this post in threaded view
|

Re: Error reading HDFS file using spark 0.9.0 / hadoop 2.2.0 - incompatible protobuf 2.5 and 2.4.1

anant
In reply to this post by Prasad
I've received the same error with Spark built using Maven. It turns out that mesos-0.13.0 depends on protobuf-2.4.1 which is causing the clash at runtime. Protobuf included by Akka is shaded and doesn't cause any problems.

The solution is to update the mesos dependency to 0.18.0 in spark's pom.xml. Rebuilding the JAR with this configuration solves the issue.

-Anant
12