Cannot get Hadoop dependencies

classic Classic list List threaded Threaded
4 messages Options
Reply | Threaded
Open this post in threaded view
|

Cannot get Hadoop dependencies

Kal El
I am having some trouble with Hadoop. I cannot build my project with sbt.

According to the documentation, I added a line like this in my build.sbt file:
"libraryDependencies += "org.apache.hadoop" % "hadoop-client" % "<your-hdfs-version>""
my line being:

libraryDependencies += "org.apache.hadoop" % "hadoop-client" % "0.20.2"

when I hit assembly under sbt I get the following error:

> assembly
[info] Updating {file:/home/spark2013/clusterWorkDirectory/GC/}gc...
[info] Resolving org.apache.hadoop#hadoop-client;0.20.2 ...
[warn]  module not found: org.apache.hadoop#hadoop-client;0.20.2
[warn] ==== local: tried
[warn]   /home/spark2013/.ivy2/local/org.apache.hadoop/hadoop-client/0.20.2/ivys/ivy.xml
[warn] ==== public: tried
[warn]   http://repo1.maven.org/maven2/org/apache/hadoop/hadoop-client/0.20.2/hadoop-client-0.20.2.pom
[info] Resolving org.fusesource.jansi#jansi;1.4 ...
[warn]  ::::::::::::::::::::::::::::::::::::::::::::::
[warn]  ::          UNRESOLVED DEPENDENCIES         ::
[warn]  ::::::::::::::::::::::::::::::::::::::::::::::
[warn]  :: org.apache.hadoop#hadoop-client;0.20.2: not found
[warn]  ::::::::::::::::::::::::::::::::::::::::::::::
[trace] Stack trace suppressed: run last *:update for the full output.
[error] (*:update) sbt.ResolveException: unresolved dependency: org.apache.hadoop#hadoop-client;0.20.2: not found
[error] Total time: 4 s, completed Jan 27, 2014 3:21:28 PM

How can I fix this ?

Thanks
Reply | Threaded
Open this post in threaded view
|

Re: Cannot get Hadoop dependencies

yinxusen

http://www.scala-sbt.org/release/docs/Getting-Started/Library-Dependencies

This document might be useful. You should make sure that your specified package in the right uri,and the repo is added in resolver.

2014-1-27 PM9:24于 "Kal El" <[hidden email]>写道:
I am having some trouble with Hadoop. I cannot build my project with sbt.

According to the documentation, I added a line like this in my build.sbt file:
"libraryDependencies += "org.apache.hadoop" % "hadoop-client" % "<your-hdfs-version>""
my line being:

libraryDependencies += "org.apache.hadoop" % "hadoop-client" % "0.20.2"

when I hit assembly under sbt I get the following error:

> assembly
[info] Updating {file:/home/spark2013/clusterWorkDirectory/GC/}gc...
[info] Resolving org.apache.hadoop#hadoop-client;0.20.2 ...
[warn]  module not found: org.apache.hadoop#hadoop-client;0.20.2
[warn] ==== local: tried
[warn]   /home/spark2013/.ivy2/local/org.apache.hadoop/hadoop-client/0.20.2/ivys/ivy.xml
[warn] ==== public: tried
[info] Resolving org.fusesource.jansi#jansi;1.4 ...
[warn]  ::::::::::::::::::::::::::::::::::::::::::::::
[warn]  ::          UNRESOLVED DEPENDENCIES         ::
[warn]  ::::::::::::::::::::::::::::::::::::::::::::::
[warn]  :: org.apache.hadoop#hadoop-client;0.20.2: not found
[warn]  ::::::::::::::::::::::::::::::::::::::::::::::
[trace] Stack trace suppressed: run last *:update for the full output.
[error] (*:update) sbt.ResolveException: unresolved dependency: org.apache.hadoop#hadoop-client;0.20.2: not found
[error] Total time: 4 s, completed Jan 27, 2014 3:21:28 PM

How can I fix this ?

Thanks
Reply | Threaded
Open this post in threaded view
|

Re: Cannot get Hadoop dependencies

Jey Kottalam
I believe that Hadoop 0.20.2 is too old for compatibility with Spark.
The hadoop-client dependency is available in the 0.23.x, 1.0.x, and
newer releases, but not in the 0.20.x releases.

Source: http://mvnrepository.com/artifact/org.apache.hadoop/hadoop-client

On Mon, Jan 27, 2014 at 6:20 AM, 尹绪森 <[hidden email]> wrote:

> http://www.scala-sbt.org/release/docs/Getting-Started/Library-Dependencies
>
> This document might be useful. You should make sure that your specified
> package in the right uri,and the repo is added in resolver.
>
> 2014-1-27 PM9:24于 "Kal El" <[hidden email]>写道:
>
>> I am having some trouble with Hadoop. I cannot build my project with sbt.
>>
>> According to the documentation, I added a line like this in my build.sbt
>> file:
>> "libraryDependencies += "org.apache.hadoop" % "hadoop-client" %
>> "<your-hdfs-version>""
>> my line being:
>>
>> libraryDependencies += "org.apache.hadoop" % "hadoop-client" % "0.20.2"
>>
>> when I hit assembly under sbt I get the following error:
>>
>> > assembly
>> [info] Updating {file:/home/spark2013/clusterWorkDirectory/GC/}gc...
>> [info] Resolving org.apache.hadoop#hadoop-client;0.20.2 ...
>> [warn]  module not found: org.apache.hadoop#hadoop-client;0.20.2
>> [warn] ==== local: tried
>> [warn]
>> /home/spark2013/.ivy2/local/org.apache.hadoop/hadoop-client/0.20.2/ivys/ivy.xml
>> [warn] ==== public: tried
>> [warn]
>> http://repo1.maven.org/maven2/org/apache/hadoop/hadoop-client/0.20.2/hadoop-client-0.20.2.pom
>> [info] Resolving org.fusesource.jansi#jansi;1.4 ...
>> [warn]  ::::::::::::::::::::::::::::::::::::::::::::::
>> [warn]  ::          UNRESOLVED DEPENDENCIES         ::
>> [warn]  ::::::::::::::::::::::::::::::::::::::::::::::
>> [warn]  :: org.apache.hadoop#hadoop-client;0.20.2: not found
>> [warn]  ::::::::::::::::::::::::::::::::::::::::::::::
>> [trace] Stack trace suppressed: run last *:update for the full output.
>> [error] (*:update) sbt.ResolveException: unresolved dependency:
>> org.apache.hadoop#hadoop-client;0.20.2: not found
>> [error] Total time: 4 s, completed Jan 27, 2014 3:21:28 PM
>>
>> How can I fix this ?
>>
>> Thanks
Reply | Threaded
Open this post in threaded view
|

Re: Cannot get Hadoop dependencies

Kal El
Well, it seems that 0.20.2 is actually the latest version (2.2.0)
I have the following problem:
In build.sbt I have this:

libraryDependencies ++= Seq(
  ("org.apache.spark" %% "spark-core" % "0.8.0-incubating").
    exclude("org.mortbay.jetty", "servlet-api").
    exclude("commons-beanutils", "commons-beanutils-core").
    exclude("commons-collections", "commons-collections").
    exclude("commons-collections", "commons-collections").
    exclude("com.esotericsoftware.minlog", "minlog")
)

And I would like to add this too: "org.apache.hadoop" % "hadoop-client" % "0.20.2-cdh3u4"

How do i write this ? I have tried several ways and it is messing with my excludes.



On Monday, January 27, 2014 10:24 PM, Jey Kottalam <[hidden email]> wrote:
I believe that Hadoop 0.20.2 is too old for compatibility with Spark.
The hadoop-client dependency is available in the 0.23.x, 1.0.x, and
newer releases, but not in the 0.20.x releases.

Source: http://mvnrepository.com/artifact/org.apache.hadoop/hadoop-client

On Mon, Jan 27, 2014 at 6:20 AM, 尹绪森 <[hidden email]> wrote:

> http://www.scala-sbt.org/release/docs/Getting-Started/Library-Dependencies
>
> This document might be useful. You should make sure that your specified
> package in the right uri,and the repo is added in resolver.
>
> 2014-1-27 PM9:24于 "Kal El" <[hidden email]>写道:
>
>> I am having some trouble with Hadoop. I cannot build my project with sbt.
>>
>> According to the documentation, I added a line like this in my build.sbt
>> file:
>> "libraryDependencies += "org.apache.hadoop" % "hadoop-client" %
>> "<your-hdfs-version>""
>> my line being:
>>
>> libraryDependencies += "org.apache.hadoop" % "hadoop-client" % "0.20.2"
>>
>> when I hit assembly under sbt I get the following error:
>>
>> > assembly
>> [info] Updating {file:/home/spark2013/clusterWorkDirectory/GC/}gc...
>> [info] Resolving org.apache.hadoop#hadoop-client;0.20.2 ...
>> [warn]  module not found: org.apache.hadoop#hadoop-client;0.20.2
>> [warn] ==== local: tried
>> [warn]
>> /home/spark2013/.ivy2/local/org.apache.hadoop/hadoop-client/0.20.2/ivys/ivy.xml
>> [warn] ==== public: tried
>> [warn]
>> http://repo1.maven.org/maven2/org/apache/hadoop/hadoop-client/0.20.2/hadoop-client-0.20.2.pom
>> [info] Resolving org.fusesource.jansi#jansi;1.4 ...
>> [warn]  ::::::::::::::::::::::::::::::::::::::::::::::
>> [warn]  ::          UNRESOLVED DEPENDENCIES        ::
>> [warn]  ::::::::::::::::::::::::::::::::::::::::::::::
>> [warn]  :: org.apache.hadoop#hadoop-client;0.20.2: not found
>> [warn]  ::::::::::::::::::::::::::::::::::::::::::::::
>> [trace] Stack trace suppressed: run last *:update for the full output.
>> [error] (*:update) sbt.ResolveException: unresolved dependency:
>> org.apache.hadoop#hadoop-client;0.20.2: not found
>> [error] Total time: 4 s, completed Jan 27, 2014 3:21:28 PM
>>
>> How can I fix this ?
>>
>> Thanks