Is it possible to build with Maven?

classic Classic list List threaded Threaded
3 messages Options
Reply | Threaded
Open this post in threaded view
|

Is it possible to build with Maven?

Kevin Markey
I have never been able Spark to build with Maven.  I've tried 0.8.1 and 0.9.0.  I've followed all the instructions provided in http://spark.incubator.apache.org/docs/latest/building-with-maven.html, e.g...

$ export MAVEN_OPTS="-Xmx2g -XX:MaxPermSize=512M -XX:ReservedCodeCacheSize=512m"
$ mvn -Pyarn -Dhadoop.version=2.2.0 -Dyarn.version=2.2.0 -DskipTests clean package

SBT builds work just fine, but I don't seem to get IntelliJ to find various classes and resources when starting it by pointing to the spark-parent POM.  And setting the defines, so I naively thought building it in Maven might help. 

Instead, I get the following errors:

[INFO] --- scala-maven-plugin:3.1.5:compile (scala-compile-first) @ spark-core_2.10 ---
[WARNING] Zinc server is not available at port 3030 - reverting to normal incremental compile
[INFO] Using incremental compilation
[INFO] Compiling 301 Scala sources and 17 Java sources to /Code/Spark/spark-master-2014-02-09/core/target/scala-2.10/classes...
[ERROR] /Code/Spark/spark-master-2014-02-09/core/src/main/scala/org/apache/spark/Logging.scala:22: object impl is not a member of package org.slf4j
[ERROR] import org.slf4j.impl.StaticLoggerBinder
[ERROR]                  ^
[ERROR] /Code/Spark/spark-master-2014-02-09/core/src/main/scala/org/apache/spark/Logging.scala:106: not found: value StaticLoggerBinder
[ERROR]     val binder = StaticLoggerBinder.getSingleton
[ERROR]                  ^
[ERROR] two errors found

And the following sorts of warnings (although none of the dependency warnings should be fatal. 

[WARNING] 'dependencyManagement.dependencies.dependency.(groupId:artifactId:type:classifier)' must be unique: org.mockito:mockito-all:jar -> duplicate declaration of version 1.8.5 @ org.apache.spark:spark-parent:1.0.0-incubating-SNAPSHOT, /Code/Spark/spark-master-2014-02-09/pom.xml, line 377, column 18
...
[WARNING] 'dependencyManagement.dependencies.dependency.exclusions.exclusion.artifactId' for org.apache.hadoop:hadoop-client:jar with value '*' does not match a valid id pattern. @ org.apache.spark:spark-parent:1.0.0-incubating-SNAPSHOT, /Code/Spark/spark-master-2014-02-09/pom.xml, line 421, column 25

Any thoughts?

Also, the notes on IntelliJ in the "building-with-maven" web page appear to be out-of-date, mentioning profiles that no longer exist.

Thank you.
Kevin Markey
Reply | Threaded
Open this post in threaded view
|

Re: Is it possible to build with Maven?

sowen

I successfully build with Maven on the command line and from IntelliJ.

I also see that error which only started yesterday, and think it is due to a commit that has been reverted.

The first of two warnings is fixed in a PR I submitted yesterday and is ignorable.

The second warning is really a Maven problem since the syntax it warns on is quite supported. (Though I have a mind to fix it and some other warnings with some work anyway.)

You won't need profiles just to build.

I have never been able Spark to build with Maven.  I've tried 0.8.1 and 0.9.0.  I've followed all the instructions provided in http://spark.incubator.apache.org/docs/latest/building-with-maven.html, e.g...

$ export MAVEN_OPTS="-Xmx2g -XX:MaxPermSize=512M -XX:ReservedCodeCacheSize=512m"
$ mvn -Pyarn -Dhadoop.version=2.2.0 -Dyarn.version=2.2.0 -DskipTests clean package

SBT builds work just fine, but I don't seem to get IntelliJ to find various classes and resources when starting it by pointing to the spark-parent POM.  And setting the defines, so I naively thought building it in Maven might help. 

Instead, I get the following errors:

[INFO] --- scala-maven-plugin:3.1.5:compile (scala-compile-first) @ spark-core_2.10 ---
[WARNING] Zinc server is not available at port 3030 - reverting to normal incremental compile
[INFO] Using incremental compilation
[INFO] Compiling 301 Scala sources and 17 Java sources to /Code/Spark/spark-master-2014-02-09/core/target/scala-2.10/classes...
[ERROR] /Code/Spark/spark-master-2014-02-09/core/src/main/scala/org/apache/spark/Logging.scala:22: object impl is not a member of package org.slf4j
[ERROR] import org.slf4j.impl.StaticLoggerBinder
[ERROR]                  ^
[ERROR] /Code/Spark/spark-master-2014-02-09/core/src/main/scala/org/apache/spark/Logging.scala:106: not found: value StaticLoggerBinder
[ERROR]     val binder = StaticLoggerBinder.getSingleton
[ERROR]                  ^
[ERROR] two errors found

And the following sorts of warnings (although none of the dependency warnings should be fatal. 

[WARNING] 'dependencyManagement.dependencies.dependency.(groupId:artifactId:type:classifier)' must be unique: org.mockito:mockito-all:jar -> duplicate declaration of version 1.8.5 @ org.apache.spark:spark-parent:1.0.0-incubating-SNAPSHOT, /Code/Spark/spark-master-2014-02-09/pom.xml, line 377, column 18
...
[WARNING] 'dependencyManagement.dependencies.dependency.exclusions.exclusion.artifactId' for org.apache.hadoop:hadoop-client:jar with value '*' does not match a valid id pattern. @ org.apache.spark:spark-parent:1.0.0-incubating-SNAPSHOT, /Code/Spark/spark-master-2014-02-09/pom.xml, line 421, column 25

Any thoughts?

Also, the notes on IntelliJ in the "building-with-maven" web page appear to be out-of-date, mentioning profiles that no longer exist.

Thank you.
Kevin Markey
Reply | Threaded
Open this post in threaded view
|

Re: Is it possible to build with Maven?

Chen Jin
Cool, thanks, I will give it a try!


On Mon, Feb 10, 2014 at 1:02 AM, Sean Owen <[hidden email]> wrote:

> I successfully build with Maven on the command line and from IntelliJ.
>
> I also see that error which only started yesterday, and think it is due to a
> commit that has been reverted.
>
> The first of two warnings is fixed in a PR I submitted yesterday and is
> ignorable.
>
> The second warning is really a Maven problem since the syntax it warns on is
> quite supported. (Though I have a mind to fix it and some other warnings
> with some work anyway.)
>
> You won't need profiles just to build.
>
> I have never been able Spark to build with Maven.  I've tried 0.8.1 and
> 0.9.0.  I've followed all the instructions provided in
> http://spark.incubator.apache.org/docs/latest/building-with-maven.html,
> e.g...
>
> $ export MAVEN_OPTS="-Xmx2g -XX:MaxPermSize=512M
> -XX:ReservedCodeCacheSize=512m"
> $ mvn -Pyarn -Dhadoop.version=2.2.0 -Dyarn.version=2.2.0 -DskipTests clean
> package
>
> SBT builds work just fine, but I don't seem to get IntelliJ to find various
> classes and resources when starting it by pointing to the spark-parent POM.
> And setting the defines, so I naively thought building it in Maven might
> help.
>
> Instead, I get the following errors:
>
> [INFO] --- scala-maven-plugin:3.1.5:compile (scala-compile-first) @
> spark-core_2.10 ---
> [WARNING] Zinc server is not available at port 3030 - reverting to normal
> incremental compile
> [INFO] Using incremental compilation
> [INFO] Compiling 301 Scala sources and 17 Java sources to
> /Code/Spark/spark-master-2014-02-09/core/target/scala-2.10/classes...
> [ERROR]
> /Code/Spark/spark-master-2014-02-09/core/src/main/scala/org/apache/spark/Logging.scala:22:
> object impl is not a member of package org.slf4j
> [ERROR] import org.slf4j.impl.StaticLoggerBinder
> [ERROR]                  ^
> [ERROR]
> /Code/Spark/spark-master-2014-02-09/core/src/main/scala/org/apache/spark/Logging.scala:106:
> not found: value StaticLoggerBinder
> [ERROR]     val binder = StaticLoggerBinder.getSingleton
> [ERROR]                  ^
> [ERROR] two errors found
>
> And the following sorts of warnings (although none of the dependency
> warnings should be fatal.
>
> [WARNING]
> 'dependencyManagement.dependencies.dependency.(groupId:artifactId:type:classifier)'
> must be unique: org.mockito:mockito-all:jar -> duplicate declaration of
> version 1.8.5 @ org.apache.spark:spark-parent:1.0.0-incubating-SNAPSHOT,
> /Code/Spark/spark-master-2014-02-09/pom.xml, line 377, column 18
> ...
> [WARNING]
> 'dependencyManagement.dependencies.dependency.exclusions.exclusion.artifactId'
> for org.apache.hadoop:hadoop-client:jar with value '*' does not match a
> valid id pattern. @ org.apache.spark:spark-parent:1.0.0-incubating-SNAPSHOT,
> /Code/Spark/spark-master-2014-02-09/pom.xml, line 421, column 25
>
> Any thoughts?
>
> Also, the notes on IntelliJ in the "building-with-maven" web page appear to
> be out-of-date, mentioning profiles that no longer exist.
>
> Thank you.
> Kevin Markey