Development version error on sbt compile publish-local

Previous Topic Next Topic
 
classic Classic list List threaded Threaded
5 messages Options
Reply | Threaded
Open this post in threaded view
|

Development version error on sbt compile publish-local

Shing Hing Man




 Hi,
   I have checkouted  the  development version of Spark at
          git://github.com/apache/incubator-spark.git.

I have trying to compile it with Scala 2.10.3.

The following command completed successfully.

matmsh@gauss:~/Downloads/spark/github/incubator-spark> SPARK_HADOOP_VERSION=1.2.1 sbt/sbt assembly
But

matmsh@gauss:~/Downloads/spark/github/incubator-spark> sbt compile publish-local

gives the following error:



[info] Compiling 1 Scala source to /home/matmsh/Downloads/spark/github/incubator-spark/repl/target/scala-2.10/classes...
[info] Compiling 8 Scala sources to /home/matmsh/Downloads/spark/github/incubator-spark/streaming/target/scala-2.10/classes...
[error] /home/matmsh/Downloads/spark/github/incubator-spark/streaming/src/main/scala/org/apache/spark/streaming/api/java/JavaPairDStream.scala:52: type mismatch;
[error]  found   : org.apache.spark.streaming.DStream[(K, V)]
[error]  required: org.apache.spark.streaming.api.java.JavaPairDStream[K,V]
[error]  Note: implicit method fromPairDStream is not applicable here because it comes after the application point and it lacks an explicit result type
[error]     dstream.filter((x => f(x).booleanValue()))


Any there anyway to resolve the above issue ?

Thanks  in advance for your assistance !


Shing
Reply | Threaded
Open this post in threaded view
|

Re: Development version error on sbt compile publish-local

Patrick Wendell
Can you try running "sbt/sbt clean". Sometimes things can get randomly
corrupted and cause stuff like this.

On Sat, Jan 11, 2014 at 12:49 PM, Shing Hing Man <[hidden email]> wrote:

>
>
>
>
>  Hi,
>    I have checkouted  the  development version of Spark at
>           git://github.com/apache/incubator-spark.git.
>
> I have trying to compile it with Scala 2.10.3.
>
> The following command completed successfully.
>
> matmsh@gauss:~/Downloads/spark/github/incubator-spark> SPARK_HADOOP_VERSION=1.2.1 sbt/sbt assembly
> But
>
> matmsh@gauss:~/Downloads/spark/github/incubator-spark> sbt compile publish-local
>
> gives the following error:
>
>
>
> [info] Compiling 1 Scala source to /home/matmsh/Downloads/spark/github/incubator-spark/repl/target/scala-2.10/classes...
> [info] Compiling 8 Scala sources to /home/matmsh/Downloads/spark/github/incubator-spark/streaming/target/scala-2.10/classes...
> [error] /home/matmsh/Downloads/spark/github/incubator-spark/streaming/src/main/scala/org/apache/spark/streaming/api/java/JavaPairDStream.scala:52: type mismatch;
> [error]  found   : org.apache.spark.streaming.DStream[(K, V)]
> [error]  required: org.apache.spark.streaming.api.java.JavaPairDStream[K,V]
> [error]  Note: implicit method fromPairDStream is not applicable here because it comes after the application point and it lacks an explicit result type
> [error]     dstream.filter((x => f(x).booleanValue()))
>
>
> Any there anyway to resolve the above issue ?
>
> Thanks  in advance for your assistance !
>
>
> Shing
Reply | Threaded
Open this post in threaded view
|

Re: Development version error on sbt compile publish-local

Shing Hing Man
Hi,
  Thanks for your reply !

sbt/sbt clean does not help.

I did the following in incubator-spark directory and still get the same error as before.

1) sbt/sbt clean
 2) SPARK_HADOOP_VERSION=1.2.1 sbt/sbt assembly
3) sbt/sbt compile publish-local


Shing




On Sunday, January 12, 2014 12:32 AM, Patrick Wendell <[hidden email]> wrote:
Can you try running "sbt/sbt clean". Sometimes things can get randomly
corrupted and cause stuff like this.


On Sat, Jan 11, 2014 at 12:49 PM, Shing Hing Man <[hidden email]> wrote:

>
>
>
>
>  Hi,
>    I have checkouted  the  development version of Spark at
>           git://github.com/apache/incubator-spark.git.
>
> I have trying to compile it with Scala 2.10.3.
>
> The following command completed successfully.
>
> matmsh@gauss:~/Downloads/spark/github/incubator-spark> SPARK_HADOOP_VERSION=1.2.1 sbt/sbt assembly
> But
>
> matmsh@gauss:~/Downloads/spark/github/incubator-spark> sbt compile publish-local
>
> gives the following error:
>
>
>
> [info] Compiling 1 Scala source to /home/matmsh/Downloads/spark/github/incubator-spark/repl/target/scala-2.10/classes...
> [info] Compiling 8 Scala sources to /home/matmsh/Downloads/spark/github/incubator-spark/streaming/target/scala-2.10/classes...
> [error] /home/matmsh/Downloads/spark/github/incubator-spark/streaming/src/main/scala/org/apache/spark/streaming/api/java/JavaPairDStream.scala:52: type mismatch;
> [error]  found   : org.apache.spark.streaming.DStream[(K, V)]
> [error]  required: org.apache.spark.streaming.api.java.JavaPairDStream[K,V]
> [error]  Note: implicit method fromPairDStream is not applicable here because it comes after the application point and it lacks an explicit result type
> [error]     dstream.filter((x => f(x).booleanValue()))
>
>
> Any there anyway to resolve the above issue ?
>
> Thanks  in advance for your assistance !
>
>
> Shing

Reply | Threaded
Open this post in threaded view
|

Re: Development version error on sbt compile publish-local

Shing Hing Man
There is no error if I do sbt/sbt clean between "sbt compile publish-local" and "sbt/sbt assembly". Namely

1) sbt/sbt clean
2) sbt/sbt compile publish-local
3)  sbt/sbt clean
4) SPARK_HADOOP_VERSION=1.2.1 sbt/sbt assembly

Now I have the spark jars in my local ivy repository and I can run  spark shell and
the examples that comes with the spark distribution.

Shing




On Sunday, January 12, 2014 9:58 AM, Shing Hing Man <[hidden email]> wrote:
Hi,
  Thanks for your reply !

sbt/sbt clean does not help.

I did the following in incubator-spark directory and still get the same error as before.

1) sbt/sbt clean
 2) SPARK_HADOOP_VERSION=1.2.1 sbt/sbt assembly
3) sbt/sbt compile publish-local


Shing





On Sunday, January 12, 2014 12:32 AM, Patrick Wendell <[hidden email]> wrote:
Can you try running "sbt/sbt clean". Sometimes things can get randomly
corrupted and cause stuff like this.


On Sat, Jan 11, 2014 at 12:49 PM, Shing Hing Man <[hidden email]> wrote:

>
>
>
>
>  Hi,
>    I have checkouted  the  development version of Spark at
>           git://github.com/apache/incubator-spark.git.
>
> I have trying to compile it with Scala 2.10.3.
>
> The following command completed successfully.
>
> matmsh@gauss:~/Downloads/spark/github/incubator-spark> SPARK_HADOOP_VERSION=1.2.1 sbt/sbt assembly
> But
>
> matmsh@gauss:~/Downloads/spark/github/incubator-spark> sbt compile publish-local
>
> gives the following error:
>
>
>
> [info] Compiling 1 Scala source to /home/matmsh/Downloads/spark/github/incubator-spark/repl/target/scala-2.10/classes...
> [info] Compiling 8 Scala sources to /home/matmsh/Downloads/spark/github/incubator-spark/streaming/target/scala-2.10/classes...
> [error] /home/matmsh/Downloads/spark/github/incubator-spark/streaming/src/main/scala/org/apache/spark/streaming/api/java/JavaPairDStream.scala:52: type mismatch;
> [error]  found   : org.apache.spark.streaming.DStream[(K, V)]
> [error]  required: org.apache.spark.streaming.api.java.JavaPairDStream[K,V]
> [error]  Note: implicit method fromPairDStream is not applicable here because it comes after the application point and it lacks an explicit result type
> [error]     dstream.filter((x => f(x).booleanValue()))
>
>
> Any there anyway to resolve the above issue ?
>
> Thanks  in advance for your assistance !
>
>
> Shing

Reply | Threaded
Open this post in threaded view
|

Re: Development version error on sbt compile publish-local

Patrick Wendell
Ah okay - glad you got it working... it must be due to a corruption
somewhere in sbt's state.

On Sun, Jan 12, 2014 at 2:18 AM, Shing Hing Man <[hidden email]> wrote:

> There is no error if I do sbt/sbt clean between "sbt compile publish-local" and "sbt/sbt assembly". Namely
>
> 1) sbt/sbt clean
> 2) sbt/sbt compile publish-local
> 3)  sbt/sbt clean
> 4) SPARK_HADOOP_VERSION=1.2.1 sbt/sbt assembly
>
> Now I have the spark jars in my local ivy repository and I can run  spark shell and
> the examples that comes with the spark distribution.
>
> Shing
>
>
>
>
> On Sunday, January 12, 2014 9:58 AM, Shing Hing Man <[hidden email]> wrote:
> Hi,
>   Thanks for your reply !
>
> sbt/sbt clean does not help.
>
> I did the following in incubator-spark directory and still get the same error as before.
>
> 1) sbt/sbt clean
>  2) SPARK_HADOOP_VERSION=1.2.1 sbt/sbt assembly
> 3) sbt/sbt compile publish-local
>
>
> Shing
>
>
>
>
>
> On Sunday, January 12, 2014 12:32 AM, Patrick Wendell <[hidden email]> wrote:
> Can you try running "sbt/sbt clean". Sometimes things can get randomly
> corrupted and cause stuff like this.
>
>
> On Sat, Jan 11, 2014 at 12:49 PM, Shing Hing Man <[hidden email]> wrote:
>>
>>
>>
>>
>>  Hi,
>>    I have checkouted  the  development version of Spark at
>>           git://github.com/apache/incubator-spark.git.
>>
>> I have trying to compile it with Scala 2.10.3.
>>
>> The following command completed successfully.
>>
>> matmsh@gauss:~/Downloads/spark/github/incubator-spark> SPARK_HADOOP_VERSION=1.2.1 sbt/sbt assembly
>> But
>>
>> matmsh@gauss:~/Downloads/spark/github/incubator-spark> sbt compile publish-local
>>
>> gives the following error:
>>
>>
>>
>> [info] Compiling 1 Scala source to /home/matmsh/Downloads/spark/github/incubator-spark/repl/target/scala-2.10/classes...
>> [info] Compiling 8 Scala sources to /home/matmsh/Downloads/spark/github/incubator-spark/streaming/target/scala-2.10/classes...
>> [error] /home/matmsh/Downloads/spark/github/incubator-spark/streaming/src/main/scala/org/apache/spark/streaming/api/java/JavaPairDStream.scala:52: type mismatch;
>> [error]  found   : org.apache.spark.streaming.DStream[(K, V)]
>> [error]  required: org.apache.spark.streaming.api.java.JavaPairDStream[K,V]
>> [error]  Note: implicit method fromPairDStream is not applicable here because it comes after the application point and it lacks an explicit result type
>> [error]     dstream.filter((x => f(x).booleanValue()))
>>
>>
>> Any there anyway to resolve the above issue ?
>>
>> Thanks  in advance for your assistance !
>>
>>
>> Shing
>