java.lang.UnsupportedOperationException: No Encoder found for Set[String]

classic Classic list List threaded Threaded
4 messages Options
Reply | Threaded
Open this post in threaded view
|

java.lang.UnsupportedOperationException: No Encoder found for Set[String]

V0lleyBallJunki3
Hello,
  I am using Spark 2.2.2 with Scala 2.11.8. I wrote a short program

val spark = SparkSession.builder().master("local[4]").getOrCreate()

case class TestCC(i: Int, ss: Set[String])

import spark.implicits._
import spark.sqlContext.implicits._

val testCCDS = Seq(TestCC(1,Set("SS","Salil")), TestCC(2, Set("xx",
"XYZ"))).toDS()


I get :
java.lang.UnsupportedOperationException: No Encoder found for Set[String]
- field (class: "scala.collection.immutable.Set", name: "ss")
- root class: "TestCC"
  at
org.apache.spark.sql.catalyst.ScalaReflection$$anonfun$org$apache$spark$sql$catalyst$ScalaReflection$$serializerFor$1.apply(ScalaReflection.scala:632)
  at
org.apache.spark.sql.catalyst.ScalaReflection$$anonfun$org$apache$spark$sql$catalyst$ScalaReflection$$serializerFor$1.apply(ScalaReflection.scala:455)
  at
scala.reflect.internal.tpe.TypeConstraints$UndoLog.undo(TypeConstraints.scala:56)
  at
org.apache.spark.sql.catalyst.ScalaReflection$class.cleanUpReflectionObjects(ScalaReflection.scala:809)
  at
org.apache.spark.sql.catalyst.ScalaReflection$.cleanUpReflectionObjects(ScalaReflection.scala:39)
  at
org.apache.spark.sql.catalyst.ScalaReflection$.org$apache$spark$sql$catalyst$ScalaReflection$$serializerFor(ScalaReflection.scala:455)
  at
org.apache.spark.sql.catalyst.ScalaReflection$$anonfun$org$apache$spark$sql$catalyst$ScalaReflection$$serializerFor$1$$anonfun$10.apply(ScalaReflection.scala:626)
  at
org.apache.spark.sql.catalyst.ScalaReflection$$anonfun$org$apache$spark$sql$catalyst$ScalaReflection$$serializerFor$1$$anonfun$10.apply(ScalaReflection.scala:614)
  at
scala.collection.TraversableLike$$anonfun$flatMap$1.apply(TraversableLike.scala:241)

To the best of my knowledge implicit support for Set has been added in Spark
2.2. Am I missing something?



--
Sent from: http://apache-spark-user-list.1001560.n3.nabble.com/

---------------------------------------------------------------------
To unsubscribe e-mail: [hidden email]

Reply | Threaded
Open this post in threaded view
|

Re: java.lang.UnsupportedOperationException: No Encoder found for Set[String]

Manu Zhang

On Thu, Aug 16, 2018 at 9:59 AM V0lleyBallJunki3 <[hidden email]> wrote:
Hello,
  I am using Spark 2.2.2 with Scala 2.11.8. I wrote a short program

val spark = SparkSession.builder().master("local[4]").getOrCreate()

case class TestCC(i: Int, ss: Set[String])

import spark.implicits._
import spark.sqlContext.implicits._

val testCCDS = Seq(TestCC(1,Set("SS","Salil")), TestCC(2, Set("xx",
"XYZ"))).toDS()


I get :
java.lang.UnsupportedOperationException: No Encoder found for Set[String]
- field (class: "scala.collection.immutable.Set", name: "ss")
- root class: "TestCC"
  at
org.apache.spark.sql.catalyst.ScalaReflection$$anonfun$org$apache$spark$sql$catalyst$ScalaReflection$$serializerFor$1.apply(ScalaReflection.scala:632)
  at
org.apache.spark.sql.catalyst.ScalaReflection$$anonfun$org$apache$spark$sql$catalyst$ScalaReflection$$serializerFor$1.apply(ScalaReflection.scala:455)
  at
scala.reflect.internal.tpe.TypeConstraints$UndoLog.undo(TypeConstraints.scala:56)
  at
org.apache.spark.sql.catalyst.ScalaReflection$class.cleanUpReflectionObjects(ScalaReflection.scala:809)
  at
org.apache.spark.sql.catalyst.ScalaReflection$.cleanUpReflectionObjects(ScalaReflection.scala:39)
  at
org.apache.spark.sql.catalyst.ScalaReflection$.org$apache$spark$sql$catalyst$ScalaReflection$$serializerFor(ScalaReflection.scala:455)
  at
org.apache.spark.sql.catalyst.ScalaReflection$$anonfun$org$apache$spark$sql$catalyst$ScalaReflection$$serializerFor$1$$anonfun$10.apply(ScalaReflection.scala:626)
  at
org.apache.spark.sql.catalyst.ScalaReflection$$anonfun$org$apache$spark$sql$catalyst$ScalaReflection$$serializerFor$1$$anonfun$10.apply(ScalaReflection.scala:614)
  at
scala.collection.TraversableLike$$anonfun$flatMap$1.apply(TraversableLike.scala:241)

To the best of my knowledge implicit support for Set has been added in Spark
2.2. Am I missing something?



--
Sent from: http://apache-spark-user-list.1001560.n3.nabble.com/

---------------------------------------------------------------------
To unsubscribe e-mail: [hidden email]

Reply | Threaded
Open this post in threaded view
|

Re: java.lang.UnsupportedOperationException: No Encoder found for Set[String]

V0lleyBallJunki3
We are using spark 2.2.0. Is it possible to bring the
ExpressionEncoder from 2.3.0 and related classes into my code base and
use them? I see the changes in ExpressionEncoder between 2.3.0 and
2.2.0 is not much but there might be many other classes underneath
that might have changed.

On Thu, Aug 16, 2018 at 5:23 AM, Manu Zhang <[hidden email]> wrote:

> Hi,
>
> It's added since Spark 2.3.0.
> https://github.com/apache/spark/blob/master/sql/core/src/main/scala/org/apache/spark/sql/SQLImplicits.scala#L180
>
> Regards,
> Manu Zhang
>
> On Thu, Aug 16, 2018 at 9:59 AM V0lleyBallJunki3 <[hidden email]>
> wrote:
>>
>> Hello,
>>   I am using Spark 2.2.2 with Scala 2.11.8. I wrote a short program
>>
>> val spark = SparkSession.builder().master("local[4]").getOrCreate()
>>
>> case class TestCC(i: Int, ss: Set[String])
>>
>> import spark.implicits._
>> import spark.sqlContext.implicits._
>>
>> val testCCDS = Seq(TestCC(1,Set("SS","Salil")), TestCC(2, Set("xx",
>> "XYZ"))).toDS()
>>
>>
>> I get :
>> java.lang.UnsupportedOperationException: No Encoder found for Set[String]
>> - field (class: "scala.collection.immutable.Set", name: "ss")
>> - root class: "TestCC"
>>   at
>>
>> org.apache.spark.sql.catalyst.ScalaReflection$$anonfun$org$apache$spark$sql$catalyst$ScalaReflection$$serializerFor$1.apply(ScalaReflection.scala:632)
>>   at
>>
>> org.apache.spark.sql.catalyst.ScalaReflection$$anonfun$org$apache$spark$sql$catalyst$ScalaReflection$$serializerFor$1.apply(ScalaReflection.scala:455)
>>   at
>>
>> scala.reflect.internal.tpe.TypeConstraints$UndoLog.undo(TypeConstraints.scala:56)
>>   at
>>
>> org.apache.spark.sql.catalyst.ScalaReflection$class.cleanUpReflectionObjects(ScalaReflection.scala:809)
>>   at
>>
>> org.apache.spark.sql.catalyst.ScalaReflection$.cleanUpReflectionObjects(ScalaReflection.scala:39)
>>   at
>>
>> org.apache.spark.sql.catalyst.ScalaReflection$.org$apache$spark$sql$catalyst$ScalaReflection$$serializerFor(ScalaReflection.scala:455)
>>   at
>>
>> org.apache.spark.sql.catalyst.ScalaReflection$$anonfun$org$apache$spark$sql$catalyst$ScalaReflection$$serializerFor$1$$anonfun$10.apply(ScalaReflection.scala:626)
>>   at
>>
>> org.apache.spark.sql.catalyst.ScalaReflection$$anonfun$org$apache$spark$sql$catalyst$ScalaReflection$$serializerFor$1$$anonfun$10.apply(ScalaReflection.scala:614)
>>   at
>>
>> scala.collection.TraversableLike$$anonfun$flatMap$1.apply(TraversableLike.scala:241)
>>
>> To the best of my knowledge implicit support for Set has been added in
>> Spark
>> 2.2. Am I missing something?
>>
>>
>>
>> --
>> Sent from: http://apache-spark-user-list.1001560.n3.nabble.com/
>>
>> ---------------------------------------------------------------------
>> To unsubscribe e-mail: [hidden email]
>>
>

---------------------------------------------------------------------
To unsubscribe e-mail: [hidden email]

Reply | Threaded
Open this post in threaded view
|

Re: java.lang.UnsupportedOperationException: No Encoder found for Set[String]

Manu Zhang
You may try applying this PR  https://github.com/apache/spark/pull/18416.

On Fri, Aug 17, 2018 at 9:13 AM Venkat Dabri <[hidden email]> wrote:
We are using spark 2.2.0. Is it possible to bring the
ExpressionEncoder from 2.3.0 and related classes into my code base and
use them? I see the changes in ExpressionEncoder between 2.3.0 and
2.2.0 is not much but there might be many other classes underneath
that might have changed.

On Thu, Aug 16, 2018 at 5:23 AM, Manu Zhang <[hidden email]> wrote:
> Hi,
>
> It's added since Spark 2.3.0.
> https://github.com/apache/spark/blob/master/sql/core/src/main/scala/org/apache/spark/sql/SQLImplicits.scala#L180
>
> Regards,
> Manu Zhang
>
> On Thu, Aug 16, 2018 at 9:59 AM V0lleyBallJunki3 <[hidden email]>
> wrote:
>>
>> Hello,
>>   I am using Spark 2.2.2 with Scala 2.11.8. I wrote a short program
>>
>> val spark = SparkSession.builder().master("local[4]").getOrCreate()
>>
>> case class TestCC(i: Int, ss: Set[String])
>>
>> import spark.implicits._
>> import spark.sqlContext.implicits._
>>
>> val testCCDS = Seq(TestCC(1,Set("SS","Salil")), TestCC(2, Set("xx",
>> "XYZ"))).toDS()
>>
>>
>> I get :
>> java.lang.UnsupportedOperationException: No Encoder found for Set[String]
>> - field (class: "scala.collection.immutable.Set", name: "ss")
>> - root class: "TestCC"
>>   at
>>
>> org.apache.spark.sql.catalyst.ScalaReflection$$anonfun$org$apache$spark$sql$catalyst$ScalaReflection$$serializerFor$1.apply(ScalaReflection.scala:632)
>>   at
>>
>> org.apache.spark.sql.catalyst.ScalaReflection$$anonfun$org$apache$spark$sql$catalyst$ScalaReflection$$serializerFor$1.apply(ScalaReflection.scala:455)
>>   at
>>
>> scala.reflect.internal.tpe.TypeConstraints$UndoLog.undo(TypeConstraints.scala:56)
>>   at
>>
>> org.apache.spark.sql.catalyst.ScalaReflection$class.cleanUpReflectionObjects(ScalaReflection.scala:809)
>>   at
>>
>> org.apache.spark.sql.catalyst.ScalaReflection$.cleanUpReflectionObjects(ScalaReflection.scala:39)
>>   at
>>
>> org.apache.spark.sql.catalyst.ScalaReflection$.org$apache$spark$sql$catalyst$ScalaReflection$$serializerFor(ScalaReflection.scala:455)
>>   at
>>
>> org.apache.spark.sql.catalyst.ScalaReflection$$anonfun$org$apache$spark$sql$catalyst$ScalaReflection$$serializerFor$1$$anonfun$10.apply(ScalaReflection.scala:626)
>>   at
>>
>> org.apache.spark.sql.catalyst.ScalaReflection$$anonfun$org$apache$spark$sql$catalyst$ScalaReflection$$serializerFor$1$$anonfun$10.apply(ScalaReflection.scala:614)
>>   at
>>
>> scala.collection.TraversableLike$$anonfun$flatMap$1.apply(TraversableLike.scala:241)
>>
>> To the best of my knowledge implicit support for Set has been added in
>> Spark
>> 2.2. Am I missing something?
>>
>>
>>
>> --
>> Sent from: http://apache-spark-user-list.1001560.n3.nabble.com/
>>
>> ---------------------------------------------------------------------
>> To unsubscribe e-mail: [hidden email]
>>
>