Reasoning behind fail safe behaviour of cast expression

classic Classic list List threaded Threaded
2 messages Options
Reply | Threaded
Open this post in threaded view
|

Reasoning behind fail safe behaviour of cast expression

vatsal
I noticed that Spark handles CAST operation in fail-safe manner. i.e. if the
casting operation will fail for some record(s), Spark doesn't fail the
entire query instead it returns null data value for those failures.

For example following query:


Looking at the code it seems that this behavior is implemented
intentionally. As per my understanding, is this behavior is implemented to
avoid failing the entire query due to some outlier records. Is this
understanding correct?

Is there any dev group mail thread where I can find details discussion on
this decision?

Also is there any configuration that can be used to make the query fail fast
instead of returning null values? Any future development is planned on this
similar line?




--
Sent from: http://apache-spark-user-list.1001560.n3.nabble.com/

---------------------------------------------------------------------
To unsubscribe e-mail: [hidden email]

Reply | Threaded
Open this post in threaded view
|

Re: Reasoning behind fail safe behaviour of cast expression

vatsal
The example query didn't get posted in the previous email.
Adding it here:

> spark.sql("select cast((id || 'name') as int) from range(10) ").show
> +---------------------------------------------+
|CAST(concat(CAST(id AS STRING), name) AS INT)|
+---------------------------------------------+
|                                         null|
|                                         null|
|                                         null|
|                                         null|
|                                         null|
|                                         null|
|                                         null|
|                                         null|
|                                         null|
|                                         null|
+---------------------------------------------+




--
Sent from: http://apache-spark-user-list.1001560.n3.nabble.com/

---------------------------------------------------------------------
To unsubscribe e-mail: [hidden email]