Best approach to write UDF

Previous Topic Next Topic
classic Classic list List threaded Threaded
1 message Options
Reply | Threaded
Open this post in threaded view

Best approach to write UDF

Nicolas Paris-2

I have written spark udf and I am able to use them in spark scala /
pyspark by using the API.

I d'like to use them in spark-sql thought thrift. I tried to create the
functions "create function as ''". however I get the below
error when using it:

> org.apache.spark.sql.AnalysisException: No handler for UDF/UDAF/UDTF '';

I have read there ( that
only the org.apache.hadoop.hive.ql.exec.UDF API works for thrift.

How one can write UDF the good way ?



To unsubscribe e-mail: [hidden email]