I am having the a similar problem to the original post. I am trying to run in
Spark 3 and connect to CDH Hive 2.1.1. I have run with the same
spark.sql.hive.metastore options. The main difference in my environment is
that I am trying to run in Spark on K8S using spark-operator, making it a
little more difficult to control and debug. Depending on the different
configurations I get either the 'get_table_req' problem or
Currently my base image was created using:
I do see the hive jars despite the hadoop-provided profile, so have been
trying to different configurations of my jars including or omitting them and
compiling against 2.3.7 to match Spark default.
--
Sent from:
http://apache-spark-user-list.1001560.n3.nabble.com/---------------------------------------------------------------------
To unsubscribe e-mail:
[hidden email]