How to enable hive support on an existing Spark session?

classic Classic list List threaded Threaded
2 messages Options
Reply | Threaded
Open this post in threaded view
|

How to enable hive support on an existing Spark session?

Kun Huang (COSMOS)

Hi Spark experts,

I am seeking for an approach to enable hive support manually on an existing Spark session. 

Currently, HiveContext seems the best way for my scenario. However, this class has already been marked as deprecated and it is recommended to use SparkSession.builder.enableHiveSupport(). This should be called before creating Spark session.

I wonder if there are other workaround?

Thanks,
Kun
Reply | Threaded
Open this post in threaded view
|

Re: How to enable hive support on an existing Spark session?

Harsh
Hi Kun, 

You can use following spark property instead while launching the app instead of manually enabling it in the code.

spark.sql.catalogImplementation=hive


Kind Regards
Harsh

On Tue, May 26, 2020 at 9:55 PM Kun Huang (COSMOS) <[hidden email]> wrote:

Hi Spark experts,

I am seeking for an approach to enable hive support manually on an existing Spark session. 

Currently, HiveContext seems the best way for my scenario. However, this class has already been marked as deprecated and it is recommended to use SparkSession.builder.enableHiveSupport(). This should be called before creating Spark session.

I wonder if there are other workaround?

Thanks,
Kun