Why Spark 2.2.1 still bundles old Hive jars?

Previous Topic Next Topic
 
classic Classic list List threaded Threaded
2 messages Options
Reply | Threaded
Open this post in threaded view
|

Why Spark 2.2.1 still bundles old Hive jars?

An Qin

Hi, all,

 

I want to include Sentry 2.0.0 in my Spark project. However it bundles Hive 2.3.2. I find the newest Spark 2.2.1 still bundles old Hive jars, for example, hive-exec-1.2.1.spark2.jar. Why does it upgrade to the new Hive? Are they compatible?

 

Regards,

 

 

Qin An.

 

 

 

Reply | Threaded
Open this post in threaded view
|

Re: Why Spark 2.2.1 still bundles old Hive jars?

Jacek Laskowski

Pozdrawiam,
Jacek Laskowski
----
Spark Structured Streaming https://bit.ly/spark-structured-streaming
Mastering Apache Spark 2 https://bit.ly/mastering-apache-spark

On Mon, Dec 11, 2017 at 7:43 AM, An Qin <[hidden email]> wrote:

Hi, all,

 

I want to include Sentry 2.0.0 in my Spark project. However it bundles Hive 2.3.2. I find the newest Spark 2.2.1 still bundles old Hive jars, for example, hive-exec-1.2.1.spark2.jar. Why does it upgrade to the new Hive? Are they compatible?

 

Regards,

 

 

Qin An.