Save hive table from spark in hive 2.1.0

Previous Topic Next Topic
 
classic Classic list List threaded Threaded
5 messages Options
Reply | Threaded
Open this post in threaded view
|

Save hive table from spark in hive 2.1.0

Alejandro Reina
This post was updated on .
Iam using Spark 2.2 with scala, hive 2.1.0 and zeppelin on ubuntu 16.04.
In addition, i copied hive-site.xml to spark/conf/ and
mysql-connector-java.jar from hive/libs to spark/jars

I want to save a dataframe as hivetable and iam doing this with:

    val hc = new org.apache.spark.sql.hive.HiveContext(sc)
    df.registerTempTable("myTempTable")
    hc.sql("create table store_sales as select * from myTempTable")

after in my notebook, i run this.

    %hive
    show tables;

And i can see that my new hivetable store_sales was created, but i cant run
hive after of this.

This is my hive-site.xml

    <configuration>
            <property>
                    <name>javax.jdo.option.ConnectionURL</name>
                   
<value>jdbc:mysql://localhost/metastore?useSSL=false</value>
                    <description>metadata is stored in a MySQL
server</description>
            </property>
            <property>
                    <name>javax.jdo.option.ConnectionDriverName</name>
                    <value>com.mysql.jdbc.Driver</value>
                    <description>MySQL JDBC driver class</description>
            </property>
            <property>
                    <name>javax.jdo.option.ConnectionUserName</name>
                    <value>hive</value>
                    <description>user name for connecting to mysql
server</description>
            </property>
            <property>
                    <name>javax.jdo.option.ConnectionPassword</name>
                    <value>hive</value>
                    <description>password for connecting to mysql
server</description>
            </property>
            <property>
                    <name>hive.execution.engine</name>
                    <value>spark</value>
                    <description>set hive on spark</description>
            </property>
    </configuration>

when i running hive

    root@alex-bi:/usr/local/hive/bin# hive
    SLF4J: Class path contains multiple SLF4J bindings.
    SLF4J: Found binding in
[jar:file:/usr/local/hive/lib/log4j-slf4j-impl-2.4.1.jar!/org/slf4j/impl/StaticLoggerBinder.class]
    SLF4J: Found binding in
[jar:file:/usr/local/hadoop/share/hadoop/common/lib/slf4j-log4j12-1.7.10.jar!/org/slf4j/impl/StaticLoggerBinder.class]
    SLF4J: See http://www.slf4j.org/codes.html#multiple_bindings for an
explanation.
    SLF4J: Actual binding is of type
[org.apache.logging.slf4j.Log4jLoggerFactory]
   
    Logging initialized using configuration in
jar:file:/usr/local/hive/lib/hive-common-2.1.1.jar!/hive-log4j2.properties
Async: true
    Exception in thread "main" java.lang.RuntimeException:
org.apache.hadoop.hive.ql.metadata.HiveException:
java.lang.RuntimeException: Unable to instantiate
org.apache.hadoop.hive.ql.metadata.SessionHiveMetaStoreClient
    at
org.apache.hadoop.hive.ql.session.SessionState.start(SessionState.java:591)
    at
org.apache.hadoop.hive.ql.session.SessionState.beginStart(SessionState.java:531)
    at org.apache.hadoop.hive.cli.CliDriver.run(CliDriver.java:705)
    at org.apache.hadoop.hive.cli.CliDriver.main(CliDriver.java:641)
    at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
    at
sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
    at
sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
    at java.lang.reflect.Method.invoke(Method.java:498)
    at org.apache.hadoop.util.RunJar.run(RunJar.java:221)
    at org.apache.hadoop.util.RunJar.main(RunJar.java:136)
    Caused by: org.apache.hadoop.hive.ql.metadata.HiveException:
java.lang.RuntimeException: Unable to instantiate
org.apache.hadoop.hive.ql.metadata.SessionHiveMetaStoreClient
    at
org.apache.hadoop.hive.ql.metadata.Hive.registerAllFunctionsOnce(Hive.java:226)
    at org.apache.hadoop.hive.ql.metadata.Hive.<init>(Hive.java:366)
    at org.apache.hadoop.hive.ql.metadata.Hive.create(Hive.java:310)
    at org.apache.hadoop.hive.ql.metadata.Hive.getInternal(Hive.java:290)
    at org.apache.hadoop.hive.ql.metadata.Hive.get(Hive.java:266)
    at
org.apache.hadoop.hive.ql.session.SessionState.start(SessionState.java:558)
    ... 9 more
    Caused by: java.lang.RuntimeException: Unable to instantiate
org.apache.hadoop.hive.ql.metadata.SessionHiveMetaStoreClient
    at
org.apache.hadoop.hive.metastore.MetaStoreUtils.newInstance(MetaStoreUtils.java:1654)
    at
org.apache.hadoop.hive.metastore.RetryingMetaStoreClient.<init>(RetryingMetaStoreClient.java:80)
    at
org.apache.hadoop.hive.metastore.RetryingMetaStoreClient.getProxy(RetryingMetaStoreClient.java:130)
    at
org.apache.hadoop.hive.metastore.RetryingMetaStoreClient.getProxy(RetryingMetaStoreClient.java:101)
    at
org.apache.hadoop.hive.ql.metadata.Hive.createMetaStoreClient(Hive.java:3367)
    at org.apache.hadoop.hive.ql.metadata.Hive.getMSC(Hive.java:3406)
    at org.apache.hadoop.hive.ql.metadata.Hive.getMSC(Hive.java:3386)
    at
org.apache.hadoop.hive.ql.metadata.Hive.getAllFunctions(Hive.java:3640)
    at
org.apache.hadoop.hive.ql.metadata.Hive.reloadFunctions(Hive.java:236)
    at
org.apache.hadoop.hive.ql.metadata.Hive.registerAllFunctionsOnce(Hive.java:221)
    ... 14 more
    Caused by: java.lang.reflect.InvocationTargetException
    at sun.reflect.NativeConstructorAccessorImpl.newInstance0(Native
Method)
    at
sun.reflect.NativeConstructorAccessorImpl.newInstance(NativeConstructorAccessorImpl.java:62)
    at
sun.reflect.DelegatingConstructorAccessorImpl.newInstance(DelegatingConstructorAccessorImpl.java:45)
    at java.lang.reflect.Constructor.newInstance(Constructor.java:423)
    at
org.apache.hadoop.hive.metastore.MetaStoreUtils.newInstance(MetaStoreUtils.java:1652)
    ... 23 more
    Caused by: MetaException(message:Hive Schema version 2.1.0 does not
match metastore's schema version 1.2.0 Metastore is not upgraded or corrupt)
    at
org.apache.hadoop.hive.metastore.ObjectStore.checkSchema(ObjectStore.java:7768)
    at
org.apache.hadoop.hive.metastore.ObjectStore.verifySchema(ObjectStore.java:7731)
    at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
    at
sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
    at
sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
    at java.lang.reflect.Method.invoke(Method.java:498)
    at
org.apache.hadoop.hive.metastore.RawStoreProxy.invoke(RawStoreProxy.java:101)
    at com.sun.proxy.$Proxy21.verifySchema(Unknown Source)
    at
org.apache.hadoop.hive.metastore.HiveMetaStore$HMSHandler.getMS(HiveMetaStore.java:565)
    at
org.apache.hadoop.hive.metastore.HiveMetaStore$HMSHandler.createDefaultDB(HiveMetaStore.java:626)
    at
org.apache.hadoop.hive.metastore.HiveMetaStore$HMSHandler.init(HiveMetaStore.java:416)
    at
org.apache.hadoop.hive.metastore.RetryingHMSHandler.<init>(RetryingHMSHandler.java:78)
    at
org.apache.hadoop.hive.metastore.RetryingHMSHandler.getProxy(RetryingHMSHandler.java:84)
    at
org.apache.hadoop.hive.metastore.HiveMetaStore.newRetryingHMSHandler(HiveMetaStore.java:6490)
    at
org.apache.hadoop.hive.metastore.HiveMetaStoreClient.<init>(HiveMetaStoreClient.java:238)
    at
org.apache.hadoop.hive.ql.metadata.SessionHiveMetaStoreClient.<init>(SessionHiveMetaStoreClient.java:70)
    ... 28 more

I think that this is problem of jars located in spark/jars

    hive-beeline-1.2.1.spark2.jar
    hive-cli-1.2.1.spark2.jar
    hive-exec-1.2.1.spark2.jar
    hive-jdbc-1.2.1.spark2.jar
    hive-metastore-1.2.1.spark2.jar

because spark save by default in hiveTable version 1.2.1, but i dont know
how to configure spark to save hivetable version 2.1.0 that this is my local
hive installation. I changed these jars for those of hive/lib version 2.1.0
but it did not work and I dont know what else to try, and google does not
help.

Thanks in advance




--
Sent from: http://apache-spark-user-list.1001560.n3.nabble.com/

---------------------------------------------------------------------
To unsubscribe e-mail: user-unsubscribe@spark.apache.org

Reply | Threaded
Open this post in threaded view
|

Re: Save hive table from spark in hive 2.1.1

☼ R Nair (रविशंकर नायर)
There is an option in Hive site.xml to ignore metadata validation. I mean try making below as false and try. Hive schematool also can help.


<property> <name>hive.metastore.schema.verification</name> <value>true</value> </property>

Best,
Ravion


On Dec 9, 2017 5:56 PM, "konu" <[hidden email]> wrote:
Iam using Spark 2.2 with scala, hive 2.1.1 and zeppelin on ubuntu 16.04.
In addition, i copied hive-site.xml to spark/conf/ and
mysql-connector-java.jar from hive/libs to spark/jars

I want to save a dataframe as hivetable and iam doing this with:

    val hc = new org.apache.spark.sql.hive.HiveContext(sc)
    df.registerTempTable("myTempTable")
    hc.sql("create table store_sales as select * from myTempTable")

after in my notebook, i run this.

    %hive
    show tables;

And i can see that my new hivetable store_sales was created, but i cant run
hive after of this.

This is my hive-site.xml

    <configuration>
            <property>
                    <name>javax.jdo.option.ConnectionURL</name>

<value>jdbc:mysql://localhost/metastore?useSSL=false</value>
                    <description>metadata is stored in a MySQL
server</description>
            </property>
            <property>
                    <name>javax.jdo.option.ConnectionDriverName</name>
                    <value>com.mysql.jdbc.Driver</value>
                    <description>MySQL JDBC driver class</description>
            </property>
            <property>
                    <name>javax.jdo.option.ConnectionUserName</name>
                    <value>hive</value>
                    <description>user name for connecting to mysql
server</description>
            </property>
            <property>
                    <name>javax.jdo.option.ConnectionPassword</name>
                    <value>hive</value>
                    <description>password for connecting to mysql
server</description>
            </property>
            <property>
                    <name>hive.execution.engine</name>
                    <value>spark</value>
                    <description>set hive on spark</description>
            </property>
    </configuration>

when i running hive

    root@alex-bi:/usr/local/hive/bin# hive
    SLF4J: Class path contains multiple SLF4J bindings.
    SLF4J: Found binding in
[jar:file:/usr/local/hive/lib/log4j-slf4j-impl-2.4.1.jar!/org/slf4j/impl/StaticLoggerBinder.class]
    SLF4J: Found binding in
[jar:file:/usr/local/hadoop/share/hadoop/common/lib/slf4j-log4j12-1.7.10.jar!/org/slf4j/impl/StaticLoggerBinder.class]
    SLF4J: See http://www.slf4j.org/codes.html#multiple_bindings for an
explanation.
    SLF4J: Actual binding is of type
[org.apache.logging.slf4j.Log4jLoggerFactory]

    Logging initialized using configuration in
jar:file:/usr/local/hive/lib/hive-common-2.1.1.jar!/hive-log4j2.properties
Async: true
    Exception in thread "main" java.lang.RuntimeException:
org.apache.hadoop.hive.ql.metadata.HiveException:
java.lang.RuntimeException: Unable to instantiate
org.apache.hadoop.hive.ql.metadata.SessionHiveMetaStoreClient
        at
org.apache.hadoop.hive.ql.session.SessionState.start(SessionState.java:591)
        at
org.apache.hadoop.hive.ql.session.SessionState.beginStart(SessionState.java:531)
        at org.apache.hadoop.hive.cli.CliDriver.run(CliDriver.java:705)
        at org.apache.hadoop.hive.cli.CliDriver.main(CliDriver.java:641)
        at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
        at
sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
        at
sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
        at java.lang.reflect.Method.invoke(Method.java:498)
        at org.apache.hadoop.util.RunJar.run(RunJar.java:221)
        at org.apache.hadoop.util.RunJar.main(RunJar.java:136)
    Caused by: org.apache.hadoop.hive.ql.metadata.HiveException:
java.lang.RuntimeException: Unable to instantiate
org.apache.hadoop.hive.ql.metadata.SessionHiveMetaStoreClient
        at
org.apache.hadoop.hive.ql.metadata.Hive.registerAllFunctionsOnce(Hive.java:226)
        at org.apache.hadoop.hive.ql.metadata.Hive.<init>(Hive.java:366)
        at org.apache.hadoop.hive.ql.metadata.Hive.create(Hive.java:310)
        at org.apache.hadoop.hive.ql.metadata.Hive.getInternal(Hive.java:290)
        at org.apache.hadoop.hive.ql.metadata.Hive.get(Hive.java:266)
        at
org.apache.hadoop.hive.ql.session.SessionState.start(SessionState.java:558)
        ... 9 more
    Caused by: java.lang.RuntimeException: Unable to instantiate
org.apache.hadoop.hive.ql.metadata.SessionHiveMetaStoreClient
        at
org.apache.hadoop.hive.metastore.MetaStoreUtils.newInstance(MetaStoreUtils.java:1654)
        at
org.apache.hadoop.hive.metastore.RetryingMetaStoreClient.<init>(RetryingMetaStoreClient.java:80)
        at
org.apache.hadoop.hive.metastore.RetryingMetaStoreClient.getProxy(RetryingMetaStoreClient.java:130)
        at
org.apache.hadoop.hive.metastore.RetryingMetaStoreClient.getProxy(RetryingMetaStoreClient.java:101)
        at
org.apache.hadoop.hive.ql.metadata.Hive.createMetaStoreClient(Hive.java:3367)
        at org.apache.hadoop.hive.ql.metadata.Hive.getMSC(Hive.java:3406)
        at org.apache.hadoop.hive.ql.metadata.Hive.getMSC(Hive.java:3386)
        at
org.apache.hadoop.hive.ql.metadata.Hive.getAllFunctions(Hive.java:3640)
        at
org.apache.hadoop.hive.ql.metadata.Hive.reloadFunctions(Hive.java:236)
        at
org.apache.hadoop.hive.ql.metadata.Hive.registerAllFunctionsOnce(Hive.java:221)
        ... 14 more
    Caused by: java.lang.reflect.InvocationTargetException
        at sun.reflect.NativeConstructorAccessorImpl.newInstance0(Native
Method)
        at
sun.reflect.NativeConstructorAccessorImpl.newInstance(NativeConstructorAccessorImpl.java:62)
        at
sun.reflect.DelegatingConstructorAccessorImpl.newInstance(DelegatingConstructorAccessorImpl.java:45)
        at java.lang.reflect.Constructor.newInstance(Constructor.java:423)
        at
org.apache.hadoop.hive.metastore.MetaStoreUtils.newInstance(MetaStoreUtils.java:1652)
        ... 23 more
    Caused by: MetaException(message:Hive Schema version 2.1.0 does not
match metastore's schema version 1.2.0 Metastore is not upgraded or corrupt)
        at
org.apache.hadoop.hive.metastore.ObjectStore.checkSchema(ObjectStore.java:7768)
        at
org.apache.hadoop.hive.metastore.ObjectStore.verifySchema(ObjectStore.java:7731)
        at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
        at
sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
        at
sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
        at java.lang.reflect.Method.invoke(Method.java:498)
        at
org.apache.hadoop.hive.metastore.RawStoreProxy.invoke(RawStoreProxy.java:101)
        at com.sun.proxy.$Proxy21.verifySchema(Unknown Source)
        at
org.apache.hadoop.hive.metastore.HiveMetaStore$HMSHandler.getMS(HiveMetaStore.java:565)
        at
org.apache.hadoop.hive.metastore.HiveMetaStore$HMSHandler.createDefaultDB(HiveMetaStore.java:626)
        at
org.apache.hadoop.hive.metastore.HiveMetaStore$HMSHandler.init(HiveMetaStore.java:416)
        at
org.apache.hadoop.hive.metastore.RetryingHMSHandler.<init>(RetryingHMSHandler.java:78)
        at
org.apache.hadoop.hive.metastore.RetryingHMSHandler.getProxy(RetryingHMSHandler.java:84)
        at
org.apache.hadoop.hive.metastore.HiveMetaStore.newRetryingHMSHandler(HiveMetaStore.java:6490)
        at
org.apache.hadoop.hive.metastore.HiveMetaStoreClient.<init>(HiveMetaStoreClient.java:238)
        at
org.apache.hadoop.hive.ql.metadata.SessionHiveMetaStoreClient.<init>(SessionHiveMetaStoreClient.java:70)
        ... 28 more

I think that this is problem of jars located in spark/jars

    hive-beeline-1.2.1.spark2.jar
    hive-cli-1.2.1.spark2.jar
    hive-exec-1.2.1.spark2.jar
    hive-jdbc-1.2.1.spark2.jar
    hive-metastore-1.2.1.spark2.jar

because spark save by default in hiveTable version 1.2.1, but i dont know
how to configure spark to save hivetable version 2.1.1 that this is my local
hive installation. I changed these jars for those of hive/lib version 2.1.1
but it did not work and I dont know what else to try, and google does not
help.

Thanks in advance




--
Sent from: http://apache-spark-user-list.1001560.n3.nabble.com/

---------------------------------------------------------------------
To unsubscribe e-mail: [hidden email]


Reply | Threaded
Open this post in threaded view
|

Re: Save hive table from spark in hive 2.1.0

Alejandro Reina
This post was updated on .
I have tried what you propose, added the property to hive-site.xml, and although with this option I can run hive, this does not solve my problem. I'm sorry if maybe you explain me wrongly.

I need to save a dataframe transformed into spark in hive, with the version of scheme 2.1.1 of hive (last stable version of hive until 2 months ago), by default spark internally uses version 1.2.1 and this generates conflict with my other applications the ecosystem.

This hiveTable created from spark, is not visible from any other application due to its version of the metadata, so I need to create from spark a hive table in its corresponding version.

Hive schematool
root@alex-bi:/usr/local/hive/bin# schematool -dbType mysql -info --verbose
SLF4J: Class path contains multiple SLF4J bindings.
SLF4J: Found binding in
[jar:file:/usr/local/hive/lib/log4j-slf4j-impl-2.4.1.jar!/org/slf4j/impl/StaticLoggerBinder.class]
SLF4J: Found binding in
[jar:file:/usr/local/hadoop/share/hadoop/common/lib/slf4j-log4j12-1.7.10.jar!/org/slf4j/impl/StaticLoggerBinder.class]
SLF4J: See http://www.slf4j.org/codes.html#multiple_bindings for an
explanation.
SLF4J: Actual binding is of type
[org.apache.logging.slf4j.Log4jLoggerFactory]
Metastore connection URL: jdbc:mysql://localhost/metastore?useSSL=false
Metastore Connection Driver : com.mysql.jdbc.Driver
Metastore connection User: hive
Hive distribution version: 2.1.0
Metastore schema version: 1.2.0
org.apache.hadoop.hive.metastore.HiveMetaException: Metastore schema version
is not compatible. Hive Version: 2.1.0, Database Schema Version: 1.2.0
org.apache.hadoop.hive.metastore.HiveMetaException: Metastore schema version
is not compatible. Hive Version: 2.1.0, Database Schema Version: 1.2.0
        at
org.apache.hive.beeline.HiveSchemaTool.assertCompatibleVersion(HiveSchemaTool.java:202)
        at org.apache.hive.beeline.HiveSchemaTool.showInfo(HiveSchemaTool.java:139)
        at org.apache.hive.beeline.HiveSchemaTool.main(HiveSchemaTool.java:498)
        at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
        at
sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
        at
sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
        at java.lang.reflect.Method.invoke(Method.java:498)
        at org.apache.hadoop.util.RunJar.run(RunJar.java:221)
        at org.apache.hadoop.util.RunJar.main(RunJar.java:136)
*** schemaTool failed ***


If i try upgrade schema too have error.



root@alex-bi:/usr/local/hive/bin# schematool -dbType mysql -upgradeSchema
SLF4J: Class path contains multiple SLF4J bindings.
SLF4J: Found binding in
[jar:file:/usr/local/hive/lib/log4j-slf4j-impl-2.4.1.jar!/org/slf4j/impl/StaticLoggerBinder.class]
SLF4J: Found binding in
[jar:file:/usr/local/hadoop/share/hadoop/common/lib/slf4j-log4j12-1.7.10.jar!/org/slf4j/impl/StaticLoggerBinder.class]
SLF4J: See http://www.slf4j.org/codes.html#multiple_bindings for an
explanation.
SLF4J: Actual binding is of type
[org.apache.logging.slf4j.Log4jLoggerFactory]
Metastore connection URL: jdbc:mysql://localhost/metastore?useSSL=false
Metastore Connection Driver : com.mysql.jdbc.Driver
Metastore connection User: hive
Starting upgrade metastore schema from version 1.2.0 to 2.1.0
Upgrade script upgrade-1.2.0-to-2.0.0.mysql.sql
Error: Duplicate column name 'CQ_HIGHEST_TXN_ID' (state=42S21,code=1060)
org.apache.hadoop.hive.metastore.HiveMetaException: Upgrade FAILED!
Metastore state would be inconsistent !!
Underlying cause: java.io.IOException : Schema script failed, errorcode 2
Use --verbose for detailed stacktrace.
*** schemaTool failed ***



thank you very much for your time dedicated to helping me




--
Sent from: http://apache-spark-user-list.1001560.n3.nabble.com/

---------------------------------------------------------------------
To unsubscribe e-mail: user-unsubscribe@spark.apache.org

Reply | Threaded
Open this post in threaded view
|

Re: Save hive table from spark in hive 2.1.0

☼ R Nair (रविशंकर नायर)
Hi,

Good try. As you can see, when you run upgrade using schematool, there is a duplicate column error. Can you please look the script generated and edit to avoid duplicate column?

Not sure why the Hive guys made it complicated, I did face same issues like you.

Can anyone else give a clean and better option?

Best, Ravion


On Dec 10, 2017 6:17 AM, "Alejandro Reina" <[hidden email]> wrote:
I have tried what you propose, added the property to hive-site.xml, and
although with this option I can run hive, this does not solve my problem.
I'm sorry if maybe you explain me wrongly.

I need to save a dataframe transformed into spark in hive, with the version
of scheme 2.1.1 of hive (last stable version of hive until 2 months ago), by
default spark internally uses version 1.2.1 and this generates conflict with
my other applications the ecosystem.

This hiveTable created from spark, is not visible from any other application
due to its version of the metadata, so I need to create from spark a hive
table in its corresponding version.

Hive schematool
root@alex-bi:/usr/local/hive/bin# schematool -dbType mysql -info --verbose
SLF4J: Class path contains multiple SLF4J bindings.
SLF4J: Found binding in
[jar:file:/usr/local/hive/lib/log4j-slf4j-impl-2.4.1.jar!/org/slf4j/impl/StaticLoggerBinder.class]
SLF4J: Found binding in
[jar:file:/usr/local/hadoop/share/hadoop/common/lib/slf4j-log4j12-1.7.10.jar!/org/slf4j/impl/StaticLoggerBinder.class]
SLF4J: See http://www.slf4j.org/codes.html#multiple_bindings for an
explanation.
SLF4J: Actual binding is of type
[org.apache.logging.slf4j.Log4jLoggerFactory]
Metastore connection URL:        jdbc:mysql://localhost/metastore?useSSL=false
Metastore Connection Driver :    com.mysql.jdbc.Driver
Metastore connection User:       hive
Hive distribution version:       2.1.0
Metastore schema version:        1.2.0
org.apache.hadoop.hive.metastore.HiveMetaException: Metastore schema version
is not compatible. Hive Version: 2.1.0, Database Schema Version: 1.2.0
org.apache.hadoop.hive.metastore.HiveMetaException: Metastore schema version
is not compatible. Hive Version: 2.1.0, Database Schema Version: 1.2.0
        at
org.apache.hive.beeline.HiveSchemaTool.assertCompatibleVersion(HiveSchemaTool.java:202)
        at org.apache.hive.beeline.HiveSchemaTool.showInfo(HiveSchemaTool.java:139)
        at org.apache.hive.beeline.HiveSchemaTool.main(HiveSchemaTool.java:498)
        at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
        at
sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
        at
sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
        at java.lang.reflect.Method.invoke(Method.java:498)
        at org.apache.hadoop.util.RunJar.run(RunJar.java:221)
        at org.apache.hadoop.util.RunJar.main(RunJar.java:136)
*** schemaTool failed ***

If i try upgrade schema too have error.
root@alex-bi:/usr/local/hive/bin# schematool -dbType mysql -upgradeSchema
SLF4J: Class path contains multiple SLF4J bindings.
SLF4J: Found binding in
[jar:file:/usr/local/hive/lib/log4j-slf4j-impl-2.4.1.jar!/org/slf4j/impl/StaticLoggerBinder.class]
SLF4J: Found binding in
[jar:file:/usr/local/hadoop/share/hadoop/common/lib/slf4j-log4j12-1.7.10.jar!/org/slf4j/impl/StaticLoggerBinder.class]
SLF4J: See http://www.slf4j.org/codes.html#multiple_bindings for an
explanation.
SLF4J: Actual binding is of type
[org.apache.logging.slf4j.Log4jLoggerFactory]
Metastore connection URL:        jdbc:mysql://localhost/metastore?useSSL=false
Metastore Connection Driver :    com.mysql.jdbc.Driver
Metastore connection User:       hive
Starting upgrade metastore schema from version 1.2.0 to 2.1.0
Upgrade script upgrade-1.2.0-to-2.0.0.mysql.sql
Error: Duplicate column name 'CQ_HIGHEST_TXN_ID' (state=42S21,code=1060)
org.apache.hadoop.hive.metastore.HiveMetaException: Upgrade FAILED!
Metastore state would be inconsistent !!
Underlying cause: java.io.IOException : Schema script failed, errorcode 2
Use --verbose for detailed stacktrace.
*** schemaTool failed ***

thank you very much for your time dedicated to helping me




--
Sent from: http://apache-spark-user-list.1001560.n3.nabble.com/

---------------------------------------------------------------------
To unsubscribe e-mail: [hidden email]

Reply | Threaded
Open this post in threaded view
|

Re: Save hive table from spark in hive 2.1.0

Alejandro Reina
This post was updated on .
I did what you said and I was finally able to update the scheme. But you're right, it's very dirty, I have to modify almost all the scripts. The problem of the scripts comes from having already a previous table in that version, many of the tables or columns that I try to add, already exist and it gives many errors, but by modifying the script everything is ok.

A clearer way would be that Spark could save a hive table in the corresponding version, but I do not know how to do it.

Thanks you very much ^^



--
Sent from: http://apache-spark-user-list.1001560.n3.nabble.com/

---------------------------------------------------------------------
To unsubscribe e-mail: user-unsubscribe@spark.apache.org