INSERT INTO TABLE_PARAMS fails during ANALYZE TABLE

classic Classic list List threaded Threaded
1 message Options
Reply | Threaded
Open this post in threaded view
|

INSERT INTO TABLE_PARAMS fails during ANALYZE TABLE

smikesh
Hi everybody,

I wanted to test CBO with enabled histograms.
In order to do this, I  have enabled property spark.sql.statistics.histogram.enabled.
In this test derby was used as a database for hive metastore.  

The problem is, that in some cases, the values, that are inserted to table TABLE_PARAMS exceed the maximum length of 4000 symbols:

org.apache.spark.sql.AnalysisException: org.apache.hadoop.hive.ql.metadata.HiveException: Unable to alter table. Put request failed : INSERT INTO TABLE_PARAMS (PARAM_VALUE,TBL_ID,PARAM_KEY) VALUES (?,?,?)

org.datanucleus.exceptions.NucleusDataStoreException: Put request failed : INSERT INTO TABLE_PARAMS (PARAM_VALUE,TBL_ID,PARAM_KEY) VALUES (?,?,?)

and then

Caused by: java.sql.SQLDataException: A truncation error was encountered trying to shrink VARCHAR 'TFo0QmxvY2smMQwAAOAXAABMl6MI8TlBBw+MWLFixgAAAP7Bn9+7oD1wpMEv&' to length 4000.
The detailed stack trace can be seen here:

https://gist.github.com/mshtelma/c5ee8206200533fc1d606964dd5a30e2

Is it a known issue ? 

Best,
Michael