INSERT INTO TABLE_PARAMS fails during ANALYZE TABLE
I wanted to test CBO with enabled histograms. In order to do this, I have enabled property spark.sql.statistics.histogram.enabled. In this test derby was used as a database for hive metastore.
The problem is, that in some cases, the values, that are inserted to table TABLE_PARAMS exceed the maximum length of 4000 symbols:
org.apache.spark.sql.AnalysisException: org.apache.hadoop.hive.ql.metadata.HiveException: Unable to alter table. Put request failed : INSERT INTO TABLE_PARAMS (PARAM_VALUE,TBL_ID,PARAM_KEY) VALUES (?,?,?)
org.datanucleus.exceptions.NucleusDataStoreException: Put request failed : INSERT INTO TABLE_PARAMS (PARAM_VALUE,TBL_ID,PARAM_KEY) VALUES (?,?,?)
Caused by: java.sql.SQLDataException: A truncation error was encountered trying to shrink VARCHAR 'TFo0QmxvY2smMQwAAOAXAABMl6MI8TlBBw+MWLFixgAAAP7Bn9+7oD1wpMEv&' to length 4000. The detailed stack trace can be seen here: