Error while creating table with space with /wihout partition

classic Classic list List threaded Threaded
1 message Options
Reply | Threaded
Open this post in threaded view
|

Error while creating table with space with /wihout partition

abhijeet bedagkar


I am facing a weird situation wherein the insert overwrite query does not give any error on being executed against a table which contains a column with a space in its name. Following are the queries which give no error:


CREATE TABLE TEST_PART (`col1 ` STRING) PARTITIONED BY (`col2` STRING) STORED AS PARQUET; set hive.exec.dynamic.partition.mode=nonstrict; set hive.exec.max.dynamic.partitions = 2500; INSERT OVERWRITE TABLE TEST_PART PARTITION (col2) select 'test' as col1, 'test2' as col2;


A similar table without partitions results in the AnalysisException thrown saying 'Attribute name "col1 " contains invalid character(s) among " ,;{}()\n\t=". Please use alias to rename it.;'. Below are the set of queries:


CREATE TABLE TEST (`col1 ` STRING,`col2` STRING)STORED AS PARQUET; INSERT OVERWRITE TABLE TEST select 'test' as col1, 'test2' as col2;


However, I get the AnalysisException on running a select * against TEST_PART table. For TEST the insert overwrite query itself errors out. Any idea why would this happen ?


We are using Spark 2.0.0. The same behavior has been noticed in Spark 2.2.1 as well.