Spark Streaming Kerberos Issue

Previous Topic Next Topic
 
classic Classic list List threaded Threaded
7 messages Options
Reply | Threaded
Open this post in threaded view
|

Spark Streaming Kerberos Issue

khajaasmath786
Hi,

I have written spark stream job and job is running successfully for more than 36 hours. After around 36 hours job gets failed with kerberos issue. Any solution on how to resolve it.

org.apache.spark.SparkException: Task failed while wri\

ting rows.

                at org.apache.spark.sql.hive.SparkHiveDynamicPartitionWriterContainer.writeToFile(hiveWriterContainers.scala:328)

                at org.apache.spark.sql.hive.execution.InsertIntoHiveTable$$anonfun$saveAsHiveFile$3.apply(InsertIntoHiveTable.scala:210)

                at org.apache.spark.sql.hive.execution.InsertIntoHiveTable$$anonfun$saveAsHiveFile$3.apply(InsertIntoHiveTable.scala:210)

                at org.apache.spark.scheduler.ResultTask.runTask(ResultTask.scala:87)

                at org.apache.spark.scheduler.Task.run(Task.scala:99)

                at org.apache.spark.executor.Executor$TaskRunner.run(Executor.scala:322)

                at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1145)

                at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:615)

                at java.lang.Thread.run(Thread.java:745)

Caused by: org.apache.hadoop.hive.ql.metadata.HiveException: java.io.IOException: org.apache.hadoop.security.authentication.client.\

AuthenticationException: org.apache.hadoop.security.token.SecretManager$InvalidToken: token (kms-dt owner=va_dflt, renewer=yarn, re\

alUser=, issueDate=1511262017635, maxDate=1511866817635, sequenceNumber=1854601, masterKeyId=3392) is expired

                at org.apache.hadoop.hive.ql.io.HiveFileFormatUtils.getHiveRecordWriter(HiveFileFormatUtils.java:248)

                at org.apache.spark.sql.hive.SparkHiveDynamicPartitionWriterContainer.newOutputWriter$1(hiveWriterContainers.scala:346)

                at org.apache.spark.sql.hive.SparkHiveDynamicPartitionWriterContainer.writeToFile(hiveWriterContainers.scala:304)

                ... 8 more

Caused by: java.io.IOException: org.apache.hadoop.security.authentication.client.AuthenticationException: org.apache.hadoop.securit\

y.token.SecretManager$InvalidToken: token (kms-dt owner=va_dflt, renewer=yarn, realUser=, issueDate=1511262017635, maxDate=15118668\

17635, sequenceNumber=1854601, masterKeyId=3392) is expired

                at org.apache.hadoop.crypto.key.kms.LoadBalancingKMSClientProvider.decryptEncryptedKey(LoadBalancingKMSClientProvider.java:216)

                at org.apache.hadoop.crypto.key.KeyProviderCryptoExtension.decryptEncryptedKey(KeyProviderCryptoExtension.java:388)

                at org.apache.hadoop.hdfs.DFSClient.decryptEncryptedDataEncryptionKey(DFSClient.java:1440)

                at org.apache.hadoop.hdfs.DFSClient.createWrappedOutputStream(DFSClient.java:1542)

                at org.apache.hadoop.hdfs.DFSClient.createWrappedOutputStream(DFSClient.java:1527)

                at org.apache.hadoop.hdfs.DistributedFileSystem$7.doCall(DistributedFileSystem.java:428)

                at org.apache.hadoop.hdfs.DistributedFileSystem$7.doCall(DistributedFileSystem.java:421)

                at org.apache.hadoop.fs.FileSystemLinkResolver.resolve(FileSystemLinkResolver.java:81)

                at org.apache.hadoop.hdfs.DistributedFileSystem.create(DistributedFileSystem.java:421)

                at org.apache.hadoop.hdfs.DistributedFileSystem.create(DistributedFileSystem.java:362)

                at org.apache.hadoop.fs.FileSystem.create(FileSystem.java:925)

                at org.apache.hadoop.fs.FileSystem.create(FileSystem.java:906)

                at parquet.hadoop.ParquetFileWriter.<init>(ParquetFileWriter.java:220)

                at parquet.hadoop.ParquetOutputFormat.getRecordWriter(ParquetOutputFormat.java:311)

                at parquet.hadoop.ParquetOutputFormat.getRecordWriter(ParquetOutputFormat.java:287)

                at org.apache.hadoop.hive.ql.io.parquet.write.ParquetRecordWriterWrapper.<init>(ParquetRecordWriterWrapper.java:65)

                at org.apache.hadoop.hive.ql.io.parquet.MapredParquetOutputFormat.getParquerRecordWriterWrapper(MapredParquetOutputFormat.java:125)

                at org.apache.hadoop.hive.ql.io.parquet.MapredParquetOutputFormat.getHiveRecordWriter(MapredParquetOutputFormat.java:114)

                at org.apache.hadoop.hive.ql.io.HiveFileFormatUtils.getRecordWriter(HiveFileFormatUtils.java:260)

                at org.apache.hadoop.hive.ql.io.HiveFileFormatUtils.getHiveRecordWriter(HiveFileFormatUtils.java:245)

                ... 10 more

Caused by: org.apache.hadoop.security.authentication.client.AuthenticationException: org.apache.hadoop.security.token.SecretManager\

$InvalidToken: token (kms-dt owner=va_dflt, renewer=yarn, realUser=, issueDate=1511262017635, maxDate=1511866817635, sequenceNumber\

=1854601, masterKeyId=3392) is expired

                at sun.reflect.NativeConstructorAccessorImpl.newInstance0(Native Method)

                at sun.reflect.NativeConstructorAccessorImpl.newInstance(NativeConstructorAccessorImpl.java:57)

                at sun.reflect.DelegatingConstructorAccessorImpl.newInstance(DelegatingConstructorAccessorImpl.java:45)

                at java.lang.reflect.Constructor.newInstance(Constructor.java:526)

                at org.apache.hadoop.util.HttpExceptionUtils.validateResponse(HttpExceptionUtils.java:157)

                at org.apache.hadoop.crypto.key.kms.KMSClientProvider.call(KMSClientProvider.java:627)

                at org.apache.hadoop.crypto.key.kms.KMSClientProvider.call(KMSClientProvider.java:585)

                at org.apache.hadoop.crypto.key.kms.KMSClientProvider.decryptEncryptedKey(KMSClientProvider.java:852)

                at org.apache.hadoop.crypto.key.kms.LoadBalancingKMSClientProvider$5.call(LoadBalancingKMSClientProvider.java:209)

                at org.apache.hadoop.crypto.key.kms.LoadBalancingKMSClientProvider$5.call(LoadBalancingKMSClientProvider.java:205)

                at org.apache.hadoop.crypto.key.kms.LoadBalancingKMSClientProvider.doOp(LoadBalancingKMSClientProvider.java:94)

                at org.apache.hadoop.crypto.key.kms.LoadBalancingKMSClientProvider.decryptEncryptedKey(LoadBalancingKMSClientProvider.java:205)

                ... 29 more


Thanks,

Asmath

Reply | Threaded
Open this post in threaded view
|

Re: Spark Streaming Kerberos Issue

geoHeil
Did you pass a keytab? Is renewal enabled in your kdc?
KhajaAsmath Mohammed <[hidden email]> schrieb am Mi. 22. Nov. 2017 um 19:25:
Hi,

I have written spark stream job and job is running successfully for more than 36 hours. After around 36 hours job gets failed with kerberos issue. Any solution on how to resolve it.

org.apache.spark.SparkException: Task failed while wri\

ting rows.

                at org.apache.spark.sql.hive.SparkHiveDynamicPartitionWriterContainer.writeToFile(hiveWriterContainers.scala:328)

                at org.apache.spark.sql.hive.execution.InsertIntoHiveTable$$anonfun$saveAsHiveFile$3.apply(InsertIntoHiveTable.scala:210)

                at org.apache.spark.sql.hive.execution.InsertIntoHiveTable$$anonfun$saveAsHiveFile$3.apply(InsertIntoHiveTable.scala:210)

                at org.apache.spark.scheduler.ResultTask.runTask(ResultTask.scala:87)

                at org.apache.spark.scheduler.Task.run(Task.scala:99)

                at org.apache.spark.executor.Executor$TaskRunner.run(Executor.scala:322)

                at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1145)

                at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:615)

                at java.lang.Thread.run(Thread.java:745)

Caused by: org.apache.hadoop.hive.ql.metadata.HiveException: java.io.IOException: org.apache.hadoop.security.authentication.client.\

AuthenticationException: org.apache.hadoop.security.token.SecretManager$InvalidToken: token (kms-dt owner=va_dflt, renewer=yarn, re\

alUser=, issueDate=1511262017635, maxDate=1511866817635, sequenceNumber=1854601, masterKeyId=3392) is expired

                at org.apache.hadoop.hive.ql.io.HiveFileFormatUtils.getHiveRecordWriter(HiveFileFormatUtils.java:248)

                at org.apache.spark.sql.hive.SparkHiveDynamicPartitionWriterContainer.newOutputWriter$1(hiveWriterContainers.scala:346)

                at org.apache.spark.sql.hive.SparkHiveDynamicPartitionWriterContainer.writeToFile(hiveWriterContainers.scala:304)

                ... 8 more

Caused by: java.io.IOException: org.apache.hadoop.security.authentication.client.AuthenticationException: org.apache.hadoop.securit\

y.token.SecretManager$InvalidToken: token (kms-dt owner=va_dflt, renewer=yarn, realUser=, issueDate=1511262017635, maxDate=15118668\

17635, sequenceNumber=1854601, masterKeyId=3392) is expired

                at org.apache.hadoop.crypto.key.kms.LoadBalancingKMSClientProvider.decryptEncryptedKey(LoadBalancingKMSClientProvider.java:216)

                at org.apache.hadoop.crypto.key.KeyProviderCryptoExtension.decryptEncryptedKey(KeyProviderCryptoExtension.java:388)

                at org.apache.hadoop.hdfs.DFSClient.decryptEncryptedDataEncryptionKey(DFSClient.java:1440)

                at org.apache.hadoop.hdfs.DFSClient.createWrappedOutputStream(DFSClient.java:1542)

                at org.apache.hadoop.hdfs.DFSClient.createWrappedOutputStream(DFSClient.java:1527)

                at org.apache.hadoop.hdfs.DistributedFileSystem$7.doCall(DistributedFileSystem.java:428)

                at org.apache.hadoop.hdfs.DistributedFileSystem$7.doCall(DistributedFileSystem.java:421)

                at org.apache.hadoop.fs.FileSystemLinkResolver.resolve(FileSystemLinkResolver.java:81)

                at org.apache.hadoop.hdfs.DistributedFileSystem.create(DistributedFileSystem.java:421)

                at org.apache.hadoop.hdfs.DistributedFileSystem.create(DistributedFileSystem.java:362)

                at org.apache.hadoop.fs.FileSystem.create(FileSystem.java:925)

                at org.apache.hadoop.fs.FileSystem.create(FileSystem.java:906)

                at parquet.hadoop.ParquetFileWriter.<init>(ParquetFileWriter.java:220)

                at parquet.hadoop.ParquetOutputFormat.getRecordWriter(ParquetOutputFormat.java:311)

                at parquet.hadoop.ParquetOutputFormat.getRecordWriter(ParquetOutputFormat.java:287)

                at org.apache.hadoop.hive.ql.io.parquet.write.ParquetRecordWriterWrapper.<init>(ParquetRecordWriterWrapper.java:65)

                at org.apache.hadoop.hive.ql.io.parquet.MapredParquetOutputFormat.getParquerRecordWriterWrapper(MapredParquetOutputFormat.java:125)

                at org.apache.hadoop.hive.ql.io.parquet.MapredParquetOutputFormat.getHiveRecordWriter(MapredParquetOutputFormat.java:114)

                at org.apache.hadoop.hive.ql.io.HiveFileFormatUtils.getRecordWriter(HiveFileFormatUtils.java:260)

                at org.apache.hadoop.hive.ql.io.HiveFileFormatUtils.getHiveRecordWriter(HiveFileFormatUtils.java:245)

                ... 10 more

Caused by: org.apache.hadoop.security.authentication.client.AuthenticationException: org.apache.hadoop.security.token.SecretManager\

$InvalidToken: token (kms-dt owner=va_dflt, renewer=yarn, realUser=, issueDate=1511262017635, maxDate=1511866817635, sequenceNumber\

=1854601, masterKeyId=3392) is expired

                at sun.reflect.NativeConstructorAccessorImpl.newInstance0(Native Method)

                at sun.reflect.NativeConstructorAccessorImpl.newInstance(NativeConstructorAccessorImpl.java:57)

                at sun.reflect.DelegatingConstructorAccessorImpl.newInstance(DelegatingConstructorAccessorImpl.java:45)

                at java.lang.reflect.Constructor.newInstance(Constructor.java:526)

                at org.apache.hadoop.util.HttpExceptionUtils.validateResponse(HttpExceptionUtils.java:157)

                at org.apache.hadoop.crypto.key.kms.KMSClientProvider.call(KMSClientProvider.java:627)

                at org.apache.hadoop.crypto.key.kms.KMSClientProvider.call(KMSClientProvider.java:585)

                at org.apache.hadoop.crypto.key.kms.KMSClientProvider.decryptEncryptedKey(KMSClientProvider.java:852)

                at org.apache.hadoop.crypto.key.kms.LoadBalancingKMSClientProvider$5.call(LoadBalancingKMSClientProvider.java:209)

                at org.apache.hadoop.crypto.key.kms.LoadBalancingKMSClientProvider$5.call(LoadBalancingKMSClientProvider.java:205)

                at org.apache.hadoop.crypto.key.kms.LoadBalancingKMSClientProvider.doOp(LoadBalancingKMSClientProvider.java:94)

                at org.apache.hadoop.crypto.key.kms.LoadBalancingKMSClientProvider.decryptEncryptedKey(LoadBalancingKMSClientProvider.java:205)

                ... 29 more


Thanks,

Asmath

Reply | Threaded
Open this post in threaded view
|

Re: Spark Streaming Kerberos Issue

khajaasmath786
I passed keytab, renewal is enabled by running the script every eight hours. User gets renewed by the script every eight hours.

On Wed, Nov 22, 2017 at 12:27 PM, Georg Heiler <[hidden email]> wrote:
Did you pass a keytab? Is renewal enabled in your kdc?
KhajaAsmath Mohammed <[hidden email]> schrieb am Mi. 22. Nov. 2017 um 19:25:
Hi,

I have written spark stream job and job is running successfully for more than 36 hours. After around 36 hours job gets failed with kerberos issue. Any solution on how to resolve it.

org.apache.spark.SparkException: Task failed while wri\

ting rows.

                at org.apache.spark.sql.hive.SparkHiveDynamicPartitionWriterContainer.writeToFile(hiveWriterContainers.scala:328)

                at org.apache.spark.sql.hive.execution.InsertIntoHiveTable$$anonfun$saveAsHiveFile$3.apply(InsertIntoHiveTable.scala:210)

                at org.apache.spark.sql.hive.execution.InsertIntoHiveTable$$anonfun$saveAsHiveFile$3.apply(InsertIntoHiveTable.scala:210)

                at org.apache.spark.scheduler.ResultTask.runTask(ResultTask.scala:87)

                at org.apache.spark.scheduler.Task.run(Task.scala:99)

                at org.apache.spark.executor.Executor$TaskRunner.run(Executor.scala:322)

                at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1145)

                at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:615)

                at java.lang.Thread.run(Thread.java:745)

Caused by: org.apache.hadoop.hive.ql.metadata.HiveException: java.io.IOException: org.apache.hadoop.security.authentication.client.\

AuthenticationException: org.apache.hadoop.security.token.SecretManager$InvalidToken: token (kms-dt owner=va_dflt, renewer=yarn, re\

alUser=, issueDate=1511262017635, maxDate=1511866817635, sequenceNumber=1854601, masterKeyId=3392) is expired

                at org.apache.hadoop.hive.ql.io.HiveFileFormatUtils.getHiveRecordWriter(HiveFileFormatUtils.java:248)

                at org.apache.spark.sql.hive.SparkHiveDynamicPartitionWriterContainer.newOutputWriter$1(hiveWriterContainers.scala:346)

                at org.apache.spark.sql.hive.SparkHiveDynamicPartitionWriterContainer.writeToFile(hiveWriterContainers.scala:304)

                ... 8 more

Caused by: java.io.IOException: org.apache.hadoop.security.authentication.client.AuthenticationException: org.apache.hadoop.securit\

y.token.SecretManager$InvalidToken: token (kms-dt owner=va_dflt, renewer=yarn, realUser=, issueDate=1511262017635, maxDate=15118668\

17635, sequenceNumber=1854601, masterKeyId=3392) is expired

                at org.apache.hadoop.crypto.key.kms.LoadBalancingKMSClientProvider.decryptEncryptedKey(LoadBalancingKMSClientProvider.java:216)

                at org.apache.hadoop.crypto.key.KeyProviderCryptoExtension.decryptEncryptedKey(KeyProviderCryptoExtension.java:388)

                at org.apache.hadoop.hdfs.DFSClient.decryptEncryptedDataEncryptionKey(DFSClient.java:1440)

                at org.apache.hadoop.hdfs.DFSClient.createWrappedOutputStream(DFSClient.java:1542)

                at org.apache.hadoop.hdfs.DFSClient.createWrappedOutputStream(DFSClient.java:1527)

                at org.apache.hadoop.hdfs.DistributedFileSystem$7.doCall(DistributedFileSystem.java:428)

                at org.apache.hadoop.hdfs.DistributedFileSystem$7.doCall(DistributedFileSystem.java:421)

                at org.apache.hadoop.fs.FileSystemLinkResolver.resolve(FileSystemLinkResolver.java:81)

                at org.apache.hadoop.hdfs.DistributedFileSystem.create(DistributedFileSystem.java:421)

                at org.apache.hadoop.hdfs.DistributedFileSystem.create(DistributedFileSystem.java:362)

                at org.apache.hadoop.fs.FileSystem.create(FileSystem.java:925)

                at org.apache.hadoop.fs.FileSystem.create(FileSystem.java:906)

                at parquet.hadoop.ParquetFileWriter.<init>(ParquetFileWriter.java:220)

                at parquet.hadoop.ParquetOutputFormat.getRecordWriter(ParquetOutputFormat.java:311)

                at parquet.hadoop.ParquetOutputFormat.getRecordWriter(ParquetOutputFormat.java:287)

                at org.apache.hadoop.hive.ql.io.parquet.write.ParquetRecordWriterWrapper.<init>(ParquetRecordWriterWrapper.java:65)

                at org.apache.hadoop.hive.ql.io.parquet.MapredParquetOutputFormat.getParquerRecordWriterWrapper(MapredParquetOutputFormat.java:125)

                at org.apache.hadoop.hive.ql.io.parquet.MapredParquetOutputFormat.getHiveRecordWriter(MapredParquetOutputFormat.java:114)

                at org.apache.hadoop.hive.ql.io.HiveFileFormatUtils.getRecordWriter(HiveFileFormatUtils.java:260)

                at org.apache.hadoop.hive.ql.io.HiveFileFormatUtils.getHiveRecordWriter(HiveFileFormatUtils.java:245)

                ... 10 more

Caused by: org.apache.hadoop.security.authentication.client.AuthenticationException: org.apache.hadoop.security.token.SecretManager\

$InvalidToken: token (kms-dt owner=va_dflt, renewer=yarn, realUser=, issueDate=1511262017635, maxDate=1511866817635, sequenceNumber\

=1854601, masterKeyId=3392) is expired

                at sun.reflect.NativeConstructorAccessorImpl.newInstance0(Native Method)

                at sun.reflect.NativeConstructorAccessorImpl.newInstance(NativeConstructorAccessorImpl.java:57)

                at sun.reflect.DelegatingConstructorAccessorImpl.newInstance(DelegatingConstructorAccessorImpl.java:45)

                at java.lang.reflect.Constructor.newInstance(Constructor.java:526)

                at org.apache.hadoop.util.HttpExceptionUtils.validateResponse(HttpExceptionUtils.java:157)

                at org.apache.hadoop.crypto.key.kms.KMSClientProvider.call(KMSClientProvider.java:627)

                at org.apache.hadoop.crypto.key.kms.KMSClientProvider.call(KMSClientProvider.java:585)

                at org.apache.hadoop.crypto.key.kms.KMSClientProvider.decryptEncryptedKey(KMSClientProvider.java:852)

                at org.apache.hadoop.crypto.key.kms.LoadBalancingKMSClientProvider$5.call(LoadBalancingKMSClientProvider.java:209)

                at org.apache.hadoop.crypto.key.kms.LoadBalancingKMSClientProvider$5.call(LoadBalancingKMSClientProvider.java:205)

                at org.apache.hadoop.crypto.key.kms.LoadBalancingKMSClientProvider.doOp(LoadBalancingKMSClientProvider.java:94)

                at org.apache.hadoop.crypto.key.kms.LoadBalancingKMSClientProvider.decryptEncryptedKey(LoadBalancingKMSClientProvider.java:205)

                ... 29 more


Thanks,

Asmath


Reply | Threaded
Open this post in threaded view
|

Re: Spark Streaming Kerberos Issue

geoHeil
Do you use oracle or open jdk? We recently had an issue with open jdk: formerly, java Security extensions were installed by default - no longer so on centos 7.3

Are these installed?
KhajaAsmath Mohammed <[hidden email]> schrieb am Mi. 22. Nov. 2017 um 19:29:
I passed keytab, renewal is enabled by running the script every eight hours. User gets renewed by the script every eight hours.

On Wed, Nov 22, 2017 at 12:27 PM, Georg Heiler <[hidden email]> wrote:
Did you pass a keytab? Is renewal enabled in your kdc?
KhajaAsmath Mohammed <[hidden email]> schrieb am Mi. 22. Nov. 2017 um 19:25:
Hi,

I have written spark stream job and job is running successfully for more than 36 hours. After around 36 hours job gets failed with kerberos issue. Any solution on how to resolve it.

org.apache.spark.SparkException: Task failed while wri\

ting rows.

                at org.apache.spark.sql.hive.SparkHiveDynamicPartitionWriterContainer.writeToFile(hiveWriterContainers.scala:328)

                at org.apache.spark.sql.hive.execution.InsertIntoHiveTable$$anonfun$saveAsHiveFile$3.apply(InsertIntoHiveTable.scala:210)

                at org.apache.spark.sql.hive.execution.InsertIntoHiveTable$$anonfun$saveAsHiveFile$3.apply(InsertIntoHiveTable.scala:210)

                at org.apache.spark.scheduler.ResultTask.runTask(ResultTask.scala:87)

                at org.apache.spark.scheduler.Task.run(Task.scala:99)

                at org.apache.spark.executor.Executor$TaskRunner.run(Executor.scala:322)

                at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1145)

                at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:615)

                at java.lang.Thread.run(Thread.java:745)

Caused by: org.apache.hadoop.hive.ql.metadata.HiveException: java.io.IOException: org.apache.hadoop.security.authentication.client.\

AuthenticationException: org.apache.hadoop.security.token.SecretManager$InvalidToken: token (kms-dt owner=va_dflt, renewer=yarn, re\

alUser=, issueDate=1511262017635, maxDate=1511866817635, sequenceNumber=1854601, masterKeyId=3392) is expired

                at org.apache.hadoop.hive.ql.io.HiveFileFormatUtils.getHiveRecordWriter(HiveFileFormatUtils.java:248)

                at org.apache.spark.sql.hive.SparkHiveDynamicPartitionWriterContainer.newOutputWriter$1(hiveWriterContainers.scala:346)

                at org.apache.spark.sql.hive.SparkHiveDynamicPartitionWriterContainer.writeToFile(hiveWriterContainers.scala:304)

                ... 8 more

Caused by: java.io.IOException: org.apache.hadoop.security.authentication.client.AuthenticationException: org.apache.hadoop.securit\

y.token.SecretManager$InvalidToken: token (kms-dt owner=va_dflt, renewer=yarn, realUser=, issueDate=1511262017635, maxDate=15118668\

17635, sequenceNumber=1854601, masterKeyId=3392) is expired

                at org.apache.hadoop.crypto.key.kms.LoadBalancingKMSClientProvider.decryptEncryptedKey(LoadBalancingKMSClientProvider.java:216)

                at org.apache.hadoop.crypto.key.KeyProviderCryptoExtension.decryptEncryptedKey(KeyProviderCryptoExtension.java:388)

                at org.apache.hadoop.hdfs.DFSClient.decryptEncryptedDataEncryptionKey(DFSClient.java:1440)

                at org.apache.hadoop.hdfs.DFSClient.createWrappedOutputStream(DFSClient.java:1542)

                at org.apache.hadoop.hdfs.DFSClient.createWrappedOutputStream(DFSClient.java:1527)

                at org.apache.hadoop.hdfs.DistributedFileSystem$7.doCall(DistributedFileSystem.java:428)

                at org.apache.hadoop.hdfs.DistributedFileSystem$7.doCall(DistributedFileSystem.java:421)

                at org.apache.hadoop.fs.FileSystemLinkResolver.resolve(FileSystemLinkResolver.java:81)

                at org.apache.hadoop.hdfs.DistributedFileSystem.create(DistributedFileSystem.java:421)

                at org.apache.hadoop.hdfs.DistributedFileSystem.create(DistributedFileSystem.java:362)

                at org.apache.hadoop.fs.FileSystem.create(FileSystem.java:925)

                at org.apache.hadoop.fs.FileSystem.create(FileSystem.java:906)

                at parquet.hadoop.ParquetFileWriter.<init>(ParquetFileWriter.java:220)

                at parquet.hadoop.ParquetOutputFormat.getRecordWriter(ParquetOutputFormat.java:311)

                at parquet.hadoop.ParquetOutputFormat.getRecordWriter(ParquetOutputFormat.java:287)

                at org.apache.hadoop.hive.ql.io.parquet.write.ParquetRecordWriterWrapper.<init>(ParquetRecordWriterWrapper.java:65)

                at org.apache.hadoop.hive.ql.io.parquet.MapredParquetOutputFormat.getParquerRecordWriterWrapper(MapredParquetOutputFormat.java:125)

                at org.apache.hadoop.hive.ql.io.parquet.MapredParquetOutputFormat.getHiveRecordWriter(MapredParquetOutputFormat.java:114)

                at org.apache.hadoop.hive.ql.io.HiveFileFormatUtils.getRecordWriter(HiveFileFormatUtils.java:260)

                at org.apache.hadoop.hive.ql.io.HiveFileFormatUtils.getHiveRecordWriter(HiveFileFormatUtils.java:245)

                ... 10 more

Caused by: org.apache.hadoop.security.authentication.client.AuthenticationException: org.apache.hadoop.security.token.SecretManager\

$InvalidToken: token (kms-dt owner=va_dflt, renewer=yarn, realUser=, issueDate=1511262017635, maxDate=1511866817635, sequenceNumber\

=1854601, masterKeyId=3392) is expired

                at sun.reflect.NativeConstructorAccessorImpl.newInstance0(Native Method)

                at sun.reflect.NativeConstructorAccessorImpl.newInstance(NativeConstructorAccessorImpl.java:57)

                at sun.reflect.DelegatingConstructorAccessorImpl.newInstance(DelegatingConstructorAccessorImpl.java:45)

                at java.lang.reflect.Constructor.newInstance(Constructor.java:526)

                at org.apache.hadoop.util.HttpExceptionUtils.validateResponse(HttpExceptionUtils.java:157)

                at org.apache.hadoop.crypto.key.kms.KMSClientProvider.call(KMSClientProvider.java:627)

                at org.apache.hadoop.crypto.key.kms.KMSClientProvider.call(KMSClientProvider.java:585)

                at org.apache.hadoop.crypto.key.kms.KMSClientProvider.decryptEncryptedKey(KMSClientProvider.java:852)

                at org.apache.hadoop.crypto.key.kms.LoadBalancingKMSClientProvider$5.call(LoadBalancingKMSClientProvider.java:209)

                at org.apache.hadoop.crypto.key.kms.LoadBalancingKMSClientProvider$5.call(LoadBalancingKMSClientProvider.java:205)

                at org.apache.hadoop.crypto.key.kms.LoadBalancingKMSClientProvider.doOp(LoadBalancingKMSClientProvider.java:94)

                at org.apache.hadoop.crypto.key.kms.LoadBalancingKMSClientProvider.decryptEncryptedKey(LoadBalancingKMSClientProvider.java:205)

                ... 29 more


Thanks,

Asmath


Reply | Threaded
Open this post in threaded view
|

Re: Spark Streaming Kerberos Issue

khajaasmath786
We use oracle JDK. we are on unix. 

On Wed, Nov 22, 2017 at 12:31 PM, Georg Heiler <[hidden email]> wrote:
Do you use oracle or open jdk? We recently had an issue with open jdk: formerly, java Security extensions were installed by default - no longer so on centos 7.3

Are these installed?

KhajaAsmath Mohammed <[hidden email]> schrieb am Mi. 22. Nov. 2017 um 19:29:
I passed keytab, renewal is enabled by running the script every eight hours. User gets renewed by the script every eight hours.

On Wed, Nov 22, 2017 at 12:27 PM, Georg Heiler <[hidden email]> wrote:
Did you pass a keytab? Is renewal enabled in your kdc?
KhajaAsmath Mohammed <[hidden email]> schrieb am Mi. 22. Nov. 2017 um 19:25:
Hi,

I have written spark stream job and job is running successfully for more than 36 hours. After around 36 hours job gets failed with kerberos issue. Any solution on how to resolve it.

org.apache.spark.SparkException: Task failed while wri\

ting rows.

                at org.apache.spark.sql.hive.SparkHiveDynamicPartitionWriterContainer.writeToFile(hiveWriterContainers.scala:328)

                at org.apache.spark.sql.hive.execution.InsertIntoHiveTable$$anonfun$saveAsHiveFile$3.apply(InsertIntoHiveTable.scala:210)

                at org.apache.spark.sql.hive.execution.InsertIntoHiveTable$$anonfun$saveAsHiveFile$3.apply(InsertIntoHiveTable.scala:210)

                at org.apache.spark.scheduler.ResultTask.runTask(ResultTask.scala:87)

                at org.apache.spark.scheduler.Task.run(Task.scala:99)

                at org.apache.spark.executor.Executor$TaskRunner.run(Executor.scala:322)

                at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1145)

                at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:615)

                at java.lang.Thread.run(Thread.java:745)

Caused by: org.apache.hadoop.hive.ql.metadata.HiveException: java.io.IOException: org.apache.hadoop.security.authentication.client.\

AuthenticationException: org.apache.hadoop.security.token.SecretManager$InvalidToken: token (kms-dt owner=va_dflt, renewer=yarn, re\

alUser=, issueDate=1511262017635, maxDate=1511866817635, sequenceNumber=1854601, masterKeyId=3392) is expired

                at org.apache.hadoop.hive.ql.io.HiveFileFormatUtils.getHiveRecordWriter(HiveFileFormatUtils.java:248)

                at org.apache.spark.sql.hive.SparkHiveDynamicPartitionWriterContainer.newOutputWriter$1(hiveWriterContainers.scala:346)

                at org.apache.spark.sql.hive.SparkHiveDynamicPartitionWriterContainer.writeToFile(hiveWriterContainers.scala:304)

                ... 8 more

Caused by: java.io.IOException: org.apache.hadoop.security.authentication.client.AuthenticationException: org.apache.hadoop.securit\

y.token.SecretManager$InvalidToken: token (kms-dt owner=va_dflt, renewer=yarn, realUser=, issueDate=1511262017635, maxDate=15118668\

17635, sequenceNumber=1854601, masterKeyId=3392) is expired

                at org.apache.hadoop.crypto.key.kms.LoadBalancingKMSClientProvider.decryptEncryptedKey(LoadBalancingKMSClientProvider.java:216)

                at org.apache.hadoop.crypto.key.KeyProviderCryptoExtension.decryptEncryptedKey(KeyProviderCryptoExtension.java:388)

                at org.apache.hadoop.hdfs.DFSClient.decryptEncryptedDataEncryptionKey(DFSClient.java:1440)

                at org.apache.hadoop.hdfs.DFSClient.createWrappedOutputStream(DFSClient.java:1542)

                at org.apache.hadoop.hdfs.DFSClient.createWrappedOutputStream(DFSClient.java:1527)

                at org.apache.hadoop.hdfs.DistributedFileSystem$7.doCall(DistributedFileSystem.java:428)

                at org.apache.hadoop.hdfs.DistributedFileSystem$7.doCall(DistributedFileSystem.java:421)

                at org.apache.hadoop.fs.FileSystemLinkResolver.resolve(FileSystemLinkResolver.java:81)

                at org.apache.hadoop.hdfs.DistributedFileSystem.create(DistributedFileSystem.java:421)

                at org.apache.hadoop.hdfs.DistributedFileSystem.create(DistributedFileSystem.java:362)

                at org.apache.hadoop.fs.FileSystem.create(FileSystem.java:925)

                at org.apache.hadoop.fs.FileSystem.create(FileSystem.java:906)

                at parquet.hadoop.ParquetFileWriter.<init>(ParquetFileWriter.java:220)

                at parquet.hadoop.ParquetOutputFormat.getRecordWriter(ParquetOutputFormat.java:311)

                at parquet.hadoop.ParquetOutputFormat.getRecordWriter(ParquetOutputFormat.java:287)

                at org.apache.hadoop.hive.ql.io.parquet.write.ParquetRecordWriterWrapper.<init>(ParquetRecordWriterWrapper.java:65)

                at org.apache.hadoop.hive.ql.io.parquet.MapredParquetOutputFormat.getParquerRecordWriterWrapper(MapredParquetOutputFormat.java:125)

                at org.apache.hadoop.hive.ql.io.parquet.MapredParquetOutputFormat.getHiveRecordWriter(MapredParquetOutputFormat.java:114)

                at org.apache.hadoop.hive.ql.io.HiveFileFormatUtils.getRecordWriter(HiveFileFormatUtils.java:260)

                at org.apache.hadoop.hive.ql.io.HiveFileFormatUtils.getHiveRecordWriter(HiveFileFormatUtils.java:245)

                ... 10 more

Caused by: org.apache.hadoop.security.authentication.client.AuthenticationException: org.apache.hadoop.security.token.SecretManager\

$InvalidToken: token (kms-dt owner=va_dflt, renewer=yarn, realUser=, issueDate=1511262017635, maxDate=1511866817635, sequenceNumber\

=1854601, masterKeyId=3392) is expired

                at sun.reflect.NativeConstructorAccessorImpl.newInstance0(Native Method)

                at sun.reflect.NativeConstructorAccessorImpl.newInstance(NativeConstructorAccessorImpl.java:57)

                at sun.reflect.DelegatingConstructorAccessorImpl.newInstance(DelegatingConstructorAccessorImpl.java:45)

                at java.lang.reflect.Constructor.newInstance(Constructor.java:526)

                at org.apache.hadoop.util.HttpExceptionUtils.validateResponse(HttpExceptionUtils.java:157)

                at org.apache.hadoop.crypto.key.kms.KMSClientProvider.call(KMSClientProvider.java:627)

                at org.apache.hadoop.crypto.key.kms.KMSClientProvider.call(KMSClientProvider.java:585)

                at org.apache.hadoop.crypto.key.kms.KMSClientProvider.decryptEncryptedKey(KMSClientProvider.java:852)

                at org.apache.hadoop.crypto.key.kms.LoadBalancingKMSClientProvider$5.call(LoadBalancingKMSClientProvider.java:209)

                at org.apache.hadoop.crypto.key.kms.LoadBalancingKMSClientProvider$5.call(LoadBalancingKMSClientProvider.java:205)

                at org.apache.hadoop.crypto.key.kms.LoadBalancingKMSClientProvider.doOp(LoadBalancingKMSClientProvider.java:94)

                at org.apache.hadoop.crypto.key.kms.LoadBalancingKMSClientProvider.decryptEncryptedKey(LoadBalancingKMSClientProvider.java:205)

                ... 29 more


Thanks,

Asmath



Reply | Threaded
Open this post in threaded view
|

Re: Spark Streaming Kerberos Issue

khajaasmath786
Inline image 1

This is what we are on.

On Wed, Nov 22, 2017 at 12:33 PM, KhajaAsmath Mohammed <[hidden email]> wrote:
We use oracle JDK. we are on unix. 

On Wed, Nov 22, 2017 at 12:31 PM, Georg Heiler <[hidden email]> wrote:
Do you use oracle or open jdk? We recently had an issue with open jdk: formerly, java Security extensions were installed by default - no longer so on centos 7.3

Are these installed?

KhajaAsmath Mohammed <[hidden email]> schrieb am Mi. 22. Nov. 2017 um 19:29:
I passed keytab, renewal is enabled by running the script every eight hours. User gets renewed by the script every eight hours.

On Wed, Nov 22, 2017 at 12:27 PM, Georg Heiler <[hidden email]> wrote:
Did you pass a keytab? Is renewal enabled in your kdc?
KhajaAsmath Mohammed <[hidden email]> schrieb am Mi. 22. Nov. 2017 um 19:25:
Hi,

I have written spark stream job and job is running successfully for more than 36 hours. After around 36 hours job gets failed with kerberos issue. Any solution on how to resolve it.

org.apache.spark.SparkException: Task failed while wri\

ting rows.

                at org.apache.spark.sql.hive.SparkHiveDynamicPartitionWriterContainer.writeToFile(hiveWriterContainers.scala:328)

                at org.apache.spark.sql.hive.execution.InsertIntoHiveTable$$anonfun$saveAsHiveFile$3.apply(InsertIntoHiveTable.scala:210)

                at org.apache.spark.sql.hive.execution.InsertIntoHiveTable$$anonfun$saveAsHiveFile$3.apply(InsertIntoHiveTable.scala:210)

                at org.apache.spark.scheduler.ResultTask.runTask(ResultTask.scala:87)

                at org.apache.spark.scheduler.Task.run(Task.scala:99)

                at org.apache.spark.executor.Executor$TaskRunner.run(Executor.scala:322)

                at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1145)

                at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:615)

                at java.lang.Thread.run(Thread.java:745)

Caused by: org.apache.hadoop.hive.ql.metadata.HiveException: java.io.IOException: org.apache.hadoop.security.authentication.client.\

AuthenticationException: org.apache.hadoop.security.token.SecretManager$InvalidToken: token (kms-dt owner=va_dflt, renewer=yarn, re\

alUser=, issueDate=1511262017635, maxDate=1511866817635, sequenceNumber=1854601, masterKeyId=3392) is expired

                at org.apache.hadoop.hive.ql.io.HiveFileFormatUtils.getHiveRecordWriter(HiveFileFormatUtils.java:248)

                at org.apache.spark.sql.hive.SparkHiveDynamicPartitionWriterContainer.newOutputWriter$1(hiveWriterContainers.scala:346)

                at org.apache.spark.sql.hive.SparkHiveDynamicPartitionWriterContainer.writeToFile(hiveWriterContainers.scala:304)

                ... 8 more

Caused by: java.io.IOException: org.apache.hadoop.security.authentication.client.AuthenticationException: org.apache.hadoop.securit\

y.token.SecretManager$InvalidToken: token (kms-dt owner=va_dflt, renewer=yarn, realUser=, issueDate=1511262017635, maxDate=15118668\

17635, sequenceNumber=1854601, masterKeyId=3392) is expired

                at org.apache.hadoop.crypto.key.kms.LoadBalancingKMSClientProvider.decryptEncryptedKey(LoadBalancingKMSClientProvider.java:216)

                at org.apache.hadoop.crypto.key.KeyProviderCryptoExtension.decryptEncryptedKey(KeyProviderCryptoExtension.java:388)

                at org.apache.hadoop.hdfs.DFSClient.decryptEncryptedDataEncryptionKey(DFSClient.java:1440)

                at org.apache.hadoop.hdfs.DFSClient.createWrappedOutputStream(DFSClient.java:1542)

                at org.apache.hadoop.hdfs.DFSClient.createWrappedOutputStream(DFSClient.java:1527)

                at org.apache.hadoop.hdfs.DistributedFileSystem$7.doCall(DistributedFileSystem.java:428)

                at org.apache.hadoop.hdfs.DistributedFileSystem$7.doCall(DistributedFileSystem.java:421)

                at org.apache.hadoop.fs.FileSystemLinkResolver.resolve(FileSystemLinkResolver.java:81)

                at org.apache.hadoop.hdfs.DistributedFileSystem.create(DistributedFileSystem.java:421)

                at org.apache.hadoop.hdfs.DistributedFileSystem.create(DistributedFileSystem.java:362)

                at org.apache.hadoop.fs.FileSystem.create(FileSystem.java:925)

                at org.apache.hadoop.fs.FileSystem.create(FileSystem.java:906)

                at parquet.hadoop.ParquetFileWriter.<init>(ParquetFileWriter.java:220)

                at parquet.hadoop.ParquetOutputFormat.getRecordWriter(ParquetOutputFormat.java:311)

                at parquet.hadoop.ParquetOutputFormat.getRecordWriter(ParquetOutputFormat.java:287)

                at org.apache.hadoop.hive.ql.io.parquet.write.ParquetRecordWriterWrapper.<init>(ParquetRecordWriterWrapper.java:65)

                at org.apache.hadoop.hive.ql.io.parquet.MapredParquetOutputFormat.getParquerRecordWriterWrapper(MapredParquetOutputFormat.java:125)

                at org.apache.hadoop.hive.ql.io.parquet.MapredParquetOutputFormat.getHiveRecordWriter(MapredParquetOutputFormat.java:114)

                at org.apache.hadoop.hive.ql.io.HiveFileFormatUtils.getRecordWriter(HiveFileFormatUtils.java:260)

                at org.apache.hadoop.hive.ql.io.HiveFileFormatUtils.getHiveRecordWriter(HiveFileFormatUtils.java:245)

                ... 10 more

Caused by: org.apache.hadoop.security.authentication.client.AuthenticationException: org.apache.hadoop.security.token.SecretManager\

$InvalidToken: token (kms-dt owner=va_dflt, renewer=yarn, realUser=, issueDate=1511262017635, maxDate=1511866817635, sequenceNumber\

=1854601, masterKeyId=3392) is expired

                at sun.reflect.NativeConstructorAccessorImpl.newInstance0(Native Method)

                at sun.reflect.NativeConstructorAccessorImpl.newInstance(NativeConstructorAccessorImpl.java:57)

                at sun.reflect.DelegatingConstructorAccessorImpl.newInstance(DelegatingConstructorAccessorImpl.java:45)

                at java.lang.reflect.Constructor.newInstance(Constructor.java:526)

                at org.apache.hadoop.util.HttpExceptionUtils.validateResponse(HttpExceptionUtils.java:157)

                at org.apache.hadoop.crypto.key.kms.KMSClientProvider.call(KMSClientProvider.java:627)

                at org.apache.hadoop.crypto.key.kms.KMSClientProvider.call(KMSClientProvider.java:585)

                at org.apache.hadoop.crypto.key.kms.KMSClientProvider.decryptEncryptedKey(KMSClientProvider.java:852)

                at org.apache.hadoop.crypto.key.kms.LoadBalancingKMSClientProvider$5.call(LoadBalancingKMSClientProvider.java:209)

                at org.apache.hadoop.crypto.key.kms.LoadBalancingKMSClientProvider$5.call(LoadBalancingKMSClientProvider.java:205)

                at org.apache.hadoop.crypto.key.kms.LoadBalancingKMSClientProvider.doOp(LoadBalancingKMSClientProvider.java:94)

                at org.apache.hadoop.crypto.key.kms.LoadBalancingKMSClientProvider.decryptEncryptedKey(LoadBalancingKMSClientProvider.java:205)

                ... 29 more


Thanks,

Asmath




Reply | Threaded
Open this post in threaded view
|

Re: Spark Streaming Kerberos Issue

geoHeil
Did you check that the security extensions are installed (JCE)?

KhajaAsmath Mohammed <[hidden email]> schrieb am Mi., 22. Nov. 2017 um 19:36 Uhr:
Inline image 1

This is what we are on.

On Wed, Nov 22, 2017 at 12:33 PM, KhajaAsmath Mohammed <[hidden email]> wrote:
We use oracle JDK. we are on unix. 

On Wed, Nov 22, 2017 at 12:31 PM, Georg Heiler <[hidden email]> wrote:
Do you use oracle or open jdk? We recently had an issue with open jdk: formerly, java Security extensions were installed by default - no longer so on centos 7.3

Are these installed?

KhajaAsmath Mohammed <[hidden email]> schrieb am Mi. 22. Nov. 2017 um 19:29:
I passed keytab, renewal is enabled by running the script every eight hours. User gets renewed by the script every eight hours.

On Wed, Nov 22, 2017 at 12:27 PM, Georg Heiler <[hidden email]> wrote:
Did you pass a keytab? Is renewal enabled in your kdc?
KhajaAsmath Mohammed <[hidden email]> schrieb am Mi. 22. Nov. 2017 um 19:25:
Hi,

I have written spark stream job and job is running successfully for more than 36 hours. After around 36 hours job gets failed with kerberos issue. Any solution on how to resolve it.

org.apache.spark.SparkException: Task failed while wri\

ting rows.

                at org.apache.spark.sql.hive.SparkHiveDynamicPartitionWriterContainer.writeToFile(hiveWriterContainers.scala:328)

                at org.apache.spark.sql.hive.execution.InsertIntoHiveTable$$anonfun$saveAsHiveFile$3.apply(InsertIntoHiveTable.scala:210)

                at org.apache.spark.sql.hive.execution.InsertIntoHiveTable$$anonfun$saveAsHiveFile$3.apply(InsertIntoHiveTable.scala:210)

                at org.apache.spark.scheduler.ResultTask.runTask(ResultTask.scala:87)

                at org.apache.spark.scheduler.Task.run(Task.scala:99)

                at org.apache.spark.executor.Executor$TaskRunner.run(Executor.scala:322)

                at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1145)

                at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:615)

                at java.lang.Thread.run(Thread.java:745)

Caused by: org.apache.hadoop.hive.ql.metadata.HiveException: java.io.IOException: org.apache.hadoop.security.authentication.client.\

AuthenticationException: org.apache.hadoop.security.token.SecretManager$InvalidToken: token (kms-dt owner=va_dflt, renewer=yarn, re\

alUser=, issueDate=1511262017635, maxDate=1511866817635, sequenceNumber=1854601, masterKeyId=3392) is expired

                at org.apache.hadoop.hive.ql.io.HiveFileFormatUtils.getHiveRecordWriter(HiveFileFormatUtils.java:248)

                at org.apache.spark.sql.hive.SparkHiveDynamicPartitionWriterContainer.newOutputWriter$1(hiveWriterContainers.scala:346)

                at org.apache.spark.sql.hive.SparkHiveDynamicPartitionWriterContainer.writeToFile(hiveWriterContainers.scala:304)

                ... 8 more

Caused by: java.io.IOException: org.apache.hadoop.security.authentication.client.AuthenticationException: org.apache.hadoop.securit\

y.token.SecretManager$InvalidToken: token (kms-dt owner=va_dflt, renewer=yarn, realUser=, issueDate=1511262017635, maxDate=15118668\

17635, sequenceNumber=1854601, masterKeyId=3392) is expired

                at org.apache.hadoop.crypto.key.kms.LoadBalancingKMSClientProvider.decryptEncryptedKey(LoadBalancingKMSClientProvider.java:216)

                at org.apache.hadoop.crypto.key.KeyProviderCryptoExtension.decryptEncryptedKey(KeyProviderCryptoExtension.java:388)

                at org.apache.hadoop.hdfs.DFSClient.decryptEncryptedDataEncryptionKey(DFSClient.java:1440)

                at org.apache.hadoop.hdfs.DFSClient.createWrappedOutputStream(DFSClient.java:1542)

                at org.apache.hadoop.hdfs.DFSClient.createWrappedOutputStream(DFSClient.java:1527)

                at org.apache.hadoop.hdfs.DistributedFileSystem$7.doCall(DistributedFileSystem.java:428)

                at org.apache.hadoop.hdfs.DistributedFileSystem$7.doCall(DistributedFileSystem.java:421)

                at org.apache.hadoop.fs.FileSystemLinkResolver.resolve(FileSystemLinkResolver.java:81)

                at org.apache.hadoop.hdfs.DistributedFileSystem.create(DistributedFileSystem.java:421)

                at org.apache.hadoop.hdfs.DistributedFileSystem.create(DistributedFileSystem.java:362)

                at org.apache.hadoop.fs.FileSystem.create(FileSystem.java:925)

                at org.apache.hadoop.fs.FileSystem.create(FileSystem.java:906)

                at parquet.hadoop.ParquetFileWriter.<init>(ParquetFileWriter.java:220)

                at parquet.hadoop.ParquetOutputFormat.getRecordWriter(ParquetOutputFormat.java:311)

                at parquet.hadoop.ParquetOutputFormat.getRecordWriter(ParquetOutputFormat.java:287)

                at org.apache.hadoop.hive.ql.io.parquet.write.ParquetRecordWriterWrapper.<init>(ParquetRecordWriterWrapper.java:65)

                at org.apache.hadoop.hive.ql.io.parquet.MapredParquetOutputFormat.getParquerRecordWriterWrapper(MapredParquetOutputFormat.java:125)

                at org.apache.hadoop.hive.ql.io.parquet.MapredParquetOutputFormat.getHiveRecordWriter(MapredParquetOutputFormat.java:114)

                at org.apache.hadoop.hive.ql.io.HiveFileFormatUtils.getRecordWriter(HiveFileFormatUtils.java:260)

                at org.apache.hadoop.hive.ql.io.HiveFileFormatUtils.getHiveRecordWriter(HiveFileFormatUtils.java:245)

                ... 10 more

Caused by: org.apache.hadoop.security.authentication.client.AuthenticationException: org.apache.hadoop.security.token.SecretManager\

$InvalidToken: token (kms-dt owner=va_dflt, renewer=yarn, realUser=, issueDate=1511262017635, maxDate=1511866817635, sequenceNumber\

=1854601, masterKeyId=3392) is expired

                at sun.reflect.NativeConstructorAccessorImpl.newInstance0(Native Method)

                at sun.reflect.NativeConstructorAccessorImpl.newInstance(NativeConstructorAccessorImpl.java:57)

                at sun.reflect.DelegatingConstructorAccessorImpl.newInstance(DelegatingConstructorAccessorImpl.java:45)

                at java.lang.reflect.Constructor.newInstance(Constructor.java:526)

                at org.apache.hadoop.util.HttpExceptionUtils.validateResponse(HttpExceptionUtils.java:157)

                at org.apache.hadoop.crypto.key.kms.KMSClientProvider.call(KMSClientProvider.java:627)

                at org.apache.hadoop.crypto.key.kms.KMSClientProvider.call(KMSClientProvider.java:585)

                at org.apache.hadoop.crypto.key.kms.KMSClientProvider.decryptEncryptedKey(KMSClientProvider.java:852)

                at org.apache.hadoop.crypto.key.kms.LoadBalancingKMSClientProvider$5.call(LoadBalancingKMSClientProvider.java:209)

                at org.apache.hadoop.crypto.key.kms.LoadBalancingKMSClientProvider$5.call(LoadBalancingKMSClientProvider.java:205)

                at org.apache.hadoop.crypto.key.kms.LoadBalancingKMSClientProvider.doOp(LoadBalancingKMSClientProvider.java:94)

                at org.apache.hadoop.crypto.key.kms.LoadBalancingKMSClientProvider.decryptEncryptedKey(LoadBalancingKMSClientProvider.java:205)

                ... 29 more


Thanks,

Asmath