[Spark Core]: S3a with Openstack swift object storage not using credentials provided in sparkConf
i am currently using Spark 2.2.0 for Hadoop 2.7.x in in a
Standalone cluster for testing. I want to Access some files to
share them one the nodes on the cluster using addFiles. As local
directories are not supported for this i want to use s3 to do
In contrast to nearly everything i have found on the internet i
am using a self hosted openstack cluster using swift as object
storage. Accessing swift directly would be fine, too, but all
tutorials i have found seem to use keystone v2, whilst our
deployment uses the v3 version.
I added the following jars:
as jars and to the classpath of each executor and driver.
When i try to access an s3 bucket the following exception
occurs: "Unable to load AWS credentials from any provider in the