does anybody if (and how) it's possible to get a (dev-local) Spark
installation to talk to fakes3 for s3[n|a]:// URLs?
I have managed to connect to AWS S3 from my local installation by adding
hadoop-aws and aws-java-sdk to jars, using s3:// URLs as arguments for
SparkContext#textFile(), but I'm at loss how to get it to work with a
The only reference I've found so far is this issue, where somebody seems
to have gotten close, but unfortunately he's forgotten about the details: