How to create security filter for Spark UI in Spark on YARN

classic Classic list List threaded Threaded
1 message Options
Reply | Threaded
Open this post in threaded view
|

How to create security filter for Spark UI in Spark on YARN

Jhon Anderson Cardenas Diaz
Environment:
AWS EMR, yarn cluster.

Description:
I am trying to use a java filter to protect the access to spark ui, this by using the property spark.ui.filters; the problem is that when spark is running on yarn mode, that property is being allways overriden by hadoop with the filter org.apache.hadoop.yarn.server.webproxy.amfilter.AmIpFilter:

spark.ui.filters: org.apache.hadoop.yarn.server.webproxy.amfilter.AmIpFilter

And this properties are automatically added:

spark.org.apache.hadoop.yarn.server.webproxy.amfilter.AmIpFilter.param.PROXY_HOSTS: ip-x-x-x-226.eu-west-1.compute.internal
spark.org.apache.hadoop.yarn.server.webproxy.amfilter.AmIpFilter.param.PROXY_URI_BASES: http://ip-x-x-x-226.eu-west-1.compute.internal:20888/proxy/application_xxxxxxxxxxxxx_xxxx


Any suggestion of how to add a java security filter so hadoop does not override it, or maybe how to configure the security from hadoop side?

Thanks.