I have a docker based cluster. In my cluster, I try to schedule spark jobs by using Airflow. Airflow and Spark are running separately in different containers. However, I cannot run a spark job by using airflow.
Below the code is my airflow script:
from airflow import DAG
from airflow.contrib.operators.spark_submit_operator import SparkSubmitOperator