but when my job runs it still complains that there are not enough resources/workers. Connecting to the master, it shows that workers have been assigned and are in the RUNNING state. My local spark app doesn't agree. It's like the workers were assigned but the PC end doesn't know.
I can use spark-submit.sh but I was really hoping to be able to run Spark Applications directly from IDEA. Possible?