[apache-spark] Spark Kill Running Application

This might not be an ethical and preferred solution but it helps in environments where you can't access the console to kill the job using yarn application command.

Steps are

Go to application master page of spark job. Click on the jobs section. Click on the active job's active stage. You will see "kill" button right next to the active stage.

This works if the succeeding stages are dependent on the currently running stage. Though it marks job as " Killed By User"