[apache-spark] How do I set the driver's python version in spark?

I came across the same error message and I have tried three ways mentioned above. I listed the results as a complementary reference to others.

  1. Change the PYTHON_SPARK and PYTHON_DRIVER_SPARK value in spark-env.sh does not work for me.
  2. Change the value inside python script using os.environ["PYSPARK_PYTHON"]="/usr/bin/python3.5" os.environ["PYSPARK_DRIVER_PYTHON"]="/usr/bin/python3.5" does not work for me.
  3. Change the value in ~/.bashrc works like a charm~