You need to make sure the standalone project you're launching is launched with python 3. If your are submitting your standalone program through spark-submit then it should work fine, but if you are launching it with python make sure you use python3 to start your app.
Also make sure you have set your env variables in ./conf/spark-spark-env.sh (if it doesnt exist you can use spark-env.sh.template as a base.
2
@Kevin - I am having same problem, could you please post your solution regarding what change you made in spark-evn.sh. –
Dev PatelJun 22 '15 at 17:14
1
This is the right way of inducing PATH variables to Spark, instead of modifying .bashrc. –
CᴴᴀZAug 3 at 12:31
Spark can run in python2, but in this case the user was trying to specify python3 in their question. Whichever Python version it is it needs to be done consistently. –
HoldenAug 26 at 9:45
Setting PYSPARK_PYTHON=python3 and PYSPARK_DRIVER_PYTHON=python3 both to python3 works for me. I did this using export in my .bashrc. In the end, these are the variables I create:
.bashrc
. – CᴴᴀZ Aug 3 at 12:31