昨天已经完成了standalone的spark搭建,当然今天在实际使用python连接时,发现其实是有问题的。master和slave其实并没有成功连接,原因是启动slave时指定的master ip地址有误,需要通过netstat查看7077在哪个ip上侦听,不可直接写127.0.0.1
今天成功用python连接了昨天搭建的spark集群做简单计算
环境:
python3
pip3
安装:
pip3 install pyspark
代码:
cat test_connect_spark.py
from pyspark import SparkContext
logFile = "./test.txt"
# conect spark
sc = SparkContext(master="spark://192.168.1.20:7077", appName="Simple App")
logData = sc.textFile(logFile).cache()
# count lines include 'a'
numAs = logData.filter(lambda s: 'a' in s).count()
# count lines include 'b'
numBs = logData.filter(lambda s: 'b' in s).count()
print("Lines with a: %i, lines with b: %i" % (numAs, numBs))