写了一个python脚本,发现在执行时报了一个错:
File “xh_spark/xh_result_check.py”, line 87, in
a.show(1000, False)
File “/opt/huawei/apps/browser/fi-client/FusionInsight_Services_ClientConfig_master/Spark2x/spark/python/pyspark/sql/dataframe.py”, line 382, in show
print(self._jdf.showString(n, int(truncate), vertical))
UnicodeEncodeError: ‘ascii’ codec can’t encode characters in position 1702-1705: ordinal not in range(128)
问题现象:执行脚本会报错,但在spak-shell中不会报错,
解决办法:
import sys
import codecs
sys.stdout = codecs.getwriter(‘utf8’)(sys.stdout)