17/11/28 09:49:24 ERROR RetryingBlockFetcher: Exception while beginning fetch of 1 outstanding blocks java.io.IOException: Failed to connect to /172.17.0.2:45442
File "/Users/seki/git/dyd/data_foundation/python/com/dyd/data/test/TestRecommend.py", line 158, in <lambda>
group_by_post_colums_sum = data_T.apply(lambda x:sum(x))
File "/Users/seki/git/learn/spark/python/lib/pyspark.zip/pyspark/sql/functions.py", line 40, in _
jc = getattr(sc._jvm.functions, name)(col._jc if isinstance(col, Column) else col)
AttributeError: ("'NoneType' object has no attribute '_jvm'", u'occurred at index 6293615542250297450')
update-alternatives: using /usr/lib/atlas-base/atlas/libblas.so.3 to provide /usr/lib/libblas.so.3 (libblas.so.3) in auto mode Setting up libopenblas-base (0.2.8-6ubuntu1) ... update-alternatives: using /usr/lib/openblas-base/libblas.so.3 to provide /usr/lib/libblas.so.3 (libblas.so.3) in auto mode
Caused by: java.lang.NullPointerException: Cannot suppress a null exception.at java.lang.Throwable.addSuppressed(Throwable.java:1046)at org.apache.spark.sql.execution.datasources.jdbc.JdbcUtils$.savePartition(JdbcUtils.scala:243)
mysql 账号没有写入权限
java.io.IOException: No space left on deviceat java.io.FileOutputStream.writeBytes(Native Method)at java.io.FileOutputStream.write(FileOutputStream.java:326)at org.apache.spark.storage.TimeTrackingOutputStream.write(TimeTrackingOutputStream.java:58)
org.apache.spark.shuffle.FetchFailedException: /data/yarn/local/usercache/root/appcache/application_1514973560079_1031/blockmgr-b8eb9d9c-e6e2-4c4c-a5ac-5120a52b1e32/1d/shuffle_9_1_0.index (No such file or directory)
spark als 推荐慢问题
blockify -》 blockSize 调整
Exception in thread "main" org.apache.spark.SparkException: Job aborted due to stage failure: Task 1 in stage 222.0 failed 4 times, most recent failure: Lost task 1.3 in stage 222.0 (TID 9505, uhadoop-adarbt-core1): java.lang.ArrayIndexOutOfBoundsException at org.apache.spark.scheduler.DAGScheduler.org$apache$spark$scheduler$DAGScheduler$$failJobAndIndependentStages(DAGScheduler.scala:1454)
spark2.3.1 调试问题,java.io.InvalidClassException: org.apache.spark.storage.BlockManagerId; local class incompatible: stream classdesc serialVersionUID = -3720498261147521051, local class serialVersionUID = -6655865447853211720 at java.io.ObjectStreamClass.initNonProxy(ObjectStreamClass.java:616)
core 编译target 目录 original-spark-core_2.11-2.3.1-SNAPSHOT.jar 替换 assembly 项目下 scala-2.11/jars对应的spark core jar