In scala, I tried to pass a huge global variable into map operation, spark prompted me with information:
在scala中,我试图将一个巨大的全局变量传递给map操作,spark提示我提供信息:
ERROR yarn.ApplicationMaster: User class threw exception: java.lang.*Error
at java.io.ObjectOutputStream.writeSerialData(ObjectOutputStream.java:1509)
at java.io.ObjectOutputStream.writeOrdinaryObject(ObjectOutputStream.java:1432)
at java.io.ObjectOutputStream.writeObject0(ObjectOutputStream.java:1178)
at java.io.ObjectOutputStream.defaultWriteFields(ObjectOutputStream.java:1548)
Code is like:
代码如下:
val data = sc.textFile(inputPath).cache()
val map = Map[String, Int]()
for (i <- 0 to 9) {
map(i.toString) = i
}
data.map(sample => {
if (map.contains(sample)) {
("Found")
}
else {
("Not found")
}
})
1 个解决方案
#1
0
It has been solved, I adjust data type from "Array[Map[String, Int]]" into "Map[String, Int]"
它已经解决,我将数据类型从“Array [Map [String,Int]]”调整为“Map [String,Int]”
#1
0
It has been solved, I adjust data type from "Array[Map[String, Int]]" into "Map[String, Int]"
它已经解决,我将数据类型从“Array [Map [String,Int]]”调整为“Map [String,Int]”