Spark 编程读取hive,hbase, 文本等外部数据生成dataframe后,一般我们都会map遍历get数据的每个字段,此时如果原始数据为null时,如果不进行判断直接转化为string,就会报空指针异常 java.lang.NullPointerException
示例代码如下:
val data = spark.sql(sql)
val rdd = data.rdd.map(record => {
val recordSize = record.size
for(i <- 0 to (recordSize-1)){
val str = record.get(i).toString
do something...
}
为了解决该问题,可以对代码添加判空逻辑,如下所示:
val data = spark.sql(sql)
val rdd = data.rdd.map(record => {
val recordSize = record.size
for(i <- 0 to (recordSize-1)){
val str = record.get(i)
if(!record.isNullAt(i) && !str.toString.isEmpty){
do something...
}
}
record.isNullAt(i) 判断第i个字段取值是否为null
不为null的话,再用isEmpty判断是否为空