Spark-shell交互式编程--林子雨Spark实验四(1)

时间:2024-03-03 12:38:01
1.该系总共有多少学生
val lines = sc.textFile("file:///usr/local/spark/sparksqldata/Data01.txt")
val par = lines.map(row=>row.split(",")(0)) 
val distinct_par = par.distinct() //去重操作
distinct_par.count //取得总数

2.该系共开设来多少门课程

val lines = sc.textFile("file:///usr/local/spark/sparksqldata/Data01.txt")
val par = lines.map(row=>row.split(",")(1)) 
val distinct_par = par.distinct() 
distinct_par.count

3.Tom 同学的总成绩平均分是多少

val lines = sc.textFile("file:///usr/local/spark/sparksqldata/Data01.txt")
val pare = lines.filter(row=>row.split(",")(0)=="Tom")
pare.foreach(println)
Tom,DataBase,26
Tom,Algorithm,12
Tom,OperatingSystem,16
Tom,Python,40
Tom,Software,60
pare.map(row=>(row.split(",")(0),row.split(",")(2).toInt)).mapValues(x=>(x,1)).reduceByKey((x,y
) => (x._1+y._1,x._2 + y._2)).mapValues(x => (x._1 / x._2)).collect()
//res9: Array[(String, Int)] = Array((Tom,30))

4.求每名同学的选修的课程门数

 

5.该系 DataBase 课程共有多少人选修

val lines = sc.textFile("file:///usr/local/spark/sparksqldata/Data01.txt")
val pare = lines.filter(row=>row.split(",")(1)=="DataBase")
pare.count
res1: Long = 126

 

6.各门课程的平均分是多少

val lines = sc.textFile("file:///usr/local/spark/sparksqldata/Data01.txt")
val pare = lines.map(row=>(row.split(",")(1),row.split(",")(2).toInt))
pare.mapValues(x=>(x,1)).reduceByKey((x,y) => (x._1+y._1,x._2 + y._2)).mapValues(x => (x._1 / x._2)).collect()
res0: Array[(String, Int)] = Array((Python,57), (OperatingSystem,54), (CLanguage,50), 
(Software,50), (Algorithm,48), (DataStructure,47), (DataBase,50), (ComputerNetwork,51))