MapReduce编程实例3

时间:2023-03-08 20:14:13
MapReduce编程实例3

MapReduce编程实例:

MapReduce编程实例(一),详细介绍在集成环境中运行第一个MapReduce程序 WordCount及代码分析

MapReduce编程实例(二),计算学生平均成绩

MapReduce编程实例(三),数据去重

MapReduce编程实例(四),排序

MapReduce编程实例(五),MapReduce实现单表关联

输入:

2013-11-01 aa
2013-11-02 bb
2013-11-03 cc
2013-11-04 aa
2013-11-05 dd
2013-11-06 dd
2013-11-07 aa
2013-11-09 cc
2013-11-10 ee

2013-11-01 bb 
2013-11-02 33 
2013-11-03 cc
2013-11-04 bb
2013-11-05 23 
2013-11-06 dd
2013-11-07 99 
2013-11-09 99
2013-11-10 ee

.....

.....

.....

数据重复,map中每一行做为一个key,value值任意,经过shuffle之后输入到reduce中利用key的唯一性直接输出key

代码太简单,不解释,上代码:

  1. package com.t.hadoop;
  2. import java.io.IOException;
  3. import java.util.HashSet;
  4. import java.util.StringTokenizer;
  5. import org.apache.hadoop.conf.Configuration;
  6. import org.apache.hadoop.fs.Path;
  7. import org.apache.hadoop.io.Text;
  8. import org.apache.hadoop.mapreduce.Job;
  9. import org.apache.hadoop.mapreduce.Mapper;
  10. import org.apache.hadoop.mapreduce.Reducer;
  11. import org.apache.hadoop.mapreduce.lib.input.FileInputFormat;
  12. import org.apache.hadoop.mapreduce.lib.output.FileOutputFormat;
  13. import org.apache.hadoop.util.GenericOptionsParser;
  14. /**
  15. * 数据去重
  16. * @author daT dev.tao@gmail.com
  17. *
  18. */
  19. public class Dedup {
  20. public static class MyMapper extends Mapper<Object, Text, Text, Text>{
  21. @Override
  22. protected void map(Object key, Text value, Context context)
  23. throws IOException, InterruptedException {
  24. context.write(value, new Text(""));
  25. }
  26. }
  27. public static class MyReducer extends Reducer<Text, Text, Text, Text>{
  28. @Override
  29. protected void reduce(Text key, Iterable<Text> value,
  30. Context context)
  31. throws IOException, InterruptedException {
  32. context.write(key, new Text(""));
  33. }
  34. }
  35. public static void main(String[] args) throws IOException, ClassNotFoundException, InterruptedException{
  36. Configuration conf = new Configuration();
  37. String[] otherArgs = new GenericOptionsParser(conf, args).getRemainingArgs();
  38. if(otherArgs.length<2){
  39. System.out.println("parameter errors!");
  40. System.exit(2);
  41. }
  42. Job job = new org.apache.hadoop.mapreduce.Job(conf, "Dedup");
  43. job.setJarByClass(Dedup.class);
  44. job.setMapperClass(MyMapper.class);
  45. job.setCombinerClass(MyReducer.class);
  46. job.setReducerClass(MyReducer.class);
  47. job.setOutputKeyClass(Text.class);
  48. job.setOutputValueClass(Text.class);
  49. FileInputFormat.addInputPath(job, new Path(otherArgs[0]));
  50. FileOutputFormat.setOutputPath(job, new Path(otherArgs[1]));
  51. System.exit(job.waitForCompletion(true)?0:1);
  52. }
  53. }

输出结果
2013-11-01 aa
2013-11-01 bb 
2013-11-02 33 
2013-11-02 bb
2013-11-03 cc
2013-11-03 cc 
2013-11-04 98
2013-11-04 aa
2013-11-04 bb
2013-11-05 23 
2013-11-05 93
2013-11-05 dd
2013-11-06 99
2013-11-06 dd
2013-11-07 92
2013-11-07 99 
2013-11-07 aa
2013-11-09 99
2013-11-09 aa 
2013-11-09 cc
2013-11-10 ee

版权声明:本文为博主原创文章,未经博主允许不得转载。