报错信息
17/07/06 17:00:27 WARN mapred.JobClient: Use GenericOptionsParser for parsing the arguments. Applications should implement Tool for the same.
17/07/06 17:00:27 WARN mapred.JobClient: No job jar file set. User classes may not be found. See JobConf(Class) or JobConf#setJar(String).
17/07/06 17:00:27 INFO input.FileInputFormat: Total input paths to process : 1
17/07/06 17:00:27 WARN util.NativeCodeLoader: Unable to load native-hadoop library for your platform... using builtin-java classes where applicable
17/07/06 17:00:27 WARN snappy.LoadSnappy: Snappy native library not loaded
17/07/06 17:00:27 INFO mapred.JobClient: Running job: job_201707060106_0012
17/07/06 17:00:28 INFO mapred.JobClient: map 0% reduce 0%
17/07/06 17:00:34 INFO mapred.JobClient: Task Id : attempt_201707060106_0012_m_000000_0, Status : FAILED
java.lang.RuntimeException: java.lang.ClassNotFoundException: com.mapreduce.MapUtil
问题出在工程下没有job 的jar包
解决办法
在conf中添加 conf.set("mapred.jar", "hadooptest.jar"); 其中 hadooptest.jar为导出jar包名称,mapred.jar不变
将工程打包jar文件,放到工程根目录下,再次运行问题解决
源码
package com.mapreduce;
import org.apache.hadoop.conf.Configuration;
import org.apache.hadoop.fs.Path;
import org.apache.hadoop.io.IntWritable;
import org.apache.hadoop.io.Text;
import org.apache.hadoop.mapreduce.Job;
import org.apache.hadoop.mapreduce.lib.input.FileInputFormat;
import org.apache.hadoop.mapreduce.lib.output.FileOutputFormat;
public class MapReduceMain {
public static void main(String[] args) {
Configuration conf = new Configuration();
conf.set("mapred.job.tracker", "test1:9001");
conf.set("mapred.jar", "hadooptest.jar");
try {
Job job = new Job(conf);
job.setJarByClass(MapReduceMain.class);//设置启动类
job.setMapperClass(MapUtil.class);//设置map类
job.setReducerClass(ReduceUtil.class);//设置reduce类
job.setOutputKeyClass(Text.class);//设置输出key类型类
job.setOutputValueClass(IntWritable.class);//设置数据value类型类
//job.setNumReduceTasks(1);//设置reduce任务个数,默认为1
//输入数据所在的文件目录
FileInputFormat.addInputPath(job, new Path("hdfs://test1:9000/input/"));
//mapreduce执行后输出数据目录
FileOutputFormat.setOutputPath(job, new Path("hdfs://test1:9000/output/"));
System.exit(job.waitForCompletion(true)?0:1);
} catch (Exception e) {
e.printStackTrace();
}
}
}