在 windows 上运行 MapReduce 时报如下异常
Exception in thread "main" java.lang.UnsatisfiedLinkError: org.apache.hadoop.io.nativeio.NativeIO$Windows.access0(Ljava/lang/String;I)Z
at org.apache.hadoop.io.nativeio.NativeIO$Windows.access0(Native Method)
at org.apache.hadoop.io.nativeio.NativeIO$Windows.access(NativeIO.java:606)
at org.apache.hadoop.fs.FileUtil.canRead(FileUtil.java:998)
at org.apache.hadoop.util.DiskChecker.checkAccessByFileMethods(DiskChecker.java:160)
at org.apache.hadoop.util.DiskChecker.checkDirInternal(DiskChecker.java:100)
at org.apache.hadoop.util.DiskChecker.checkDir(DiskChecker.java:77)
at org.apache.hadoop.fs.LocalDirAllocator$AllocatorPerContext.confChanged(LocalDirAllocator.java:315)
at org.apache.hadoop.fs.LocalDirAllocator$AllocatorPerContext.getLocalPathForWrite(LocalDirAllocator.java:378)
at org.apache.hadoop.fs.LocalDirAllocator.getLocalPathForWrite(LocalDirAllocator.java:152)
at org.apache.hadoop.fs.LocalDirAllocator.getLocalPathForWrite(LocalDirAllocator.java:133)
at org.apache.hadoop.fs.LocalDirAllocator.getLocalPathForWrite(LocalDirAllocator.java:117)
at org.apache.hadoop.mapred.LocalDistributedCacheManager.setup(LocalDistributedCacheManager.java:124)
at org.apache.hadoop.mapred.LocalJobRunner$Job.<init>(LocalJobRunner.java:171)
at org.apache.hadoop.mapred.LocalJobRunner.submitJob(LocalJobRunner.java:760)
at org.apache.hadoop.mapreduce.JobSubmitter.submitJobInternal(JobSubmitter.java:253)
at org.apache.hadoop.mapreduce.Job$11.run(Job.java:1570)
at org.apache.hadoop.mapreduce.Job$11.run(Job.java:1567)
at java.security.AccessController.doPrivileged(Native Method)
at javax.security.auth.Subject.doAs(Subject.java:422)
at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1893)
at org.apache.hadoop.mapreduce.Job.submit(Job.java:1567)
at org.apache.hadoop.mapreduce.Job.waitForCompletion(Job.java:1588)
at test.mapreduce.wordcount.WordcountDriver.main(WordcountDriver.java:56)
搜索相关问题后,已解决
前提:
确保 hadoop/bin 目录下的文件是完整的,直接从 apache 下载的会缺少文件,可以自己编译或从 GitHub 上下载其他人编译好的文件
确保软件 32 位或 64 位,保持一致
解决方法有如下两种
1.手动加载 hadoop.dll,指定 hadoop 目录
static {
try {
// 设置 HADOOP_HOME 目录
System.setProperty("hadoop.home.dir", "D:/DevelopTools/hadoop-2.9.2/");
// 加载库文件
System.load("D:/DevelopTools/hadoop-2.9.2/bin/hadoop.dll");
} catch (UnsatisfiedLinkError e) {
System.err.println("Native code library failed to load.\n" + e);
System.exit(1);
}
}
2.覆盖修改源码中的代码
复制 org.apache.hadoop.io.nativeio.NativeIO 到自己的项目,保持和源码包路径一致
修改 access() 方法返回值,原代码如下
修改成如下,直接返回
private static native boolean access0(String var0, int var1); public static boolean access(String path, NativeIO.Windows.AccessRight desiredAccess) throws IOException {
return true;
}