问题
I cannot solve this exception, I've read the hadoop docu and all related stackoverflow questions that I could find.
My fileSystem.mkdirs(***) throws:
Exception in thread "main" java.lang.UnsatisfiedLinkError: org.apache.hadoop.io.nativeio.NativeIO$Windows.createDirectoryWithMode0(Ljava/lang/String;I)V
at org.apache.hadoop.io.nativeio.NativeIO$Windows.createDirectoryWithMode0(Native Method)
at org.apache.hadoop.io.nativeio.NativeIO$Windows.createDirectoryWithMode(NativeIO.java:524)
at org.apache.hadoop.fs.RawLocalFileSystem.mkOneDirWithMode(RawLocalFileSystem.java:465)
at org.apache.hadoop.fs.RawLocalFileSystem.mkdirsWithOptionalPermission(RawLocalFileSystem.java:518)
at org.apache.hadoop.fs.RawLocalFileSystem.mkdirs(RawLocalFileSystem.java:496)
at org.apache.hadoop.fs.FilterFileSystem.mkdirs(FilterFileSystem.java:316)
...
I am including the following dependencies in my app (via maven pom.xml), all in version 2.6.0-cdh5.13.0: hadoop-common, hadoop-hdfs, hadoop-client, hadoop-minicluster
My filesystem variable is a valid (hadoop-common) FileSystem (org.apache.hadoop.fs.FileSystem).
I downloaded the hadoop files from https://github.com/steveloughran/winutils/tree/master/hadoop-2.6.0/bin. I stored the winutils.exe and all the other files from in version 2.6.0 to my local file system under C:\Temp\hadoop\bin. I added the path variable HADOOP_HOME with C:\Temp\hadoop (yes, not the path to the bin directory).
The fallback is not used ("using builtin-java classes"), I am getting:
145 [main] DEBUG org.apache.hadoop.util.NativeCodeLoader - Trying to load the custom-built native-hadoop library...
147 [main] DEBUG org.apache.hadoop.util.NativeCodeLoader - Loaded the native-hadoop library
(See https://hadoop.apache.org/docs/r2.6.0/hadoop-project-dist/hadoop-common/NativeLibraries.html)
I understood, that this exception can be caused by a hadoop version mismatch, but I checked that the imported hadoop matches the hadoop I stored locally, version wise.
I am working on a Windows 10 x64 system and in IntelliJ.
Anybody has an idea, what I could check or even, what I am doing wrong?
UPDATE: I run my main with the following VM options
-Dhadoop.home.dir=C:/Temp/hadoop
-Djava.library.path=C:/Temp/hadoop/bin
Without specifying the lib path, I get:
org.apache.hadoop.util.NativeCodeLoader - Unable to load native-hadoop library for your platform... using builtin-java classes where applicable
回答1:
To me setting VM Argument -Djava.library.path=C:\devTools\winutils-master\hadoop-3.0.0 resolved the issue.
回答2:
The reason for this exception was:
I am importing 2.6.0-cdh5.13.0 via my maven pom, but I downloaded the pre-built files in version 2.6.0. Those are missing the changes made in the cdh5.13.0 variant (CDH is Cloudera’s platform that includes the Hadoop ecosystem). Hence, the versions are indeed in conflict.
If I import hadoop-common, hadoop-hdfs, hadoop-client like 2.6.0 instead of like 2.6.0-cdh5.13.0, the exception disappears (and I don't even need to set the VM options).
See http://archive-primary.cloudera.com/cdh5/cdh/5/hadoop-2.6.0-cdh5.13.0/hadoop-project-dist/hadoop-common/NativeLibraries.html
回答3:
Check your java version. If java is 32bits version, you need uninstall and re-install with 64 bits version for hadoop.
Check command:
java -d32 -version;(no error, if 32 version)
java -d64 -version;(no error, if 64 version)
来源:https://stackoverflow.com/questions/51282184/java-lang-unsatisfiedlinkerror-org-apache-hadoop-io-nativeio-nativeiowindows-c