问题
I'm trying to write an IntelliJ plugin that reads local files using the Hadoop HDFS API (because I eventually want to read Parquet files, and the only way to do that is through Hadoop).
I have a minimal codebase, using plugins
plugins {
id 'java'
id 'org.jetbrains.intellij' version '0.4.16'
}
dependencies
compile("org.apache.hadoop:hadoop-client:3.2.1")
and code
Configuration conf = new Configuration();
Path path = new Path(file.getPath());
try {
FileSystem fs = path.getFileSystem(conf);
} catch (Exception e) {
System.out.println(e);
}
My unit tests run without error, but when I build the plugin and run it within IntelliJ I get the error
org.apache.hadoop.fs.UnsupportedFileSystemException: No FileSystem for scheme "file"
If I follow the suggestion in this SO question and add
conf.set("fs.file.impl", org.apache.hadoop.fs.LocalFileSystem.class.getName());
then I get
java.lang.ClassNotFoundException: Class org.apache.hadoop.fs.LocalFileSystem not found
I have tried depending on several combinations of Hadoop jars, manually replacing all jars in the zip/libs
dir with a single shadowjar (with mergeServiceFiles()
enabled), again with no luck. Everything seems to be in the right place - I can see the META-INF/services file is correct, classes are all there in the jar, etc.
来源:https://stackoverflow.com/questions/60019230/unable-to-read-local-file-with-hdfs-api-in-intellij-plugin