How can i make a new directory in hdfs with java?

后端 未结 2 1372
青春惊慌失措
青春惊慌失措 2021-01-07 00:11
public static void main(String[] args) throws IOException, URISyntaxException 

{ Configuration config = new Configuration();

         


        
相关标签:
2条回答
  • 2021-01-07 00:37

    You are missing the Apache Commons Configuration dependency. Download the jar and add it to your build/class path.

    0 讨论(0)
  • 2021-01-07 00:45

    For your issue ,you have to add commons-configuration-1.6.jar jar..

    i listed out necessary jars below

    { 
       Configuration config = new Configuration();
       config.addResource(new Path("/etc/hadoop/conf/core-site.xml"));
       config.addResource(new Path("/etc/hadoop/conf/hdfs-site.xml"));
    
       config.set("fs.hdfs.impl", 
                org.apache.hadoop.hdfs.DistributedFileSystem.class.getName()
            );
           config.set("fs.file.impl",
                org.apache.hadoop.fs.LocalFileSystem.class.getName()
            );
      FileSystem dfs = FileSystem.get(config);
      String dirName = "TestDirectory";
      System.out.println(dfs.getWorkingDirectory() +" this is from /n/n");
      Path src = new Path(dfs.getWorkingDirectory()+"/"+dirName);
    
       dfs.mkdirs(src); 
    
    } }
    

    You have to add below list of jars in your build path.

    commons-cli-1.2.jar

    commons-collections-3.2.1.jar

    commons-configuration-1.6.jar

    commons-lang-2.5.jar

    commons-logging-1.1.1.jar

    guava-11.0.2.jar

    hadoop-auth.jar

    hadoop-common.jar

    protobuf-java-2.4.0a.jar

    slf4j-api-1.6.1.jar

    log4j-1.2.17.jar

    hadoop-hdfs.jar

    These all jars available in hadoop/lib folder if it is cloudera.

    0 讨论(0)
提交回复
热议问题