添加链接
link管理
链接快照平台
  • 输入网页链接,自动生成快照
  • 标签化管理网页链接

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement . We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account java.lang.NoClassDefFoundError: Could not initialize class org.apache.hadoop.security.UserGroupInformation java.lang.NoClassDefFoundError: Could not initialize class org.apache.hadoop.security.UserGroupInformation Ilqjx opened this issue Jul 19, 2022 · 54 comments

org.apache.hadoop.security.UserGroupInformation is a part of Hadoop. We met this error because we couldn't find the related Jar.

What's your HADOOP_HOME setup? Do you set CLASSPATH by hand?

We have an automatic CLASSPATH setup via: https://github.com/Xuanwo/hdrs/blob/main/src/client.rs#L287-L322 .

If CLASSPATH is set by hand, please make sure at least the following has been added:

let paths = vec![
    format!("{hadoop_home}/share/hadoop/common"),
    format!("{hadoop_home}/share/hadoop/common/lib"),
    format!("{hadoop_home}/share/hadoop/hdfs"),
    format!("{hadoop_home}/share/hadoop/hdfs/lib"),
for path in paths {
    for d in fs::read_dir(&path)? {
        let p = d?.path();
        if let Some(ext) = p.extension() {
            if ext == "jar" {
                jars.push(p.to_string_lossy().to_string());

Please check out have we included hadoop-common-x.y.z.jar

java.lang.NoClassDefFoundError: Could not initialize class org.apache.hadoop.security.UserGroupInformation Xuanwo/hdrs#74

I have HADOOP_HOME set and CLASSPATH is not set manually.

But suspiciously, I did not find hadoop-common-x.y.z.jar in these four paths.

let paths = vec![
    format!("{hadoop_home}/share/hadoop/common"),
    format!("{hadoop_home}/share/hadoop/common/lib"),
    format!("{hadoop_home}/share/hadoop/hdfs"),
    format!("{hadoop_home}/share/hadoop/hdfs/lib"),
          

First of all, we need to figure out why hadoop-common-x.y.z.jar is not found.

Please print the following envs in rust code (after Client::connect, but ignore all errors so that we can get the output):

  • HADOOP_HOME
  • CLASSPATH
  • I manually set the CLASSPATH env, but still got the same error.

    HADOOP_HOME: Ok("/home/guozhenwei/hadoop/hadoop-3.3.3")
    CLASSPATH: Ok(".:/home/guozhenwei/java/jdk-18.0.1.1/lib:/home/guozhenwei/java/jdk-18.0.1.1/jre/lib:/home/guozhenwei/hadoop/hadoop-3.3.3/etc/hadoop:/home/guozhenwei/hadoop/hadoop-3.3.3/share/hadoop/common/lib/:/home/guozhenwei/hadoop/hadoop-3.3.3/share/hadoop/common/:/home/guozhenwei/hadoop/hadoop-3.3.3/share/hadoop/hdfs:/home/guozhenwei/hadoop/hadoop-3.3.3/share/hadoop/hdfs/lib/:/home/guozhenwei/hadoop/hadoop-3.3.3/share/hadoop/hdfs/:/home/guozhenwei/hadoop/hadoop-3.3.3/share/hadoop/mapreduce/:/home/guozhenwei/hadoop/hadoop-3.3.3/share/hadoop/yarn:/home/guozhenwei/hadoop/hadoop-3.3.3/share/hadoop/yarn/lib/:/home/guozhenwei/hadoop/hadoop-3.3.3/share/hadoop/yarn/*")

    2022-07-19 15:27:47,646 WARN fs.FileSystem: Cannot load filesystem: java.util.ServiceConfigurationError: org.apache.hadoop.fs.FileSystem: Provider org.apache.hadoop.fs.viewfs.ViewFileSystem could not be instantiated
    2022-07-19 15:27:47,647 WARN fs.FileSystem: java.lang.NoClassDefFoundError: java/lang/management/ManagementFactory
    2022-07-19 15:27:47,647 WARN fs.FileSystem: java.lang.ClassNotFoundException: java.lang.management.ManagementFactory
    2022-07-19 15:27:47,698 WARN util.NativeCodeLoader: Unable to load native-hadoop library for your platform... using builtin-java classes where applicable
    hdfsBuilderConnect(forceNewInstance=0, nn=hdfs://localhost:9000, port=0, kerbTicketCachePath=(NULL), userName=(NULL)) error:
    ExceptionInInitializerError: Exception java.lang.NoClassDefFoundError: java/lang/management/ManagementFactory [in thread "main"]java.lang.NoClassDefFoundError: Could not initialize class org.apache.hadoop.security.UserGroupInformation
    at org.apache.hadoop.fs.FileSystem.get(FileSystem.java:274)
    Caused by: java.lang.ExceptionInInitializerError: Exception java.lang.NoClassDefFoundError: java/lang/management/ManagementFactory [in thread "main"]
    at org.apache.hadoop.util.ReflectionUtils.(ReflectionUtils.java:144)
    at org.apache.hadoop.metrics2.lib.MetricsSourceBuilder.initRegistry(MetricsSourceBuilder.java:102)
    at org.apache.hadoop.metrics2.lib.MetricsSourceBuilder.(MetricsSourceBuilder.java:66)
    at org.apache.hadoop.metrics2.lib.MetricsAnnotations.newSourceBuilder(MetricsAnnotations.java:43)
    at org.apache.hadoop.metrics2.impl.MetricsSystemImpl.register(MetricsSystemImpl.java:223)
    at org.apache.hadoop.metrics2.MetricsSystem.register(MetricsSystem.java:71)
    at org.apache.hadoop.security.UserGroupInformation$UgiMetrics.create(UserGroupInformation.java:149)
    at org.apache.hadoop.security.UserGroupInformation.(UserGroupInformation.java:265)
    at org.apache.hadoop.fs.viewfs.ViewFileSystem.(ViewFileSystem.java:269)
    at java.base/jdk.internal.reflect.DirectConstructorHandleAccessor.newInstance(DirectConstructorHandleAccessor.java:67)
    at java.base/java.lang.reflect.Constructor.newInstanceWithCaller(Constructor.java:499)
    at java.base/java.lang.reflect.Constructor.newInstance(Constructor.java:483)
    at java.base/java.util.ServiceLoader$ProviderImpl.newInstance(ServiceLoader.java:789)
    at java.base/java.util.ServiceLoader$ProviderImpl.get(ServiceLoader.java:729)
    at java.base/java.util.ServiceLoader$3.next(ServiceLoader.java:1403)
    at org.apache.hadoop.fs.FileSystem.loadFileSystems(FileSystem.java:3379)
    Error: Os { code: 255, kind: Uncategorized, message: "Unknown error 255" }

    Java 18 may be too new for Hadoop. Can you try Java 11?

    As noted here: https://hadoop.apache.org/docs/stable/

    Java 11 runtime support
    Java 11 runtime support is completed.

    Unfortunately, I switched to Java 11, but the error still occurs. Do I have to switch hadoop to 3.3.2?

    HADOOP_HOME: Ok("/home/guozhenwei/hadoop/hadoop-3.3.3")
    CLASSPATH: Ok(".:/home/guozhenwei/java/jdk-11.0.15/lib:/home/guozhenwei/java/jdk-11.0.15/jre/lib:/home/guozhenwei/hadoop/hadoop-3.3.3/etc/hadoop:/home/guozhenwei/hadoop/hadoop-3.3.3/share/hadoop/common/lib/:/home/guozhenwei/hadoop/hadoop-3.3.3/share/hadoop/common/:/home/guozhenwei/hadoop/hadoop-3.3.3/share/hadoop/hdfs:/home/guozhenwei/hadoop/hadoop-3.3.3/share/hadoop/hdfs/lib/:/home/guozhenwei/hadoop/hadoop-3.3.3/share/hadoop/hdfs/:/home/guozhenwei/hadoop/hadoop-3.3.3/share/hadoop/mapreduce/:/home/guozhenwei/hadoop/hadoop-3.3.3/share/hadoop/yarn:/home/guozhenwei/hadoop/hadoop-3.3.3/share/hadoop/yarn/lib/:/home/guozhenwei/hadoop/hadoop-3.3.3/share/hadoop/yarn/*")

    2022-07-19 16:55:21,389 WARN fs.FileSystem: Cannot load filesystem: java.util.ServiceConfigurationError: org.apache.hadoop.fs.FileSystem: Provider org.apache.hadoop.fs.viewfs.ViewFileSystem could not be instantiated
    2022-07-19 16:55:21,390 WARN fs.FileSystem: java.lang.NoClassDefFoundError: java/lang/management/ManagementFactory
    2022-07-19 16:55:21,390 WARN fs.FileSystem: java.lang.ClassNotFoundException: java.lang.management.ManagementFactory
    2022-07-19 16:55:21,445 WARN util.NativeCodeLoader: Unable to load native-hadoop library for your platform... using builtin-java classes where applicable
    hdfsBuilderConnect(forceNewInstance=0, nn=hdfs://localhost:9000, port=0, kerbTicketCachePath=(NULL), userName=(NULL)) error:
    NoClassDefFoundError: Could not initialize class org.apache.hadoop.security.UserGroupInformationjava.lang.NoClassDefFoundError: Could not initialize class org.apache.hadoop.security.UserGroupInformation
    at org.apache.hadoop.fs.FileSystem.get(FileSystem.java:274)
    Error: Os { code: 255, kind: Uncategorized, message: "Unknown error 255" }

    I switched the hadoop version to 3.2.3 and the java version to Java 11, but the same error is still reported.

    HADOOP_HOME: Ok("/home/guozhenwei/hadoop/hadoop-3.2.3")
    CLASSPATH: Ok(".:/home/guozhenwei/java/jdk-11.0.15/lib:/home/guozhenwei/java/jdk-11.0.15/jre/lib:/home/guozhenwei/hadoop/hadoop-3.2.3/etc/hadoop:/home/guozhenwei/hadoop/hadoop-3.2.3/share/hadoop/common/lib/:/home/guozhenwei/hadoop/hadoop-3.2.3/share/hadoop/common/:/home/guozhenwei/hadoop/hadoop-3.2.3/share/hadoop/hdfs:/home/guozhenwei/hadoop/hadoop-3.2.3/share/hadoop/hdfs/lib/:/home/guozhenwei/hadoop/hadoop-3.2.3/share/hadoop/hdfs/:/home/guozhenwei/hadoop/hadoop-3.2.3/share/hadoop/mapreduce/lib/:/home/guozhenwei/hadoop/hadoop-3.2.3/share/hadoop/mapreduce/:/home/guozhenwei/hadoop/hadoop-3.2.3/share/hadoop/yarn:/home/guozhenwei/hadoop/hadoop-3.2.3/share/hadoop/yarn/lib/:/home/guozhenwei/hadoop/hadoop-3.2.3/share/hadoop/yarn/")

    2022-07-19 17:50:36,328 WARN fs.FileSystem: Cannot load filesystem: java.util.ServiceConfigurationError: org.apache.hadoop.fs.FileSystem: Provider org.apache.hadoop.fs.viewfs.ViewFileSystem could not be instantiated
    2022-07-19 17:50:36,329 WARN fs.FileSystem: java.lang.NoClassDefFoundError: java/lang/management/ManagementFactory
    2022-07-19 17:50:36,329 WARN fs.FileSystem: java.lang.ClassNotFoundException: java.lang.management.ManagementFactory
    2022-07-19 17:50:36,402 WARN util.NativeCodeLoader: Unable to load native-hadoop library for your platform... using builtin-java classes where applicable
    hdfsBuilderConnect(forceNewInstance=0, nn=hdfs://localhost:9000, port=0, kerbTicketCachePath=(NULL), userName=(NULL)) error:
    NoClassDefFoundError: Could not initialize class org.apache.hadoop.security.UserGroupInformationjava.lang.NoClassDefFoundError: Could not initialize class org.apache.hadoop.security.UserGroupInformation
    at org.apache.hadoop.fs.FileSystem.get(FileSystem.java:219)
    Error: Os { code: 255, kind: Uncategorized, message: "Unknown error 255" }

    I believe the classpath is wrong:

    Expected (match all files with *):

    :) bin/hadoop classpath
    /home/xuanwo/Code/xuanwo/tmp/hadoop-3.2.3/hadoop-3.2.3/etc/hadoop:/home/xuanwo/Code/xuanwo/tmp/hadoop-3.2.3/hadoop-3.2.3/share/hadoop/common/lib/*:/home/xuanwo/Code/xuanwo/tmp/hadoop-3.2.3/hadoop-3.2.3/share/hadoop/common/*:/home/xuanwo/Code/xuanwo/tmp/hadoop-3.2.3/hadoop-3.2.3/share/hadoop/hdfs:/home/xuanwo/Code/xuanwo/tmp/hadoop-3.2.3/hadoop-3.2.3/share/hadoop/hdfs/lib/*:/home/xuanwo/Code/xuanwo/tmp/hadoop-3.2.3/hadoop-3.2.3/share/hadoop/hdfs/*:/home/xuanwo/Code/xuanwo/tmp/hadoop-3.2.3/hadoop-3.2.3/share/hadoop/mapreduce/lib/*:/home/xuanwo/Code/xuanwo/tmp/hadoop-3.2.3/hadoop-3.2.3/share/hadoop/mapreduce/*:/home/xuanwo/Code/xuanwo/tmp/hadoop-3.2.3/hadoop-3.2.3/share/hadoop/yarn:/home/xuanwo/Code/xuanwo/tmp/hadoop-3.2.3/hadoop-3.2.3/share/hadoop/yarn/lib/*:/home/xuanwo/Code/xuanwo/tmp/hadoop-3.2.3/hadoop-3.2.3/share/hadoop/yarn/*

    Instead of:

    .:/home/guozhenwei/java/jdk-11.0.15/lib:/home/guozhenwei/java/jdk-11.0.15/jre/lib:/home/guozhenwei/hadoop/hadoop-3.2.3/etc/hadoop:/home/guozhenwei/hadoop/hadoop-3.2.3/share/hadoop/common/lib/:/home/guozhenwei/hadoop/hadoop-3.2.3/share/hadoop/common/:/home/guozhenwei/hadoop/hadoop-3.2.3/share/hadoop/hdfs:/home/guozhenwei/hadoop/hadoop-3.2.3/share/hadoop/hdfs/lib/:/home/guozhenwei/hadoop/hadoop-3.2.3/share/hadoop/hdfs/:/home/guozhenwei/hadoop/hadoop-3.2.3/share/hadoop/mapreduce/lib/:/home/guozhenwei/hadoop/hadoop-3.2.3/share/hadoop/mapreduce/:/home/guozhenwei/hadoop/hadoop-3.2.3/share/hadoop/yarn:/home/guozhenwei/hadoop/hadoop-3.2.3/share/hadoop/yarn/lib/:/home/guozhenwei/hadoop/hadoop-3.2.3/share/hadoop/yarn/

    Notice: hdrs has an internal logic that lists all jars instead use path directly.

    With CLASSPATH set to a wrong path, I can reproduce the same error:

    test metadata::tests::test_from_hdfs_file_info ... ok
    loadFileSystems error:
    (unable to get stack trace for java.lang.NoClassDefFoundError exception: ExceptionUtils::getStackTrace error.)
    hdfsBuilderConnect(forceNewInstance=0, nn=default, port=0, kerbTicketCachePath=(NULL), userName=(NULL)) error:
    (unable to get stack trace for java.lang.NoClassDefFoundError exception: ExceptionUtils::getStackTrace error.)
    hdfsBuilderConnect(forceNewInstance=0, nn=default, port=0, kerbTicketCachePath=(NULL), userName=(NULL)) error:
    test client::tests::test_client_mkdir ... hdfsBuilderConnect(forceNewInstance=0, nn=default, port=0, kerbTicketCachePath=(NULL), userName=(NULL))FAILED error:
    (unable to get stack trace for java.lang.NoClassDefFoundError exception: ExceptionUtils::getStackTrace error.)

    Modify the environment variables as follows.

    HADOOP_HOME: Ok("/home/guozhenwei/hadoop/hadoop-3.2.3")
    CLASSPATH: Ok("/home/guozhenwei/java/jdk-11.0.15/lib:/home/guozhenwei/java/jdk-11.0.15/jre/lib:hadoop classpath")

    could not find method getRootCauseMessage from class (null) with signature (Ljava/lang/Throwable;)Ljava/lang/String;
    [1] 1038885 segmentation fault (core dumped) cargo run

    hadoop classpath

    Sorry, I changed it, but, emm.

    cargo run
    HADOOP_HOME: Ok("/home/guozhenwei/hadoop/hadoop-3.2.3")
    CLASSPATH: Ok("/home/guozhenwei/java/jdk-11.0.15/lib:/home/guozhenwei/java/jdk-11.0.15/jre/lib:")

    could not find method getRootCauseMessage from class (null) with signature (Ljava/lang/Throwable;)Ljava/lang/String;
    [1] 1043244 segmentation fault (core dumped) cargo run

    CLASSPATH: Ok("/home/guozhenwei/java/jdk-11.0.15/lib:/home/guozhenwei/java/jdk-11.0.15/jre/lib:")

    It seems hadoop classpath returns empty string

    Try to use shell

    export HADOOP_CLASSPATH=`${HADOOP_HOME}/bin/hadoop classpath`

    Dude, I'm here to bother you again.

    I thought I had successfully set the CLASSPATH, but the same thing happened.

    cargo run

    echo $CLASSPATH

    All jars from Hadoop are OK to find, then we need to address java.lang.management.ManagementFactory

    How do you install java? (From where? Which distribution?)

    Can you run the following demo?

    package test;
    import java.lang.management.ManagementFactory;
    public class Demo {
        public static void main(String[] args) {
            System.out.println("class loaded");
    

    Running via java Demo.java

    I wrote a java demo, which proves that java operation hdfs is no problem.

    public class HdfsDemo {
        public static void main(String[] args) {
            Configuration conf = new Configuration();
            FileSystem fs = FileSystem.get(new URI("hdfs://localhost:9000"), conf, "guozhenwei");
            fs.mkdirs(new Path("/xx/yy/zz"));
            fs.close();
              

    I have no ideas so far...

    I'm trying to reproduce this issue based on the environment info you mentioned at Xuanwo/hdrs#73

  • Ubuntu 22.04
  • Java 11
  • Hadoop 3.2.3
  • rustc 1.62.0
  • Can you also describe the following info:

  • How do you install java? (From where? Which distribution?)
  • How do you install Hadoop? (From where?)
  • Do you run on x86_64?
  • Can you share your rust test code with me?
  • Do you have the following files in your Hadoop setup?
    :( ls -lh lib/native/libhdfs.*
    Permissions Size User   Date Modified Name
    .rw-r--r--  504k xuanwo 20 Mar 09:23  lib/native/libhdfs.a
    lrwxrwxrwx    16 xuanwo 27 Apr 01:49  lib/native/libhdfs.so -> libhdfs.so.0.0.0
    .rwxr-xr-x  308k xuanwo 20 Mar 09:23  lib/native/libhdfs.so.0.0.0

    1.How do you install java? (From where? Which distribution?)
    I installed Java through the compressed package of the official website.
    https://www.oracle.com/java/technologies/javase/jdk11-archive-downloads.html

    2.How do you install Hadoop? (From where?)
    I installed Hadoop through the compressed package of the official website.
    https://hadoop.apache.org/releases.html

    3.Do you run on x86_64?

    4.Can you share your rust test code with me?

    use std::{
        env,
        io::{Read, Write},
    use hdrs::Client;
    fn main() -> Result<(), Box<dyn std::error::Error>> {
        println!("HADOOP_HOME: {:?}", env::var("HADOOP_HOME"));
        println!("CLASSPATH: {:?}", env::var("CLASSPATH"));
        println!();
        let fs = Client::connect("hdfs://localhost:9000")?;
        let mut f = fs
            .open_file()
            .write(true)
            .create(true)
            .open("/tmp/hello.txt")?;
        let _n = f.write("Hello, World!".as_bytes())?;
        let mut f = fs.open_file().read(true).open("/tmp/hello.txt")?;
        let mut buf = vec![0; 1024];
        let _n = f.read(&mut buf)?;
        println!("buf: {:?}", String::from_utf8_lossy(&buf));
        let _ = fs.remove_file("/tmp/hello.txt")?;
        Ok(())
    

    5.Do you have the following files in your Hadoop setup?
    fn main() -> Result<(), Box<dyn std::error::Error>> { println!("HADOOP_HOME: {:?}", env::var("HADOOP_HOME")); println!("CLASSPATH: {:?}", env::var("CLASSPATH")); println!(); let fs = Client::connect("default")?; let mut f = fs .open_file() .write(true) .create(true) .open("/tmp/hello.txt")?; let _n = f.write("Hello, World!".as_bytes())?; let mut f = fs.open_file().read(true).open("/tmp/hello.txt")?; let mut buf = vec![0; 1024]; let _n = f.read(&mut buf)?; println!("buf: {:?}", String::from_utf8_lossy(&buf)); let _ = fs.remove_file("/tmp/hello.txt")?; Ok(())

    It can't work.

    2022-07-20 14:05:46,385 WARN fs.FileSystem: Cannot load filesystem: java.util.ServiceConfigurationError: org.apache.hadoop.fs.FileSystem: Provider org.apache.hadoop.fs.viewfs.ViewFileSystem could not be instantiated
    2022-07-20 14:05:46,386 WARN fs.FileSystem: java.lang.NoClassDefFoundError: java/lang/management/ManagementFactory
    2022-07-20 14:05:46,386 WARN fs.FileSystem: java.lang.ClassNotFoundException: java.lang.management.ManagementFactory
    2022-07-20 14:05:46,457 WARN util.NativeCodeLoader: Unable to load native-hadoop library for your platform... using builtin-java classes where applicable
    hdfsBuilderConnect(forceNewInstance=0, nn=default, port=0, kerbTicketCachePath=(NULL), userName=(NULL)) error:
    NoClassDefFoundError: Could not initialize class org.apache.hadoop.security.UserGroupInformationjava.lang.NoClassDefFoundError: Could not initialize class org.apache.hadoop.security.UserGroupInformation
    at org.apache.hadoop.fs.FileSystem.get(FileSystem.java:219)
    Error: Os { code: 255, kind: Uncategorized, message: "Unknown error 255" }

    So far, we have ruled out some wrong answers:

  • Hadoop setup is good: with stable and official releases
  • Java setup is good: I tested the same JDK version.
  • Connect to default also failed, so it's not related to Hadoop deploy.
  • Sample code LGTM is not related to our testing code
  • Let's date back to the error itself.

    java.lang.NoClassDefFoundError, based on existing reports, means:

    the class is present in the classpath at Compile time, but it doesn't exist in the classpath at Runtime.

    And org.apache.hadoop.security.UserGroupInformation, should be included in share/hadoop/common/hadoop-common-3.2.3.jar.

    Let's make log4j print more info, change hadoop-3.2.3/etc/hadoop/log4j.properties

    - hadoop.root.logger=INFO,console
    + hadoop.root.logger=DEBUG,console

    We will see more logs from log4j, in my setup:

    2022-07-20 15:06:46,849 DEBUG fs.FileSystem: Loading filesystems
    2022-07-20 15:06:46,864 DEBUG fs.FileSystem: file:// = class org.apache.hadoop.fs.LocalFileSystem from /home/xuanwo/Code/xuanwo/tmp/hadoop-3.2.3/hadoop-3.2.3/share/hadoop/common/hadoop-common-3.2.3.jar
    2022-07-20 15:06:46,895 DEBUG lib.MutableMetricsFactory: field org.apache.hadoop.metrics2.lib.MutableRate org.apache.hadoop.security.UserGroupInformation$UgiMetrics.loginSuccess with annotation @org.apache.hadoop.metrics2.annotation.Metric(sampleName="Ops", always=false, valueName="Time", about="", interval=10, type=DEFAULT, value={"Rate of successful kerberos logins and latency (milliseconds)"})
    2022-07-20 15:06:46,897 DEBUG lib.MutableMetricsFactory: field org.apache.hadoop.metrics2.lib.MutableRate org.apache.hadoop.security.UserGroupInformation$UgiMetrics.loginFailure with annotation @org.apache.hadoop.metrics2.annotation.Metric(sampleName="Ops", always=false, valueName="Time", about="", interval=10, type=DEFAULT, value={"Rate of failed kerberos logins and latency (milliseconds)"})
    2022-07-20 15:06:46,897 DEBUG lib.MutableMetricsFactory: field org.apache.hadoop.metrics2.lib.MutableRate org.apache.hadoop.security.UserGroupInformation$UgiMetrics.getGroups with annotation @org.apache.hadoop.metrics2.annotation.Metric(sampleName="Ops", always=false, valueName="Time", about="", interval=10, type=DEFAULT, value={"GetGroups"})
    2022-07-20 15:06:46,898 DEBUG lib.MutableMetricsFactory: field private org.apache.hadoop.metrics2.lib.MutableGaugeLong org.apache.hadoop.security.UserGroupInformation$UgiMetrics.renewalFailuresTotal with annotation @org.apache.hadoop.metrics2.annotation.Metric(sampleName="Ops", always=false, valueName="Time", about="", interval=10, type=DEFAULT, value={"Renewal failures since startup"})
    2022-07-20 15:06:46,898 DEBUG lib.MutableMetricsFactory: field private org.apache.hadoop.metrics2.lib.MutableGaugeInt org.apache.hadoop.security.UserGroupInformation$UgiMetrics.renewalFailures with annotation @org.apache.hadoop.metrics2.annotation.Metric(sampleName="Ops", always=false, valueName="Time", about="", interval=10, type=DEFAULT, value={"Renewal failures since last successful login"})
    2022-07-20 15:06:46,899 DEBUG impl.MetricsSystemImpl: UgiMetrics, User and group related metrics
    2022-07-20 15:06:46,943 DEBUG util.Shell: setsid exited with exit code 0
    2022-07-20 15:06:46,944 DEBUG security.SecurityUtil: Setting hadoop.security.token.service.use_ip to true
    2022-07-20 15:06:46,950 DEBUG security.Groups:  Creating new Groups object
    2022-07-20 15:06:46,950 DEBUG util.NativeCodeLoader: Trying to load the custom-built native-hadoop library...
    2022-07-20 15:06:46,950 DEBUG util.NativeCodeLoader: Failed to load native-hadoop with error: java.lang.UnsatisfiedLinkError: no hadoop in java.library.path: [/home/xuanwo/Code/xuanwo/hdrs/target/debug/build/hdfs-sys-5d5ff34cc5ce2b90/out, /home/xuanwo/Code/xuanwo/hdrs/target/debug/deps, /home/xuanwo/Code/xuanwo/hdrs/target/debug, /home/xuanwo/.rustup/toolchains/stable-x86_64-unknown-linux-gnu/lib/rustlib/x86_64-unknown-linux-gnu/lib, /home/xuanwo/.rustup/toolchains/stable-x86_64-unknown-linux-gnu/lib, /tmp/jdk-11.0.15.1/lib/server, /usr/java/packages/lib, /usr/lib64, /lib64, /lib, /usr/lib]
    2022-07-20 15:06:46,950 DEBUG util.NativeCodeLoader: java.library.path=/home/xuanwo/Code/xuanwo/hdrs/target/debug/build/hdfs-sys-5d5ff34cc5ce2b90/out:/home/xuanwo/Code/xuanwo/hdrs/target/debug/deps:/home/xuanwo/Code/xuanwo/hdrs/target/debug:/home/xuanwo/.rustup/toolchains/stable-x86_64-unknown-linux-gnu/lib/rustlib/x86_64-unknown-linux-gnu/lib:/home/xuanwo/.rustup/toolchains/stable-x86_64-unknown-linux-gnu/lib:/tmp/jdk-11.0.15.1/lib/server:/usr/java/packages/lib:/usr/lib64:/lib64:/lib:/usr/lib
    2022-07-20 15:06:46,950 WARN util.NativeCodeLoader: Unable to load native-hadoop library for your platform... using builtin-java classes where applicable
    2022-07-20 15:06:46,950 DEBUG util.PerformanceAdvisory: Falling back to shell based
    2022-07-20 15:06:46,951 DEBUG security.JniBasedUnixGroupsMappingWithFallback: Group mapping impl=org.apache.hadoop.security.ShellBasedUnixGroupsMapping
    2022-07-20 15:06:46,981 DEBUG security.Groups: Group mapping impl=org.apache.hadoop.security.JniBasedUnixGroupsMappingWithFallback; cacheTimeout=300000; warningDeltaMs=5000
    2022-07-20 15:06:46,996 DEBUG security.UserGroupInformation: hadoop login
    2022-07-20 15:06:46,996 DEBUG security.UserGroupInformation: hadoop login commit
    2022-07-20 15:06:46,997 DEBUG security.UserGroupInformation: using local user:UnixPrincipal: xuanwo
    2022-07-20 15:06:46,997 DEBUG security.UserGroupInformation: Using user: "UnixPrincipal: xuanwo" with name xuanwo
    2022-07-20 15:06:46,997 DEBUG security.UserGroupInformation: User entry: "xuanwo"
    2022-07-20 15:06:46,997 DEBUG security.UserGroupInformation: UGI loginUser:xuanwo (auth:SIMPLE)
    2022-07-20 15:06:46,997 DEBUG fs.FileSystem: viewfs:// = class org.apache.hadoop.fs.viewfs.ViewFileSystem from /home/xuanwo/Code/xuanwo/tmp/hadoop-3.2.3/hadoop-3.2.3/share/hadoop/common/hadoop-common-3.2.3.jar
    2022-07-20 15:06:46,998 DEBUG fs.FileSystem: har:// = class org.apache.hadoop.fs.HarFileSystem from /home/xuanwo/Code/xuanwo/tmp/hadoop-3.2.3/hadoop-3.2.3/share/hadoop/common/hadoop-common-3.2.3.jar
    2022-07-20 15:06:46,999 DEBUG fs.FileSystem: http:// = class org.apache.hadoop.fs.http.HttpFileSystem from /home/xuanwo/Code/xuanwo/tmp/hadoop-3.2.3/hadoop-3.2.3/share/hadoop/common/hadoop-common-3.2.3.jar
    2022-07-20 15:06:47,000 DEBUG fs.FileSystem: https:// = class org.apache.hadoop.fs.http.HttpsFileSystem from /home/xuanwo/Code/xuanwo/tmp/hadoop-3.2.3/hadoop-3.2.3/share/hadoop/common/hadoop-common-3.2.3.jar
    2022-07-20 15:06:47,003 DEBUG fs.FileSystem: hdfs:// = class org.apache.hadoop.hdfs.DistributedFileSystem from /home/xuanwo/Code/xuanwo/tmp/hadoop-3.2.3/hadoop-3.2.3/share/hadoop/hdfs/hadoop-hdfs-client-3.2.3.jar
    2022-07-20 15:06:47,007 DEBUG fs.FileSystem: webhdfs:// = class org.apache.hadoop.hdfs.web.WebHdfsFileSystem from /home/xuanwo/Code/xuanwo/tmp/hadoop-3.2.3/hadoop-3.2.3/share/hadoop/hdfs/hadoop-hdfs-client-3.2.3.jar
    2022-07-20 15:06:47,007 DEBUG fs.FileSystem: swebhdfs:// = class org.apache.hadoop.hdfs.web.SWebHdfsFileSystem from /home/xuanwo/Code/xuanwo/tmp/hadoop-3.2.3/hadoop-3.2.3/share/hadoop/hdfs/hadoop-hdfs-client-3.2.3.jar
    2022-07-20 15:06:47,085 DEBUG security.UserGroupInformation: PrivilegedAction as:xuanwo (auth:SIMPLE) from:org.apache.hadoop.fs.FileSystem.get(FileSystem.java:220)
    2022-07-20 15:06:47,089 DEBUG security.UserGroupInformation: PrivilegedAction as:xuanwo (auth:SIMPLE) from:org.apache.hadoop.fs.FileSystem.get(FileSystem.java:220)
    2022-07-20 15:06:47,091 DEBUG security.UserGroupInformation: PrivilegedAction as:xuanwo (auth:SIMPLE) from:org.apache.hadoop.fs.FileSystem.get(FileSystem.java:220)
    2022-07-20 15:06:47,091 DEBUG security.UserGroupInformation: PrivilegedAction as:xuanwo (auth:SIMPLE) from:org.apache.hadoop.fs.FileSystem.get(FileSystem.java:220)
    2022-07-20 15:06:47,091 DEBUG security.UserGroupInformation: PrivilegedAction as:xuanwo (auth:SIMPLE) from:org.apache.hadoop.fs.FileSystem.get(FileSystem.java:220)
    2022-07-20 15:06:47,092 DEBUG security.UserGroupInformation: PrivilegedAction as:xuanwo (auth:SIMPLE) from:org.apache.hadoop.fs.FileSystem.get(FileSystem.java:220)
    2022-07-20 15:06:47,092 DEBUG security.UserGroupInformation: PrivilegedAction as:xuanwo (auth:SIMPLE) from:org.apache.hadoop.fs.FileSystem.get(FileSystem.java:220)
    2022-07-20 15:06:47,092 DEBUG security.UserGroupInformation: PrivilegedAction as:xuanwo (auth:SIMPLE) from:org.apache.hadoop.fs.FileSystem.get(FileSystem.java:220)
    2022-07-20 15:06:47,094 DEBUG core.Tracer: sampler.classes = ; loaded no samplers
    2022-07-20 15:06:47,097 DEBUG core.Tracer: span.receiver.classes = ; loaded no span receivers
    2022-07-20 15:06:47,097 DEBUG fs.FileSystem: Looking for FS supporting file
    2022-07-20 15:06:47,097 DEBUG fs.FileSystem: Looking for FS supporting file
    2022-07-20 15:06:47,097 DEBUG fs.FileSystem: Looking for FS supporting file
    2022-07-20 15:06:47,097 DEBUG fs.FileSystem: Looking for FS supporting file
    2022-07-20 15:06:47,097 DEBUG fs.FileSystem: Looking for FS supporting file
    2022-07-20 15:06:47,097 DEBUG fs.FileSystem: Looking for FS supporting file
    2022-07-20 15:06:47,097 DEBUG fs.FileSystem: looking for configuration option fs.file.impl
    2022-07-20 15:06:47,097 DEBUG fs.FileSystem: looking for configuration option fs.file.impl
    2022-07-20 15:06:47,097 DEBUG fs.FileSystem: looking for configuration option fs.file.impl
    2022-07-20 15:06:47,097 DEBUG fs.FileSystem: looking for configuration option fs.file.impl
    2022-07-20 15:06:47,097 DEBUG fs.FileSystem: Looking for FS supporting file
    2022-07-20 15:06:47,097 DEBUG fs.FileSystem: Looking for FS supporting file
    2022-07-20 15:06:47,097 DEBUG fs.FileSystem: looking for configuration option fs.file.impl
    2022-07-20 15:06:47,097 DEBUG fs.FileSystem: Looking in service filesystems for implementation class
    2022-07-20 15:06:47,098 DEBUG fs.FileSystem: FS for file is class org.apache.hadoop.fs.LocalFileSystem
    2022-07-20 15:06:47,097 DEBUG fs.FileSystem: looking for configuration option fs.file.impl
    2022-07-20 15:06:47,097 DEBUG fs.FileSystem: looking for configuration option fs.file.impl
    2022-07-20 15:06:47,097 DEBUG fs.FileSystem: Looking in service filesystems for implementation class
    2022-07-20 15:06:47,097 DEBUG fs.FileSystem: Looking in service filesystems for implementation class
    2022-07-20 15:06:47,097 DEBUG fs.FileSystem: Looking in service filesystems for implementation class
    2022-07-20 15:06:47,097 DEBUG fs.FileSystem: Looking in service filesystems for implementation class
    2022-07-20 15:06:47,097 DEBUG fs.FileSystem: looking for configuration option fs.file.impl
    2022-07-20 15:06:47,098 DEBUG fs.FileSystem: FS for file is class org.apache.hadoop.fs.LocalFileSystem
    2022-07-20 15:06:47,098 DEBUG fs.FileSystem: FS for file is class org.apache.hadoop.fs.LocalFileSystem
    2022-07-20 15:06:47,098 DEBUG fs.FileSystem: FS for file is class org.apache.hadoop.fs.LocalFileSystem
    2022-07-20 15:06:47,098 DEBUG fs.FileSystem: FS for file is class org.apache.hadoop.fs.LocalFileSystem
    2022-07-20 15:06:47,098 DEBUG fs.FileSystem: Looking in service filesystems for implementation class
    2022-07-20 15:06:47,098 DEBUG fs.FileSystem: Looking in service filesystems for implementation class
    2022-07-20 15:06:47,098 DEBUG fs.FileSystem: FS for file is class org.apache.hadoop.fs.LocalFileSystem
    2022-07-20 15:06:47,098 DEBUG fs.FileSystem: Looking in service filesystems for implementation class
    2022-07-20 15:06:47,099 DEBUG fs.FileSystem: FS for file is class org.apache.hadoop.fs.LocalFileSystem
    2022-07-20 15:06:47,099 DEBUG fs.FileSystem: FS for file is class org.apache.hadoop.fs.LocalFileSystem
    test client::tests::test_client_connect ... ok
              

    A big different I can find in log is I don't have this warning:

    2022-07-19 15:27:47,647 WARN fs.FileSystem: java.lang.NoClassDefFoundError: java/lang/management/ManagementFactory
    2022-07-19 15:27:47,647 WARN fs.FileSystem: java.lang.ClassNotFoundException: java.lang.management.ManagementFactory

    Any ideas? Can you also print JAVA_HOME in the code? It is possible for our code running on Android or something which doesn't have java.lang.management?

    JAVA_HOME is ok.

    JAVA_HOME: Ok("/home/guozhenwei/java/jdk-11.0.15")
    HADOOP_HOME: Ok("/home/guozhenwei/hadoop/hadoop-3.2.3")
    CLASSPATH: Ok("/home/guozhenwei/java/jdk-11.0.15/lib:/home/guozhenwei/java/jdk-11.0.15/jre/lib:/home/guozhenwei/hadoop/hadoop-3.2.3/etc/hadoop:/home/guozhenwei/hadoop/hadoop-3.2.3/share/hadoop/common/lib/*:/home/guozhenwei/hadoop/hadoop-3.2.3/share/hadoop/common/*:/home/guozhenwei/hadoop/hadoop-3.2.3/share/hadoop/hdfs:/home/guozhenwei/hadoop/hadoop-3.2.3/share/hadoop/hdfs/lib/*:/home/guozhenwei/hadoop/hadoop-3.2.3/share/hadoop/hdfs/*:/home/guozhenwei/hadoop/hadoop-3.2.3/share/hadoop/mapreduce/lib/*:/home/guozhenwei/hadoop/hadoop-3.2.3/share/hadoop/mapreduce/*:/home/guozhenwei/hadoop/hadoop-3.2.3/share/hadoop/yarn:/home/guozhenwei/hadoop/hadoop-3.2.3/share/hadoop/yarn/lib/*:/home/guozhenwei/hadoop/hadoop-3.2.3/share/hadoop/yarn/*")
    2022-07-20 15:19:02,861 DEBUG fs.FileSystem: Loading filesystems
    2022-07-20 15:19:02,900 DEBUG fs.FileSystem: file:// = class org.apache.hadoop.fs.LocalFileSystem from /home/guozhenwei/hadoop/hadoop-3.2.3/share/hadoop/common/hadoop-common-3.2.3.jar
    2022-07-20 15:19:02,951 WARN fs.FileSystem: Cannot load filesystem: java.util.ServiceConfigurationError: org.apache.hadoop.fs.FileSystem: Provider org.apache.hadoop.fs.viewfs.ViewFileSystem could not be instantiated
    2022-07-20 15:19:02,951 WARN fs.FileSystem: java.lang.NoClassDefFoundError: java/lang/management/ManagementFactory
    2022-07-20 15:19:02,951 WARN fs.FileSystem: java.lang.ClassNotFoundException: java.lang.management.ManagementFactory
    2022-07-20 15:19:02,952 DEBUG fs.FileSystem: Stack Trace
    java.util.ServiceConfigurationError: org.apache.hadoop.fs.FileSystem: Provider org.apache.hadoop.fs.viewfs.ViewFileSystem could not be instantiated
    	at java.base/java.util.ServiceLoader.fail(ServiceLoader.java:582)
    	at java.base/java.util.ServiceLoader$ProviderImpl.newInstance(ServiceLoader.java:804)
    	at java.base/java.util.ServiceLoader$ProviderImpl.get(ServiceLoader.java:722)
    	at java.base/java.util.ServiceLoader$3.next(ServiceLoader.java:1395)
    	at org.apache.hadoop.fs.FileSystem.loadFileSystems(FileSystem.java:3289)
    Caused by: java.lang.NoClassDefFoundError: java/lang/management/ManagementFactory
    	at org.apache.hadoop.util.ReflectionUtils.<clinit>(ReflectionUtils.java:162)
    	at org.apache.hadoop.metrics2.lib.MetricsSourceBuilder.initRegistry(MetricsSourceBuilder.java:102)
    	at org.apache.hadoop.metrics2.lib.MetricsSourceBuilder.<init>(MetricsSourceBuilder.java:66)
    	at org.apache.hadoop.metrics2.lib.MetricsAnnotations.newSourceBuilder(MetricsAnnotations.java:43)
    	at org.apache.hadoop.metrics2.impl.MetricsSystemImpl.register(MetricsSystemImpl.java:223)
    	at org.apache.hadoop.metrics2.MetricsSystem.register(MetricsSystem.java:71)
    	at org.apache.hadoop.security.UserGroupInformation$UgiMetrics.create(UserGroupInformation.java:143)
    	at org.apache.hadoop.security.UserGroupInformation.<clinit>(UserGroupInformation.java:276)
    	at org.apache.hadoop.fs.viewfs.ViewFileSystem.<init>(ViewFileSystem.java:249)
    	at java.base/jdk.internal.reflect.NativeConstructorAccessorImpl.newInstance0(Native Method)
    	at java.base/jdk.internal.reflect.NativeConstructorAccessorImpl.newInstance(NativeConstructorAccessorImpl.java:62)
    	at java.base/jdk.internal.reflect.DelegatingConstructorAccessorImpl.newInstance(DelegatingConstructorAccessorImpl.java:45)
    	at java.base/java.lang.reflect.Constructor.newInstance(Constructor.java:490)
    	at java.base/java.util.ServiceLoader$ProviderImpl.newInstance(ServiceLoader.java:780)
    	... 3 more
    Caused by: java.lang.ClassNotFoundException: java.lang.management.ManagementFactory
    	at java.base/jdk.internal.loader.BuiltinClassLoader.loadClass(BuiltinClassLoader.java:581)
    	at java.base/jdk.internal.loader.ClassLoaders$AppClassLoader.loadClass(ClassLoaders.java:178)
    	at java.base/java.lang.ClassLoader.loadClass(ClassLoader.java:521)
    	... 17 more
    2022-07-20 15:19:02,958 DEBUG fs.FileSystem: har:// = class org.apache.hadoop.fs.HarFileSystem from /home/guozhenwei/hadoop/hadoop-3.2.3/share/hadoop/common/hadoop-common-3.2.3.jar
    2022-07-20 15:19:02,961 DEBUG fs.FileSystem: http:// = class org.apache.hadoop.fs.http.HttpFileSystem from /home/guozhenwei/hadoop/hadoop-3.2.3/share/hadoop/common/hadoop-common-3.2.3.jar
    2022-07-20 15:19:02,962 DEBUG fs.FileSystem: https:// = class org.apache.hadoop.fs.http.HttpsFileSystem from /home/guozhenwei/hadoop/hadoop-3.2.3/share/hadoop/common/hadoop-common-3.2.3.jar
    2022-07-20 15:19:02,994 DEBUG fs.FileSystem: hdfs:// = class org.apache.hadoop.hdfs.DistributedFileSystem from /home/guozhenwei/hadoop/hadoop-3.2.3/share/hadoop/hdfs/hadoop-hdfs-client-3.2.3.jar
    2022-07-20 15:19:03,007 DEBUG fs.FileSystem: webhdfs:// = class org.apache.hadoop.hdfs.web.WebHdfsFileSystem from /home/guozhenwei/hadoop/hadoop-3.2.3/share/hadoop/hdfs/hadoop-hdfs-client-3.2.3.jar
    2022-07-20 15:19:03,009 DEBUG fs.FileSystem: swebhdfs:// = class org.apache.hadoop.hdfs.web.SWebHdfsFileSystem from /home/guozhenwei/hadoop/hadoop-3.2.3/share/hadoop/hdfs/hadoop-hdfs-client-3.2.3.jar
    2022-07-20 15:19:03,032 DEBUG util.NativeCodeLoader: Trying to load the custom-built native-hadoop library...
    2022-07-20 15:19:03,032 DEBUG util.NativeCodeLoader: Failed to load native-hadoop with error: java.lang.UnsatisfiedLinkError: no hadoop in java.library.path: [/home/guozhenwei/project/rust/hdrs-demo/target/debug/build/curl-sys-8990cadea4dce356/out/build, /home/guozhenwei/project/rust/hdrs-demo/target/debug/build/hdfs-sys-5d5ff34cc5ce2b90/out, /home/guozhenwei/project/rust/hdrs-demo/target/debug/build/libnghttp2-sys-e9a000639cb5ea2a/out/i/lib, /home/guozhenwei/project/rust/hdrs-demo/target/debug/build/ring-c1674a59669df96e/out, /home/guozhenwei/project/rust/hdrs-demo/target/debug/deps, /home/guozhenwei/project/rust/hdrs-demo/target/debug, /home/guozhenwei/.rustup/toolchains/stable-x86_64-unknown-linux-gnu/lib/rustlib/x86_64-unknown-linux-gnu/lib, /home/guozhenwei/.rustup/toolchains/stable-x86_64-unknown-linux-gnu/lib, /home/guozhenwei/java/jdk-11.0.15/jre/lib/server, ., /usr/java/packages/lib, /usr/lib64, /lib64, /lib, /usr/lib]
    2022-07-20 15:19:03,032 DEBUG util.NativeCodeLoader: java.library.path=/home/guozhenwei/project/rust/hdrs-demo/target/debug/build/curl-sys-8990cadea4dce356/out/build:/home/guozhenwei/project/rust/hdrs-demo/target/debug/build/hdfs-sys-5d5ff34cc5ce2b90/out:/home/guozhenwei/project/rust/hdrs-demo/target/debug/build/libnghttp2-sys-e9a000639cb5ea2a/out/i/lib:/home/guozhenwei/project/rust/hdrs-demo/target/debug/build/ring-c1674a59669df96e/out:/home/guozhenwei/project/rust/hdrs-demo/target/debug/deps:/home/guozhenwei/project/rust/hdrs-demo/target/debug:/home/guozhenwei/.rustup/toolchains/stable-x86_64-unknown-linux-gnu/lib/rustlib/x86_64-unknown-linux-gnu/lib:/home/guozhenwei/.rustup/toolchains/stable-x86_64-unknown-linux-gnu/lib:/home/guozhenwei/java/jdk-11.0.15/jre/lib/server::/usr/java/packages/lib:/usr/lib64:/lib64:/lib:/usr/lib
    2022-07-20 15:19:03,032 WARN util.NativeCodeLoader: Unable to load native-hadoop library for your platform... using builtin-java classes where applicable
    hdfsBuilderConnect(forceNewInstance=0, nn=hdfs://localhost:9000, port=0, kerbTicketCachePath=(NULL), userName=(NULL)) error:
    NoClassDefFoundError: Could not initialize class org.apache.hadoop.security.UserGroupInformationjava.lang.NoClassDefFoundError: Could not initialize class org.apache.hadoop.security.UserGroupInformation
    	at org.apache.hadoop.fs.FileSystem.get(FileSystem.java:219)
    Error: Os { code: 255, kind: Uncategorized, message: "Unknown error 255" }

    Let's focus on the followign part:

    2022-07-20 15:19:02,861 DEBUG fs.FileSystem: Loading filesystems
    2022-07-20 15:19:02,900 DEBUG fs.FileSystem: file:// = class org.apache.hadoop.fs.LocalFileSystem from /home/guozhenwei/hadoop/hadoop-3.2.3/share/hadoop/common/hadoop-common-3.2.3.jar
    2022-07-20 15:19:02,951 WARN fs.FileSystem: Cannot load filesystem: java.util.ServiceConfigurationError: org.apache.hadoop.fs.FileSystem: Provider org.apache.hadoop.fs.viewfs.ViewFileSystem could not be instantiated
    2022-07-20 15:19:02,951 WARN fs.FileSystem: java.lang.NoClassDefFoundError: java/lang/management/ManagementFactory
    2022-07-20 15:19:02,951 WARN fs.FileSystem: java.lang.ClassNotFoundException: java.lang.management.ManagementFactory
    2022-07-20 15:19:02,952 DEBUG fs.FileSystem: Stack Trace
    java.util.ServiceConfigurationError: org.apache.hadoop.fs.FileSystem: Provider org.apache.hadoop.fs.viewfs.ViewFileSystem could not be instantiated
    	at java.base/java.util.ServiceLoader.fail(ServiceLoader.java:582)
    	at java.base/java.util.ServiceLoader$ProviderImpl.newInstance(ServiceLoader.java:804)
    	at java.base/java.util.ServiceLoader$ProviderImpl.get(ServiceLoader.java:722)
    	at java.base/java.util.ServiceLoader$3.next(ServiceLoader.java:1395)
    	at org.apache.hadoop.fs.FileSystem.loadFileSystems(FileSystem.java:3289)
    Caused by: java.lang.NoClassDefFoundError: java/lang/management/ManagementFactory
    	at org.apache.hadoop.util.ReflectionUtils.<clinit>(ReflectionUtils.java:162)
    	at org.apache.hadoop.metrics2.lib.MetricsSourceBuilder.initRegistry(MetricsSourceBuilder.java:102)
    	at org.apache.hadoop.metrics2.lib.MetricsSourceBuilder.<init>(MetricsSourceBuilder.java:66)
    	at org.apache.hadoop.metrics2.lib.MetricsAnnotations.newSourceBuilder(MetricsAnnotations.java:43)
    	at org.apache.hadoop.metrics2.impl.MetricsSystemImpl.register(MetricsSystemImpl.java:223)
    	at org.apache.hadoop.metrics2.MetricsSystem.register(MetricsSystem.java:71)
    	at org.apache.hadoop.security.UserGroupInformation$UgiMetrics.create(UserGroupInformation.java:143)
    	at org.apache.hadoop.security.UserGroupInformation.<clinit>(UserGroupInformation.java:276)
    	at org.apache.hadoop.fs.viewfs.ViewFileSystem.<init>(ViewFileSystem.java:249)
    	at java.base/jdk.internal.reflect.NativeConstructorAccessorImpl.newInstance0(Native Method)
    	at java.base/jdk.internal.reflect.NativeConstructorAccessorImpl.newInstance(NativeConstructorAccessorImpl.java:62)
    	at java.base/jdk.internal.reflect.DelegatingConstructorAccessorImpl.newInstance(DelegatingConstructorAccessorImpl.java:45)
    	at java.base/java.lang.reflect.Constructor.newInstance(Constructor.java:490)
    	at java.base/java.util.ServiceLoader$ProviderImpl.newInstance(ServiceLoader.java:780)
    	... 3 more
    Caused by: java.lang.ClassNotFoundException: java.lang.management.ManagementFactory
    	at java.base/jdk.internal.loader.BuiltinClassLoader.loadClass(BuiltinClassLoader.java:581)
    	at java.base/jdk.internal.loader.ClassLoaders$AppClassLoader.loadClass(ClassLoaders.java:178)
    	at java.base/java.lang.ClassLoader.loadClass(ClassLoader.java:521)
    	... 17 more

    I tried it, but it didn't work.

    Can you create a repo with your testing code and Cargo.toml (Cargo.lock is also needed)? Thanks.

    If there are some changes in hdrs please also submit.

    Hadoop configuration should be no problem.

    I think the configuration should be OK too. But can you give a try?

    Remove all changes, and rollback to:

    <?xml version="1.0" encoding="UTF-8"?>
    <?xml-stylesheet type="text/xsl" href="configuration.xsl"?>
      Licensed under the Apache License, Version 2.0 (the "License");
      you may not use this file except in compliance with the License.
      You may obtain a copy of the License at
        http://www.apache.org/licenses/LICENSE-2.0
      Unless required by applicable law or agreed to in writing, software
      distributed under the License is distributed on an "AS IS" BASIS,
      WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
      See the License for the specific language governing permissions and
      limitations under the License. See accompanying LICENSE file.
    <!-- Put site-specific property overrides in this file. -->
    <configuration>
    </configuration>

    And then connect to default?

    I only added a line to hadoop-env.sh this export JAVA_HOME=/home/guozhenwei/java/jdk-11.0.15/.

    If I don't add it, hadoop will report an error when it starts.

    BTW, default (without any config) is OK to run without starting hadoop. Please give it a try. It will only load the jars to read local fs.

    I tried it. Same error.

    This error is caused by cannot load the file system.

    2022-07-20 17:14:54,061 WARN fs.FileSystem: Cannot load filesystem: java.util.ServiceConfigurationError: org.apache.hadoop.fs.FileSystem: Provider org.apache.hadoop.fs.viewfs.ViewFileSystem could not be instantiated
    2022-07-20 17:14:54,061 WARN fs.FileSystem: java.lang.NoClassDefFoundError: java/lang/management/ManagementFactory
    2022-07-20 17:14:54,061 WARN fs.FileSystem: java.lang.ClassNotFoundException: java.lang.management.ManagementFactory
    2022-07-20 17:14:54,062 DEBUG fs.FileSystem: Stack Trace
    java.util.ServiceConfigurationError: org.apache.hadoop.fs.FileSystem: Provider org.apache.hadoop.fs.viewfs.ViewFileSystem could not be instantiated

    Ubuntu has it's own OpenJDK, can you try sudo apt install default-jdk? And switch all JAVA_HOME to it?

    The new JAVA_HOME may be like /usr/lib/jvm/java-18-openjdk/.

    So far, I don't know why java.lang.management.ManagementFactory is not found. It's a part of Java.

    This error is caused by cannot load the file system.

    Yes, the fs is not loaded because we can't find java.lang.management.ManagementFactory.

    If everything works as expected, we will see logs like the following:

    2022-07-20 15:06:46,849 DEBUG fs.FileSystem: Loading filesystems
    2022-07-20 15:06:46,864 DEBUG fs.FileSystem: file:// = class org.apache.hadoop.fs.LocalFileSystem from /home/xuanwo/Code/xuanwo/tmp/hadoop-3.2.3/hadoop-3.2.3/share/hadoop/common/hadoop-common-3.2.3.jar
    2022-07-20 15:06:46,895 DEBUG lib.MutableMetricsFactory: field org.apache.hadoop.metrics2.lib.MutableRate org.apache.hadoop.security.UserGroupInformation$UgiMetrics.loginSuccess with annotation @org.apache.hadoop.metrics2.annotation.Metric(sampleName="Ops", always=false, valueName="Time", about="", interval=10, type=DEFAULT, value={"Rate of successful kerberos logins and latency (milliseconds)"})
    2022-07-20 15:06:46,897 DEBUG lib.MutableMetricsFactory: field org.apache.hadoop.metrics2.lib.MutableRate org.apache.hadoop.security.UserGroupInformation$UgiMetrics.loginFailure with annotation @org.apache.hadoop.metrics2.annotation.Metric(sampleName="Ops", always=false, valueName="Time", about="", interval=10, type=DEFAULT, value={"Rate of failed kerberos logins and latency (milliseconds)"})
    2022-07-20 15:06:46,897 DEBUG lib.MutableMetricsFactory: field org.apache.hadoop.metrics2.lib.MutableRate org.apache.hadoop.security.UserGroupInformation$UgiMetrics.getGroups with annotation @org.apache.hadoop.metrics2.annotation.Metric(sampleName="Ops", always=false, valueName="Time", about="", interval=10, type=DEFAULT, value={"GetGroups"})
    2022-07-20 15:06:46,898 DEBUG lib.MutableMetricsFactory: field private org.apache.hadoop.metrics2.lib.MutableGaugeLong org.apache.hadoop.security.UserGroupInformation$UgiMetrics.renewalFailuresTotal with annotation @org.apache.hadoop.metrics2.annotation.Metric(sampleName="Ops", always=false, valueName="Time", about="", interval=10, type=DEFAULT, value={"Renewal failures since startup"})
    2022-07-20 15:06:46,898 DEBUG lib.MutableMetricsFactory: field private org.apache.hadoop.metrics2.lib.MutableGaugeInt org.apache.hadoop.security.UserGroupInformation$UgiMetrics.renewalFailures with annotation @org.apache.hadoop.metrics2.annotation.Metric(sampleName="Ops", always=false, valueName="Time", about="", interval=10, type=DEFAULT, value={"Renewal failures since last successful login"})
    2022-07-20 15:06:46,899 DEBUG impl.MetricsSystemImpl: UgiMetrics, User and group related metrics
    2022-07-20 15:06:46,943 DEBUG util.Shell: setsid exited with exit code 0
    2022-07-20 15:06:46,944 DEBUG security.SecurityUtil: Setting hadoop.security.token.service.use_ip to true
    2022-07-20 15:06:46,950 DEBUG security.Groups:  Creating new Groups object

    But we stopped at:

    2022-07-20 15:19:02,861 DEBUG fs.FileSystem: Loading filesystems
    2022-07-20 15:19:02,900 DEBUG fs.FileSystem: file:// = class org.apache.hadoop.fs.LocalFileSystem from /home/guozhenwei/hadoop/hadoop-3.2.3/share/hadoop/common/hadoop-common-3.2.3.jar
    2022-07-20 15:19:02,951 WARN fs.FileSystem: Cannot load filesystem: java.util.ServiceConfigurationError: org.apache.hadoop.fs.FileSystem: Provider org.apache.hadoop.fs.viewfs.ViewFileSystem could not be instantiated
    2022-07-20 15:19:02,951 WARN fs.FileSystem: java.lang.NoClassDefFoundError: java/lang/management/ManagementFactory
    2022-07-20 15:19:02,951 WARN fs.FileSystem: java.lang.ClassNotFoundException: java.lang.management.ManagementFactory
    2022-07-20 15:19:02,952 DEBUG fs.FileSystem: Stack Trace

    We failed again.

    JAVA_HOME: Ok("/usr/lib/jvm/java-11-openjdk-amd64")
    HADOOP_HOME: Ok("/home/guozhenwei/hadoop/hadoop-3.2.3")
    CLASSPATH: Ok("/usr/lib/jvm/java-11-openjdk-amd64/lib:/usr/lib/jvm/java-11-openjdk-amd64/jre/lib:/home/guozhenwei/hadoop/hadoop-3.2.3/etc/hadoop:/home/guozhenwei/hadoop/hadoop-3.2.3/share/hadoop/common/lib/*:/home/guozhenwei/hadoop/hadoop-3.2.3/share/hadoop/common/*:/home/guozhenwei/hadoop/hadoop-3.2.3/share/hadoop/hdfs:/home/guozhenwei/hadoop/hadoop-3.2.3/share/hadoop/hdfs/lib/*:/home/guozhenwei/hadoop/hadoop-3.2.3/share/hadoop/hdfs/*:/home/guozhenwei/hadoop/hadoop-3.2.3/share/hadoop/mapreduce/lib/*:/home/guozhenwei/hadoop/hadoop-3.2.3/share/hadoop/mapreduce/*:/home/guozhenwei/hadoop/hadoop-3.2.3/share/hadoop/yarn:/home/guozhenwei/hadoop/hadoop-3.2.3/share/hadoop/yarn/lib/*:/home/guozhenwei/hadoop/hadoop-3.2.3/share/hadoop/yarn/*")
    2022-07-20 17:53:18,185 DEBUG fs.FileSystem: Loading filesystems
    2022-07-20 17:53:18,220 DEBUG fs.FileSystem: file:// = class org.apache.hadoop.fs.LocalFileSystem from /home/guozhenwei/hadoop/hadoop-3.2.3/share/hadoop/common/hadoop-common-3.2.3.jar
    2022-07-20 17:53:18,266 WARN fs.FileSystem: Cannot load filesystem: java.util.ServiceConfigurationError: org.apache.hadoop.fs.FileSystem: Provider org.apache.hadoop.fs.viewfs.ViewFileSystem could not be instantiated
    2022-07-20 17:53:18,266 WARN fs.FileSystem: java.lang.NoClassDefFoundError: java/lang/management/ManagementFactory
    2022-07-20 17:53:18,266 WARN fs.FileSystem: java.lang.ClassNotFoundException: java.lang.management.ManagementFactory
    2022-07-20 17:53:18,266 DEBUG fs.FileSystem: Stack Trace
    java.util.ServiceConfigurationError: org.apache.hadoop.fs.FileSystem: Provider org.apache.hadoop.fs.viewfs.ViewFileSystem could not be instantiated
    	at java.base/java.util.ServiceLoader.fail(ServiceLoader.java:582)
    	at java.base/java.util.ServiceLoader$ProviderImpl.newInstance(ServiceLoader.java:804)
    	at java.base/java.util.ServiceLoader$ProviderImpl.get(ServiceLoader.java:722)
    	at java.base/java.util.ServiceLoader$3.next(ServiceLoader.java:1395)
    	at org.apache.hadoop.fs.FileSystem.loadFileSystems(FileSystem.java:3289)
    Caused by: java.lang.NoClassDefFoundError: java/lang/management/ManagementFactory
    	at org.apache.hadoop.util.ReflectionUtils.<clinit>(ReflectionUtils.java:162)
    	at org.apache.hadoop.metrics2.lib.MetricsSourceBuilder.initRegistry(MetricsSourceBuilder.java:102)
    	at org.apache.hadoop.metrics2.lib.MetricsSourceBuilder.<init>(MetricsSourceBuilder.java:66)
    	at org.apache.hadoop.metrics2.lib.MetricsAnnotations.newSourceBuilder(MetricsAnnotations.java:43)
    	at org.apache.hadoop.metrics2.impl.MetricsSystemImpl.register(MetricsSystemImpl.java:223)
    	at org.apache.hadoop.metrics2.MetricsSystem.register(MetricsSystem.java:71)
    	at org.apache.hadoop.security.UserGroupInformation$UgiMetrics.create(UserGroupInformation.java:143)
    	at org.apache.hadoop.security.UserGroupInformation.<clinit>(UserGroupInformation.java:276)
    	at org.apache.hadoop.fs.viewfs.ViewFileSystem.<init>(ViewFileSystem.java:249)
    	at java.base/jdk.internal.reflect.NativeConstructorAccessorImpl.newInstance0(Native Method)
    	at java.base/jdk.internal.reflect.NativeConstructorAccessorImpl.newInstance(NativeConstructorAccessorImpl.java:62)
    	at java.base/jdk.internal.reflect.DelegatingConstructorAccessorImpl.newInstance(DelegatingConstructorAccessorImpl.java:45)
    	at java.base/java.lang.reflect.Constructor.newInstance(Constructor.java:490)
    	at java.base/java.util.ServiceLoader$ProviderImpl.newInstance(ServiceLoader.java:780)
    	... 3 more
    Caused by: java.lang.ClassNotFoundException: java.lang.management.ManagementFactory
    	at java.base/jdk.internal.loader.BuiltinClassLoader.loadClass(BuiltinClassLoader.java:581)
    	at java.base/jdk.internal.loader.ClassLoaders$AppClassLoader.loadClass(ClassLoaders.java:178)
    	at java.base/java.lang.ClassLoader.loadClass(ClassLoader.java:522)
    	... 17 more
    2022-07-20 17:53:18,272 DEBUG fs.FileSystem: har:// = class org.apache.hadoop.fs.HarFileSystem from /home/guozhenwei/hadoop/hadoop-3.2.3/share/hadoop/common/hadoop-common-3.2.3.jar
    2022-07-20 17:53:18,274 DEBUG fs.FileSystem: http:// = class org.apache.hadoop.fs.http.HttpFileSystem from /home/guozhenwei/hadoop/hadoop-3.2.3/share/hadoop/common/hadoop-common-3.2.3.jar
    2022-07-20 17:53:18,276 DEBUG fs.FileSystem: https:// = class org.apache.hadoop.fs.http.HttpsFileSystem from /home/guozhenwei/hadoop/hadoop-3.2.3/share/hadoop/common/hadoop-common-3.2.3.jar
    2022-07-20 17:53:18,305 DEBUG fs.FileSystem: hdfs:// = class org.apache.hadoop.hdfs.DistributedFileSystem from /home/guozhenwei/hadoop/hadoop-3.2.3/share/hadoop/hdfs/hadoop-hdfs-client-3.2.3.jar
    2022-07-20 17:53:18,317 DEBUG fs.FileSystem: webhdfs:// = class org.apache.hadoop.hdfs.web.WebHdfsFileSystem from /home/guozhenwei/hadoop/hadoop-3.2.3/share/hadoop/hdfs/hadoop-hdfs-client-3.2.3.jar
    2022-07-20 17:53:18,319 DEBUG fs.FileSystem: swebhdfs:// = class org.apache.hadoop.hdfs.web.SWebHdfsFileSystem from /home/guozhenwei/hadoop/hadoop-3.2.3/share/hadoop/hdfs/hadoop-hdfs-client-3.2.3.jar
    2022-07-20 17:53:18,340 DEBUG util.NativeCodeLoader: Trying to load the custom-built native-hadoop library...
    2022-07-20 17:53:18,340 DEBUG util.NativeCodeLoader: Failed to load native-hadoop with error: java.lang.UnsatisfiedLinkError: no hadoop in java.library.path: [/home/guozhenwei/project/rust/hdrs-demo/target/debug/build/hdfs-sys-5d5ff34cc5ce2b90/out, /home/guozhenwei/project/rust/hdrs-demo/target/debug/deps, /home/guozhenwei/project/rust/hdrs-demo/target/debug, /home/guozhenwei/.rustup/toolchains/stable-x86_64-unknown-linux-gnu/lib/rustlib/x86_64-unknown-linux-gnu/lib, /home/guozhenwei/.rustup/toolchains/stable-x86_64-unknown-linux-gnu/lib, /usr/lib/jvm/java-11-openjdk-amd64/jre/lib/server, ., /usr/java/packages/lib, /usr/lib/x86_64-linux-gnu/jni, /lib/x86_64-linux-gnu, /usr/lib/x86_64-linux-gnu, /usr/lib/jni, /lib, /usr/lib]
    2022-07-20 17:53:18,340 DEBUG util.NativeCodeLoader: java.library.path=/home/guozhenwei/project/rust/hdrs-demo/target/debug/build/hdfs-sys-5d5ff34cc5ce2b90/out:/home/guozhenwei/project/rust/hdrs-demo/target/debug/deps:/home/guozhenwei/project/rust/hdrs-demo/target/debug:/home/guozhenwei/.rustup/toolchains/stable-x86_64-unknown-linux-gnu/lib/rustlib/x86_64-unknown-linux-gnu/lib:/home/guozhenwei/.rustup/toolchains/stable-x86_64-unknown-linux-gnu/lib:/usr/lib/jvm/java-11-openjdk-amd64/jre/lib/server::/usr/java/packages/lib:/usr/lib/x86_64-linux-gnu/jni:/lib/x86_64-linux-gnu:/usr/lib/x86_64-linux-gnu:/usr/lib/jni:/lib:/usr/lib
    2022-07-20 17:53:18,340 WARN util.NativeCodeLoader: Unable to load native-hadoop library for your platform... using builtin-java classes where applicable
    hdfsBuilderConnect(forceNewInstance=0, nn=default, port=0, kerbTicketCachePath=(NULL), userName=(NULL)) error:
    NoClassDefFoundError: Could not initialize class org.apache.hadoop.security.UserGroupInformationjava.lang.NoClassDefFoundError: Could not initialize class org.apache.hadoop.security.UserGroupInformation
    	at org.apache.hadoop.fs.FileSystem.get(FileSystem.java:219)
    Error: Os { code: 255, kind: Uncategorized, message: "Unknown error 255" }

    Support for Java versions is documented in the official Hadoop documentation. It has been tested that using openjdk-11 results in a certain error when YARN starts up.

    I learned a lot from this issue...

    I will add your notes and solutions in official documents. Thanks for your patience!

  •