recycler 没有正确加载 HDFS Storage Vault

Viewed 39
W20250901 05:12:55.908783  3125 recycler.cpp:604] no accessors for instance=569741636
W20250901 05:12:55.908816  3125 recycler.cpp:259] failed to init instance recycler, instance_id=569741636 ret=-2
W20250901 06:12:55.965934  3130 hdfs_accessor.cpp:87] failed to connect hdfs: (无效的参数): UnknownHostException: nameservices
W20250901 06:12:55.965989  3130 hdfs_accessor.cpp:363] failed to init hdfs accessor. uri=hdfs://nameservices/doris/data_92DE10DD-26AC-7FA8-DCB3-D3944003C35E
W20250901 06:12:55.965996  3130 recycler.cpp:565] failed to init hdfs accessor. instance_id=569741636 resource_id=1 name=hdfs_vault_demo hdfs_vault=build_conf { fs_name: "hdfs://nameservices" user: "hdfs" hdfs_confs { key: "hdfs.config.resources" value: "/data/doris/be/conf/core-site.xml,/data/doris/be/conf/hdfs-site.xml" } hdfs_confs { key: "dfs.namenode.rpc-address.nameservices.nn1" value: "namenode-01:8020" } hdfs_confs { key: "dfs.namenode.rpc-address.nameservices.nn2" value: "namenode-02:8020" } hdfs_confs { key: "dfs.nameservices" value: "nameservices" } hdfs_confs { key: "dfs.ha.namenodes.nameservices" value: "nn1,nn2" } hdfs_confs { key: "dfs.replication" value: "2" } } prefix: "doris/data_92DE10DD-26AC-7FA8-DCB3-D3944003C35E"

HDFS HA 模型下, 报错, 疑似没有正确加载HDFS HA 配置, 上面是meta_service.WARNING 日志, 下面是doris_cloud.out 里的报错, 更加明显

hdfsBuilderConnect(forceNewInstance=1, nn=hdfs://nameservices, port=0, kerbTicketCachePath=(NULL), userName=hdfs) error:
UnknownHostException: nameservicesjava.lang.IllegalArgumentException: java.net.UnknownHostException: nameservices
        at org.apache.hadoop.security.SecurityUtil.buildTokenService(SecurityUtil.java:475)
        at org.apache.hadoop.hdfs.NameNodeProxiesClient.createProxyWithClientProtocol(NameNodeProxiesClient.java:134)
        at org.apache.hadoop.hdfs.DFSClient.<init>(DFSClient.java:374)
        at org.apache.hadoop.hdfs.DFSClient.<init>(DFSClient.java:308)
        at org.apache.hadoop.hdfs.DistributedFileSystem.initDFSClient(DistributedFileSystem.java:204)
        at org.apache.hadoop.hdfs.DistributedFileSystem.initialize(DistributedFileSystem.java:189)
        at org.apache.hadoop.fs.FileSystem.createFileSystem(FileSystem.java:3624)
        at org.apache.hadoop.fs.FileSystem.access$300(FileSystem.java:174)
        at org.apache.hadoop.fs.FileSystem$Cache.getInternal(FileSystem.java:3725)
        at org.apache.hadoop.fs.FileSystem$Cache.getUnique(FileSystem.java:3682)
        at org.apache.hadoop.fs.FileSystem.newInstance(FileSystem.java:623)
        at org.apache.hadoop.fs.FileSystem$2.run(FileSystem.java:580)
        at org.apache.hadoop.fs.FileSystem$2.run(FileSystem.java:577)
        at java.base/java.security.AccessController.doPrivileged(AccessController.java:712)
        at java.base/javax.security.auth.Subject.doAs(Subject.java:439)
        at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1899)
        at org.apache.hadoop.fs.FileSystem.newInstance(FileSystem.java:577)
Caused by: java.net.UnknownHostException: nameservices
        ... 17 more

be及fe都是正常的, 存取数据都正常, 另外有个疑问, 创建HDFS Storage Vault 时, 参数 dfs.client.failover.proxy.provide.[] 无法入参

[HY000][1105] Unexpected exception: Invalid argument dfs.client.failover.proxy.provider.nameservices

有没有大佬看一下, 帮忙解决一下

1 Answers

可以看下这个
'dfs.client.failover.proxy.provider.nameservice1' = 'org.apache.hadoop.hdfs.server.namenode.ha.ConfiguredFailoverProxyProvider',