3.0.6写数据到hive过一段时间报没有权限

Viewed 16

版本:3.0.6
运行环境:k8s-operator

操作步骤与问题描述:

  • 1、当将数据写入hive表中时,报没有权限
  • 2、将hive catalog中写入:ALTER CATALOG hive SET PROPERTIES ('dfs.hadoop.username' = 'root'); 后修复
  • 3、但是过一两天又报没有权限,查看catalog中dfs.hadoop.username属性还在,再写入dfs.hadoop.user.name=root后恢复
  • 4、又过两天,又报没权限,再写入hadoop.user.name=root后恢复
    发现如果出现权限问题,任意写个catalog的properites都能恢复

执行以下SQL报错:

insert into hive.temp.test3 values(1,'xxx');
[HY000][1105] failed to rename remote hdfs://mycluster/tmp/.doris_staging/admin/1e00f28134b0443dab60df9992b02f66/29d1839986ce420e-83b38f14abf90335_b0a99c9b-f631-4228-b007-c22ced2a6aa4-0.parquet to hdfs://mycluster/user/hive/warehouse/temp.db/test3/29d1839986ce420e-83b38f14abf90335_b0a99c9b-f631-4228-b007-c22ced2a6aa4-0.parquet, msg: Permission denied: user=hadoop, access=WRITE, inode="/user/hive/warehouse/temp.db/test3":root:supergroup:drwxr-xr-x
at org.apache.hadoop.hdfs.server.namenode.FSPermissionChecker.check(FSPermissionChecker.java:506)
at org.apache.hadoop.hdfs.server.namenode.FSPermissionChecker.checkPermission(FSPermissionChecker.java:346)
at org.apache.hadoop.hdfs.server.namenode.FSPermissionChecker.checkPermissionWithContext(FSPermissionChecker.java:370)
at org.apache.hadoop.hdfs.server.namenode.FSPermissionChecker.checkPermission(FSPermissionChecker.java:2 ...

show create catalog hive;

CREATE CATALOG `hive` PROPERTIES (
"type" = "hms",
"ipc.client.fallback-to-simple-auth-allowed" = "true",
"hive.metastore.uris" = "thrift://hive-metastore.hive:9083",
"hadoop.username" = "root",
"dfs.nameservices" = "mycluster",
"dfs.namenode.rpc-address.mycluster.nn1" = "hadoop-hadoop-hdfs-nn-1.hadoop-hadoop-hdfs-nn.hadoop.svc.cluster.local:9000",
"dfs.namenode.rpc-address.mycluster.nn0" = "hadoop-hadoop-hdfs-nn-0.hadoop-hadoop-hdfs-nn.hadoop.svc.cluster.local:9000",
"dfs.hadoop.username" = "root",
"dfs.hadoop.user.name" = "root",
"dfs.ha.namenodes.mycluster" = "nn1,nn0",
"dfs.client.failover.proxy.provider.mycluster" = "org.apache.hadoop.hdfs.server.namenode.ha.ConfiguredFailoverProxyProvider"
);
0 Answers