hive的权限管理

时间:2022-02-03 06:17:40

总体思路:hive的权限管理中只有角色的概念,还需要搭配linux下的用户和组的概念一起完成权限管理

  • linux下创建用户、组

[root@dn210126 ~]# useradd -s /bin/sh -d /hadoop1 -m -G adm,root hadoop1
[root@dn210126 ~]# passwd hadoop1
更改用户 hadoop1 的密码 。
新的 密码:
无效的密码: 它基于字典单词
无效的密码: 过于简单
重新输入新的 密码:
passwd: 所有的身份验证令牌已经成功更新。
[root@dn210126 ~]# useradd -s /bin/sh -d /hadoop2 -m -G adm,root hadoop2
[root@dn210126 ~]# passwd hadoop2
更改用户 hadoop2 的密码 。
新的 密码:
无效的密码: 它基于字典单词
无效的密码: 过于简单
重新输入新的 密码:
抱歉,密码不匹配。
新的 密码:
无效的密码: 它基于字典单词
无效的密码: 过于简单
重新输入新的 密码:
passwd: 所有的身份验证令牌已经成功更新。
[root@dn210126 ~]# useradd -s /bin/sh -d /hadoop3 -m -G adm,root hadoop3

  • 修改hive的配置文件hive-site.xml
<property>
<name>hive.security.authorization.enabled</name>
<value>true</value>
<description>enable or disable the hive client authorization</description>
</property>

<property>
<name>hive.security.authorization.createtable.owner.grants</name>
<value>ALL</value>
<description>the privileges automatically granted to the owner whenever a table gets created.
An example like "select,drop" will grant select and drop privilege to the owner of the table</description>
</property>


  • hive下创建角色

hive> create role role_db1;
OK
Time taken: 0.815 seconds
hive> create role role_db2;
OK
Time taken: 0.032 seconds
hive> create role role_all_db;
OK
Time taken: 0.037 seconds

  • hive下创建数据库

hive> create database dc_test_db1;
OK
Time taken: 0.146 seconds
hive> create database dc_test_db2;
OK
Time taken: 0.057 seconds

  • 针对数据库对应于hdfs上的目录需要赋予777权限
[root@dn210126 bin]# ./hadoop dfs -chmod a+w /hive/warehouse/dc_test_db2.db
DEPRECATED: Use of this script to execute hdfs command is deprecated.
Instead use the hdfs command for it.

14/12/01 16:33:15 WARN util.NativeCodeLoader: Unable to load native-hadoop library for your platform... using builtin-java classes where applicable
[root@dn210126 bin]# ./hadoop dfs -chmod a+w /hive/warehouse/dc_test_db1.db
DEPRECATED: Use of this script to execute hdfs command is deprecated.
Instead use the hdfs command for it.

14/12/01 16:33:59 WARN util.NativeCodeLoader: Unable to load native-hadoop library for your platform... using builtin-java classes where applicable
如果不设置777权限,则会出现如下错误:
hive> use dc_test_db1;
OK
Time taken: 0.096 seconds
hive> show tables;
OK
Time taken: 0.094 seconds
hive> create table test(id int);
FAILED: Execution Error, return code 1 from org.apache.hadoop.hive.ql.exec.DDLTask. MetaException(message:Got exception: org.apache.hadoop.security.AccessControlException Permission denied: user=hadoop1, access=WRITE, inode="/hive/warehouse/dc_test_db1.db":root:supergroup:drwxr-xr-x
at org.apache.hadoop.hdfs.server.namenode.FSPermissionChecker.checkFsPermission(FSPermissionChecker.java:271)
at org.apache.hadoop.hdfs.server.namenode.FSPermissionChecker.check(FSPermissionChecker.java:257)
at org.apache.hadoop.hdfs.server.namenode.FSPermissionChecker.check(FSPermissionChecker.java:238)
at org.apache.hadoop.hdfs.server.namenode.FSPermissionChecker.checkPermission(FSPermissionChecker.java:179)
at org.apache.hadoop.hdfs.server.namenode.FSNamesystem.checkPermission(FSNamesystem.java:5904)
at org.apache.hadoop.hdfs.server.namenode.FSNamesystem.checkPermission(FSNamesystem.java:5886)
at org.apache.hadoop.hdfs.server.namenode.FSNamesystem.checkAncestorAccess(FSNamesystem.java:5860)
at org.apache.hadoop.hdfs.server.namenode.FSNamesystem.mkdirsInternal(FSNamesystem.java:3793)
at org.apache.hadoop.hdfs.server.namenode.FSNamesystem.mkdirsInt(FSNamesystem.java:3763)
at org.apache.hadoop.hdfs.server.namenode.FSNamesystem.mkdirs(FSNamesystem.java:3737)
at org.apache.hadoop.hdfs.server.namenode.NameNodeRpcServer.mkdirs(NameNodeRpcServer.java:778)
at org.apache.hadoop.hdfs.protocolPB.ClientNamenodeProtocolServerSideTranslatorPB.mkdirs(ClientNamenodeProtocolServerSideTranslatorPB.java:573)
at org.apache.hadoop.hdfs.protocol.proto.ClientNamenodeProtocolProtos$ClientNamenodeProtocol$2.callBlockingMethod(ClientNamenodeProtocolProtos.java)
at org.apache.hadoop.ipc.ProtobufRpcEngine$Server$ProtoBufRpcInvoker.call(ProtobufRpcEngine.java:585)
at org.apache.hadoop.ipc.RPC$Server.call(RPC.java:928)
at org.apache.hadoop.ipc.Server$Handler$1.run(Server.java:2013)
at org.apache.hadoop.ipc.Server$Handler$1.run(Server.java:2009)
at java.security.AccessController.doPrivileged(Native Method)
at javax.security.auth.Subject.doAs(Subject.java:415)
at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1614)
at org.apache.hadoop.ipc.Server$Handler.run(Server.java:2007)
)

hive> use dc_test_db2;
OK
Time taken: 0.072 seconds
hive> create table test(id int);
FAILED: Execution Error, return code 1 from org.apache.hadoop.hive.ql.exec.DDLTask. MetaException(message:Got exception: org.apache.hadoop.security.AccessControlException Permission denied: user=hadoop2, access=WRITE, inode="/hive/warehouse/dc_test_db2.db":root:supergroup:drwxr-xr-x
at org.apache.hadoop.hdfs.server.namenode.FSPermissionChecker.checkFsPermission(FSPermissionChecker.java:271)
at org.apache.hadoop.hdfs.server.namenode.FSPermissionChecker.check(FSPermissionChecker.java:257)
at org.apache.hadoop.hdfs.server.namenode.FSPermissionChecker.check(FSPermissionChecker.java:238)
at org.apache.hadoop.hdfs.server.namenode.FSPermissionChecker.checkPermission(FSPermissionChecker.java:179)
at org.apache.hadoop.hdfs.server.namenode.FSNamesystem.checkPermission(FSNamesystem.java:5904)
at org.apache.hadoop.hdfs.server.namenode.FSNamesystem.checkPermission(FSNamesystem.java:5886)
at org.apache.hadoop.hdfs.server.namenode.FSNamesystem.checkAncestorAccess(FSNamesystem.java:5860)
at org.apache.hadoop.hdfs.server.namenode.FSNamesystem.mkdirsInternal(FSNamesystem.java:3793)
at org.apache.hadoop.hdfs.server.namenode.FSNamesystem.mkdirsInt(FSNamesystem.java:3763)
at org.apache.hadoop.hdfs.server.namenode.FSNamesystem.mkdirs(FSNamesystem.java:3737)
at org.apache.hadoop.hdfs.server.namenode.NameNodeRpcServer.mkdirs(NameNodeRpcServer.java:778)
at org.apache.hadoop.hdfs.protocolPB.ClientNamenodeProtocolServerSideTranslatorPB.mkdirs(ClientNamenodeProtocolServerSideTranslatorPB.java:573)
at org.apache.hadoop.hdfs.protocol.proto.ClientNamenodeProtocolProtos$ClientNamenodeProtocol$2.callBlockingMethod(ClientNamenodeProtocolProtos.java)
at org.apache.hadoop.ipc.ProtobufRpcEngine$Server$ProtoBufRpcInvoker.call(ProtobufRpcEngine.java:585)
at org.apache.hadoop.ipc.RPC$Server.call(RPC.java:928)
at org.apache.hadoop.ipc.Server$Handler$1.run(Server.java:2013)
at org.apache.hadoop.ipc.Server$Handler$1.run(Server.java:2009)
at java.security.AccessController.doPrivileged(Native Method)
at javax.security.auth.Subject.doAs(Subject.java:415)
at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1614)
at org.apache.hadoop.ipc.Server$Handler.run(Server.java:2007)
)


  • hive下给角色赋予权限并且给用户赋予角色(暂时忽略组的概念)

hive> grant SELECT,CREATE,SHOW_DATABASE on database dc_test_db1 to role role_db1;
Error rolling back: Can't call rollback when autocommit=true
OK
Time taken: 0.715 seconds
hive> grant role role_db1 to user hadoop1;
OK
Time taken: 0.222 seconds
hive> grant select,create,drop on database dc_test_db2 to role role_db2;
OK
Time taken: 0.122 seconds
hive> grant role role_db2 to user hadoop2;
OK
Time taken: 0.041 seconds
hive> grant role role_db1,role_db2 to user hadoop3;
OK
Time taken: 0.066 seconds
hive> grant all on database dc_test_db2 to role role_db2;
OK
Time taken: 0.062 seconds

  • 更改本地文件读写权限(不同部署环境,目录会有所不同)

[root@dn210126 hadoop]# chmod -R g+w /data/hadoop/hdfs/

  • 更改日志文件目录的读写权限(不同部署环境,目录会有所不同)
[root@dn210126 hadoop]# chmod -R  g+w /data/logs/hive
  • 更改hdfs上的文件的读写权限

[root@dn210126 bin]# ./hadoop dfs -chmod 777 /tmp
如果没有将hdfs上的tmp目录设置为777权限,则会报如下的错误:
Caused by: org.apache.hadoop.ipc.RemoteException(org.apache.hadoop.security.AccessControlException): Permission denied: user=hadoop2, access=EXECUTE, inode="/tmp":root:supergroup:drwxrw----
at org.apache.hadoop.hdfs.server.namenode.FSPermissionChecker.checkFsPermission(FSPermissionChecker.java:271)
at org.apache.hadoop.hdfs.server.namenode.FSPermissionChecker.check(FSPermissionChecker.java:257)
at org.apache.hadoop.hdfs.server.namenode.FSPermissionChecker.checkTraverse(FSPermissionChecker.java:208)
at org.apache.hadoop.hdfs.server.namenode.FSPermissionChecker.checkPermission(FSPermissionChecker.java:171)
at org.apache.hadoop.hdfs.server.namenode.FSNamesystem.checkPermission(FSNamesystem.java:5904)
at org.apache.hadoop.hdfs.server.namenode.FSNamesystem.getFileInfo(FSNamesystem.java:3691)
at org.apache.hadoop.hdfs.server.namenode.NameNodeRpcServer.getFileInfo(NameNodeRpcServer.java:803)
at org.apache.hadoop.hdfs.protocolPB.ClientNamenodeProtocolServerSideTranslatorPB.getFileInfo(ClientNamenodeProtocolServerSideTranslatorPB.java:779)
at org.apache.hadoop.hdfs.protocol.proto.ClientNamenodeProtocolProtos$ClientNamenodeProtocol$2.callBlockingMethod(ClientNamenodeProtocolProtos.java)
at org.apache.hadoop.ipc.ProtobufRpcEngine$Server$ProtoBufRpcInvoker.call(ProtobufRpcEngine.java:585)
at org.apache.hadoop.ipc.RPC$Server.call(RPC.java:928)
at org.apache.hadoop.ipc.Server$Handler$1.run(Server.java:2013)
at org.apache.hadoop.ipc.Server$Handler$1.run(Server.java:2009)
at java.security.AccessController.doPrivileged(Native Method)
at javax.security.auth.Subject.doAs(Subject.java:415)
at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1614)
at org.apache.hadoop.ipc.Server$Handler.run(Server.java:2007)

at org.apache.hadoop.ipc.Client.call(Client.java:1411)
at org.apache.hadoop.ipc.Client.call(Client.java:1364)
at org.apache.hadoop.ipc.ProtobufRpcEngine$Invoker.invoke(ProtobufRpcEngine.java:206)
at com.sun.proxy.$Proxy18.getFileInfo(Unknown Source)
at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:57)
at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
at java.lang.reflect.Method.invoke(Method.java:606)
at org.apache.hadoop.io.retry.RetryInvocationHandler.invokeMethod(RetryInvocationHandler.java:187)
at org.apache.hadoop.io.retry.RetryInvocationHandler.invoke(RetryInvocationHandler.java:102)
at com.sun.proxy.$Proxy18.getFileInfo(Unknown Source)
at org.apache.hadoop.hdfs.protocolPB.ClientNamenodeProtocolTranslatorPB.getFileInfo(ClientNamenodeProtocolTranslatorPB.java:707)
at org.apache.hadoop.hdfs.DFSClient.getFileInfo(DFSClient.java:1785)
... 15 more

  • hadoop1用户的测试结果如下:
hive> use dc_test;
OK
Time taken: 0.088 seconds
hive> show tables;
OK
aaa
asbcd
test
Time taken: 0.083 seconds, Fetched: 3 row(s)
hive> drop table aaa;
Authorization failed:No privilege 'Drop' found for outputs { database:dc_test, table:aaa}. Use SHOW GRANT to get more details.
hive> select * from aaa;
Authorization failed:No privilege 'Select' found for inputs { database:dc_test, table:aaa, columnName:id}. Use SHOW GRANT to get more details.
hive> use dc_test_db1;
OK
Time taken: 0.096 seconds
hive> show tables;
OK
Time taken: 0.094 seconds
hive> create table test(id int);
OK
Time taken: 0.664 seconds
hive> use dc_test_db2;
OK
Time taken: 0.067 seconds
hive> create table ccc(id int);
Authorization failed:No privilege 'Create' found for outputs { database:dc_test_db2}. Use SHOW GRANT to get more details
  • hadoop2用户的测试结果如下:
hive> show databases;
OK
dc_test
dc_test_db1
dc_test_db2
default
Time taken: 1.228 seconds, Fetched: 4 row(s)
hive> use dc_test_db1;
OK
Time taken: 0.087 seconds
hive> show tables;
OK
Time taken: 0.104 seconds
hive> create table test(id int ,name string);
Authorization failed:No privilege 'Create' found for outputs { database:dc_test_db1}. Use SHOW GRANT to get more details.
hive> use dc_test_db2;
OK
Time taken: 0.087 seconds
hive> show tables;
OK
Time taken: 0.104 seconds
hive> create table test(id int ,name string);
OK
Time taken: 0.641 seconds
hive> show tables;
OK
test
Time taken: 0.059 seconds, Fetched: 1 row(s)
hive> use dc_test_db1;
OK
Time taken: 0.063 seconds
hive> show tables;
OK
test
Time taken: 0.056 seconds, Fetched: 1 row(s)
hive> create table nnn(id int);
Authorization failed:No privilege 'Create' found for outputs { database:dc_test_db1}. Use SHOW GRANT to get more details.

参考资料:https://cwiki.apache.org/confluence/display/Hive/Hive+Default+Authorization+-+Legacy+Mode