常用命令
1)创建Role
create role role_name;
2)为role赋予privilege
GRANT select ON DATABASE ods to role role_name;
GRANT all ON TABLE TEST to role role_name;
3)将role授予用户组
GRANT ROLE role_name TO GROUP user_group_name;
4)查看权限授予情况
(1)查看所有role(管理员)
SHOW ROLES;
(2)查看指定用户组的role(管理员)
SHOW ROLE GRANT GROUP user_group_name;
(3)查看指定ROLE的具体权限(管理员)
SHOW GRANT ROLE role_name;
详细命令查看:https://docs.cloudera.com/documentation/enterprise/5-7-x/topics/sg_hive_sql.html
场景介绍
用户fanyunli对Hive ODS数据库有读写,修改表结构等所有权限,用户zhaominhui对Hive ODS数据库仅开放读数据权限。
Hive表读写权限使用案例
为admin角色赋予超级权限,并将admin角色授权hive用户组
jdbc:hive2://hadoop1:10000> create role admin;
jdbc:hive2://hadoop1:10000> grant all on server server1 to role admin;
jdbc:hive2://hadoop1:10000> grant role admin to group hive
注:这里必须授权hive用户组为超级用户权限
创建所有权限用户组和读权限用户组
在所有Hive节点创建所有权限用户组all_privilege和读权限用户组reader。
[root@hadoop2]# useradd all_privilege
[root@hadoop2]# passwd all_privilege
[root@hadoop2]# useradd reader
[root@hadoop2]# passwd reader</pre>
创建用户fanyunli 和 zhaominhui,并归入对应用户组
在所有Hive节点创建fanyunli和zhaominhui用户,前者归入all_privilege用户组,后者归入reader用户组。
[root@hadoop2]# useradd fanyunli
[root@hadoop2]# passwd fanyunli
[root@hadoop2 ~]# usermod -a -G all_privilege fanyunli
[root@hadoop2]# useradd zhaominhui
[root@hadoop2]# passwd zhaominhui
[root@hadoop2 ~]# usermod -a -G reader zhaominhui
#查看结果,显示fanyunli所在为用户组fanyunli和all_pribilege,zhaominhui所在用户组为zhaominhui和reader.
[root@hadoop1 ~]# id fanyunli
uid=1000(fanyunli) gid=1000(fanyunli) 组=1000(fanyunli),1006(all_privilege)
[root@hadoop1 ~]# id zhaominhui
uid=1007(zhaominhui) gid=1007(zhaominhui) 组=1007(zhaominhui),1001(reader)
创建读权限角色reader和所有权限角色all_privilege,并分别赋予reader用户组和all_privilege用户组
Sentry的三种权限:
SELECT -> 文件的Read权限
INSERT -> 文件的Write权限
ALL -> 文件的Read和Write权限</pre>
创建角色及赋予用户组过程:
[root@hadoop1 /opt/cloudera/parcels/CDH-5.7.4-1.cdh5.7.4.p0.2/lib/hive/bin]# ./beeline
beeline> !connect jdbc:hive2://hadoop1:10000
scan complete in 2ms
Connecting to jdbc:hive2://hadoop1:10000
Enter username for jdbc:hive2://hadoop1:10000: hive
Enter password for jdbc:hive2://hadoop1:10000:
Connected to: Apache Hive (version 1.1.0-cdh5.7.4)
Driver: Hive JDBC (version 1.1.0-cdh5.7.4)
Transaction isolation: TRANSACTION_REPEATABLE_READ
创建角色
0: jdbc:hive2://hadoop1:10000> create role reader;
0: jdbc:hive2://hadoop1:10000> create role all_privilege;
赋予角色权限
0: jdbc:hive2://hadoop1:10000> GRANT select ON DATABASE ods TO ROLE reader;
0: jdbc:hive2://hadoop1:10000> GRANT all ON DATABASE ods TO ROLE all_privilege;
将角色授予用户组
0: jdbc:hive2://hadoop1:10000> GRANT ROLE reader to group reader;
0: jdbc:hive2://hadoop1:10000> GRANT ROLE all_privilege TO GROUP all_privilege;
查看所有角色
0: jdbc:hive2://hadoop1:10000> show roles;
+----------------+--+
| role |
+----------------+--+
| reader |
| all_privilege |
| admin |
+----------------+--+
查看指定角色role的具体权限,如下reader角色权限为select , all_privilege角色权限为*.
0: jdbc:hive2://hadoop1:10000> SHOW GRANT ROLE reader;
+-----------+--------+------------+---------+-----------------+-----------------+------------+---------------+-------------------+----------+--+
| database | table | partition | column | principal_name | principal_type | privilege | grant_option | grant_time | grantor |
+-----------+--------+------------+---------+-----------------+-----------------+------------+---------------+-------------------+----------+--+
| ods | | | | reader | ROLE | select | false | 1597975087736000 | -- |
+-----------+--------+------------+---------+-----------------+-----------------+------------+---------------+-------------------+----------+--+
0: jdbc:hive2://hadoop1:10000> SHOW GRANT ROLE all_privilege;
+-----------+--------+------------+---------+-----------------+-----------------+------------+---------------+-------------------+----------+--+
| database | table | partition | column | principal_name | principal_type | privilege | grant_option | grant_time | grantor |
+-----------+--------+------------+---------+-----------------+-----------------+------------+---------------+-------------------+----------+--+
| ods | | | | all_privilege | ROLE | * | false | 1597975111268000 | -- |
+-----------+--------+------------+---------+-----------------+-----------------+------------+---------------+-------------------+----------+--+
权限测试
使用reader用户组的用户zhaominhui登录Hive客户端,查询ods库表数据,并插入数据到ods库表,查看权限设置是否生效。
#使用reader用户身份登录
[root@hadoop1 /opt/cloudera/parcels/CDH-5.7.4-1.cdh5.7.4.p0.2/lib/hive/bin]# ./beeline
beeline> !connect jdbc:hive2://hadoop1:10000
scan complete in 2ms
Connecting to jdbc:hive2://hadoop1:10000
Enter username for jdbc:hive2://hadoop1:10000: zhaominhui
Enter password for jdbc:hive2://hadoop1:10000:
Connected to: Apache Hive (version 1.1.0-cdh5.7.4)
Driver: Hive JDBC (version 1.1.0-cdh5.7.4)
Transaction isolation: TRANSACTION_REPEATABLE_READ
查询ods库student表数据
0: jdbc:hive2://hadoop1:10000> select * from ods.student;
+-------------+---------------+------------------+--+
| student.id | student.name | student.teacher |
+-------------+---------------+------------------+--+
| 1 | a | NULL |
| 2 | b | NULL |
| 1 | ddd | ee |
| 3 | ggg | rr |
| 4 | rrr | rrr |
| 5 | dd | ter |
+-------------+---------------+------------------+--+
插入数据到student表,显示没有查询权限????应该显示没有insert权限。
0: jdbc:hive2://hadoop1:10000> insert into ods.student values(6,"ee","gg",4);
Error: Error while compiling statement: FAILED: SemanticException No valid privileges
User zhaominhui does not have privileges for QUERY
The required privileges: Server=server1->Db=default->Table=values__tmp__table__1->Column=tmp_values_col1->action=select; (state=42000,code=40000)
插入数据到ods库的另外一张表user_model,显示没有查询权限,但要求的权限提示正确,要求insert权限。
0: jdbc:hive2://hadoop1:10000> insert into user_model values("a");
Error: Error while compiling statement: FAILED: SemanticException No valid privileges
User zhaominhui does not have privileges for QUERY
The required privileges: Server=server1->Db=ods->Table=user_model->action=insert; (state=42000,code=40000)
#修改表结构,提示没有该权限。
0: jdbc:hive2://hadoop1:10000> alter table ods.student add columns(age int);
Error: Error while compiling statement: FAILED: SemanticException No valid privileges
User zhaominhui does not have privileges for ALTERTABLE_ADDCOLS
#创建表,提示没有创建表权限
0: jdbc:hive2://hadoop1:10000> create table user_model(a int,b int);
Error: Error while compiling statement: FAILED: SemanticException No valid privileges
User zhaominhui does not have privileges for CREATETABLE
The required privileges: Server=server1->Db=default->action=*; (state=42000,code=40000)
使用all_privilege用户组的用户fanyunli登录hive的beeline客户端,测试查询,插入,修改表结构等操作。
#使用reader用户身份登录
[root@hadoop1 /opt/cloudera/parcels/CDH-5.7.4-1.cdh5.7.4.p0.2/lib/hive/bin]# ./beeline
beeline> !connect jdbc:hive2://hadoop1:10000
scan complete in 2ms
Connecting to jdbc:hive2://hadoop1:10000
Enter username for jdbc:hive2://hadoop1:10000: fanyunli
Enter password for jdbc:hive2://hadoop1:10000:
Connected to: Apache Hive (version 1.1.0-cdh5.7.4)
Driver: Hive JDBC (version 1.1.0-cdh5.7.4)
Transaction isolation: TRANSACTION_REPEATABLE_READ
查询ods库student表数据
0: jdbc:hive2://hadoop1:10000> select * from ods.student;
+-------------+---------------+------------------+--------------+--+
| student.id | student.name | student.teacher | student.num |
+-------------+---------------+------------------+--------------+--+
| 1 | a | NULL | NULL |
| 2 | b | NULL | NULL |
| 1 | ddd | ee | NULL |
| 3 | ggg | rr | NULL |
| 4 | rrr | rrr | NULL |
| 5 | dd | ter | NULL |
+-------------+---------------+------------------+--------------+--+
插入数据到ods.student表,显示执行mapreduce,插入成功。
0: jdbc:hive2://hadoop1:10000> insert into student values(6,"dd","dd","ddd");
INFO : Hadoop job information for Stage-1: number of mappers: 1; number of reducers: 0
查看插入的数据,发现已经插入进去了。
0: jdbc:hive2://hadoop1:10000> select * from ods.student;
+-------------+---------------+------------------+--------------+--+
| student.id | student.name | student.teacher | student.num |
+-------------+---------------+------------------+--------------+--+
| 1 | a | NULL | NULL |
| 2 | b | NULL | NULL |
| 1 | ddd | ee | NULL |
| 3 | ggg | rr | NULL |
| 4 | rrr | rrr | NULL |
| 5 | dd | ter | NULL |
| 6 | dd | dd | NULL |
+-------------+---------------+------------------+--------------+--+
修改表结构,并查看,显示修改表结构成功
0: jdbc:hive2://hadoop1:10000> alter table ods.student add columns(parent string);
0: jdbc:hive2://hadoop1:10000> desc ods.student;
+-----------+------------+----------+--+
| col_name | data_type | comment |
+-----------+------------+----------+--+
| id | int | |
| name | string | |
| teacher | string | |
| num | int | |
| parent | string | |
+-----------+------------+----------+--+
创建ods库新表,显示成功,查看表结构
0: jdbc:hive2://hadoop1:10000> create table user_model(a string);
0: jdbc:hive2://hadoop1:10000> show tables;
+-----------------------------------+--+
| tab_name |
+-----------------------------------+--+
| eqs_long_page |
| mall_attribute |
| mall_attribute_value_product_ref |
| mall_category_attribute_ref |
| mall_price |
| platform_scene_sharing |
| student |
| test |
| user_model |
+-----------------------------------+--+
总结:上述探讨了两种在Hive中常用的用户场景。在测试只读权限用户zhaominhui时,插入数据提示不太正常,通过另外一个用户组,仅有insert权限的测试,在插入数据时也出现没有查询的权限,通过赋予查询权限之后,便可以正常插入数据,推测insert权限必须和select权限同时赋予才能生效。全部权限用户fanyunli,上述测试场景全部符合预期。
Hdfs数据读写权限使用案例(仅针对hive家目录文件)
需要统一设定Hive家目录文件夹和文件的权限和所属用户。
#设定 /user/hive/warehouse目录及子目录使用权限设置为771,表示除hive用户之外的用户仅具有读权限。
[hdfs@hadoop1 /opt/cloudera/parcels/CDH-5.7.4-1.cdh5.7.4.p0.2/lib/hive/bin]$ hadoop fs -chmod -R 771 /user/hive/warehouse
#设定/user/hive/warehouse目录及子目录所属用户组及用户为hive
[hdfs@hadoop1 /opt/cloudera/parcels/CDH-5.7.4-1.cdh5.7.4.p0.2/lib/hive/bin]$ hadoop fs -chown -R hive:hive /user/hive/warehouse
读权限用户zhaominhui对Hdfs文件操作
zhaominhui用户所在用户组仅被赋予了Hive ODS库的读权限。
#测试zhaominhui用户对/user/hive/warehouse的读取权限
[root@hadoop1 /opt/cloudera/parcels/CDH-5.7.4-1.cdh5.7.4.p0.2/lib/hive/bin]# su zhaominhui -
[zhaominhui@hadoop1 /opt/cloudera/parcels/CDH-5.7.4-1.cdh5.7.4.p0.2/lib/hive/bin]$ hadoop fs -ls /user/hive/warehouse
ls: Permission denied: user=zhaominhui, access=READ_EXECUTE, inode="/user/hive/warehouse":hive:hive:drwxrwx--t
#读取/user/hive/warehouse/ods.db,可以读取到数据
[zhaominhui@hadoop1 /opt/cloudera/parcels/CDH-5.7.4-1.cdh5.7.4.p0.2/lib/hive/bin]$ hadoop fs -ls /user/hive/warehouse/ods.db/
Found 9 items
drwxrwx--x+ - hive hive 0 2019-06-14 10:56 /user/hive/warehouse/ods.db/eqs_long_page
drwxrwx--x+ - hive hive 0 2020-08-17 14:36 /user/hive/warehouse/ods.db/mall_attribute
drwxrwx--x+ - hive hive 0 2020-08-17 14:35 /user/hive/warehouse/ods.db/mall_attribute_value_product_ref
drwxrwx--x+ - hive hive 0 2020-08-17 14:35 /user/hive/warehouse/ods.db/mall_category_attribute_ref
drwxrwx--x+ - hive hive 0 2019-07-24 16:46 /user/hive/warehouse/ods.db/mall_price
drwxrwx--x+ - hive hive 0 2019-04-23 17:58 /user/hive/warehouse/ods.db/platform_scene_sharing
drwxrwx--x+ - hive hive 0 2020-08-21 11:42 /user/hive/warehouse/ods.db/student
drwxrwx--x+ - hive hive 0 2020-08-21 11:55 /user/hive/warehouse/ods.db/test
drwxrwx--x+ - hive hive 0 2020-08-21 13:33 /user/hive/warehouse/ods.db/user_model
读取某一张表的文件
[zhaominhui@hadoop1 /opt/cloudera/parcels/CDH-5.7.4-1.cdh5.7.4.p0.2/lib/hive/bin]$ hadoop fs -cat /user/hive/warehouse/ods.db/test/000000_0
a,b
1,2
#将一个文件写入到/user/hive/warehouse/ods.db/test/目录下,显示无此权限
[zhaominhui@hadoop1 /data/work/test]$ hadoop fs -put query_oracle.py /user/hive/warehouse/ods.db/test/
put: Permission denied: user=zhaominhui, access=WRITE, inode="/user/hive/warehouse/ods.db/test":hive:hive:drwxrwx--x
所有权限用户fanyunli对Hdfs文件操作
fanyunli用户对/user/hive/warehouse/ods.db数据库表具有所有权限,包括但不限于读写权限。
#读取/user/hive/warehouse,显示无该权限
[fanyunli@hadoop1 /opt/cloudera/parcels/CDH-5.7.4-1.cdh5.7.4.p0.2/lib/hive/bin]$ hadoop fs -ls /user/hive/warehouse/
ls: Permission denied: user=fanyunli, access=READ_EXECUTE, inode="/user/hive/warehouse":hive:hive:drwxrwx--x
#读取/user/hive/warehouse/ods.db/,可以正常读取。
[fanyunli@hadoop1 /opt/cloudera/parcels/CDH-5.7.4-1.cdh5.7.4.p0.2/lib/hive/bin]$ hadoop fs -ls /user/hive/warehouse/ods.db/
Found 9 items
drwxrwx--x+ - hive hive 0 2019-06-14 10:56 /user/hive/warehouse/ods.db/eqs_long_page
drwxrwx--x+ - hive hive 0 2020-08-17 14:36 /user/hive/warehouse/ods.db/mall_attribute
drwxrwx--x+ - hive hive 0 2020-08-17 14:35 /user/hive/warehouse/ods.db/mall_attribute_value_product_ref
drwxrwx--x+ - hive hive 0 2020-08-17 14:35 /user/hive/warehouse/ods.db/mall_category_attribute_ref
drwxrwx--x+ - hive hive 0 2019-07-24 16:46 /user/hive/warehouse/ods.db/mall_price
drwxrwx--x+ - hive hive 0 2019-04-23 17:58 /user/hive/warehouse/ods.db/platform_scene_sharing
drwxrwx--x+ - hive hive 0 2020-08-21 11:42 /user/hive/warehouse/ods.db/student
drwxrwx--x+ - hive hive 0 2020-08-21 11:55 /user/hive/warehouse/ods.db/test
drwxrwx--x+ - hive hive 0 2020-08-21 13:33 /user/hive/warehouse/ods.db/user_model
#上传数据到/user/hive/warehouse/ods.db/test/目录下,显示正常上传,未报异常。
[fanyunli@hadoop1 /data/work/test]$ hadoop fs -put query_oracle.py /user/hive/warehouse/ods.db/test/
注:hdfs文件权限管理的前提,Sentry的实例在每个hdfs的datanode节点都存在。即安装的时候,gateway选择实例,将每个hdfs实例所在的节点都选择上。否则可能出现无法读取hdfs文件的情况,即便赋予该用户select权限。
总结
上述着重介绍了,Sentry对Hive和Hdfs表/数据的权限管理使用方法,并给出可能出现问题的解决方法。上述仅对数据库级别进行了测试说明,Sentry对表级别及列级别也都可以设置权限。详情参考官方文档。