hadoop做了ACL认证之后,spark cluster方式提交不了任务,有了解底层认证原理的吗

lmk_2000_1 2017-09-07 04:52:36
没有启用kerberos,仅仅是对hadoop集群做了ACL认证,设置了hadoop-policy.xml的认证用户权限只有hadoop用户,集群启动正常,但提交spark任务的时候,使用cluster模式提交不了,报错如下:

2017-09-07 13:55:06,984 WARN org.apache.hadoop.security.UserGroupInformation: No groups available for user appattempt_1504761460138_0077_000001
2017-09-07 13:55:06,984 WARN SecurityLogger.org.apache.hadoop.security.authorize.ServiceAuthorizationManager: Authorization failed for appattempt_1504761460138_0077_000001 (auth:TOKEN) for protocol=interface org.apache.hadoop.yarn.api.ContainerManagementProtocolPB, expected client Kerberos principal is null
2017-09-07 13:55:06,984 INFO org.apache.hadoop.ipc.Server: Connection from 10.10.136.126:38932 for protocol org.apache.hadoop.yarn.api.ContainerManagementProtocolPB is unauthorized for user appattempt_1504761460138_0077_000001 (auth:TOKEN)

反编译了下源码,看到对应的程序片段是这样的
KerberosInfo krbInfo = SecurityUtil.getKerberosInfo(protocol, conf);
String clientPrincipal = null;
if (krbInfo != null) {
String clientKey = krbInfo.clientPrincipal();
if (clientKey != null && !clientKey.isEmpty()) {
try {
clientPrincipal = SecurityUtil.getServerPrincipal(
conf.get(clientKey), addr);
} catch (IOException e) {
throw (AuthorizationException) new AuthorizationException(
"Can't figure out Kerberos principal name for connection from "
+ addr + " for user=" + user + " protocol=" + protocol)
.initCause(e);
}
}
}
if((clientPrincipal != null && !clientPrincipal.equals(user.getUserName())) ||
acls.length != 2 || !acls[0].isUserAllowed(user) || acls[1].isUserAllowed(user)) {
AUDITLOG.warn(AUTHZ_FAILED_FOR + user + " for protocol=" + protocol
+ ", expected client Kerberos principal is " + clientPrincipal);
throw new AuthorizationException("User " + user +
" is not authorized for protocol " + protocol +
", expected client Kerberos principal is " + clientPrincipal);
}
AUDITLOG.info(AUTHZ_SUCCESSFUL_FOR + user + " for protocol="+protocol);

根据if条件判断,应该是对提交任务的用户做ACL认证时,没有认证通过,但日志中显示的用户为:appattempt_1504761460138_0077_000001 (auth:TOKEN) ,难道不应该是hadoop吗?

有没有对cluster提交比较清楚地,帮忙解答一下,感谢
...全文
494 1 打赏 收藏 转发到动态 举报
写回复
用AI写文章
1 条回复
切换为时间正序
请发表友善的回复…
发表回复

20,808

社区成员

发帖
与我相关
我的任务
社区描述
Hadoop生态大数据交流社区,致力于有Hadoop,hive,Spark,Hbase,Flink,ClickHouse,Kafka,数据仓库,大数据集群运维技术分享和交流等。致力于收集优质的博客
社区管理员
  • 分布式计算/Hadoop社区
  • 涤生大数据
加入社区
  • 近7日
  • 近30日
  • 至今
社区公告
暂无公告

试试用AI创作助手写篇文章吧