之前参考这篇文章(https://segmentfault.com/a/1190000019658767?utm_source=tag-newest,文章下面有我的评论),重写getConnection方法,发现虽然kerberos登陆成功,但是在获取连接时候还是没有权限。
后来参考源码,重写init()方法。获取到的Connection已经是认证通过的。
@Override
public void init() throws SQLException {
ImpalaDataSource _this = this;
loginUser.doAs(new PrivilegedAction<Void>() {
public Void run() {
try {
_this.superInit();
} catch (Exception e) {
e.printStackTrace();
}
return null;
}
});
}
但是打断点发现:每次在获取连接的时候都要走init()方法,相当于我重写的方法,每次都要走loginUser.doAs这一步。感觉没有必要,参考DuridDataSource的init方法发现。第一次初始化的时候,会将 inited设置为true。收到启发,重写方法中也仿照原生,加一句判断。最终的源码是:
import com.alibaba.druid.pool.DruidDataSource;
import lombok.Getter;
import lombok.Setter;
import org.apache.hadoop.security.UserGroupInformation;
import java.security.PrivilegedAction;
import java.sql.SQLException;
@Getter
@Setter
public class ImpalaDataSource extends DruidDataSource {
private UserGroupInformation loginUser;
public ImpalaDataSource(UserGroupInformation loginUser) {
this.loginUser = loginUser;
}
@Override
public void init() throws SQLException {
if (super.inited){
return;
}
ImpalaDataSource _this = this;
loginUser.doAs(new PrivilegedAction<Void>() {
public Void run() {
try {
_this.superInit();
} catch (Exception e) {
e.printStackTrace();
}
return null;
}
});
}
public void superInit() throws SQLException {
super.init();
}
}
下面是在实例化数据源之前做的认证操作:
@Bean(name = "impalaDataSource", initMethod = "init", destroyMethod = "close")
public ImpalaDataSource getImpalaDataSource() throws Exception {
//impala 的kerberos登录认证
org.apache.hadoop.conf.Configuration conf = new org.apache.hadoop.conf.Configuration();
KerberosInitor.initKerberosEnv(conf, impalaPrincipalName, impalaKeytabPath, krb5ConfPath, loginConfigPath);
//获取认证用户
UserGroupInformation loginUser = UserGroupInformation.getLoginUser();
ImpalaDataSource datasource = new ImpalaDataSource(loginUser);
datasource.setUrl(impalaUrl);
datasource.setDriverClassName(impalaDriverClassName);
//configuration
datasource.setInitialSize(initialSize);
datasource.setMinIdle(minIdle);
datasource.setMaxActive(maxActive);
datasource.setMaxWait(maxWait);
datasource.setMinEvictableIdleTimeMillis(minEvictableIdleTimeMillis);
datasource.setValidationQuery(validationQuery);
datasource.setTestWhileIdle(testWhileIdle);
datasource.setTestOnBorrow(testOnBorrow);
datasource.setTestOnReturn(testOnReturn);
datasource.setPoolPreparedStatements(poolPreparedStatements);
datasource.setMaxPoolPreparedStatementPerConnectionSize(maxPoolPreparedStatementPerConnectionSize);
return datasource;
}
public static void initKerberosEnv(Configuration conf, String principalName, String keytabPath, String krb5ConfPath, String loginConfigPath) throws Exception {
System.setProperty("java.security.krb5.conf", krb5ConfPath);
System.setProperty("java.security.auth.login.config", loginConfigPath);
conf.set("hadoop.security.authentication", "Kerberos");
// linux 环境会默认读取/etc/krb5.conf文件,win不指定会默认读取C:/Windows/krb5.ini
UserGroupInformation.setConfiguration(conf);
UserGroupInformation.loginUserFromKeytab(principalName, keytabPath);
}