java kafka producer,Kafka Java Producer和kerberos

Getting error while sending message to kafka topic in kerberosed enviornment. We have cluster on hdp 2.3

But for sending messages, I have to do kinit explicitly first, then only I am able to send message to kafka topic.

I tried to do knit through java class but that also doesn't work.

PFB code:

package com.ct.test.kafka;

import java.util.Date;

import java.util.Properties;

import java.util.Random;

import kafka.javaapi.producer.Producer;

import kafka.producer.KeyedMessage;

import kafka.producer.ProducerConfig;

public class TestProducer {

public static void main(String[] args) {

String principalName = "ctadmin";

String keyTabPath = "/etc/security/keytabs/ctadmin.keytab";

boolean authStatus = CTSecurityUtil.loginUserFromKeytab(principalName, keyTabPath);

if (!authStatus) {

System.out.println("Authntication fails, try something else " + authStatus);

} else {

System.out.println("Authntication successfull " + authStatus);

}

System.setProperty("java.security.krb5.conf", "/etc/krb5.conf");

System.setProperty("java.security.auth.login.config", "/etc/kafka/2.3.4.0-3485/0/kafka_jaas.conf");

System.setProperty("javax.security.auth.useSubjectCredsOnly", "false");

System.setProperty("sun.security.krb5.debug", "true");

try {

long events = Long.parseLong("3");

Random rnd = new Random();

Properties props = new Properties();

System.out.println("After broker list- " + args[0]);

props.put("metadata.broker.list", args[0]);

props.put("serializer.class", "kafka.serializer.StringEncoder");

props.put("request.required.acks", "1");

props.put("security.protocol", "PLAINTEXTSASL");

//props.put("partitioner.class", "com.ct.test.kafka.SimplePartitioner");

System.out.println("After config prop -1");

ProducerConfig config = new ProducerConfig(props);

System.out.println("After config prop -2 config" + config);

Producer producer = new Producer(config);

System.out.println("After config prop -3");

for (long nEvents = 0L; nEvents < events; nEvents += 1L) {

Date runtime = new Date();

String ip = "192.168.2" + rnd.nextInt(255);

String msg = runtime + " www.example.com, " + ip;

KeyedMessage data = new KeyedMessage("test_march4", ip, msg);

System.out.println("After config prop -1 data" + data);

producer.send(data);

}

producer.close();

} catch (Throwable th) {

th.printStackTrace();

}

}

}

Pom.xml : All dependency downloaded from hortonworks repo.

org.apache.kafka

kafka_2.10

0.9.0.2.3.4.0-3485

org.apache.kafka

kafka-clients

0.9.0.2.3.4.0-3485

org.jasypt

jasypt-spring31

1.9.2

compile

org.apache.hadoop

hadoop-common

2.7.1.2.3.4.0-3485

Error :

Case1 : when I specify myuser kafka_jass.conf

log4j:WARN No appenders could be found for logger (kafka.utils.VerifiableProperties).

log4j:WARN Please initialize the log4j system properly.

log4j:WARN See http://logging.apache.org/log4j/1.2/faq.html#noconfig for more info.

After config prop -2 configkafka.producer.ProducerConfig@643293ae

java.lang.SecurityException: Configuration Error:

Line 6: expected [controlFlag]

at com.sun.security.auth.login.ConfigFile.(ConfigFile.java:110)

at sun.reflect.NativeConstructorAccessorImpl.newInstance0(Native Method)

at sun.reflect.NativeConstructorAccessorImpl.newInstance(NativeConstructorAccessorImpl.java:57)

at sun.reflect.DelegatingConstructorAccessorImpl.newInstance(DelegatingConstructorAccessorImpl.java:45)

at java.lang.reflect.Constructor.newInstance(Constructor.java:526)

at java.lang.Class.newInstance(Class.java:379)

at javax.security.auth.login.Configuration$2.run(Configuration.java:258)

at javax.security.auth.login.Configuration$2.run(Configuration.java:250)

at java.security.AccessController.doPrivileged(Native Method)

at javax.security.auth.login.Configuration.getConfiguration(Configuration.java:249)

at org.apache.kafka.common.security.kerberos.Login.login(Login.java:291)

at org.apache.kafka.common.security.kerberos.Login.(Login.java:104)

at kafka.common.security.LoginManager$.init(LoginManager.scala:36)

at kafka.producer.Producer.(Producer.scala:50)

at kafka.producer.Producer.(Producer.scala:73)

at kafka.javaapi.producer.Producer.(Producer.scala:26)

at com.ct.test.kafka.TestProducer.main(TestProducer.java:51)

Caused by: java.io.IOException: Configuration Error:

Line 6: expected [controlFlag]

at com.sun.security.auth.login.ConfigFile.match(ConfigFile.java:563)

at com.sun.security.auth.login.ConfigFile.parseLoginEntry(ConfigFile.java:413)

at com.sun.security.auth.login.ConfigFile.readConfig(ConfigFile.java:383)

at com.sun.security.auth.login.ConfigFile.init(ConfigFile.java:283)

at com.sun.security.auth.login.ConfigFile.init(ConfigFile.java:219)

at com.sun.security.auth.login.ConfigFile.(ConfigFile.java:108)

MyUser_Kafka_jass.conf

KafkaClient {

com.sun.security.auth.module.Krb5LoginModule required

doNotPrompt=true

useTicketCache=true

renewTicket=true

principal="ctadmin/prod-dev1-dn1@PROD.COM";

useKeyTab=true

serviceName="kafka"

keyTab="/etc/security/keytabs/ctadmin.keytab"

client=true;

};

Client {

com.sun.security.auth.module.Krb5LoginModule required

useKeyTab=true

keyTab="/etc/security/keytabs/ctadmin.keytab"

storeKey=true

useTicketCache=true

serviceName="zookeeper"

principal="ctadmin/prod-dev1-dn1@PROD.COM";

};

case2 : When I specify Kafkas own jaas file

Java config name: /etc/krb5.conf

Loaded from Java config

javax.security.auth.login.LoginException: Could not login: the client is being asked for a password, but the Kafka client code does not currently support obtaining a password from the user. Make sure -Djava.security.auth.login.config property passed to JVM and the client is configured to use a ticket cache (using the JAAS configuration setting 'useTicketCache=true)'. Make sure you are using FQDN of the Kafka broker you are trying to connect to. not available to garner authentication information from the user

at com.sun.security.auth.module.Krb5LoginModule.promptForPass(Krb5LoginModule.java:899)

at com.sun.security.auth.module.Krb5LoginModule.attemptAuthentication(Krb5LoginModule.java:719)

at com.sun.security.auth.module.Krb5LoginModule.login(Krb5LoginModule.java:584)

at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)

at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:57)

at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)

at java.lang.reflect.Method.invoke(Method.java:606)

at javax.security.auth.login.LoginContext.invoke(LoginContext.java:762)

at javax.security.auth.login.LoginContext.access$000(LoginContext.java:203)

at javax.security.auth.login.LoginContext$4.run(LoginContext.java:690)

at javax.security.auth.login.LoginContext$4.run(LoginContext.java:688)

at java.security.AccessController.doPrivileged(Native Method)

at javax.security.auth.login.LoginContext.invokePriv(LoginContext.java:687)

at javax.security.auth.login.LoginContext.login(LoginContext.java:595)

at org.apache.kafka.common.security.kerberos.Login.login(Login.java:298)

at org.apache.kafka.common.security.kerberos.Login.(Login.java:104)

at kafka.common.security.LoginManager$.init(LoginManager.scala:36)

at kafka.producer.Producer.(Producer.scala:50)

at kafka.producer.Producer.(Producer.scala:73)

at kafka.javaapi.producer.Producer.(Producer.scala:26)

at com.ct.test.kafka.TestProducer.main(TestProducer.java:51)

This works fine, if I do kinit before running this app, else it will through above error.

I cant do this in my production environment, if there is any way to do this by our app itself then please help me out.

Please let me know if you need any more details.

Thanks:)

解决方案

I don't know what mistake did first time, below things I did again, and it works fine.

First give all access to topic:

bin/kafka-acls.sh --add --allow-principals user:ctadmin --operation ALL --topic marchTesting --authorizer-properties zookeeper.connect={hostname}:2181

create jass file:

kafka-jaas.conf

KafkaClient {

com.sun.security.auth.module.Krb5LoginModule required

doNotPrompt=true

useTicketCache=true

principal="ctadmin@HSCALE.COM"

useKeyTab=true

serviceName="kafka"

keyTab="/etc/security/keytabs/ctadmin.keytab"

client=true;

};

Java Program:

package com.ct.test.kafka;

import java.util.Date;

import java.util.Properties;

import kafka.javaapi.producer.Producer;

import kafka.producer.KeyedMessage;

import kafka.producer.ProducerConfig;

public class KafkaProducer {

public static void main(String[] args) {

String topic = args[0];

Properties props = new Properties();

props.put("metadata.broker.list", "{Hostname}:6667");

props.put("serializer.class", "kafka.serializer.StringEncoder");

props.put("request.required.acks", "1");

props.put("security.protocol", "PLAINTEXTSASL");

ProducerConfig config = new ProducerConfig(props);

Producer producer = new Producer(config);

for (int i = 0; i < 10; i++){

producer.send(new KeyedMessage(topic, "Test Date: " + new Date()));

}

}

}

Run application:

java -Djava.security.auth.login.config=/home/ctadmin/kafka-jaas.conf -Djava.security.krb5.conf=/etc/krb5.conf -Djavax.security.auth.useSubjectCredsOnly=true -cp kafka-testing-0.0.1-jar-with-dependencies.jar com.ct.test.kafka.KafkaProducer

### Java Kafka Kerberos 认证配置实现方法 #### 配置环境变量 为了使Java应用程序能够成功通过Kerberos认证访问Kafka集群,需要设置必要的环境变量。这通常涉及指定`java.security.krb5.conf`指向Kerberos配置文件的位置以及加载JAAS(Java Authentication and Authorization Service)配置。 ```bash export JAVA_OPTS="-Djava.security.krb5.conf=/etc/krb5.conf" ``` 对于Windows操作系统中的开发环境,则应调整上述命令来匹配本地系统的路径结构[^3]。 #### 编写 JAAS 文件 创建名为`kafka_client_jaas.conf`的JAAS配置文件用于定义登录模块及其参数。此文件需放置于应用可以读取到的地方,并按照如下格式编写: ```plaintext KafkaClient { com.sun.security.auth.module.Krb5LoginModule required useKeyTab=true storeKey=true keyTab="/path/to/keytab/file.keytab" serviceName="kafka" principal="your_principal_name@YOUR.REALM"; }; ``` 注意替换其中的关键字为实际使用的密钥表位置、服务名称服务主体名。在Windows环境下,确保提供有效的UNC风格路径给`keyTab`字段。 #### 设置 JVM 参数 启动Java程序时加入额外的JVM选项以告知其使用特定的JAAS配置文件作为默认的安全策略源: ```bash -Djava.security.auth.login.config=path_to_kafka_client_jaas.conf ``` 同样地,在生产环境中部署之前要确认这些路径的有效性权限问题。 #### 修改 Kafka 客户端属性 最后一步是在构建Producer或Consumer实例的时候向它们传递一组特殊的配置项,用来指示采用SASL/GSSAPI方式连接至Broker节点并参与协商过程: ```properties sasl.mechanism=GSSAPI security.protocol=SASL_PLAINTEXT # 或者 SASL_SSL 如果启用了SSL加密传输层的话 ``` 当仅限于GSSAPI机制被激活的情况下,意味着只有那些经过Kerberos验证过的客户端才允许建立会话链接[^2]。 #### 示例代码片段 下面给出一段简单的Java代码示例展示如何利用以上提到的各项设定去初始化一个消费者对象并与受保护的Kafka主题交互: ```java import org.apache.kafka.clients.consumer.ConsumerConfig; import org.apache.kafka.clients.consumer.KafkaConsumer; import java.util.Collections; import java.util.Properties; public class SecureKafkaConsumerExample { public static void main(String[] args) throws Exception { Properties props = new Properties(); // 基础配置 props.put(ConsumerConfig.BOOTSTRAP_SERVERS_CONFIG, "broker_host:port"); props.put(ConsumerConfig.GROUP_ID_CONFIG, "test-group"); // 启用SASL/Kerberos认证 props.put("sasl.mechanism", "GSSAPI"); props.put("security.protocol", "SASL_PLAINTEXT"); // 创建消费者实例 try (final KafkaConsumer<String, String> consumer = new KafkaConsumer<>(props)) { consumer.subscribe(Collections.singletonList("topic-name")); while (true){ var records = consumer.poll(Duration.ofMillis(100)); // 处理接收到的消息... } } } } ``` 这段代码展示了怎样基于前面讨论过的原则构造出具备Kerberos鉴权能力的Kafka Consumer实例,并订阅感兴趣的主题等待接收消息。
评论
添加红包

请填写红包祝福语或标题

红包个数最小为10个

红包金额最低5元

当前余额3.43前往充值 >
需支付:10.00
成就一亿技术人!
领取后你会自动成为博主和红包主的粉丝 规则
hope_wisdom
发出的红包
实付
使用余额支付
点击重新获取
扫码支付
钱包余额 0

抵扣说明:

1.余额是钱包充值的虚拟货币,按照1:1的比例进行支付金额的抵扣。
2.余额无法直接购买下载,可以购买VIP、付费专栏及课程。

余额充值