微信公众号:大数据开发运维架构
关注可了解更多大数据相关的资讯。问题或建议,请公众号留言;
如果您觉得“大数据开发运维架构”对你有帮助,欢迎转发朋友圈
从微信公众号拷贝过来,格式有些错乱,建议直接去公众号阅读
这里kafka版本是0.10.0,版本有点老,不过方法大同小异;
当kafka开启Kerberos认证后,如何使用java API生产或消费数据呢?其实就是在生产消费者的代码中加入jaas、keytab这些认证有关的配置,下面我们直接看代码:
1.其实连接Kerberos集群很简单,需要下面三个文件:
1).KerberosServer的配置文件krb5.conf,让程序知道我应该哪个kdc去登录认证;
[libdefaults]udp_preference_limit=1 renew_lifetime=3650dforwardable=truedefault_realm=CHINAUNICOMticket_lifetime=3650ddns_lookup_realm=falsedns_lookup_kdc=falsedefault_ccache_name=/tmp/krb5cc_%{uid} #default_tgs_enctypes = aes des3-cbc-sha1 rc4 des-cbc-md5 #default_tkt_enctypes = aes des3-cbc-sha1 rc4 des-cbc-md5[domain_realm] .CHINAUNICOM = CHINAUNICOM[logging]default=FILE:/var/log/krb5kdc.logadmin_server=FILE:/var/log/kadmind.log kdc = FILE:/var/log/krb5kdc.log[realms]CHINAUNICOM={ admin_server = master98.hadoop.ljskdc=master98.hadoop.ljs }
2).认证肯定需要指定认证方式这里需要一个jaas.conf文件,一般集群的conf目录下都有;
KafkaClient{com.sun.security.auth.module.Krb5LoginModulerequireduseKeyTab=truekeyTab="D:\\kafkaSSL\\kafka.service.keytab"storeKey=trueuseTicketCache=falseprincipal="kafka/salver32.hadoop.unicom@CHINAUNICOM"serviceName=kafka;};
3).就是用户的登录认证票据和认证文件,票据和keytab文件这里就不在贴了;
2.pom.xml文件依赖,有多余的依赖你可以删除;
UTF-81.81.81.2.171.7.22org.apache.kafkakafka_2.110.10.1.0net.sf.json-libjson-lib2.4jdk15
3.Java生产者发送消息,代码实例:
package com.hadoop.ljs.kafka010.security;importorg.apache.kafka.clients.producer.KafkaProducer;importorg.apache.kafka.clients.producer.ProducerRecord;importjava.util.Date;importjava.util.Properties;/** * @author: Created By lujisen * @company ChinaUnicom Software JiNan * @date: 2020-02-28 15:47 * @version: v1.0 * @description: com.hadoop.ljs.kafka010.security */publicclassKafkaKerberosProducer3 {publicstaticfinalStringkrb5Conf="D:\\kafkaSSL\\krb5.conf";publicstaticfinalStringkafkaJaasConf="D:\\kafkaSSL\\kafka_client_jaas.conf";publicstaticfinalStringbootstrapServers="salver31.hadoop.ljs:6667,salver32.hadoop.ljs:6667";publicstaticfinalStringtopic="topic1";privatestaticlong count =5;publicstaticvoidmain(String[] args) {//Kerberos认证必须添加System.setProperty("java.security.krb5.conf", krb5Conf);System.setProperty("java.security.auth.login.config", kafkaJaasConf)Properties props =newProperties();props.put("bootstrap.servers", bootstrapServers);props.put("key.serializer","org.apache.kafka.common.serialization.StringSerializer");props.put("value.serializer","org.apache.kafka.common.serialization.StringSerializer");//Kerberos认证必须添加以下三行props.put("security.protocol","SASL_PLAINTEXT");props.put("sasl.kerberos.service.name","kafka");props.put("sasl.mechanism","GSSAPI");KafkaProducer producer =newKafkaProducer(props);int i=1;while(true){Stringmessage ="{\"id\":"+ i +",\"ip\":\"192.168.0."+ i +"\",\"date\":"+newDate().toString() +"}"; System.out.println(message);producer.send(newProducerRecord(topic, message));try{Thread.sleep(200);}catch(InterruptedException e) { e.printStackTrace(); }if(i++>count){break; } } }}
4.Java消费者接收消息,代码实例:
packagecom.hadoop.ljs.kafka010.security;importorg.apache.kafka.clients.consumer.ConsumerRecord;importorg.apache.kafka.clients.consumer.ConsumerRecords;importorg.apache.kafka.clients.consumer.KafkaConsumer;importjava.util.Arrays;importjava.util.Properties;/***@author: Created By lujisen*@companyChinaUnicom Software JiNan*@date: 2020-02-28 15:04*@version: v1.0*@description: com.hadoop.ljs.kafka010.security */publicclassKafkaKerberosConsumer{publicstaticfinalString krb5Conf="D:\\kafkaSSL\\krb5.conf";publicstaticfinalString kafkaJaasConf="D:\\kafkaSSL\\kafka_client_jaas.conf";publicstaticfinalString bootstrapServers="salver31.hadoop.ljs:6667,salver32.hadoop.ljs:6667";publicstaticfinalString topic="topic1";publicstaticfinalString comsumerGroup="group_topic1";publicstaticvoidmain(String[] args){/*kerberos认证,需要添加以下两行*/System.setProperty("java.security.krb5.conf", krb5Conf);System.setProperty("java.security.auth.login.config", kafkaJaasConf);Properties props =newProperties();props.put("bootstrap.servers", bootstrapServers);props.put("group.id", comsumerGroup);props.put("enable.auto.commit","false");props.put("auto.commit.interval.ms","1000");props.put("auto.offset.reset","earliest");props.put("session.timeout.ms","30000");props.put("key.deserializer","org.apache.kafka.common.serialization.StringDeserializer");props.put("value.deserializer","org.apache.kafka.common.serialization.StringDeserializer");/*kerberos认证,需要添加以下三个属性*/props.put("security.protocol","SASL_PLAINTEXT");props.put("sasl.mechanism","GSSAPI");props.put("sasl.kerberos.service.name","kafka");KafkaConsumer kafkaConsumer =newKafkaConsumer<>(props); kafkaConsumer.subscribe(Arrays.asList(topic));while(true) {ConsumerRecords records = kafkaConsumer.poll(1);for(ConsumerRecord record : records)System.out.println("Partition: "+ record.partition() +" Offset: "+ record.offset() +" Value: "+ record.value() +" ThreadID: "+ Thread.currentThread().getId()); } }}