Kafka security-Encryption and Authentication using SSL
Overview
For Kafka release 0.9.X, the following security measures are currently supported:Authentication of connections to brokers from clients (producers and consumers), other brokers and tools, using either SSL or SASL (Kerberos)
Authentication of connections from brokers to ZooKeeper
- Encryption of data transferred between brokers and clients, between brokers, or between brokers and tools using SSL (Note that there is a performance degradation when SSL is enabled, the magnitude of which depends on the CPU type and the JVM implementation.)
- Authorization of read / write operations by clients
- Authorization is pluggable and integration with external authorization services is supported
as the picture display:
Encryption and Authentication using SSL
I. architecture
Provided we have 3 servers. Server 1 generate CA, server 2 and Server 3 use the CA which is generated on server 1.
The architecture display as below:
Note: This document just focus on how to configure SSL omiting the Zookeeper and Kafka deployment.
II. configuration of server 1 :
firstly create one directory for SSL:
cd $KAFKA_HOME #your own Kafka home
mkdir SSL
cd SSL
then, run the below command one by one:
keytool -keystore server.keystore.jks -alias $HOSTNAME -validity 365 -genkey
openssl req -new -x509 -keyout ca-key -out ca-cert -days 365
keytool -keystore server.truststore.jks -alias CARoot -import -file ca-cert
keytool -keystore client.truststore.jks -alias CARoot -import -file ca-cert
keytool -keystore server.keystore.jks -alias $HOSTNAME -certreq -file cert-file
openssl x509 -req -CA ca-cert -CAkey ca-key -in cert-file -out cert-signed -days 365 -CAcreateserial -passin pass:test123
keytool -keystore server.keystore.jks -alias CARoot -import -file ca-cert
keytool -keystore server.keystore.jks -alias $HOSTNAME -import -file cert-signed
Kafka broker configuration:
cd $KAFKA_HOME/config
vi server.properties
listeners=PLAINTEXT://:9092,SSL://:9093 #add ',SSL://:9093'
advertised.listeners=SSL://10.103.219.67:9093 # advertised.listeners
ssl.keystore.location=/usr/local/kafka_2.11-0.9.0.0/SSL/server.keystore.jks
ssl.keystore.password=test123
ssl.key.password=test123
ssl.truststore.location=/usr/local/kafka_2.11-0.9.0.0/SSL/server.truststore.jks
ssl.truststore.password=test123
ssl.client.auth=requested
ssl.enabled.protocols=TLSv1.2,TLSv1.1,TLSv1
ssl.keystore.type=JKS
ssl.truststore.type=JKS
security.inter.broker.protocol=SSL #enable SSL for inter-broker communication
Kafka client configuration:
cd $KAFKA_HOME/config
vi client-ssl.properties
#add the below configuration:
security.protocol=SSL
ssl.truststore.location=/usr/local/kafka_2.11-0.9.0.0/SSL/client.truststore.jks
ssl.truststore.password=test123
ssl.keystore.location=/usr/local/kafka_2.11-0.9.0.0/SSL/server.keystore.jks
ssl.keystore.password=test123
ssl.key.password=test123
#ssl.enable.protocols=TLSv1.2,TLSv1.1,TLSv1
ssl.truststore.type=JKS
ssl.keystore.type=JKS
III. configuration of server 2 (server 3 is the same with server 2) :
firstly create one directory for SSL:
cd $KAFKA_HOME #your own Kafka home
mkdir SSL
cd SSL
then, run the below command one by one:
keytool -keystore client.keystore.jks -alias $HOSTNAME -validity 365 -genkey
keytool -keystore client.keystore.jks -alias $HOSTNAME -certreq -file cert-file-$HOSTNAME
mkdir CA
then copy the generated file at the server 1 to the CA directory, in my case, /usr/local/kafka_2.11-0.9.0.0/SSL/CA:
scp -r /usr/local/kafka_2.11-0.9.0.0/SSL/* casb-68:/usr/local/kafka_2.11-0.9.0.0/SSL/CA
then, continue to run the below command one by one:
openssl x509 -req -CA CA/ca-cert -CAkey CA/ca-key -in cert-file-$HOSTNAME -out cert-signed-$HOSTNAME -days 365 -CAcreateserial -passin pass:test123
keytool -keystore client.keystore.jks -alias CARoot -import -file CA/ca-cert
keytool -keystore client.keystore.jks -alias $HOSTNAME -import -file cert-signed-$HOSTNAME
Kafka broker configuration:
cd $KAFKA_HOME/config
vi server.properties
#add the below configuration:
listeners=PLAINTEXT://:9092,SSL://:9093 #add ',SSL://:9093'
advertised.listeners=SSL://10.103.219.68:9093 # advertised.listeners
ssl.keystore.location=/usr/local/kafka_2.11-0.9.0.0/SSL/CA/server.keystore.jks
ssl.keystore.password=test123
ssl.key.password=test123
ssl.truststore.location=/usr/local/kafka_2.11-0.9.0.0/SSL/CA/server.truststore.jks
ssl.truststore.password=test123
ssl.client.auth=requested
ssl.enabled.protocols=TLSv1.2,TLSv1.1,TLSv1
ssl.keystore.type=JKS
ssl.truststore.type=JKS
security.inter.broker.protocol=SSL #enable SSL for inter-broker communication
Kafka client configuration:
cd $KAFKA_HOME/config
vi client-ssl.properties
#add the below configuration:
security.protocol=SSL
ssl.truststore.location=/usr/local/kafka_2.11-0.9.0.0/SSL/client.keystore.jks
ssl.truststore.password=test123
ssl.keystore.location=/usr/local/kafka_2.11-0.9.0.0/SSL/client.keystore.jks
ssl.keystore.password=test123
ssl.key.password=test123
#ssl.enable.protocols=TLSv1.2,TLSv1.1,TLSv1
ssl.truststore.type=JKS
ssl.keystore.type=JKS
IV. example :
1). check zookeeper status is running
2). start all brokers
3). create topic
then you can
run at command line:
kafka-console-producer.sh --broker-list casb-68:9093 --topic test-ssl --producer.config client-ssl.properties
kafka-console-consumer.sh --bootstrap-server casb-67:9093 --topic test-ssl --new-consumer --consumer.config client-ssl.properties
run at java IDE:
Properties props = new Properties();
//other the ordinary properties, add the below properties
props.put(ProducerConfig.BOOTSTRAP_SERVERS_CONFIG, " ");
props.put(ProducerConfig.CLIENT_ID_CONFIG, " "); //same with the client-ssl.properties
props.put(SslConfigs.SSL_KEYSTORE_LOCATION_CONFIG, " "); //same with the client-ssl.properties
props.put(SslConfigs.SSL_KEYSTORE_PASSWORD_CONFIG, " "); //same with the client-ssl.properties
props.put(SslConfigs.SSL_TRUSTSTORE_LOCATION_CONFIG, " "); //same with the client-ssl.properties
props.put(SslConfigs.SSL_TRUSTSTORE_PASSWORD_CONFIG, " "); //same with the client-ssl.properties
props.put(SslConfigs.SSL_TRUSTSTORE_TYPE_CONFIG, "JKS");
props.put(CommonClientConfigs.SECURITY_PROTOCOL_CONFIG, "SSL");
KafkaProducer producer = new KafkaProducer(props);
KafkaConsumer consumer = new KafkaConsumer(props);
相關文章
- Using HiveServer2 - AuthenticationHiveServer
- kafka ssl sasl_ssl 配置Kafka
- Kafka SSL安裝與配置Kafka
- 【Using English】28 - Security with HTTPS and SSLHTTP
- Setup SSL using .PFX file on nginx/apache2NginxApache
- Example of SQL Linux Windows Authentication configuration using Managed Service AccountsSQLLinuxWindows
- Connect SQL Server from Linux Client using Windows Authentication and troubleshoot stepsSQLServerLinuxclientWindows
- fatal: Authentication failedAI
- API Token AuthenticationAPI
- Authentication failed!nullAINull
- Squid with AD authenticationUI
- SQLNET.AUTHENTICATION_SERVICESSQL
- SourceTree 提示 fatal: Authentication failed for..AI
- Web services 安全 - HTTP Basic AuthenticationWebHTTP
- Database Administrator Authentication (30)Database
- KSQLException: The authentication type 10 is not supported.SQLException
- Using index condition Using indexIndex
- 小議SQLNET.AUTHENTICATION_SERVICESSQL
- authentication plugin caching_sha2Plugin
- Network sniffing and identity authenticationIDE
- HTTP基礎認證Basic AuthenticationHTTP
- SQLNET.AUTHENTICATION_SERVICES說明SQL
- 授權(Authorization)和認證(Authentication)
- ssl
- MySQL 索引優化 Using where, Using filesortMySql索引優化
- 在Docker環境下的kafka部署之二:SSL連線及內外網分別訪問DockerKafka
- kafka-ngx_kafka_moduleKafka
- 【Kafka】Kafka叢集搭建Kafka
- Kafka實戰-Kafka ClusterKafka
- using indexIndex
- MySQL explain結果Extra中"Using Index"與"Using where; Using index"區別MySqlAIIndex
- RPC:authentication error:why = server rejected credentialRPCErrorServer
- Blazor Server完美實現Cookie Authorization and AuthenticationBlazorServerCookie
- WCF Security:authentication based on Username/Password - Part I
- ASP.NET Core之Authentication與AuthorizationASP.NET
- Vesta 安裝SSL – HTTPS SSL 教程HTTP
- SSL,TLSTLS
- kafka之一:kafka簡介Kafka