Developer Guide Distributed Message Service - Date 2020-03-06
←
→
Page content transcription
If your browser does not render page correctly, please read the page content below
Distributed Message Service
Developer Guide Contents
Contents
1 Kafka Premium Developer Guide........................................................................................ 1
1.1 Overview.................................................................................................................................................................................... 1
1.2 Collecting Connection Information................................................................................................................................... 2
1.3 Java............................................................................................................................................................................................... 3
1.3.1 Configuring Kafka Clients in Java...................................................................................................................................4
1.3.2 Setting Up the Java Development Environment.....................................................................................................11
1.4 Python....................................................................................................................................................................................... 15
1.5 Recommendations on Client Usage............................................................................................................................... 17
2 DMS Kafka Developer Guide.............................................................................................. 19
2.1 Overview.................................................................................................................................................................................. 19
2.2 Java SDK................................................................................................................................................................................... 20
2.2.1 Preparing the Environment............................................................................................................................................ 20
2.2.2 Creating a Project.............................................................................................................................................................. 25
2.2.3 Configuring Parameters.................................................................................................................................................. 26
2.2.4 Running the Sample Project.......................................................................................................................................... 27
2.2.5 Compiling the Sample Project Code........................................................................................................................... 29
2.2.6 Code of the Sample Project........................................................................................................................................... 30
2.2.7 Using the Enhanced Java SDK.......................................................................................................................................30
2.2.8 FAQs....................................................................................................................................................................................... 31
2.3 Python SDK............................................................................................................................................................................. 32
2.4 Lua SDK.................................................................................................................................................................................... 34
2.5 C SDK........................................................................................................................................................................................ 36
2.6 Go SDK..................................................................................................................................................................................... 39
2.7 Recommended Parameter Settings for Kafka Clients.............................................................................................. 42
3 Standard Queue Developer Guide.....................................................................................46
3.1 Overview.................................................................................................................................................................................. 46
3.2 Preparing the Environment............................................................................................................................................... 48
3.3 Creating a Project................................................................................................................................................................. 54
3.4 Configuring Parameters...................................................................................................................................................... 54
3.5 Running the Sample Project..............................................................................................................................................56
3.6 Compiling the Sample Project Code............................................................................................................................... 57
3.7 Code of the Sample Project............................................................................................................................................... 57
2020-03-06 iiDistributed Message Service Developer Guide Contents 4 Change History...................................................................................................................... 59 2020-03-06 iii
Distributed Message Service
Developer Guide 1 Kafka Premium Developer Guide
1 Kafka Premium Developer Guide
1.1 Overview
Kafka premium instances are compatible with Apache Kafka and can be accessed
using open-source Kafka clients. To access an instance in SASL mode, you would
also need certificates.
This document describes how to collect instance connection information, such as
the instance connection address, certificate required for SASL connection, and
information required for public access. It also provides examples of accessing an
instance in Java and Python.
The examples only demonstrate how to invoke Kafka APIs for producing and
consuming messages. For more information about the APIs provided by Kafka,
visit the official Kafka website.
Client Network Environment
A client can access a Kafka instance in any of the following three modes:
1. Within a Virtual Private Network (VPC)
If the client runs an Elastic Cloud Server (ECS) and is in the same region and
VPC as the Kafka instance, the client can access the instance using an IP
address within a subnet in the VPC.
2. Using a VPC peering connection
If the client runs an ECS and is in the same region but not the same VPC as
the Kafka instance, the client can access the instance using an IP address
within a subnet in the VPC after a VPC peering connection has been
established.
For details, see VPC Peering Connection.
3. Over public networks
If the client is not in the same network environment or region as the Kafka
instance, the client can access the instance using a public network IP address.
For public access, modify the inbound rules of the security group configured
for the Kafka instance, allowing access over port 9095.
2020-03-06 1Distributed Message Service
Developer Guide 1 Kafka Premium Developer Guide
The three modes differ only in the connection address for the client to access the instance.
This document takes intra-VPC access as an example to describe how to set up the
development environment.
If the connection times out or fails, check the network connectivity. You can use telnet to
test the connection address and port of the instance.
1.2 Collecting Connection Information
Obtaining Kafka Instance Information
1. Instance connection address and port
Obtain the connection addresses and port numbers on the Basic Information
tab page. Configure all the three addresses listed on the page for the client.
For public access, you can use the connection addresses listed in the Public
Access section.
Figure 1-1 Viewing the connection addresses and ports of brokers of a Kafka
instance
2. Topic name
Obtain the topic name from the Topic Management tab page of the instance
as shown in the following figure.
2020-03-06 2Distributed Message Service
Developer Guide 1 Kafka Premium Developer Guide
Figure 1-2 Viewing the topic name
3. SASL connection information
If SASL access is enabled for the instance, the username, password, and SSL
certificate are required. The username and password are set during instance
creation.
Obtain the username on the Basic Information tab page. If the password is
lost, you can reset Kafka password.
Figure 1-3 Resetting Kafka password
Figure 1-4 Viewing the username used for SASL access
1.3 Java
2020-03-06 3Distributed Message Service
Developer Guide 1 Kafka Premium Developer Guide
1.3.1 Configuring Kafka Clients in Java
This section describes how to add Kafka clients in Maven, and use the clients to
access Kafka instances and produce and consume messages. To check how the
demo project runs in IDEA, see 1.3.2 Setting Up the Java Development
Environment.
The Kafka instance connection addresses, topic name, and user information used
in the following examples are obtained in 1.2 Collecting Connection
Information.
Adding Kafka Clients in Maven
//Kafka premium instances support Kafka 1.1.0. Use the same version for clients.
org.apache.kafka
kafka-clients
1.1.0
Preparing Kafka Configuration Files
The following describes example producer and consumer configuration files. If
SASL is not enabled for the Kafka instance, comment out lines regarding SASL. If
SASL has been enabled, set SASL configurations for encrypted access.
● Producer configuration file (the dms.sdk.producer.properties file in the demo
project)
The information in bold is specific to different Kafka instances and must be
modified. Other parameters can also be added.
#The topic name is in the code for producing or consuming messages.
#######################
#Obtain the information about Kafka instance brokers on the console. For example:
bootstrap.servers=100.xxx.xxx.87:9095,100.xxx.xxx.69:9095,100.xxx.xxx.155:9095
bootstrap.servers=ip1:port1,ip2:port2,ip3:port3
#Producer acknowledgement
acks=all
#Method of turning the key into bytes
key.serializer=org.apache.kafka.common.serialization.StringSerializer
#Method of turning the value into bytes
value.serializer=org.apache.kafka.common.serialization.StringSerializer
#Memory available to the producer for buffering
buffer.memory=33554432
#Number of retries
retries=0
#######################
#Comment out the following parameters if SASL access is not enabled.
#######################
#Configure the JAAS username and password on the console.
sasl.jaas.config=org.apache.kafka.common.security.plain.PlainLoginModule required \
username="username" \
password="password";
#SASL mechanism
sasl.mechanism=PLAIN
#Encryption protocol. Currently, only SASL_SSL is supported.
security.protocol=SASL_SSL
#Location of ssl.truststore
ssl.truststore.location=E:\\temp\\client.truststore.jks
#Password of ssl.truststore
ssl.truststore.password=dms@kafka
● Consumer configuration file (the dms.sdk.consumer.properties file in the
demo project)
2020-03-06 4Distributed Message Service
Developer Guide 1 Kafka Premium Developer Guide
The information in bold is specific to different Kafka instances and must be
modified. Other parameters can also be added.
#The topic name is in the code for producing or consuming messages.
#######################
#Obtain the information about Kafka instance brokers on the console. For example:
bootstrap.servers=100.xxx.xxx.87:9095,100.xxx.xxx.69:9095,100.xxx.xxx.155:9095
bootstrap.servers=ip1:port1,ip2:port2,ip3:port3
#Unique string to identify the group of consumer processes to which the consumer belongs.
Configuring the same group.id for different processes indicates that the processes belong to the same
consumer group.
group.id=1
#Method of turning the key into bytes
key.deserializer=org.apache.kafka.common.serialization.StringDeserializer
#Method of turning the value into bytes
value.deserializer=org.apache.kafka.common.serialization.StringDeserializer
#Offset reset policy
auto.offset.reset=earliest
#######################
#Comment out the following parameters if SASL access is not enabled.
#######################
#Configure the JAAS username and password on the console.
sasl.jaas.config=org.apache.kafka.common.security.plain.PlainLoginModule required \
username="username" \
password="password";
#SASL mechanism
sasl.mechanism=PLAIN
#Encryption protocol. Currently, only SASL_SSL is supported.
security.protocol=SASL_SSL
#Location of ssl.truststore
ssl.truststore.location=E:\\temp\\client.truststore.jks
#Password of ssl.truststore
ssl.truststore.password=dms@kafka
Producing Messages
● Test code
package com.dms.producer;
import org.apache.kafka.clients.producer.Callback;
import org.apache.kafka.clients.producer.RecordMetadata;
import org.junit.Test;
public class DmsProducerTest {
@Test
public void testProducer() throws Exception {
DmsProducer producer = new DmsProducer();
try {
for (int i = 0; i < 10; i++){
String data = "The msg is " + i;
//Enter the name of the topic you created. There are multiple APIs for producing messages.
For details, see the Kafka official website or the following code.
producer.produce("topic-0", data, new Callback()
{
public void onCompletion(RecordMetadata metadata,
Exception exception)
{
if (exception != null)
{
exception.printStackTrace();
return;
}
System.out.println("produce msg completed");
}
});
System.out.println("produce msg:" + data);
}
}catch (Exception e)
2020-03-06 5Distributed Message Service
Developer Guide 1 Kafka Premium Developer Guide
{
// TODO: Exception handling
e.printStackTrace();
}finally {
producer.close();
}
}
}
● Message production code
package com.dms.producer;
import java.io.BufferedInputStream;
import java.io.FileInputStream;
import java.io.IOException;
import java.io.InputStream;
import java.net.URL;
import java.util.ArrayList;
import java.util.Enumeration;
import java.util.List;
import java.util.Properties;
import org.apache.kafka.clients.producer.Callback;
import org.apache.kafka.clients.producer.KafkaProducer;
import org.apache.kafka.clients.producer.Producer;
import org.apache.kafka.clients.producer.ProducerRecord;
public class DmsProducer {
//Add the producer configurations that have been specified earlier.
public static final String CONFIG_PRODUCER_FILE_NAME = "dms.sdk.producer.properties";
private Producer producer;
DmsProducer(String path)
{
Properties props = new Properties();
try {
InputStream in = new BufferedInputStream(new FileInputStream(path));
props.load(in);
}catch (IOException e)
{
e.printStackTrace();
return;
}
producer = new KafkaProducer(props);
}
DmsProducer()
{
Properties props = new Properties();
try {
props = loadFromClasspath(CONFIG_PRODUCER_FILE_NAME);
}catch (IOException e)
{
e.printStackTrace();
return;
}
producer = new KafkaProducer(props);
}
/**
* Producing messages
*
* @param topic Topic
* @param partition partition
* @param key Message key
* @param data Message data
*/
public void produce(String topic, Integer partition, K key, V data)
{
2020-03-06 6Distributed Message Service
Developer Guide 1 Kafka Premium Developer Guide
produce(topic, partition, key, data, null, (Callback)null);
}
/**
* Producing messages
*
* @param topic Topic
* @param partition partition
* @param key Message key
* @param data Message data
* @param timestamp timestamp
*/
public void produce(String topic, Integer partition, K key, V data, Long timestamp)
{
produce(topic, partition, key, data, timestamp, (Callback)null);
}
/**
* Producing messages
*
* @param topic Topic
* @param partition partition
* @param key Message key
* @param data Message data
* @param callback callback
*/
public void produce(String topic, Integer partition, K key, V data, Callback callback)
{
produce(topic, partition, key, data, null, callback);
}
public void produce(String topic, V data)
{
produce(topic, null, null, data, null, (Callback)null);
}
/**
* Producing messages
*
* @param topic Topic
* @param partition partition
* @param key Message key
* @param data Message data
* @param timestamp timestamp
* @param callback callback
*/
public void produce(String topic, Integer partition, K key, V data, Long timestamp, Callback
callback)
{
ProducerRecord kafkaRecord =
timestamp == null ? new ProducerRecord(topic, partition, key, data)
: new ProducerRecord(topic, partition, timestamp, key, data);
produce(kafkaRecord, callback);
}
public void produce(ProducerRecord kafkaRecord)
{
produce(kafkaRecord, (Callback)null);
}
public void produce(ProducerRecord kafkaRecord, Callback callback)
{
producer.send(kafkaRecord, callback);
}
public void close()
{
producer.close();
}
2020-03-06 7Distributed Message Service
Developer Guide 1 Kafka Premium Developer Guide
/**
* get classloader from thread context if no classloader found in thread
* context return the classloader which has loaded this class
*
* @return classloader
*/
public static ClassLoader getCurrentClassLoader()
{
ClassLoader classLoader = Thread.currentThread()
.getContextClassLoader();
if (classLoader == null)
{
classLoader = DmsProducer.class.getClassLoader();
}
return classLoader;
}
/**
* Load configuration information from classpath.
*
* @param configFileName Configuration file name
* @return Configuration information
* @throws IOException
*/
public static Properties loadFromClasspath(String configFileName) throws IOException
{
ClassLoader classLoader = getCurrentClassLoader();
Properties config = new Properties();
List properties = new ArrayList();
Enumeration propertyResources = classLoader
.getResources(configFileName);
while (propertyResources.hasMoreElements())
{
properties.add(propertyResources.nextElement());
}
for (URL url : properties)
{
InputStream is = null;
try
{
is = url.openStream();
config.load(is);
}
finally
{
if (is != null)
{
is.close();
is = null;
}
}
}
return config;
}
}
Consuming Messages
● Test code
package com.dms.consumer;
import org.apache.kafka.clients.consumer.ConsumerRecord;
import org.apache.kafka.clients.consumer.ConsumerRecords;
import org.junit.Test;
import java.util.Arrays;
2020-03-06 8Distributed Message Service
Developer Guide 1 Kafka Premium Developer Guide
public class DmsConsumerTest {
@Test
public void testConsumer() throws Exception {
DmsConsumer consumer = new DmsConsumer();
consumer.consume(Arrays.asList("topic-0"));
try {
for (int i = 0; i < 10; i++){
ConsumerRecords records = consumer.poll(1000);
System.out.println("the numbers of topic:" + records.count());
for (ConsumerRecord record : records)
{
System.out.println(record.toString());
}
}
}catch (Exception e)
{
// TODO: Exception handling
e.printStackTrace();
}finally {
consumer.close();
}
}
}
● Message consumption code
package com.dms.consumer;
import org.apache.kafka.clients.consumer.ConsumerRecords;
import org.apache.kafka.clients.consumer.KafkaConsumer;
import java.io.BufferedInputStream;
import java.io.FileInputStream;
import java.io.IOException;
import java.io.InputStream;
import java.net.URL;
import java.util.*;
public class DmsConsumer {
public static final String CONFIG_CONSUMER_FILE_NAME = "dms.sdk.consumer.properties";
private KafkaConsumer consumer;
DmsConsumer(String path)
{
Properties props = new Properties();
try {
InputStream in = new BufferedInputStream(new FileInputStream(path));
props.load(in);
}catch (IOException e)
{
e.printStackTrace();
return;
}
consumer = new KafkaConsumer(props);
}
DmsConsumer()
{
Properties props = new Properties();
try {
props = loadFromClasspath(CONFIG_CONSUMER_FILE_NAME);
}catch (IOException e)
{
e.printStackTrace();
return;
}
consumer = new KafkaConsumer(props);
}
public void consume(List topics)
{
2020-03-06 9Distributed Message Service
Developer Guide 1 Kafka Premium Developer Guide
consumer.subscribe(topics);
}
public ConsumerRecords poll(long timeout)
{
return consumer.poll(timeout);
}
public void close()
{
consumer.close();
}
/**
* get classloader from thread context if no classloader found in thread
* context return the classloader which has loaded this class
*
* @return classloader
*/
public static ClassLoader getCurrentClassLoader()
{
ClassLoader classLoader = Thread.currentThread()
.getContextClassLoader();
if (classLoader == null)
{
classLoader = DmsConsumer.class.getClassLoader();
}
return classLoader;
}
/**
* Load configuration information from classpath.
*
* @param configFileName Configuration file name
* @return Configuration information
* @throws IOException
*/
public static Properties loadFromClasspath(String configFileName) throws IOException
{
ClassLoader classLoader = getCurrentClassLoader();
Properties config = new Properties();
List properties = new ArrayList();
Enumeration propertyResources = classLoader
.getResources(configFileName);
while (propertyResources.hasMoreElements())
{
properties.add(propertyResources.nextElement());
}
for (URL url : properties)
{
InputStream is = null;
try
{
is = url.openStream();
config.load(is);
}
finally
{
if (is != null)
{
is.close();
is = null;
}
}
}
return config;
2020-03-06 10Distributed Message Service
Developer Guide 1 Kafka Premium Developer Guide
}
}
1.3.2 Setting Up the Java Development Environment
With the information collected in 1.2 Collecting Connection Information and the
network environment prepared for Kafka clients, you can proceed to configuring
Kafka clients. This section describes how to configure Kafka clients to produce and
consume messages.
Preparing Tools
Table 1-1 Required tools
Tool Required Version How to Obtain
Apache Maven 3.0.3 or later http://
maven.apache.org/
download.cgi
Java Development Kit 1.8.111 or later https://
(JDK) set with Java www.oracle.com/
environment variables technetwork/java/
javase/downloads/
index.html
IntelliJ IDEA https://
www.jetbrains.com/
idea/
Procedure
Step 1 Download the demo package.
Decompress the package to obtain the following files.
Table 1-2 Files in the demo package
File Directory Description
DmsConsumer.java .\src\main\java\com API for consuming messages
\dms\consumer
DmsProducer.java .\src\main\java\com API for producing messages
\dms\producer
dms.sdk.consumer. .\src\main\resources Configuration information for
properties consuming messages
dms.sdk.producer.p .\src\main\resources Configuration information for
roperties producing messages
2020-03-06 11Distributed Message Service
Developer Guide 1 Kafka Premium Developer Guide
File Directory Description
client.truststore.jks .\src\main\resources SSL certificate, used for SASL
connection
DmsConsumerTest.j .\src\test\java\com\dms Test code of consuming
ava \consumer messages
DmsProducerTest.ja .\src\test\java\com\dms Test code of producing
va \producer messages
pom.xml .\ Maven configuration file,
containing the Kafka client
dependencies
Step 2 In IntelliJ IDEA, import the demo project.
The demo project is a Java project built in Maven. Therefore, you need the JDK
and the Maven plugin in IDEA.
1. Select Import Project.
2. Select Maven.
3. Select the JDK.
4. You can select other options or retain the default settings. Then, click Finish.
The demo project has been imported.
2020-03-06 12Distributed Message Service
Developer Guide 1 Kafka Premium Developer Guide
Step 3 Configure Maven.
Choose Files > Settings, set Maven home directory correctly, and select the
required settings.xml file.
Step 4 Specify Kafka configurations.
2020-03-06 13Distributed Message Service
Developer Guide 1 Kafka Premium Developer Guide
The following is a configuration example for producing messages. Replace the
information in bold with the actual values.
#The information in bold is specific to different Kafka instances and must be modified. Other parameters
can also be added.
#The topic name is in the code for producing or consuming messages.
#######################
#Obtain the information about Kafka instance brokers on the console. For example:
bootstrap.servers=100.xxx.xxx.87:9095,100.xxx.xxx.69:9095,100.xxx.xxx.155:9095
bootstrap.servers=ip1:port1,ip2:port2,ip3:port3
#Producer acknowledgement
acks=all
#Method of turning the key into bytes
key.serializer=org.apache.kafka.common.serialization.StringSerializer
#Method of turning the value into bytes
value.serializer=org.apache.kafka.common.serialization.StringSerializer
#Memory available to the producer for buffering
buffer.memory=33554432
#Number of retries
retries=0
#######################
#Comment out the following parameters if SASL access is not enabled.
#######################
#Configure the JAAS username and password on the console.
sasl.jaas.config=org.apache.kafka.common.security.plain.PlainLoginModule required \
username="username" \
password="password";
#SASL mechanism
sasl.mechanism=PLAIN
#Encryption protocol. Currently, only SASL_SSL is supported.
security.protocol=SASL_SSL
#Location of ssl.truststore
ssl.truststore.location=E:\\temp\\client.truststore.jks
#Password of ssl.truststore
ssl.truststore.password=dms@kafka
Step 5 In the down left corner of IDEA, click Terminal. In terminal, run the mvn test
command to see how the demo project goes.
Figure 1-5 Opening terminal in IDEA
The following information is displayed for the producer:
-------------------------------------------------------
TESTS
-------------------------------------------------------
Running com.dms.producer.DmsProducerTest
produce msg:The msg is 0
produce msg:The msg is 1
produce msg:The msg is 2
produce msg:The msg is 3
produce msg:The msg is 4
produce msg:The msg is 5
produce msg:The msg is 6
2020-03-06 14Distributed Message Service
Developer Guide 1 Kafka Premium Developer Guide
produce msg:The msg is 7
produce msg:The msg is 8
produce msg:The msg is 9
Tests run: 1, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 138.877 sec
The following information is displayed for the consumer:
-------------------------------------------------------
TESTS
-------------------------------------------------------
Running com.dms.consumer.DmsConsumerTest
the numbers of topic:0
the numbers of topic:0
the numbers of topic:6
ConsumerRecord(topic = topic-0, partition = 2, offset = 0, CreateTime = 1557059377179, serialized key size
= -1, serialized value size = 12, headers = RecordHeaders(headers = [], isReadOnly = false), key = null, value
= The msg is 2)
ConsumerRecord(topic = topic-0, partition = 2, offset = 1, CreateTime = 1557059377195, serialized key size
= -1, serialized value size = 12, headers = RecordHeaders(headers = [], isReadOnly = false), key = null, value
= The msg is 5)
----End
1.4 Python
This section describes how to access a Kafka premium instance using a Kafka
client in Python, including installing the client, and producing and consuming
messages.
Before getting started, ensure that you have collected the information listed in 1.2
Collecting Connection Information.
Preparing Tools
● Python
Generally, Python is pre-installed in the system. Enter python in a CLI. If the
following information is displayed, Python has already been installed.
[root@ecs-heru bin]# python
Python 2.7.5 (default, Oct 30 2018, 23:45:53)
[GCC 4.8.5 20150623 (Red Hat 4.8.5-36)] on linux2
Type "help", "copyright", "credits" or "license" for more information.
>>>
If the Python has not been installed, run the following command to install it:
yum install python
● Kafka clients in Python
Run the following command to install Kafka clients:
pip install kafka-python
Alternatively, you can also run the following commands to install a client of a
desired version:
wget https://github.com/dpkp/kafka-python/archive/1.1.0.tar.gz
mv 1.1.0.tar.gz kafka-python-1.1.0.tar.gz
tar -xvf kafka-python-1.1.0.tar.gz
cd kafka-python-1.1.0
python setup.py install
2020-03-06 15Distributed Message Service
Developer Guide 1 Kafka Premium Developer Guide
Producing Messages
Replace the following information in bold with the actual values.
● Connection with SASL
from kafka import KafkaProducer
import ssl
##Connection information
conf = {
'bootstrap_servers': ["ip1:port1,ip2:port2,ip3:port3"],
'topic_name': 'topic_name',
'sasl_plain_username': 'username',
'sasl_plain_password': 'password'
}
context = ssl.create_default_context()
context = ssl.SSLContext(ssl.PROTOCOL_SSLv23)
context.verify_mode = ssl.CERT_REQUIRED
##Certificate
context.load_verify_locations("phy_ca.crt")
print (start producer)
producer = KafkaProducer(bootstrap_servers=conf['bootstrap_servers'],
sasl_mechanism="PLAIN",
ssl_context=context,
security_protocol='SASL_SSL',
sasl_plain_username=conf['sasl_plain_username'],
sasl_plain_password=conf['sasl_plain_password'])
data = "hello kafka!"
producer.send(conf['topic_name'], data)
producer.close()
print (end producer)
● Connection without SASL
from kafka import KafkaProducer
conf = {
'bootstrap_servers': 'ip1:port1,ip2:port2,ip3:port3',
'topic_name': 'topic-name',
}
print (start producer)
producer = KafkaProducer(bootstrap_servers=conf['bootstrap_servers'])
data = "hello kafka!"
producer.send(conf['topic_name'], data)
producer.close()
print (end producer)
Consuming Messages
● Connection with SASL
from kafka import KafkaProducer
import ssl
##Connection information
conf = {
'bootstrap_servers': ["ip1:port1,ip2:port2,ip3:port3"],
'topic_name': 'topic_name',
'sasl_plain_username': 'username',
'sasl_plain_password': 'password',
'consumer_id': 'consumer_id'
}
context = ssl.create_default_context()
context = ssl.SSLContext(ssl.PROTOCOL_SSLv23)
context.verify_mode = ssl.CERT_REQUIRED
2020-03-06 16Distributed Message Service
Developer Guide 1 Kafka Premium Developer Guide
##Certificate
context.load_verify_locations("phy_ca.crt")
print (start consumer)
consumer = KafkaConsumer(conf['topic_name'],
bootstrap_servers=conf['bootstrap_servers'],
group_id=conf['consumer_id'],
sasl_mechanism="PLAIN",
ssl_context=context,
security_protocol='SASL_SSL',
sasl_plain_username=conf['sasl_plain_username'],
sasl_plain_password=conf['sasl_plain_password'])
for message in consumer:
print("%s:%d:%d: key=%s value=%s" % (message.topic, message.partition,message.offset,
message.key,message.value))
print (end consumer)
● Connection without SASL
Replace the information in bold with the actual values.
from kafka import KafkaConsumer
conf = {
'bootstrap_servers': 'ip1:port1,ip2:port2,ip3:port3',
'topic_name': 'topic-name',
'consumer_id': 'consumer-id'
}
print (start consumer)
consumer = KafkaConsumer(conf['topic_name'],
bootstrap_servers=conf['bootstrap_servers'],
group_id=conf['consumer_id'])
for message in consumer:
print("%s:%d:%d: key=%s value=%s" % (message.topic, message.partition,message.offset,
message.key,message.value))
print (end consumer)
1.5 Recommendations on Client Usage
● Producer parameters
– Allow for retries in cases where messages fail to be sent.
For example, allow for three retries by setting the value of retries to 3.
– The producer cannot block on callback functions. Otherwise, messages
may fail to be sent.
For messages that need to be send immediately, set linger.ms to 0.
Ensure that the producer has sufficient JVM memory to avoid blockages.
● Consumer parameters
– Ensure that the owner thread does not exit abnormally. Otherwise, the
client may fail to initiate consumption requests and the consumption will
be blocked.
– Use long polling to consume messages and do not close the consumer
connection immediately after the consumption is completed. Otherwise,
rebalancing will take place frequently, blocking consumption.
– Ensure that the consumer polls at regular intervals (for example, every
200 ms) for it to keep sending heartbeats to the server. If the consumer
2020-03-06 17Distributed Message Service
Developer Guide 1 Kafka Premium Developer Guide
stops sending heartbeats for long enough, the consumer session will time
out and the consumer will be considered to have stopped. This will also
block consumption.
– Always close the consumer before exiting. Otherwise, consumers in the
same group may block the session.timeout.ms time.
– Set the timeout time for the consumer session to a reasonable value. For
example, set session.timeout.ms to 30000 so that the timeout time is
30s.
– The number of consumers cannot be greater than the number of
partitions in the topic. Otherwise, some consumers may fail to poll for
messages.
– Commit messages after they have been processed. Otherwise, the
messages may fail to be processed and cannot be polled for again.
– Ensure that there is a maximum limit on the size of messages buffered
locally to avoid an out-of-memory (OOM) situation.
– Kafka supports the exactly-once delivery. Therefore, ensure the
idempotency of processing messages for services.
2020-03-06 18Distributed Message Service
Developer Guide 2 DMS Kafka Developer Guide
2 DMS Kafka Developer Guide
2.1 Overview
NOTICE
This document describes how to connect to Distributed Message Service (DMS)
Kafka (non-premium) through clients.
For details on how to connect to Kafka premium instances, see 1.1 Overview.
DMS Kafka API
DMS Kafka supports open-source Kafka application programming interfaces
(APIs). Third-party applications can implement open-source Kafka service
capabilities by directly using a Kafka client to call DMS.
Usage Restrictions
Generally, DMS Kafka can process thousands of messages per second. If more
messages need to be processed per second, submit a service ticket or contact the
customer service.
The recommended Kafka client version is 0.10.2.1 or higher.
If the Kafka SDK is used to produce messages, the maximum size of a single
message is 10 MB. If the DMS console is used to produce messages, the maximum
size of a single message is 512 KB.
2020-03-06 19Distributed Message Service
Developer Guide 2 DMS Kafka Developer Guide
2.2 Java SDK
This section guides you through the development of Kafka queues (non-premium). Kafka
premium instances are developed by using an open-source SDK. For details on how to
access a Kafka premium instance, see Accessing a Kafka Premium Instance. For details on
how to use Kafka clients in different programming languages, visit https://
cwiki.apache.org/confluence/display/KAFKA/Clients.
2.2.1 Preparing the Environment
Helpful Links
● DMS Kafka SDK
● DMS Kafka Enhanced SDK
● Sample Project
To create a new project, use the downloaded SDK. To write code based on the sample
project, use the SDK included in the project.
Preparing Tools
Eclipse: Download Eclipse 3.6.0 or later from the Eclipse official website.
JDK: Download Java Development Kit 1.8.111 or later from the Oracle official
website.
Apache Maven: Download Apache Maven 3.0.3 or later from the Maven official
website.
Obtaining a Topic ID and Consumer Group ID
Before accessing DMS using the SDK, create a Kafka queue and consume group on
the DMS console, and obtain the topic ID and consume group ID.
Step 1 Log in to the management console.
Step 2 Choose Service List > Application > Distributed Message Service to launch the
DMS console.
Step 3 In the navigation pane, choose Queue Manager.
Step 4 On the Queue Manager page, click Create Queue.
Step 5 Specify queue parameters.
2020-03-06 20Distributed Message Service
Developer Guide 2 DMS Kafka Developer Guide
Table 2-1 Parameter description
Parameter Description
Name Name of the queue you want to create. The name must be
unique.
When you create a queue, a default queue name is generated,
which you can change if required. A queue name consists of 1 to
64 characters. Only letters, digits, underscores (_), and hyphens
(-) are allowed.
The queue name cannot be modified after creation of the
queue.
Type Select Kafka queue.
Mode Select either High throughput or High reliability.
Default value: High throughput.
High throughput: All message replicas are flushed to disk
asynchronously. Select this mode when high message delivery
performance is required.
High reliability: All message replicas are flushed to disk
synchronously. Select this mode when high message delivery
reliability is required.
Message This parameter is available only for Kafka queues.
Retention The number of hours for which messages will be preserved in a
Period (h) Kafka queue. Messages older than that period will be deleted.
Deleted messages are not retrievable to consumer groups.
Value range: integers from 1 to 72
Default value: 72
Description The description consists of a maximum of 160 characters and
(optional) cannot contain angle brackets (< and >).
2020-03-06 21Distributed Message Service
Developer Guide 2 DMS Kafka Developer Guide
Figure 2-1 Creating a Kafka queue
Step 6 Click OK.
Step 7 Click the name of the queue. On the displayed queue details page, obtain the
Kafka topic ID.
Figure 2-2 Obtaining the Kafka topic ID
Step 8 Click Create Consumer Group. The Create Consumer Group dialog box is
displayed.
Step 9 Enter a consumer group name.
A default queue name is generated, which you can change if required. A consumer
group name consists of 1 to 32 characters. Only letters, digits, underscores (_), and
hyphens (-) are allowed. Consumer group names must be unique within the same
queue.
Step 10 Click OK. Obtain the ID of the consumer group in the consumer group list.
2020-03-06 22Distributed Message Service
Developer Guide 2 DMS Kafka Developer Guide
Figure 2-3 Obtaining the consumer group ID
----End
Obtaining a Project ID
When calling APIs, you need to specify project_id in API requests. Obtain a project
ID by performing the following procedure.
Step 1 Sign up and log in to the management console.
Step 2 Click the username and choose My Credentials from the drop-down list.
Step 3 On the My Credentials page, view project IDs in the project list.
----End
Obtaining an AK/SK
Step 1 Sign up and log in to the management console.
Step 2 Click the username and choose My Credentials from the drop-down list.
Step 3 On the My Credentials page, click the Access Keys tab.
Step 4 Click Create Access Keys.
Step 5 Enter the password for login.
Step 6 Enter the verification code received by email or SMS message.
Step 7 Click OK.
Keep the key secure and do not disclose it to any unauthorized people.
Step 8 Download the credentials.csv file containing your AK and SK to a local computer.
----End
Obtaining Region and Endpoint Information
Obtain the region and endpoint from Regions and Endpoints.
2020-03-06 23Distributed Message Service
Developer Guide 2 DMS Kafka Developer Guide
Creating an ECS
An Elastic Cloud Server (ECS) must be available to run the sample project.
Log in to the ECS console, create a Linux ECS, and bind an elastic IP address (EIP)
to it. Record the EIP, username, and password. If you already created an ECS, skip
this step.
EIPs are used to log in to ECSs and upload files.
Summary of Environment Information
Table 2-2 Required environment information
Category Information Example
ECS EIP -
Username -
Password -
DMS Queue name -
Queue ID -
Queue type -
Topic -
Consumer group -
name
Consumer group ID -
Access Keys Access key ID -
Secrete access key -
(SK)
Project Region -
Project name -
Project ID -
Region and Name -
endpoint
Endpoint -
DNS DNS server IP Address -
2020-03-06 24Distributed Message Service
Developer Guide 2 DMS Kafka Developer Guide
2.2.2 Creating a Project
This section uses the Maven project kafkademo as an example to describe how to
create a project.
Procedure
Step 1 Download the demo package.
1. Log in to the DMS console.
2. In the navigation pane, choose Using APIs.
3. Choose Kafka APIs.
4. Click Downloading Sample Code to download DmsKafkaDemo.zip.
Step 2 Click Download SDK to download the DMS Kafka SASL package.
Decompress the following directories from the package:
● client.truststore.jks: client certificate
● dms.kafka.sasl.client-1.0.0.jar: DMS Kafka SASL package
● dms_kafka_client_jaas.conf: client configuration file
You can also decompress the SDK package from DmsKafkaDemo.zip (the location
is DmsKafkaDemo\dist\libs\dms.kafka.sasl.client-1.0.0.jar).
Step 3 On Eclipse (the recommended version is 4.6 or later), create a Maven project. The
project name kafkademo is used as an example.
Figure 2-4 Creating a Maven project
Step 4 Click Finish.
Step 5 Import the DMS Kafka SASL package.
1. Right-click the new project kafkademo, and create a libs folder.
2. Copy dms.kafka.sasl.client-1.0.0.jar to libs.
3. Add the following information to the pom.xml file to import
dms.kafka.sasl.client-1.0.0.jar into the Maven repository:
dms
kafka.sasl.client
1.0.0
system
${project.basedir}/libs/dms.kafka.sasl.client-1.0.0.jar
2020-03-06 25Distributed Message Service
Developer Guide 2 DMS Kafka Developer Guide
org.apache.kafka
kafka-clients
0.10.2.1
org.slf4j
slf4j-api
1.7.7
org.slf4j
slf4j-log4j12
1.7.7
log4j
log4j
1.2.17
4. Save the pom.xml file.
----End
2.2.3 Configuring Parameters
Procedure
Step 1 Configure access_key, secret_key, and project_id in the
dms_kafka_client_jaas.conf file.
The three parameters are used to authenticate DMS Kafka API requests.
KafkaClient {
com.huawei.middleware.kafka.sasl.client.KafkaLoginModule required
access_key="XXXXXX"
secret_key="XXXXXX"
project_id="XXXXXX";
};
Replace them with the actual access_key, secret_key, and project_id of your
account.
To access the queues authorized by other tenants, set target_project_id to the
project ID of the authorizing tenant.
KafkaClient {
com.huawei.middleware.kafka.sasl.client.KafkaLoginModule required
access_key="XXXXXX"
secret_key="XXXXXX"
project_id="XXXXXX"
target_project_id="";
};
Step 2 Configure SASL access to start upon process using either of the following methods.
In both methods, replace /path with the actual path name.
1. Method 1: Configure the following JVM parameter to add the location of
SASL configuration file:
-Djava.security.auth.login.config=/path/kafka_client_jaas.conf
2. Method 2: Add the following information to project code so that SASL access
can start before Kafka Producer and Consumer start:
System.setProperty("java.security.auth.login.config", "/path/kafka_client_jaas.conf");
2020-03-06 26Distributed Message Service
Developer Guide 2 DMS Kafka Developer Guide
Step 3 Add the following information to the consumer.properties file:
connections.max.idle.ms=30000
Step 4 Configure key parameters in the consumer.properties/producer.properties file.
Table 2-3 Key parameters in the consumer.properties/producer.properties file
Parameter Description Setting
bootstrap.servers IP address or -
domain name of
the DMS server
ssl.truststore.loca Path in which the /path/client.truststore.jks, where /
tion client certificate path must be replaced with the actual
client.truststore.j path name
ks is located
ssl.truststore.pass Client certificate -
word password
security.protocol Security protocol SASL_SSL
sasl.mechanism Service name DMS (Note: All letters in the entered
service name must be capitalized.)
For details about other Kafka parameters, visit the official Kafka website.
Step 5 Enable Kafka debug logging by modifying the log4j.properties file.
log4j.rootLogger=DEBUG, stdout
log4j.appender.stdout=org.apache.log4j.ConsoleAppender
log4j.appender.stdout.layout=org.apache.log4j.PatternLayout
log4j.appender.stdout.layout.ConversionPattern=[%d] %p %m (%c:%L)%n
log4j.logger.org.apache.kafka.clients=DEBUG
log4j.logger.kafka=INFO, stdout
log4j.additivity.kafka=false
log4j.logger.org.apache.kafka=DEBUG, stdout
log4j.additivity.org.apache.kafka=false
Step 6 Write code. For details about APIs, visit the official Kafka website.
----End
2.2.4 Running the Sample Project
The following describes how to access DMS Kafka queues to produce and
consume messages in Java.
Procedure
Step 1 Log in to the ECS.
You can run the sample project on an ECS with an IP address in the 192 network segment.
2020-03-06 27Distributed Message Service
Developer Guide 2 DMS Kafka Developer Guide
Step 2 Install JDK or Java runtime environment (JRE). Add the following settings of
environment variables JAVA_HOME and PATH to the ~/.bash_profile:
export JAVA_HOME=/opt/java/jdk1.8.0_151
export PATH=$JAVA_HOME/bin:$PATH
Run the source .bash_profile command for the modification to take effect.
Use Oracle JDK instead of ECS's default JDK (for example, OpenJDK), because ECS's default
JDK may not be suitable for the sample project. To obtain Oracle JDK, download Java
Development Kit 1.8.111 or a later version from https://www.oracle.com/technetwork/
java/javase/downloads/index.html.
Step 3 Run the following command to download the code package of the sample project
DmsKafkaDemo.zip.
$ wget https://dms-demo.obs.myhwclouds.com/DmsKafkaDemo.zip
Step 4 Run the following command to decompress DmsKafkaDemo.zip.
$ unzip DmsKafkaDemo.zip
Step 5 Run the following command to navigate to the DmsKafkaDemo/dist directory,
which contains pre-compiled binary files and executable scripts.
$ cd DmsKafkaDemo/dist
Step 6 Edit the config/dms_kafka_client_jaas.conf file and configure access_key,
secret_key, and project_id.
$ vim config/dms_kafka_client_jaas.conf
The values in bold are examples. Replace them with actual values.
KafkaClient {
com.huawei.middleware.kafka.sasl.client.KafkaLoginModule required
access_key="********************"
secret_key="**********"
project_id="bd67aaead60940d688b872c31bdc653b"
target_project_id="bd67aaead60940d688b872c31bdc6539";
};
To access the queues authorized by other tenants, set target_project_id to the
project ID of the authorizing tenant.
Step 7 Edit the config/producer.properties file and configure topic and
bootstrap.servers.
$ vim config/producer.properties
The values in bold are examples. Replace them with actual values.
topic=k-bd67aaead60940d688b872c31bdc653b-4df89da6-ede4-4072-93e0-28dc6e866299
bootstrap.servers=dms-kafka.cn-north-1.myhuaweicloud.com:37000
ssl.truststore.password=************
acks=all
retries=1
batch.size=16384
buffer.memory=33554432
key.serializer=org.apache.kafka.common.serialization.StringSerializer
value.serializer=org.apache.kafka.common.serialization.StringSerializer
security.protocol=SASL_SSL
sasl.mechanism=DMS
The parameter topic can be set to a queue name or a Kafka topic name. For more
information, see Table 2-2.
2020-03-06 28Distributed Message Service
Developer Guide 2 DMS Kafka Developer Guide
Step 8 Edit the config/consumer.properties file and configure topic, bootstrap.servers,
and group.id.
$ vim config/consumer.properties
The values in bold are examples. Replace them with actual values.
topic=k-bd67aaead60940d688b872c31bdc653b-4df89da6-ede4-4072-93e0-28dc6e866299
bootstrap.servers=dms-kafka.cn-north-1.myhuaweicloud.com:37000
group.id=g-7ec0caac-01fb-4f91-a4f2-0a9dd48f8af7
ssl.truststore.password=************
security.protocol=SASL_SSL
sasl.mechanism=DMS
key.deserializer=org.apache.kafka.common.serialization.StringDeserializer
value.deserializer=org.apache.kafka.common.serialization.StringDeserializer
auto.offset.reset=earliest
enable.auto.commit=false
The parameter topic can be set to a queue name or a Kafka topic name. For more
information, see Table 2-2.
Step 9 Run the sample project to produce messages:
$ bash produce.sh
After the command is run, 10 messages are automatically sent to the Kafka
queue.
Step 10 Run the sample project to consume messages:
$ bash consume.sh
----End
2.2.5 Compiling the Sample Project Code
Step 1 Download and decompress .
Step 2 Import the sample project code.
1. On Eclipse, choose File > Import.
2. In the Select an import source area, select Exiting Projects Into Workspace.
3. Select the directory to which the sample project code DmsKafkaDemo is
decompressed.
Step 3 Choose Project > Build Project to build the project.
Step 4 Export the new JAR file.
1. Right-click the sample project DmsKafkaDemo and choose Export from the
shortcut menu.
2. Choose Java > JAR file. Enter the path and name of the JAR file to be
generated.
Step 5 Replace the dms.kafka.demo.jar file in the DmsKafkaDemo/dist/libs directory
with the new JAR file. Run the newly built project by following the procedure in
Running the Sample Project.
----End
2020-03-06 29Distributed Message Service
Developer Guide 2 DMS Kafka Developer Guide
2.2.6 Code of the Sample Project
Producer
DMS Kafka APIs are compatible with native open-source Kafka clients. Compared
with native Kafka service code, the sample project code additionally contains a
client certificate and simple authentication and security layer (SASL)
configuration, which are used for identity authentication and secure
communication. To realize smooth migration of producer applications, you only
need to import the client certificate and SASL configuration before creating the
Kafka Producer without modifying any other Kafka service code.
Code pertaining to client certificate and SASL:
Properties producerConfig = Config.getProducerConfig();
producerConfig.put("ssl.truststore.location", Config.getTrustStorePath());
System.setProperty("java.security.auth.login.config", Config.getSaslConfig());
The code for creating a producer and sending messages does not need to be
modified.
Consumer
DMS Kafka APIs are compatible with native open-source Kafka clients. Compared
with native Kafka service code, the sample project code additionally contains a
client certificate and SASL configuration, which are used for identity
authentication and secure communication. To realize smooth migration of
consumer applications, you only need to import the client certificate and SASL
configuration before creating the Kafka Consumer without modifying any other
Kafka service code.
Code pertaining to client certificate and SASL:
Properties consumerConfig = Config.getConsumerConfig();
consumerConfig.put("ssl.truststore.location", Config.getTrustStorePath());
System.setProperty("java.security.auth.login.config", Config.getSaslConfig());
The code for creating a consumer and consuming messages does not need to be
modified.
2.2.7 Using the Enhanced Java SDK
The enhanced Kafka Java SDK is optimized based on the open-source Kafka
0.10.2.1 client, greatly improving system performance.
Procedure
Step 1 Download the open-source Kafka 0.10.2.1 client package.
Step 2 Decompress the following from the open-source client package:
2020-03-06 30Distributed Message Service
Developer Guide 2 DMS Kafka Developer Guide
Step 3 Download the enhanced Kafka 0.10.2.1 SDK package.
Step 4 Decompress the following from the SDK package:
Step 5 Copy all JAR packages in the dms_kafka_0.10.2.1-client/ directory to the libs
folder in the directory where the open-source Kafka 0.10.2.1 client package is
decompressed, and overwrite the kafka-clients-0.10.2.1.jar package with the
same name.
Step 6 Perform the same configuration as other open-source clients.
----End
2.2.8 FAQs
What If the Consumer Group Failed to Be Found?
Symptom: An error message similar to the following is displayed. k-xx represents a
created Kafka queue.
[2019-04-29 11:30:02,173] WARN Error while fetching metadata with correlation id 2 :
{k-8*******8cd=UNKNOWN}
Possible causes and solutions:
● If open JDK is used, use Oracle JDK instead.
● Check if the consumer group is consistent with the topic. By default, a
consumer group name starts with the letter g.
● Check if project_id and target_project_id in the dms_kafka_client_jaas.conf
file are inconsistent. The two parameters must have consistent values.
2020-03-06 31Distributed Message Service
Developer Guide 2 DMS Kafka Developer Guide
2.3 Python SDK
This chapter describes how to access a DMS Kafka queue by using a Linux Python
client. To obtain related configuration information, see 2.2.1 Preparing the
Environment.
DMS is compatible with native Kafka APIs. For details on how to call native Kafka
APIs, see the Kafka Documentation.
Preparing the Client Environment
Step 1 Select an SDK package of the required version based on the Python encoding
format.
1. Download the SDK package and decompress the following from the
package:
– kafka-dms-python-code2.tar.gz
– kafka-dms-python-code4.tar.gz
2. Run the following command in any directory to add the test.py file:
vi test.py
Add the following content to the file and save the file.
import sys
if sys.maxunicode > 65535:
print 'Your python is suitable for kafka-dms-python-code4.tar.gz'
else:
print 'Your python is suitable for kafka-dms-python-code2.tar.gz'
3. Run the following command to select a Python SDK package:
python test.py
4. Select the required SDK package as prompted.
Step 2 Decompress the SDK package.
The following assumes that the SDK package is decompressed in the {root_dir}
package.
Step 3 Run the following commands to configure environment variables:
vi ~/.bash_profile
Add the following content to the configuration file, and then save and exit.
export LD_LIBRARY_PATH={root_dir}/kafka-dms-python/confluent_kafka/lib:{root_dir}/kafka-dms-python/
confluent_kafka/thirdPart
Run the following command to make the modification take effect:
source ~/.bash_profile
Step 4 Configure the following parameters in the {root_dir}/kafka-dms-python/conf.ini
file:
sasl.project.id=your project
sasl.access.key=your ak
sasl.security.key=your sk
2020-03-06 32You can also read