Developer Guide Distributed Message Service - Date - T-Systems

Page created by Clifton Thompson
 
CONTINUE READING
Developer Guide Distributed Message Service - Date - T-Systems
Distributed Message Service

Developer Guide

Date   2020-02-20
Distributed Message Service
Developer Guide                                                                                                                                                                         Contents

                                                                                                                                                                 Contents

1 Kafka Premium Developer Guide........................................................................................ 1
1.1 Overview.................................................................................................................................................................................... 1
1.2 Collecting Connection Information................................................................................................................................... 2
1.3 Java............................................................................................................................................................................................... 3
1.3.1 Configuring Kafka Clients in Java...................................................................................................................................3
1.3.2 Setting Up the Java Development Environment.....................................................................................................10
1.4 Python....................................................................................................................................................................................... 15
1.5 Recommendations on Client Usage............................................................................................................................... 17

2 DMS Kafka Developer Guide.............................................................................................. 19
2.1 Overview.................................................................................................................................................................................. 19
2.2 Preparing the Environment............................................................................................................................................... 19
2.3 Creating a Project................................................................................................................................................................. 24
2.4 Configuring Parameters...................................................................................................................................................... 26
2.5 Running the Sample Project..............................................................................................................................................27
2.6 Compiling the Sample Project Code............................................................................................................................... 29
2.7 Code of the Sample Project............................................................................................................................................... 30
2.8 Using the Enhanced Java SDK.......................................................................................................................................... 30
2.9 Recommended Parameter Settings for Kafka Clients.............................................................................................. 32

3 Change History...................................................................................................................... 36

2020-02-20                                                                                                                                                                                            ii
Distributed Message Service
Developer Guide                                                     1 Kafka Premium Developer Guide

                  1           Kafka Premium Developer Guide

1.1 Overview
                 Kafka premium instances are compatible with Apache Kafka and can be accessed
                 using open-source Kafka clients. To access an instance in SASL mode, you would
                 also need certificates.

                 This document describes how to collect instance connection information, such as
                 the instance connection address, certificate required for SASL connection, and
                 information required for public access. It also provides examples of accessing an
                 instance in Java and Python.

                 The examples only demonstrate how to invoke Kafka APIs for producing and
                 consuming messages. For more information about the APIs provided by Kafka,
                 visit the official Kafka website.

Client Network Environment
                 A client can access a Kafka instance in any of the following three modes:

                 1.   Within a Virtual Private Network (VPC)
                      If the client runs an Elastic Cloud Server (ECS) and is in the same region and
                      VPC as the Kafka instance, the client can access the instance using an IP
                      address within a subnet in the VPC.
                 2.   Using a VPC peering connection
                      If the client runs an ECS and is in the same region but not the same VPC as
                      the Kafka instance, the client can access the instance using an IP address
                      within a subnet in the VPC after a VPC peering connection has been
                      established.
                 3.   Over public networks
                      If the client is not in the same network environment or region as the Kafka
                      instance, the client can access the instance using a public network IP address.
                      For public access, modify the inbound rules of the security group configured
                      for the Kafka instance, allowing access over port 9095.

2020-02-20                                                                                             1
Distributed Message Service
Developer Guide                                                              1 Kafka Premium Developer Guide

                      The three modes differ only in the connection address for the client to access the instance.
                      This document takes intra-VPC access as an example to describe how to set up the
                      development environment.
                      If the connection times out or fails, check the network connectivity. You can use telnet to
                      test the connection address and port of the instance.

1.2 Collecting Connection Information
Obtaining Kafka Instance Information
                 1.   Instance connection address and port
                      Obtain the connection addresses and port numbers on the Basic Information
                      tab page. Configure all the three addresses listed on the page for the client.
                      For public access, you can use the connection addresses listed in the Public
                      Access section.

                      Figure 1-1 Viewing the connection addresses and ports of brokers of a Kafka
                      instance

                 2.   Topic name
                      Obtain the topic name from the Topic Management tab page of the instance
                      as shown in the following figure.

                      Figure 1-2 Viewing the topic name

                 3.   SASL connection information
                      If SASL access is enabled for the instance, the username, password, and SSL
                      certificate are required. The username and password are set during instance
                      creation.
                      Obtain the username on the Basic Information tab page. If the password is
                      lost, you can reset Kafka password.

2020-02-20                                                                                                           2
Distributed Message Service
Developer Guide                                                                   1 Kafka Premium Developer Guide

                      Figure 1-3 Resetting Kafka password

                      Figure 1-4 Viewing the username used for SASL access

1.3 Java

1.3.1 Configuring Kafka Clients in Java
                 This section describes how to add Kafka clients in Maven, and use the clients to
                 access Kafka instances and produce and consume messages. To check how the
                 demo project runs in IDEA, see Setting Up the Java Development Environment.
                 The Kafka instance connection addresses, topic name, and user information used
                 in the following examples are obtained in Collecting Connection Information.

Adding Kafka Clients in Maven
                 //Kafka premium instances support Kafka 2.3.1. Use the same version for clients.
                      
                         org.apache.kafka
                         kafka-clients
                         2.3.1
                      
Preparing Kafka Configuration Files
                 The following describes example producer and consumer configuration files. If
                 SASL is not enabled for the Kafka instance, comment out lines regarding SASL. If
                 SASL has been enabled, set SASL configurations for encrypted access.
                 ●    Producer configuration file (the dms.sdk.producer.properties file in the demo
                      project)
                      The information in bold is specific to different Kafka instances and must be
                      modified. Other parameters can also be added.
                      #The topic name is in the code for producing or consuming messages.
                      #######################
                      #Obtain the information about Kafka instance brokers on the console. For example:
                      bootstrap.servers=100.xxx.xxx.87:9095,100.xxx.xxx.69:9095,100.xxx.xxx.155:9095

2020-02-20                                                                                                     3
Distributed Message Service
Developer Guide                                                                  1 Kafka Premium Developer Guide

                      bootstrap.servers=ip1:port1,ip2:port2,ip3:port3
                      #Producer acknowledgement
                      acks=all
                      #Method of turning the key into bytes
                      key.serializer=org.apache.kafka.common.serialization.StringSerializer
                      #Method of turning the value into bytes
                      value.serializer=org.apache.kafka.common.serialization.StringSerializer
                      #Memory available to the producer for buffering
                      buffer.memory=33554432
                      #Number of retries
                      retries=0
                      #######################
                      #Comment out the following parameters if SASL access is not enabled.
                      #######################
                      #Configure the JAAS username and password on the console.
                      sasl.jaas.config=org.apache.kafka.common.security.plain.PlainLoginModule required \
                          username="username" \
                          password="password";
                      #SASL mechanism
                      sasl.mechanism=PLAIN
                      #Encryption protocol. Currently, only SASL_SSL is supported.
                      security.protocol=SASL_SSL
                      #Location of ssl.truststore
                      ssl.truststore.location=E:\\temp\\client.truststore.jks
                      #Password of ssl.truststore
                      ssl.truststore.password=dms@kafka
                      # Disable certificate domain name verification.
                      ssl.endpoint.identification.algorithm=

                 ●    Consumer configuration file (the dms.sdk.consumer.properties file in the
                      demo project)
                      The information in bold is specific to different Kafka instances and must be
                      modified. Other parameters can also be added.
                      #The topic name is in the code for producing or consuming messages.
                      #######################
                      #Obtain the information about Kafka instance brokers on the console. For example:
                      bootstrap.servers=100.xxx.xxx.87:9095,100.xxx.xxx.69:9095,100.xxx.xxx.155:9095
                      bootstrap.servers=ip1:port1,ip2:port2,ip3:port3
                      #Unique string to identify the group of consumer processes to which the consumer belongs.
                      Configuring the same group.id for different processes indicates that the processes belong to the same
                      consumer group.
                      group.id=1
                      #Method of turning the key into bytes
                      key.deserializer=org.apache.kafka.common.serialization.StringDeserializer
                      #Method of turning the value into bytes
                      value.deserializer=org.apache.kafka.common.serialization.StringDeserializer
                      #Offset reset policy
                      auto.offset.reset=earliest
                      #######################
                      #Comment out the following parameters if SASL access is not enabled.
                      #######################
                      #Configure the JAAS username and password on the console.
                      sasl.jaas.config=org.apache.kafka.common.security.plain.PlainLoginModule required \
                          username="username" \
                          password="password";
                      #SASL mechanism
                      sasl.mechanism=PLAIN
                      #Encryption protocol. Currently, only SASL_SSL is supported.
                      security.protocol=SASL_SSL
                      #Location of ssl.truststore
                      ssl.truststore.location=E:\\temp\\client.truststore.jks
                      #Password of ssl.truststore
                      ssl.truststore.password=dms@kafka
                      # Disable certificate domain name verification.
                      ssl.endpoint.identification.algorithm=

2020-02-20                                                                                                                4
Distributed Message Service
Developer Guide                                                                    1 Kafka Premium Developer Guide

Producing Messages
                 ●    Test code
                      package com.dms.producer;

                      import org.apache.kafka.clients.producer.Callback;
                      import org.apache.kafka.clients.producer.RecordMetadata;
                      import org.junit.Test;

                      public class DmsProducerTest {
                        @Test
                        public void testProducer() throws Exception {
                            DmsProducer producer = new DmsProducer();

                           try {
                              for (int i = 0; i < 10; i++){
                                  String data = "The msg is " + i;
                                  //Enter the name of the topic you created. There are multiple APIs for producing messages.
                      For details, see the Kafka official website or the following code.
                                  producer.produce("topic-0", data, new Callback()
                                  {
                                      public void onCompletion(RecordMetadata metadata,
                                                        Exception exception)
                                      {
                                        if (exception != null)
                                        {
                                            exception.printStackTrace();
                                            return;
                                        }
                                        System.out.println("produce msg completed");
                                      }
                                  });
                                  System.out.println("produce msg:" + data);
                              }
                           }catch (Exception e)
                           {
                              // TODO: Exception handling
                              e.printStackTrace();
                           }finally {
                              producer.close();
                           }

                          }
                      }
                 ●    Message production code
                      package com.dms.producer;

                      import   java.io.BufferedInputStream;
                      import   java.io.FileInputStream;
                      import   java.io.IOException;
                      import   java.io.InputStream;
                      import   java.net.URL;
                      import   java.util.ArrayList;
                      import   java.util.Enumeration;
                      import   java.util.List;
                      import   java.util.Properties;

                      import   org.apache.kafka.clients.producer.Callback;
                      import   org.apache.kafka.clients.producer.KafkaProducer;
                      import   org.apache.kafka.clients.producer.Producer;
                      import   org.apache.kafka.clients.producer.ProducerRecord;

                      public class DmsProducer {
                         //Add the producer configurations that have been specified earlier.
                        public static final String CONFIG_PRODUCER_FILE_NAME = "dms.sdk.producer.properties";

                          private Producer producer;

                          DmsProducer(String path)

2020-02-20                                                                                                                 5
Distributed Message Service
Developer Guide                                                                     1 Kafka Premium Developer Guide

                        {
                            Properties props = new Properties();
                            try {
                               InputStream in = new BufferedInputStream(new FileInputStream(path));
                               props.load(in);
                            }catch (IOException e)
                            {
                               e.printStackTrace();
                               return;
                            }
                            producer = new KafkaProducer(props);
                        }
                        DmsProducer()
                        {
                          Properties props = new Properties();
                          try {
                             props = loadFromClasspath(CONFIG_PRODUCER_FILE_NAME);
                          }catch (IOException e)
                          {
                             e.printStackTrace();
                             return;
                          }
                          producer = new KafkaProducer(props);
                        }

                        /**
                         * Producing messages
                         *
                         * @param topic        Topic
                         * @param partition partition
                         * @param key          Message key
                         * @param data         Message data
                         */
                        public void produce(String topic, Integer partition, K key, V data)
                        {
                            produce(topic, partition, key, data, null, (Callback)null);
                        }

                        /**
                         * Producing messages
                         *
                         * @param topic        Topic
                         * @param partition partition
                         * @param key          Message key
                         * @param data         Message data
                         * @param timestamp timestamp
                         */
                        public void produce(String topic, Integer partition, K key, V data, Long timestamp)
                        {
                            produce(topic, partition, key, data, timestamp, (Callback)null);
                        }
                        /**
                         * Producing messages
                         *
                         * @param topic        Topic
                         * @param partition partition
                         * @param key          Message key
                         * @param data         Message data
                         * @param callback callback
                         */
                        public void produce(String topic, Integer partition, K key, V data, Callback callback)
                        {
                            produce(topic, partition, key, data, null, callback);
                        }

                        public void produce(String topic, V data)
                        {
                          produce(topic, null, null, data, null, (Callback)null);
                        }

2020-02-20                                                                                                       6
Distributed Message Service
Developer Guide                                                                   1 Kafka Premium Developer Guide

                         /**
                          * Producing messages
                          *
                          * @param topic       Topic
                          * @param partition partition
                          * @param key         Message key
                          * @param data        Message data
                          * @param timestamp timestamp
                          * @param callback callback
                          */
                         public void produce(String topic, Integer partition, K key, V data, Long timestamp, Callback
                      callback)
                         {
                             ProducerRecord kafkaRecord =
                                  timestamp == null ? new ProducerRecord(topic, partition, key, data)
                                       : new ProducerRecord(topic, partition, timestamp, key, data);
                             produce(kafkaRecord, callback);
                         }

                        public void produce(ProducerRecord kafkaRecord)
                        {
                          produce(kafkaRecord, (Callback)null);
                        }

                        public void produce(ProducerRecord kafkaRecord, Callback callback)
                        {
                          producer.send(kafkaRecord, callback);
                        }

                        public void close()
                        {
                          producer.close();
                        }

                        /**
                         * get classloader from thread context if no classloader found in thread
                         * context return the classloader which has loaded this class
                         *
                         * @return classloader
                         */
                        public static ClassLoader getCurrentClassLoader()
                        {
                            ClassLoader classLoader = Thread.currentThread()
                                   .getContextClassLoader();
                            if (classLoader == null)
                            {
                                classLoader = DmsProducer.class.getClassLoader();
                            }
                            return classLoader;
                        }

                        /**
                         * Load configuration information from classpath.
                         *
                         * @param configFileName Configuration file name
                         * @return Configuration information
                         * @throws IOException
                         */
                        public static Properties loadFromClasspath(String configFileName) throws IOException
                        {
                            ClassLoader classLoader = getCurrentClassLoader();
                            Properties config = new Properties();

                           List properties = new ArrayList();
                           Enumeration propertyResources = classLoader
                                 .getResources(configFileName);
                           while (propertyResources.hasMoreElements())
                           {

2020-02-20                                                                                                              7
Distributed Message Service
Developer Guide                                                                      1 Kafka Premium Developer Guide

                                  properties.add(propertyResources.nextElement());
                              }

                              for (URL url : properties)
                              {
                                 InputStream is = null;
                                 try
                                 {
                                     is = url.openStream();
                                     config.load(is);
                                 }
                                 finally
                                 {
                                     if (is != null)
                                     {
                                         is.close();
                                         is = null;
                                     }
                                 }
                              }

                              return config;
                          }
                      }

Consuming Messages
                 ●    Test code
                      package com.dms.consumer;

                      import      org.apache.kafka.clients.consumer.ConsumerRecord;
                      import      org.apache.kafka.clients.consumer.ConsumerRecords;
                      import      org.junit.Test;
                      import      java.util.Arrays;

                      public class DmsConsumerTest {
                        @Test
                        public void testConsumer() throws Exception {
                            DmsConsumer consumer = new DmsConsumer();
                            consumer.consume(Arrays.asList("topic-0"));
                            try {
                               for (int i = 0; i < 10; i++){
                                  ConsumerRecords records = consumer.poll(1000);
                                  System.out.println("the numbers of topic:" + records.count());
                                  for (ConsumerRecord record : records)
                                  {
                                      System.out.println(record.toString());
                                  }
                               }
                            }catch (Exception e)
                            {
                               // TODO: Exception handling
                               e.printStackTrace();
                            }finally {
                               consumer.close();
                            }
                        }
                      }

                 ●    Message consumption code
                      package com.dms.consumer;

                      import      org.apache.kafka.clients.consumer.ConsumerRecords;
                      import      org.apache.kafka.clients.consumer.KafkaConsumer;
                      import      java.io.BufferedInputStream;
                      import      java.io.FileInputStream;
                      import      java.io.IOException;
                      import      java.io.InputStream;
                      import      java.net.URL;

2020-02-20                                                                                                        8
Distributed Message Service
Developer Guide                                                                    1 Kafka Premium Developer Guide

                      import java.util.*;

                      public class DmsConsumer {

                         public static final String CONFIG_CONSUMER_FILE_NAME = "dms.sdk.consumer.properties";

                         private KafkaConsumer consumer;

                         DmsConsumer(String path)
                         {
                           Properties props = new Properties();
                           try {
                              InputStream in = new BufferedInputStream(new FileInputStream(path));
                              props.load(in);
                           }catch (IOException e)
                           {
                              e.printStackTrace();
                              return;
                           }
                           consumer = new KafkaConsumer(props);
                         }

                         DmsConsumer()
                         {
                           Properties props = new Properties();
                           try {
                               props = loadFromClasspath(CONFIG_CONSUMER_FILE_NAME);
                           }catch (IOException e)
                           {
                               e.printStackTrace();
                               return;
                           }
                           consumer = new KafkaConsumer(props);
                         }
                         public void consume(List topics)
                         {
                           consumer.subscribe(topics);
                         }

                         public ConsumerRecords poll(long timeout)
                         {
                           return consumer.poll(timeout);
                         }

                         public void close()
                         {
                           consumer.close();
                         }

                         /**
                          * get classloader from thread context if no classloader found in thread
                          * context return the classloader which has loaded this class
                          *
                          * @return classloader
                          */
                         public static ClassLoader getCurrentClassLoader()
                         {
                             ClassLoader classLoader = Thread.currentThread()
                                    .getContextClassLoader();
                             if (classLoader == null)
                             {
                                 classLoader = DmsConsumer.class.getClassLoader();
                             }
                             return classLoader;
                         }

                         /**
                          * Load configuration information from classpath.
                          *

2020-02-20                                                                                                       9
Distributed Message Service
Developer Guide                                                                     1 Kafka Premium Developer Guide

                           * @param configFileName Configuration file name
                           * @return Configuration information
                           * @throws IOException
                           */
                          public static Properties loadFromClasspath(String configFileName) throws IOException
                          {
                              ClassLoader classLoader = getCurrentClassLoader();
                              Properties config = new Properties();

                              List properties = new ArrayList();
                              Enumeration propertyResources = classLoader
                                    .getResources(configFileName);
                              while (propertyResources.hasMoreElements())
                              {
                                 properties.add(propertyResources.nextElement());
                              }

                              for (URL url : properties)
                              {
                                 InputStream is = null;
                                 try
                                 {
                                     is = url.openStream();
                                     config.load(is);
                                 }
                                 finally
                                 {
                                     if (is != null)
                                     {
                                         is.close();
                                         is = null;
                                     }
                                 }
                              }

                              return config;
                          }
                      }

1.3.2 Setting Up the Java Development Environment
                 With the information collected in Collecting Connection Information and the
                 network environment prepared for Kafka clients, you can proceed to configuring
                 Kafka clients. This section describes how to configure Kafka clients to produce and
                 consume messages.

Preparing Tools

                 Table 1-1 Required tools

                   Tool                                   Required Version                How to Obtain

                   Apache Maven                           3.0.3 or later                  http://
                                                                                          maven.apache.org/
                                                                                          download.cgi

                   Java Development Kit                   1.8.111 or later                https://
                   (JDK) set with Java                                                    www.oracle.com/
                   environment variables                                                  technetwork/java/
                                                                                          javase/downloads/
                                                                                          index.html

2020-02-20                                                                                                       10
Distributed Message Service
Developer Guide                                                      1 Kafka Premium Developer Guide

                   Tool                         Required Version            How to Obtain

                   IntelliJ IDEA                                            https://
                                                                            www.jetbrains.com/
                                                                            idea/

Procedure
         Step 1 Download the demo package.
                 Decompress the package to obtain the following files.

                 Table 1-2 Files in the demo package
                   File                    Directory                 Description

                   DmsConsumer.java        .\src\main\java\com       API for consuming messages
                                           \dms\consumer

                   DmsProducer.java        .\src\main\java\com       API for producing messages
                                           \dms\producer

                   dms.sdk.consumer.       .\src\main\resources      Configuration information for
                   properties                                        consuming messages

                   dms.sdk.producer.p      .\src\main\resources      Configuration information for
                   roperties                                         producing messages

                   client.truststore.jks   .\src\main\resources      SSL certificate, used for SASL
                                                                     connection

                   DmsConsumerTest.j       .\src\test\java\com\dms   Test code of consuming
                   ava                     \consumer                 messages

                   DmsProducerTest.ja      .\src\test\java\com\dms   Test code of producing
                   va                      \producer                 messages

                   pom.xml                 .\                        Maven configuration file,
                                                                     containing the Kafka client
                                                                     dependencies

         Step 2 In IntelliJ IDEA, import the demo project.
                 The demo project is a Java project built in Maven. Therefore, you need the JDK
                 and the Maven plugin in IDEA.
                 1.   Select Import Project.

2020-02-20                                                                                            11
Distributed Message Service
Developer Guide                                                      1 Kafka Premium Developer Guide

                 2.   Select Maven.

                 3.   Select the JDK.

                 4.   You can select other options or retain the default settings. Then, click Finish.
                      The demo project has been imported.

2020-02-20                                                                                           12
Distributed Message Service
Developer Guide                                                  1 Kafka Premium Developer Guide

         Step 3 Configure Maven.

                 Choose Files > Settings, set Maven home directory correctly, and select the
                 required settings.xml file.

         Step 4 Specify Kafka configurations.

2020-02-20                                                                                     13
Distributed Message Service
Developer Guide                                                                  1 Kafka Premium Developer Guide

                 The following is a configuration example for producing messages. Replace the
                 information in bold with the actual values.
                 #The information in bold is specific to different Kafka instances and must be modified. Other parameters
                 can also be added.
                 #The topic name is in the code for producing or consuming messages.
                 #######################
                 #Obtain the information about Kafka instance brokers on the console. For example:
                 bootstrap.servers=100.xxx.xxx.87:9095,100.xxx.xxx.69:9095,100.xxx.xxx.155:9095
                 bootstrap.servers=ip1:port1,ip2:port2,ip3:port3
                 #Producer acknowledgement
                 acks=all
                 #Method of turning the key into bytes
                 key.serializer=org.apache.kafka.common.serialization.StringSerializer
                 #Method of turning the value into bytes
                 value.serializer=org.apache.kafka.common.serialization.StringSerializer
                 #Memory available to the producer for buffering
                 buffer.memory=33554432
                 #Number of retries
                 retries=0
                 #######################
                 #Comment out the following parameters if SASL access is not enabled.
                 #######################
                 #Configure the JAAS username and password on the console.
                 sasl.jaas.config=org.apache.kafka.common.security.plain.PlainLoginModule required \
                     username="username" \
                     password="password";
                 #SASL mechanism
                 sasl.mechanism=PLAIN
                 #Encryption protocol. Currently, only SASL_SSL is supported.
                 security.protocol=SASL_SSL
                 #Location of ssl.truststore
                 ssl.truststore.location=E:\\temp\\client.truststore.jks
                 #Password of ssl.truststore
                 ssl.truststore.password=dms@kafka

         Step 5 In the down left corner of IDEA, click Terminal. In terminal, run the mvn test
                command to see how the demo project goes.

                 Figure 1-5 Opening terminal in IDEA

                 The following information is displayed for the producer:
                 -------------------------------------------------------
                  TESTS
                 -------------------------------------------------------
                 Running com.dms.producer.DmsProducerTest
                 produce msg:The msg is 0
                 produce msg:The msg is 1
                 produce msg:The msg is 2
                 produce msg:The msg is 3
                 produce msg:The msg is 4
                 produce msg:The msg is 5
                 produce msg:The msg is 6

2020-02-20                                                                                                                  14
Distributed Message Service
Developer Guide                                                                       1 Kafka Premium Developer Guide

                 produce msg:The msg is 7
                 produce msg:The msg is 8
                 produce msg:The msg is 9
                 Tests run: 1, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 138.877 sec

                 The following information is displayed for the consumer:
                 -------------------------------------------------------
                  TESTS
                 -------------------------------------------------------
                 Running com.dms.consumer.DmsConsumerTest
                 the numbers of topic:0
                 the numbers of topic:0
                 the numbers of topic:6
                 ConsumerRecord(topic = topic-0, partition = 2, offset = 0, CreateTime = 1557059377179, serialized key size
                 = -1, serialized value size = 12, headers = RecordHeaders(headers = [], isReadOnly = false), key = null, value
                 = The msg is 2)
                 ConsumerRecord(topic = topic-0, partition = 2, offset = 1, CreateTime = 1557059377195, serialized key size
                 = -1, serialized value size = 12, headers = RecordHeaders(headers = [], isReadOnly = false), key = null, value
                 = The msg is 5)

                 ----End

1.4 Python
                 This section describes how to access a Kafka premium instance using a Kafka
                 client in Python, including installing the client, and producing and consuming
                 messages.
                 Before getting started, ensure that you have collected the information listed in
                 Collecting Connection Information.

Preparing Tools
                 ●     Python
                       Generally, Python is pre-installed in the system. Enter python in a CLI. If the
                       following information is displayed, Python has already been installed.
                       [root@ecs-heru bin]# python
                       Python 2.7.5 (default, Oct 30 2018, 23:45:53)
                       [GCC 4.8.5 20150623 (Red Hat 4.8.5-36)] on linux2
                       Type "help", "copyright", "credits" or "license" for more information.
                       >>>

                       If the Python has not been installed, run the following command to install it:
                       yum install python
                 ●     Kafka clients in Python
                       Run the following command to install Kafka clients:
                       pip install kafka-python

Producing Messages

                      Replace the following information in bold with the actual values.
                 ●     Connection with SASL
                       from kafka import KafkaProducer
                       import ssl
                       ##Connection information
                       conf = {

2020-02-20                                                                                                                  15
Distributed Message Service
Developer Guide                                                                     1 Kafka Premium Developer Guide

                          'bootstrap_servers': ["ip1:port1,ip2:port2,ip3:port3"],
                          'topic_name': 'topic_name',
                          'sasl_plain_username': 'username',
                          'sasl_plain_password': 'password'
                      }

                      context = ssl.create_default_context()
                      context = ssl.SSLContext(ssl.PROTOCOL_SSLv23)
                      context.verify_mode = ssl.CERT_REQUIRED
                      ##Certificate
                      context.load_verify_locations("phy_ca.crt")

                      print (start producer)
                      producer = KafkaProducer(bootstrap_servers=conf['bootstrap_servers'],
                                       sasl_mechanism="PLAIN",
                                       ssl_context=context,
                                       security_protocol='SASL_SSL',
                                       sasl_plain_username=conf['sasl_plain_username'],
                                       sasl_plain_password=conf['sasl_plain_password'])

                      data = "hello kafka!"
                      producer.send(conf['topic_name'], data)
                      producer.close()
                      print (end producer)

                 ●    Connection without SASL
                      from kafka import KafkaProducer

                      conf = {
                        'bootstrap_servers': 'ip1:port1,ip2:port2,ip3:port3',
                        'topic_name': 'topic-name',
                      }

                      print (start producer)
                      producer = KafkaProducer(bootstrap_servers=conf['bootstrap_servers'])

                      data = "hello kafka!"
                      producer.send(conf['topic_name'], data)
                      producer.close()
                      print (end producer)

Consuming Messages
                 ●    Connection with SASL
                      from kafka import KafkaProducer
                      import ssl
                      ##Connection information
                      conf = {
                         'bootstrap_servers': ["ip1:port1,ip2:port2,ip3:port3"],
                         'topic_name': 'topic_name',
                         'sasl_plain_username': 'username',
                         'sasl_plain_password': 'password',
                         'consumer_id': 'consumer_id'
                      }

                      context = ssl.create_default_context()
                      context = ssl.SSLContext(ssl.PROTOCOL_SSLv23)
                      context.verify_mode = ssl.CERT_REQUIRED
                      ##Certificate
                      context.load_verify_locations("phy_ca.crt")

                      print (start consumer)
                      consumer = KafkaConsumer(conf['topic_name'],
                                       bootstrap_servers=conf['bootstrap_servers'],
                                       group_id=conf['consumer_id'],
                                       sasl_mechanism="PLAIN",
                                       ssl_context=context,
                                       security_protocol='SASL_SSL',
                                       sasl_plain_username=conf['sasl_plain_username'],

2020-02-20                                                                                                      16
Distributed Message Service
Developer Guide                                                                   1 Kafka Premium Developer Guide

                                      sasl_plain_password=conf['sasl_plain_password'])

                      for message in consumer:
                         print("%s:%d:%d: key=%s value=%s" % (message.topic, message.partition,message.offset,
                      message.key,message.value))

                      print (end consumer)
                 ●    Connection without SASL
                      Replace the information in bold with the actual values.
                      from kafka import KafkaConsumer

                      conf = {
                        'bootstrap_servers': 'ip1:port1,ip2:port2,ip3:port3',
                        'topic_name': 'topic-name',
                        'consumer_id': 'consumer-id'
                      }

                      print (start consumer)
                      consumer = KafkaConsumer(conf['topic_name'],
                                       bootstrap_servers=conf['bootstrap_servers'],
                                       group_id=conf['consumer_id'])

                      for message in consumer:
                         print("%s:%d:%d: key=%s value=%s" % (message.topic, message.partition,message.offset,
                      message.key,message.value))

                      print (end consumer)

1.5 Recommendations on Client Usage
                 ●    Producer parameters
                      –    Allow for retries in cases where messages fail to be sent.
                           For example, allow for three retries by setting the value of retries to 3.
                      –    The producer cannot block on callback functions. Otherwise, messages
                           may fail to be sent.
                           For messages that need to be send immediately, set linger.ms to 0.
                           Ensure that the producer has sufficient JVM memory to avoid blockages.
                 ●    Consumer parameters
                      –    Ensure that the owner thread does not exit abnormally. Otherwise, the
                           client may fail to initiate consumption requests and the consumption will
                           be blocked.
                      –    Use long polling to consume messages and do not close the consumer
                           connection immediately after the consumption is completed. Otherwise,
                           rebalancing will take place frequently, blocking consumption.
                      –    Ensure that the consumer polls at regular intervals (for example, every
                           200 ms) for it to keep sending heartbeats to the server. If the consumer
                           stops sending heartbeats for long enough, the consumer session will time
                           out and the consumer will be considered to have stopped. This will also
                           block consumption.
                      –    Always close the consumer before exiting. Otherwise, consumers in the
                           same group may block the session.timeout.ms time.
                      –    Set the timeout time for the consumer session to a reasonable value. For
                           example, set session.timeout.ms to 30000 so that the timeout time is
                           30s.

2020-02-20                                                                                                       17
Distributed Message Service
Developer Guide                                                    1 Kafka Premium Developer Guide

                      –   The number of consumers cannot be greater than the number of
                          partitions in the topic. Otherwise, some consumers may fail to poll for
                          messages.
                      –   Commit messages after they have been processed. Otherwise, the
                          messages may fail to be processed and cannot be polled for again.
                      –   Ensure that there is a maximum limit on the size of messages buffered
                          locally to avoid an out-of-memory (OOM) situation.
                      –   Kafka supports the exactly-once delivery. Therefore, ensure the
                          idempotency of processing messages for services.

2020-02-20                                                                                          18
Distributed Message Service
Developer Guide                                                       2 DMS Kafka Developer Guide

                              2      DMS Kafka Developer Guide

2.1 Overview

                     NOTICE

                 This document describes how to connect to Distributed Message Service (DMS)
                 Kafka (non-premium) through clients.

DMS Kafka API
                 DMS Kafka supports open-source Kafka application programming interfaces
                 (APIs). Third-party applications can implement open-source Kafka service
                 capabilities by directly using a Kafka client to call DMS.

Usage Restrictions
                 Generally, DMS Kafka can process thousands of messages per second. If more
                 messages need to be processed per second, submit a service ticket or contact the
                 customer service.

                 The recommended Kafka client version is 0.10.2.1 or higher.

                 If the Kafka SDK is used to produce messages, the maximum size of a single
                 message is 10 MB. If the DMS console is used to produce messages, the maximum
                 size of a single message is 512 KB.

2.2 Preparing the Environment
Helpful Links
                 ●    Download DMS Kafka SDK
                 ●    Download Sample Project

2020-02-20                                                                                      19
Distributed Message Service
Developer Guide                                                             2 DMS Kafka Developer Guide

                     To create a new project, use the downloaded SDK. To write code based on the sample
                     project, use the SDK included in the project.

Preparing Tools
                 Eclipse: Download Eclipse 3.6.0 or later from the Eclipse official website.
                 JDK: Download Java Development Kit 1.8.111 or later from the Oracle official
                 website.
                 Apache Maven: Download Apache Maven 3.0.3 or later from the Maven official
                 website.

Obtaining a Topic ID and Consumer Group ID
                 Before accessing DMS using the SDK, create a Kafka queue and consumer group
                 on the DMS console, and obtain the topic ID and consumer group ID.

         Step 1 Log in to the management console.
         Step 2 Choose Service List > Application > Distributed Message Service to launch the
                DMS console.
         Step 3 In the navigation pane, choose Queue Manager.
         Step 4 On the Queue Manager page, click Create Queue.
         Step 5 Specify queue parameters.

                 Table 2-1 Parameter description

                   Parameter       Description

                   Name            Name of the queue you want to create. The name must be
                                   unique.
                                   When you create a queue, a default queue name is generated,
                                   which you can change if required. A queue name is 1 to 64
                                   characters long. Only letters, digits, underscores (_), and
                                   hyphens (-) are allowed.
                                   The queue name cannot be modified after creation of the
                                   queue.

                   Type            Select Kafka queue.

                   Mode            Select either High throughput or High reliability.
                                   Default value: High throughput.
                                   High throughput: All message replicas are flushed to disk
                                   asynchronously. Select this mode when high message delivery
                                   performance is required.
                                   High reliability: All message replicas are flushed to disk
                                   synchronously. Select this mode when high message delivery
                                   reliability is required.

2020-02-20                                                                                                20
Distributed Message Service
Developer Guide                                                       2 DMS Kafka Developer Guide

                   Parameter     Description

                   Message       This parameter is available only for Kafka queues.
                   Retention     The number of hours for which messages will be preserved in a
                   Period (h)    Kafka queue. Messages older than that period will be deleted.
                                 Deleted messages are not retrievable to consumer groups.
                                 Value range: integers from 1 to 72
                                 Default value: 72

                   Tags          When creating a queue, you can add tags to identify the queue.
                                 You can classify and search for queues by tags.
                                 ● Tags of the same queue cannot have the same key.
                                 ● You can customize tags or use tags predefined by Tag
                                   Management Service (TMS).
                                 ● You can add a maximum of 10 tag keys to a queue.

                   Description   The description consists of a maximum of 160 characters and
                   (optional)    cannot contain angle brackets (< and >).

                 Figure 2-1 Creating a Kafka queue

         Step 6 Click OK.
         Step 7 Click the name of the queue. On the displayed queue details page, obtain the
                Kafka topic ID.

2020-02-20                                                                                       21
Distributed Message Service
Developer Guide                                                         2 DMS Kafka Developer Guide

                 Figure 2-2 Obtaining the Kafka topic ID

         Step 8 Click Create Consumer Group. The Create Consumer Group dialog box is
                displayed.

         Step 9 Enter a consumer group name.

                 A default queue name is generated, which you can change if required. A consumer
                 group name consists of 1 to 32 characters. Only letters, digits, underscores (_), and
                 hyphens (-) are allowed. Consumer group names must be unique within the same
                 queue.

        Step 10 Click OK. Obtain the ID of the consumer group in the consumer group list.

                 Figure 2-3 Obtaining the consumer group ID

                 ----End

Obtaining a Project ID
                 When calling APIs, you need to specify project_id in API requests. Obtain a project
                 ID by performing the following procedure.

         Step 1 Sign up and log in to the management console.

         Step 2 Click the username and choose My Credentials from the drop-down list.

2020-02-20                                                                                         22
Distributed Message Service
Developer Guide                                                               2 DMS Kafka Developer Guide

         Step 3 On the My Credentials page, view project IDs in the project list.

                 Figure 2-4 Obtaining a project ID

                 ----End

Obtaining an AK/SK
         Step 1 Sign up and log in to the management console.
         Step 2 Click the username and choose My Credentials from the drop-down list.
         Step 3 On the My Credentials page, click the Access Keys tab.
         Step 4 Click Create Access Keys.
         Step 5 Enter the password for login.
         Step 6 Enter the verification code received by email or SMS message.
         Step 7 Click OK.

                     Keep the key secure and do not disclose it to any unauthorized people.

         Step 8 Download the credentials.csv file containing your AK and SK to a local computer.

                 ----End

Obtaining Region and Endpoint Information
                 Obtain the region and endpoint from Regions and Endpoints.

2020-02-20                                                                                            23
Distributed Message Service
Developer Guide                                                     2 DMS Kafka Developer Guide

Summary of Environment Information

                 Table 2-2 Required environment information

                   Category        Information     Example

                   ECS             EIP             x.x.x.x
                                   Username        name
                                   Password        password
                   DMS             Queue name      my-kafka-queue
                                   Queue ID        4df89da6-ede4-4072-93e0-28dc6e866299
                                   Queue type      Kafka

                                   Kafka topic     k-
                                                   bd67aaead60940d688b872c31bdc653b-4df8
                                                   9da6-ede4-4072-93e0-28dc6e866299
                                   Consumer        my-consumer-group
                                   group name

                                   Consumer        g-7ec0caac-01fb-4f91-a4f2-0a9dd48f8af7
                                   group ID

                   AK/SK           AK              VAODAIIJGPUAYTJRRL**
                                   SK              ZHN49c6bpwDiQvPqKJ5CxutJxqc04Glt9xSzxY
                                                   **
                   Project         Region          eu-de

                                   Project name    eu-de

                                   Project ID      bd67aaead60940d688b872c31bdc653b
                   Region and      Region          eu-de
                   endpoint
                                   Endpoint        dms-kafka.eu-de.otc.t-systems.com:37000

                   DNS             DNS server IP   172.16.16.65
                                   address

2.3 Creating a Project
                 This section uses the Maven project kafkademo as an example to describe how to
                 create a project.

Procedure
         Step 1 Download the demo package.
                 1.   Log in to the DMS console.

2020-02-20                                                                                   24
Distributed Message Service
Developer Guide                                                                        2 DMS Kafka Developer Guide

                 2.   In the navigation pane, choose Using APIs.
                 3.   Choose Kafka APIs.
                 4.   Click Download Sample Code to download KafkaDemo.zip.
         Step 2 Click Download SDK to download the DMS Kafka SASL package.
                 Decompress the following directories from the package:
                 ●    client.truststore.jks: client certificate
                 ●    dms.kafka.sasl.client-1.0.0.jar: DMS Kafka SASL package
                 ●    dms_kafka_client_jaas.conf: client configuration file
                 You can also decompress the SDK package from \KafkaDemo\dist\libs
                 \dms.kafka.sasl.client-1.0.0.jar.
         Step 3 On Eclipse (the recommended version is 4.6 or later), create a Maven project. The
                project name kafkademo is used as an example.

                 Figure 2-5 Creating a Maven project

         Step 4 Click Finish.
         Step 5 Import the DMS Kafka SASL package.
                 1.   Right-click the new project kafkademo, and create a libs folder.
                 2.   Copy dms.kafka.sasl.client-1.0.0.jar to libs.
                 3.   Add the following information to the pom.xml file to import
                      dms.kafka.sasl.client-1.0.0.jar into the Maven repository:
                      
                          dms
                          kafka.sasl.client
                          1.0.0
                          system
                          ${project.basedir}/libs/dms.kafka.sasl.client-1.0.0.jar
                       
                          org.apache.kafka
                          kafka-clients
                          0.10.2.1
                       
                          org.slf4j
                          slf4j-api
                          1.7.7
                       
                          org.slf4j

2020-02-20                                                                                                     25
Distributed Message Service
Developer Guide                                                                         2 DMS Kafka Developer Guide

                          slf4j-log4j12
                          1.7.7
                       
                          log4j
                          log4j
                          1.2.17
                       
                 4.   Save the pom.xml file.

                 ----End

2.4 Configuring Parameters
Procedure
         Step 1 (Optional) Configure a private DNS server.

                 You need to use an ECS to connect to DMS Kafka. You do not need to configure a
                 private DNS server for a newly created ECS. If an existing ECS is used, you need to
                 configure the DNS server IP address.

                 The DNS server IP address in the eu-de region is 172.16.16.65

         Step 2 Configure access_key, secret_key, and project_id in the
                dms_kafka_client_jaas.conf file.

                 The three parameters are used to authenticate DMS Kafka API requests.
                 KafkaClient {
                   com.dms.kafka.sasl.client.KafkaLoginModule required
                   access_key="your ak"
                   secret_key="your sk"
                   project_id="projectID";
                 };

                 Replace them with the actual access_key, secret_key, and project_id of your
                 account.

                 To access the queues authorized by other tenants, set target_project_id to the
                 project ID of the authorizing tenant.

         Step 3 Configure SASL access to start upon process using either of the following methods.
                In both methods, replace /path with the actual path name.
                 1.   Method 1: Configure the following JVM parameter to add the location of
                      SASL configuration file:
                      -Djava.security.auth.login.config=/path/kafka_client_jaas.conf

                 2.   Method 2: Add the following information to project code so that SASL access
                      can start before Kafka Producer and Consumer start:
                      System.setProperty("java.security.auth.login.config", "/path/kafka_client_jaas.conf");

         Step 4 Add the following information to the consumer.properties file:
                 connections.max.idle.ms=30000

         Step 5 Configure key parameters in the consumer.properties/producer.properties file.

2020-02-20                                                                                                      26
Distributed Message Service
Developer Guide                                                                 2 DMS Kafka Developer Guide

                 Table 2-3 Key parameters in the consumer.properties/producer.properties file

                   Parameter              Description             Setting

                   bootstrap.servers      IP address or           dms-kafka.eu-de.otc.t-systems.com:
                                          domain name of          37000
                                          the DMS server

                   ssl.truststore.loca    Path in which the       /path/client.truststore.jks, where /
                   tion                   client certificate      path must be replaced with the actual
                                          client.truststore.j     path name
                                          ks is located

                   ssl.truststore.pass    Client certificate      -
                   word                   password

                   security.protocol      Security protocol       SASL_SSL

                   sasl.mechanism         Service name            DMS (Note: All letters in the entered
                                                                  service name must be capitalized.)

                 For details about other Kafka parameters, visit the official Kafka website.

         Step 6 Enable Kafka debug logging by modifying the log4j.properties file.
                 log4j.rootLogger=DEBUG, stdout
                 log4j.appender.stdout=org.apache.log4j.ConsoleAppender
                 log4j.appender.stdout.layout=org.apache.log4j.PatternLayout
                 log4j.appender.stdout.layout.ConversionPattern=[%d] %p %m (%c:%L)%n
                 log4j.logger.org.apache.kafka.clients=DEBUG
                 log4j.logger.kafka=INFO, stdout
                 log4j.additivity.kafka=false
                 log4j.logger.org.apache.kafka=DEBUG, stdout
                 log4j.additivity.org.apache.kafka=false

         Step 7 Write code. For details about APIs, visit the official Kafka website.

                 ----End

2.5 Running the Sample Project
                 The following describes how to access DMS Kafka queues to produce and
                 consume messages in Java.

Procedure
         Step 1 Log in to the ECS.

                      You can run the sample project on an ECS with an IP address in the 192 network segment.

         Step 2 Install JDK or Java runtime environment (JRE). Add the following settings of
                environment variables JAVA_HOME and PATH to the ~/.bash_profile:
                 export JAVA_HOME=/opt/java/jdk1.8.0_151
                 export PATH=$JAVA_HOME/bin:$PATH

2020-02-20                                                                                                  27
Distributed Message Service
Developer Guide                                                                      2 DMS Kafka Developer Guide

                 Run the source .bash_profile command for the modification to take effect.

                      Use Oracle JDK instead of ECS's default JDK (for example, OpenJDK), because ECS's default
                      JDK may not be suitable for the sample project. To obtain Oracle JDK, download Java
                      Development Kit 1.8.111 or a later version from https://www.oracle.com/technetwork/
                      java/javase/downloads/index.html.

         Step 3 Add the DNS server IP address to the /etc/resolv.conf file as the root user.
                 Add the following content to the first line in the file:
                 nameserver 172.16.16.65

         Step 4 Run the following command to download the code package of the sample project
                KafkaDemo.zip.
                 $ wget https://obs.eu-de.otc.t-systems.com/dms-demo/KafkaDemo.zip

         Step 5 Run the following command to decompress KafkaDemo.zip.
                 $ unzip KafkaDemo.zip

         Step 6 Run the following command to navigate to the KafkaDemo/dist directory, which
                contains pre-compiled binary files and executable scripts.
                 $ cd KafkaDemo/dist

         Step 7 Edit the config/dms_kafka_client_jaas.conf file and configure access_key,
                secret_key, and project_id.
                 $ vim config/dms_kafka_client_jaas.conf

                 The values in bold are examples. Replace them with actual values.
                 KafkaClient {
                   com.dms.kafka.sasl.client.KafkaLoginModule required
                   access_key="******************"
                   secret_key="******************"
                   project_id="bd67aaead60940d688b872c31bdc653b";
                 };

         Step 8 Edit the config/producer.properties file and configure topic and
                bootstrap.servers.
                 $ vim config/producer.properties

                 The values in bold are examples. Replace them with actual values.
                 topic=k-bd67aaead60940d688b872c31bdc653b-4df89da6-ede4-4072-93e0-28dc6e866299
                 bootstrap.servers=dms-kafka.eu-de.otc.t-systems.com:37000
                 ssl.truststore.password=************
                 acks=all
                 retries=1
                 batch.size=16384
                 buffer.memory=33554432
                 key.serializer=org.apache.kafka.common.serialization.StringSerializer
                 value.serializer=org.apache.kafka.common.serialization.StringSerializer
                 security.protocol=SASL_SSL
                 sasl.mechanism=DMS

                      The parameter topic can be set to a queue name or a Kafka topic name. For more
                      information, see Table 2-2.

         Step 9 Edit the config/consumer.properties file and configure topic, bootstrap.servers,
                and group.id.

2020-02-20                                                                                                   28
Distributed Message Service
Developer Guide                                                              2 DMS Kafka Developer Guide

                 $ vim config/consumer.properties

                 The values in bold are examples. Replace them with actual values.
                 topic=k-bd67aaead60940d688b872c31bdc653b-4df89da6-ede4-4072-93e0-28dc6e866299
                 bootstrap.servers=dms-kafka.eu-de.otc.t-systems.com:37000
                 group.id=g-7ec0caac-01fb-4f91-a4f2-0a9dd48f8af7
                 ssl.truststore.password=************
                 security.protocol=SASL_SSL
                 sasl.mechanism=DMS
                 key.deserializer=org.apache.kafka.common.serialization.StringDeserializer
                 value.deserializer=org.apache.kafka.common.serialization.StringDeserializer
                 auto.offset.reset=earliest
                 enable.auto.commit=false

                      The parameter topic can be set to a queue name or a Kafka topic name. For more
                      information, see Table 2-2.

        Step 10 Run the sample project to produce messages:
                 $ bash produce.sh

                 After the command is run, 10 messages are automatically sent to the Kafka
                 queue.

        Step 11 Run the sample project to consume messages:
                 $ bash consume.sh

                 ----End

2.6 Compiling the Sample Project Code
         Step 1 Download and decompress the sample project code KafkaDemo.zip.

         Step 2 Import the sample project code.
                 1.   On Eclipse, choose File > Import.
                 2.   In the Select an import source area, select Exiting Projects Into Workspace.
                 3.   Select the directory to which the sample project code KafkaDemo is
                      decompressed.

         Step 3 Choose Project > Build Project to build the project.

         Step 4 Export the new JAR file.
                 1.   Right-click the sample project KafkaDemo and choose Export from the
                      shortcut menu.
                 2.   Choose Java > JAR file. Enter the path and name of the JAR file to be
                      generated.

         Step 5 Replace the dms.kafka.demo.jar file in the KafkaDemo/dist/libs directory with
                the new JAR file. Run the newly built project by following the procedure in
                Running the Sample Project.

                 ----End

2020-02-20                                                                                             29
Distributed Message Service
Developer Guide                                                                          2 DMS Kafka Developer Guide

2.7 Code of the Sample Project
Producer
                 DMS Kafka APIs are compatible with native open-source Kafka clients. Compared
                 with native Kafka service code, the sample project code additionally contains a
                 client certificate and simple authentication and security layer (SASL)
                 configuration, which are used for identity authentication and secure
                 communication. To realize smooth migration of producer applications, you only
                 need to import the client certificate and SASL configuration before creating the
                 Kafka Producer without modifying any other Kafka service code.

                 Code pertaining to client certificate and SASL:
                  Properties producerConfig = Config.getProducerConfig();
                    producerConfig.put("ssl.truststore.location", Config.getTrustStorePath());
                    System.setProperty("java.security.auth.login.config", Config.getSaslConfig());

                 The code for creating a producer and sending messages does not need to be
                 modified.

Consumer
                 DMS Kafka APIs are compatible with native open-source Kafka clients. Compared
                 with native Kafka service code, the sample project code additionally contains a
                 client certificate and SASL configuration, which are used for identity
                 authentication and secure communication. To realize smooth migration of
                 consumer applications, you only need to import the client certificate and SASL
                 configuration before creating the Kafka Consumer without modifying any other
                 Kafka service code.

                 Code pertaining to client certificate and SASL:
                  Properties consumerConfig = Config.getConsumerConfig();
                    consumerConfig.put("ssl.truststore.location", Config.getTrustStorePath());
                    System.setProperty("java.security.auth.login.config", Config.getSaslConfig());

                 The code for creating a consumer and consuming messages does not need to be
                 modified.

2.8 Using the Enhanced Java SDK
                 The enhanced Kafka Java SDK is optimized based on the open-source Kafka
                 0.10.2.1 client, greatly improving system performance.

Procedure
         Step 1 Download the open-source Kafka 0.10.2.1 client package.

         Step 2 Decompress the following from the open-source client package:

2020-02-20                                                                                                       30
Distributed Message Service
Developer Guide                                                       2 DMS Kafka Developer Guide

         Step 3 Download the enhanced Kafka 0.10.2.1 SDK package.
         Step 4 Decompress the following from the SDK package:

         Step 5 Copy all JAR packages in the dms_kafka_0.10.2.1-client/ directory to the libs
                folder in the directory where the open-source Kafka 0.10.2.1 client package is
                decompressed, and overwrite the kafka-clients-0.10.2.1.jar package with the
                same name.
         Step 6 Perform the same configuration as other open-source clients.

                 ----End

2020-02-20                                                                                       31
Distributed Message Service
Developer Guide                                                          2 DMS Kafka Developer Guide

2.9 Recommended Parameter Settings for Kafka Clients
                 Table 2-4 Producer parameters
                   Paramete   Defaul    Recomm        Description
                   r          t Value   ended
                                        Value

                   acks       1         all (if       Indicates the number of acknowledgments
                                        high          the producer requires the server to return
                                        reliability   before considering a request complete. This
                                        mode is       controls the durability of records that are
                                        selected)     sent. The value of this parameter can be any
                                        1 (if high    of the following:
                                        throughp      0: The producer will not wait for any
                                        ut mode       acknowledgment from the server at all. The
                                        is            record will be immediately added to the
                                        selected)     socket buffer and considered sent. No
                                                      guarantee can be made that the server has
                                                      received the record, and the retries
                                                      configuration will not take effect (as the
                                                      client generally does not know of any
                                                      failures). The offset given back for each
                                                      record will always be set to –1.
                                                      1: The leader will write the record to its local
                                                      log but will respond without waiting until
                                                      receiving full acknowledgement from all
                                                      followers. If the leader fails immediately after
                                                      acknowledging the record but before the
                                                      followers have replicated it, the record will be
                                                      lost.
                                                      all: The leader will wait for the full set of
                                                      replicas to acknowledge the record. This is
                                                      the strongest available guarantee that the
                                                      record will not be lost as long as there is at
                                                      least one replica.

                   retries    0         0             Setting this parameter to a value greater
                                                      than zero will cause the client to resend any
                                                      record that failed to be sent due to a
                                                      potentially transient error. Note that this
                                                      retry is no different than if the client resent
                                                      the record upon receiving the error. Allowing
                                                      retries will potentially change the ordering of
                                                      records because if two batches are sent to
                                                      the same partition, and the first fails and is
                                                      retried but the second succeeds, then the
                                                      records in the second batch may appear first.

2020-02-20                                                                                             32
Distributed Message Service
Developer Guide                                                        2 DMS Kafka Developer Guide

                   Paramete     Defaul    Recomm    Description
                   r            t Value   ended
                                          Value

                   request.ti   30000     120000    Indicates the maximum amount of time the
                   meout.ms                         client will wait for the response of a request.
                                                    If the response is not received before the
                                                    timeout elapses, the client will throw a
                                                    Timeout exception.

                   block.on.b   TRUE      Default   When buffer memory is exhausted, the
                   uffer.full             value     producer must stop receiving new message
                                                    records or throw an exception. By default,
                                                    this parameter is set to TRUE. However, in
                                                    some cases, non-blocking usage is desired
                                                    and it is better to throw an exception
                                                    immediately. Setting this parameter to FALSE
                                                    will cause the producer to instead throw
                                                    "BufferExhaustedException" when buffer
                                                    memory is exhausted.

                   batch.size   16384     262144    The producer will attempt to batch records
                                                    together into fewer requests whenever
                                                    multiple records are being sent to the same
                                                    partition. This helps improve performance of
                                                    both the client and the server. This parameter
                                                    controls the default batch size in bytes.
                                                    No attempt will be made to batch records
                                                    larger than this size.
                                                    Requests sent to brokers will contain multiple
                                                    batches, one for each partition with data
                                                    available to be sent.
                                                    A smaller batch size will make batching less
                                                    common and may reduce throughput (a
                                                    batch size of zero will disable batching
                                                    entirely). A larger batch size may use more
                                                    memory as a buffer of the specified batch
                                                    size will always be allocated in anticipation
                                                    of additional records.

2020-02-20                                                                                            33
Distributed Message Service
Developer Guide                                                             2 DMS Kafka Developer Guide

                   Paramete      Defaul    Recomm        Description
                   r             t Value   ended
                                           Value

                   buffer.me     335544    53687091      The total bytes of memory the producer can
                   mory          32        2             use to buffer records waiting to be sent to
                                                         the server. If records are sent faster than they
                                                         can be delivered to the broker, the producer
                                                         will stop sending records or throw a
                                                         "block.on.buffer.full" exception.
                                                         This setting should correspond roughly to the
                                                         total memory the producer will use, but is
                                                         not a rigid bound since not all memory the
                                                         producer uses is used for buffering. Some
                                                         additional memory will be used for
                                                         compression (if compression is enabled) as
                                                         well as for maintaining in-flight requests.

                 Table 2-5 Consumer parameters
                   Parameter     Defa      Recom      Description
                                 ult       mende
                                 Value     d
                                           Value

                   auto.com      TRUE      FALSE      If this parameter is set to TRUE, the offset of
                   mit.enable                         messages already fetched by the consumer will
                                                      be periodically committed to ZooKeeper. This
                                                      committed offset will be used when the process
                                                      fails as the position from which the new
                                                      consumer will begin.
                                                      Constraints: If this parameter is set to FALSE, to
                                                      avoid message loss, an offset must be
                                                      committed to ZooKeeper after the messages are
                                                      successfully consumed.

                   auto.offset   latest    earliest   Indicates what to do when there is no initial
                   .reset                             offset in ZooKeeper or if the current offset has
                                                      been deleted. Options:
                                                      earliest: The offset is automatically reset to the
                                                      smallest offset.
                                                      latest: The offset is automatically reset to the
                                                      largest offset.
                                                      none: The system throws an exception to the
                                                      consumer if no offset is available.
                                                      anything else: The system throws an exception
                                                      to the consumer.

2020-02-20                                                                                                 34
Distributed Message Service
Developer Guide                                                       2 DMS Kafka Developer Guide

                   Parameter     Defa    Recom   Description
                                 ult     mende
                                 Value   d
                                         Value

                   connection    6000    30000   Indicates the timeout interval for an idle
                   s.max.idle.   00              connection. The server closes the idle connection
                   ms                            after this period of time ends.Setting this
                                                 parameter to 30000 can reduce the server
                                                 response failures when the network condition is
                                                 poor.

2020-02-20                                                                                       35
You can also read