You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
Copy file name to clipboardexpand all lines: README.md
+8-6
Original file line number
Diff line number
Diff line change
@@ -2,7 +2,7 @@
2
2
3
3
Wrapper package around [node-rdkafka](https://www.npmjs.com/package/node-rdkafka) where only the configuration is required, and the package can be used instantly with just the essentials. Don't be scared from the name, Kafka is cool and the name is a nod to the [Undertaker's](https://en.wikipedia.org/wiki/The_Undertaker) biker persona in the early 2000s.
4
4
5
-
The purpose of this package is to provide a battery-included package where one does not have to worry about configuring the [node-rdkafka](https://www.npmjs.com/package/node-rdkafka)package for using Kafka client's functions like sending a message to a topic and consuming a message from a topic. The package handles producer/consumer connection internally and only allows disconnecting both producer and consumer connection.
5
+
The purpose is to provide a batteries-included package where one does not have to worry about configuring [node-rdkafka](https://www.npmjs.com/package/node-rdkafka) for sending a message to a topic and consuming a message from a topic. The package handles producer/consumer connection internally and only allows disconnecting producer/consumer externally.
6
6
7
7
## Getting started
8
8
@@ -14,6 +14,8 @@ npm i @cinemataztic/big-evil-kafka
14
14
15
15
## Prerequisites
16
16
17
+
Node.js version should be >=16
18
+
17
19
This package uses [confluent-schema-registry](https://www.npmjs.com/package/@kafkajs/confluent-schema-registry) and assumes that the schema registry is in place along with the Kafka server running in the background.
18
20
19
21
## Usage
@@ -34,7 +36,7 @@ Configurations must be passed to the KafkaClient to initialize node-rdkafka prod
34
36
35
37
The unique identifier of both producer and consumer instance. It is meant as a label and is not to be confused with the group ID.
36
38
37
-
Default value is `default-client`.
39
+
Default value is `default-client-id`.
38
40
39
41
-`groupId?: string`
40
42
@@ -44,7 +46,7 @@ Configurations must be passed to the KafkaClient to initialize node-rdkafka prod
44
46
45
47
-`brokers?: Array`
46
48
47
-
The list of brokers that specifies the Kafka broker(s), the producer and consumer should connect to. Brokers need to be passed as an array, i.e, `['localhost:9092', 'kafka:29092']` because the package internally converts them to string as per the requirement for node-rdkafka that requires `metadata.broker.list` as a string.
49
+
The list of brokers that specifies the Kafka broker(s), the producer and consumer should connect to. Brokers need to be passed as an array, i.e, `['localhost:9092', 'kafka:29092']` because the package internally converts them to string as a requirement for `metadata.broker.list`.
To disconnect either the producer or consumer, call the following methods for both producer and consumer respectively.
90
+
To disconnect either the producer or consumer, call the following methods for producer and consumer respectively.
89
91
```js
90
92
client.disconnectProducer();
91
93
@@ -95,6 +97,6 @@ client.disconnectConsumer();
95
97
96
98
## Motivation
97
99
98
-
Many of our services are relying upon the Kafka message queue system. The problem with using node-rdkafka in each of the different services is that in case of any change to kafka configuration, it had to be replicated across different services for consistency and also the manual setup and configuration of node-rdkafka is not simple and requires a lot of effort to set it up in a way that ensures maintainability.
100
+
Many of our services are relying upon the Kafka message queue system. The problem with using node-rdkafka in multiple services was that in case of any change to kafka configuration, it had to be replicated across multiple services for consistency. The manual setup and configuration of node-rdkafka is not simple and requires a lot of effort to set it up in a way that ensures maintainability.
99
101
100
-
Having a wrapper package around node-rdkafka allows us to not only utilize [exponential backoff](https://www.npmjs.com/package/exponential-backoff) for consumer/producer retry mechanism but also to provide a batteries-included package that would simply allow users to send and consume messages, and with additional ability to disconnect them in case of an error in the services.
102
+
Having a wrapper package around node-rdkafka allows to not only utilize [exponential backoff](https://www.npmjs.com/package/exponential-backoff) for consumer/producer retry mechanism but also provide a batteries-included package that would simply allow users to send and consume messages.
Copy file name to clipboardexpand all lines: src/index.js
+2-9
Original file line number
Diff line number
Diff line change
@@ -75,13 +75,13 @@ class KafkaClient {
75
75
* @constructor
76
76
* @public
77
77
* @param {Object} config The configuration object for kafka client initialization
78
-
* @param {String} config.clientId The client identifier (default: 'default-client')
78
+
* @param {String} config.clientId The client identifier (default: 'default-client-id')
79
79
* @param {String} config.groupId The client group id string. All clients sharing the same groupId belong to the same group (default: 'default-group-id')
80
80
* @param {Array} config.brokers The initial list of brokers as a CSV list of broker host or host:port (default: ['localhost:9092'])
81
81
* @param {String} config.avroSchemaRegistry The schema registry host for encoding and decoding the messages as per the avro schemas wrt a subject (default: 'http://localhost:8081')
0 commit comments