Skip to content

Commit d653f08

Browse files
Merge pull request #6 from cinemataztic/remove-unnecessary-console-logs
Remove unnecessary console logs
2 parents a8ce6fe + 8eca1f5 commit d653f08

File tree

5 files changed

+14
-36
lines changed

5 files changed

+14
-36
lines changed

README.md

+8-6
Original file line numberDiff line numberDiff line change
@@ -2,7 +2,7 @@
22

33
Wrapper package around [node-rdkafka](https://www.npmjs.com/package/node-rdkafka) where only the configuration is required, and the package can be used instantly with just the essentials. Don't be scared from the name, Kafka is cool and the name is a nod to the [Undertaker's](https://en.wikipedia.org/wiki/The_Undertaker) biker persona in the early 2000s.
44

5-
The purpose of this package is to provide a battery-included package where one does not have to worry about configuring the [node-rdkafka](https://www.npmjs.com/package/node-rdkafka) package for using Kafka client's functions like sending a message to a topic and consuming a message from a topic. The package handles producer/consumer connection internally and only allows disconnecting both producer and consumer connection.
5+
The purpose is to provide a batteries-included package where one does not have to worry about configuring [node-rdkafka](https://www.npmjs.com/package/node-rdkafka) for sending a message to a topic and consuming a message from a topic. The package handles producer/consumer connection internally and only allows disconnecting producer/consumer externally.
66

77
## Getting started
88

@@ -14,6 +14,8 @@ npm i @cinemataztic/big-evil-kafka
1414

1515
## Prerequisites
1616

17+
Node.js version should be >=16
18+
1719
This package uses [confluent-schema-registry](https://www.npmjs.com/package/@kafkajs/confluent-schema-registry) and assumes that the schema registry is in place along with the Kafka server running in the background.
1820

1921
## Usage
@@ -34,7 +36,7 @@ Configurations must be passed to the KafkaClient to initialize node-rdkafka prod
3436

3537
The unique identifier of both producer and consumer instance. It is meant as a label and is not to be confused with the group ID.
3638

37-
Default value is `default-client`.
39+
Default value is `default-client-id`.
3840

3941
- `groupId?: string`
4042

@@ -44,7 +46,7 @@ Configurations must be passed to the KafkaClient to initialize node-rdkafka prod
4446

4547
- `brokers?: Array`
4648

47-
The list of brokers that specifies the Kafka broker(s), the producer and consumer should connect to. Brokers need to be passed as an array, i.e, `['localhost:9092', 'kafka:29092']` because the package internally converts them to string as per the requirement for node-rdkafka that requires `metadata.broker.list` as a string.
49+
The list of brokers that specifies the Kafka broker(s), the producer and consumer should connect to. Brokers need to be passed as an array, i.e, `['localhost:9092', 'kafka:29092']` because the package internally converts them to string as a requirement for `metadata.broker.list`.
4850

4951
Default value is `['localhost:9092']`.
5052

@@ -85,7 +87,7 @@ client.consumeMessage(topic, onMessage);
8587

8688
## Disconnection
8789

88-
To disconnect either the producer or consumer, call the following methods for both producer and consumer respectively.
90+
To disconnect either the producer or consumer, call the following methods for producer and consumer respectively.
8991
```js
9092
client.disconnectProducer();
9193

@@ -95,6 +97,6 @@ client.disconnectConsumer();
9597

9698
## Motivation
9799

98-
Many of our services are relying upon the Kafka message queue system. The problem with using node-rdkafka in each of the different services is that in case of any change to kafka configuration, it had to be replicated across different services for consistency and also the manual setup and configuration of node-rdkafka is not simple and requires a lot of effort to set it up in a way that ensures maintainability.
100+
Many of our services are relying upon the Kafka message queue system. The problem with using node-rdkafka in multiple services was that in case of any change to kafka configuration, it had to be replicated across multiple services for consistency. The manual setup and configuration of node-rdkafka is not simple and requires a lot of effort to set it up in a way that ensures maintainability.
99101

100-
Having a wrapper package around node-rdkafka allows us to not only utilize [exponential backoff](https://www.npmjs.com/package/exponential-backoff) for consumer/producer retry mechanism but also to provide a batteries-included package that would simply allow users to send and consume messages, and with additional ability to disconnect them in case of an error in the services.
102+
Having a wrapper package around node-rdkafka allows to not only utilize [exponential backoff](https://www.npmjs.com/package/exponential-backoff) for consumer/producer retry mechanism but also provide a batteries-included package that would simply allow users to send and consume messages.

__test__/index.test.js

+1-18
Original file line numberDiff line numberDiff line change
@@ -16,7 +16,7 @@ beforeAll(async () => {
1616
logSpy = jest.spyOn(console, 'log').mockImplementation();
1717
});
1818

19-
describe('Kafka producer integration tests', () => {
19+
describe('Kafka client integration tests', () => {
2020
beforeEach(async () => {
2121
jest.clearAllMocks();
2222
});
@@ -27,18 +27,6 @@ describe('Kafka producer integration tests', () => {
2727
'Kafka producer successfully connected',
2828
);
2929
});
30-
test('should log message when producer sends a message', async () => {
31-
await kafkaClient.sendMessage(topic, { message: 'Hello Cinemataztic' });
32-
expect(logSpy).toHaveBeenCalledWith(
33-
`Successfully published data to topic: ${topic}`,
34-
);
35-
});
36-
});
37-
38-
describe('Kafka consumer integration tests', () => {
39-
beforeEach(async () => {
40-
jest.clearAllMocks();
41-
});
4230

4331
test('should log message when consumer is connected', async () => {
4432
await kafkaClient.consumeMessage(topic, () => {});
@@ -49,7 +37,6 @@ describe('Kafka consumer integration tests', () => {
4937

5038
test('should log message when consumer receives a message', async () => {
5139
await kafkaClient.consumeMessage(topic, (data) => {
52-
console.log(`Message received by consumer on topic: ${topic}`);
5340
expect(data).toHaveProperty('value');
5441
expect(data.value).toHaveProperty('message', 'Hello Cinemataztic');
5542
});
@@ -62,10 +49,6 @@ describe('Kafka consumer integration tests', () => {
6249

6350
// Wait for the polling (via setInterval) to pick up the message.
6451
await new Promise((resolve) => setTimeout(resolve, 5000));
65-
66-
expect(logSpy).toHaveBeenCalledWith(
67-
`Message received by consumer on topic: ${topic}`,
68-
);
6952
});
7053

7154
afterEach(() => {

package-lock.json

+2-2
Some generated files are not rendered by default. Learn more about customizing how changed files appear on GitHub.

package.json

+1-1
Original file line numberDiff line numberDiff line change
@@ -1,6 +1,6 @@
11
{
22
"name": "@cinemataztic/big-evil-kafka",
3-
"version": "1.0.9",
3+
"version": "1.0.10",
44
"description": "A wrapper around node-rdkafka",
55
"main": "src/index.js",
66
"scripts": {

src/index.js

+2-9
Original file line numberDiff line numberDiff line change
@@ -75,13 +75,13 @@ class KafkaClient {
7575
* @constructor
7676
* @public
7777
* @param {Object} config The configuration object for kafka client initialization
78-
* @param {String} config.clientId The client identifier (default: 'default-client')
78+
* @param {String} config.clientId The client identifier (default: 'default-client-id')
7979
* @param {String} config.groupId The client group id string. All clients sharing the same groupId belong to the same group (default: 'default-group-id')
8080
* @param {Array} config.brokers The initial list of brokers as a CSV list of broker host or host:port (default: ['localhost:9092'])
8181
* @param {String} config.avroSchemaRegistry The schema registry host for encoding and decoding the messages as per the avro schemas wrt a subject (default: 'http://localhost:8081')
8282
*/
8383
constructor(config = {}) {
84-
this.#clientId = config.clientId || 'default-client';
84+
this.#clientId = config.clientId || 'default-client-id';
8585
this.#groupId = config.groupId || 'default-group-id';
8686
this.#brokers = config.brokers || ['localhost:9092'];
8787
this.#avroSchemaRegistry =
@@ -251,8 +251,6 @@ class KafkaClient {
251251
const subject = `${topic}-value`;
252252
const id = await this.#registry.getRegistryId(subject, 'latest');
253253

254-
console.log(`Using schema ${topic}-value@latest (id: ${id})`);
255-
256254
const encodedMessage = await this.#registry.encode(id, message);
257255

258256
this.#producer.produce(
@@ -261,8 +259,6 @@ class KafkaClient {
261259
Buffer.from(encodedMessage),
262260
`${topic}-schema`, // Key
263261
);
264-
265-
console.log(`Successfully published data to topic: ${topic}`);
266262
}
267263
} catch (error) {
268264
console.error(
@@ -287,7 +283,6 @@ class KafkaClient {
287283

288284
if (this.#isConsumerConnected) {
289285
this.#consumer.subscribe([topic]);
290-
console.log(`Subscribed to topic ${topic}`);
291286

292287
if (!this.#intervalId) {
293288
this.#intervalId = setInterval(() => {
@@ -299,8 +294,6 @@ class KafkaClient {
299294
try {
300295
const decodedValue = await this.#registry.decode(data.value);
301296

302-
console.log(`Message received by consumer on topic: ${topic}`);
303-
304297
onMessage({ value: decodedValue });
305298
} catch (error) {
306299
console.error(

0 commit comments

Comments
 (0)