Skip to content

Commit 2a1ed6a

Browse files
Merge pull request #2 from cinemataztic/release/1.0.0
1.0.5
2 parents 0a82ce4 + 2abc255 commit 2a1ed6a

File tree

5 files changed

+81
-5
lines changed

5 files changed

+81
-5
lines changed

.github/workflows/npm-publish.yaml

Lines changed: 1 addition & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -18,6 +18,6 @@ jobs:
1818
registry-url: https://registry.npmjs.org/
1919
scope: "@cinemataztic"
2020
- run: npm ci
21-
- run: npm publish --provenance --access public # For more information on --provenance see: https://docs.github.com/en/actions/use-cases-and-examples/publishing-packages/publishing-nodejs-packages#publishing-packages-to-the-npm-registry
21+
- run: npm publish --access public
2222
env:
2323
NODE_AUTH_TOKEN: ${{ secrets.NPM_TOKEN }}

README.md

Lines changed: 74 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -0,0 +1,74 @@
1+
# Big Evil Kafka
2+
3+
Wrapper package around [node-rdkafka](https://www.npmjs.com/package/node-rdkafka) where only the configuration is required, and the package can be used instantly with just the essentials. Dont be scared from the name, Kafka is cool and the name is a nod to the [Undertaker's](https://en.wikipedia.org/wiki/The_Undertaker) biker persona in the early 2000's.
4+
5+
The purpose of this package is to provide a batteries included package where one does not have to worry about configuring the [node-rdkafka](https://www.npmjs.com/package/node-rdkafka) package for using kafka client's functions like sending a message to a topic and consuming a message to a topic. The package handles producer/consumer connection internally and only allows disconnecting both producer and consumer connection.
6+
7+
## Getting started
8+
9+
The package is available on [npm](https://www.npmjs.com/package/@cinemataztic/big-evil-kafka), and can be installed with:
10+
11+
```sh
12+
npm i @cinemataztic/big-evil-kafka
13+
```
14+
15+
## Prerequisites
16+
17+
This package uses [confluent-schema-registry](https://www.npmjs.com/package/@kafkajs/confluent-schema-registry) and assumes that schema registry is in place along with kafka server running in the background.
18+
19+
## Usage
20+
21+
To use the module, you must require it.
22+
23+
```js
24+
const { KafkaClient } = require('@cinemataztic/big-evil-kafka');
25+
```
26+
27+
## Configuration
28+
29+
Configurations must be passed to the KafkaClient in order to initialize node-rdkafka producer and consumer internally.
30+
31+
brokers needs to be passed as a string i.e `localhost:9092, kafka:29092`
32+
33+
```js
34+
const client = new KafkaClient({
35+
clientId: process.env.KAFKA_CLIENT_ID,
36+
groupId: process.env.KAFKA_GROUP_ID,
37+
brokers: process.env.KAFKA_BROKERS.split(','), // Assumes value as localhost:9092
38+
avroSchemaRegistry: process.env.SCHEMA_REGISTRY_URL,
39+
});
40+
```
41+
42+
## Sending Messages
43+
44+
To send a message a topic, provide the topic name and the message.
45+
46+
```js
47+
client.sendMessage(topic, message);
48+
```
49+
50+
## Consuming Messages
51+
52+
The package uses non-flowing consumer mode with `enable.auto.commit` enabled along with 'auto.offset.reset' set to earliest.
53+
54+
To consume a message from a topic, provide the topic name and a callback function that would return the message.
55+
56+
```js
57+
client.consumeMessage(topic, onMessage);
58+
```
59+
60+
## Disconnecting
61+
62+
To disconnect either the producer or consumer, call the following methods for both producer and consumer respectively.
63+
```js
64+
client.disconnectProducer();
65+
66+
client.disconnectConsumer();
67+
68+
```
69+
70+
## Motivation
71+
72+
Many of our services are relying upon the kafka message queue system. The problem with using node-rdkafka in each of the different services is that in case of any change to kafka configuration, it had to be replicated across different services for consistency and also the manual setup and configuration of node-rdkafka is not simple and requires a lot of effort to set it up in a way that ensures maintainability.
73+
74+
Having a wrapper package around node-rdkafka enables us to not only utilize [exponential backoff](https://www.npmjs.com/package/exponential-backoff) for consumer/producer retry mechanism but also to provide a batteries included package that would simply allow users to send and consume messages and with additional ability to disconnect them in case of an error in the services. The user only needs to provide the configuration to use the package and can use the package without worrying about to set it up with respect to node-rdkafka connection and its methods that can be somewhat hard to implement if not well-versed in how node-rdkafka works.

__test__/index.test.js

Lines changed: 2 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -50,6 +50,8 @@ describe('Kafka consumer integration tests', () => {
5050
test('should log message when consumer receives a message', async () => {
5151
await kafkaClient.consumeMessage(topic, (data) => {
5252
console.log(`Message received by consumer on topic: ${topic}`);
53+
expect(data).toHaveProperty('value');
54+
expect(data.value).toHaveProperty('message', 'Hello Cinemataztic');
5355
});
5456

5557
// Wait for consumer to connect and subscribe.

package-lock.json

Lines changed: 2 additions & 2 deletions
Some generated files are not rendered by default. Learn more about customizing how changed files appear on GitHub.

package.json

Lines changed: 2 additions & 2 deletions
Original file line numberDiff line numberDiff line change
@@ -1,6 +1,6 @@
11
{
2-
"name": "big-evil-kafka",
3-
"version": "0.1.0",
2+
"name": "@cinemataztic/big-evil-kafka",
3+
"version": "1.0.5",
44
"description": "A wrapper around node-rdkafka",
55
"main": "src/index.js",
66
"scripts": {

0 commit comments

Comments
 (0)