Skip to content

Commit 6b79b0c

Browse files
authored
CLIENTS-2607: Updated documentation to emphasize the use of logger instead of log_cb (confluentinc#1447)
Updated documentation to reflect the use of logger instead of log_cb. Added testcase as well for logger.
1 parent dd7121a commit 6b79b0c

File tree

2 files changed

+51
-6
lines changed

2 files changed

+51
-6
lines changed

docs/index.rst

+7-6
Original file line numberDiff line numberDiff line change
@@ -720,12 +720,8 @@ providing a dict of configuration properties to the instance constructor, e.g.
720720
consumer = confluent_kafka.Consumer(conf)
721721
722722
723-
The supported configuration values are dictated by the underlying
724-
librdkafka C library. For the full range of configuration properties
725-
please consult librdkafka's documentation:
726-
https://github.com/edenhill/librdkafka/blob/master/CONFIGURATION.md
727-
728-
The Python bindings also provide some additional configuration properties:
723+
The Python client provides the following configuration properties in
724+
addition to the properties dictated by the underlying librdkafka C library:
729725

730726
* ``default.topic.config``: value is a dict of client topic-level configuration
731727
properties that are applied to all used topics for the instance. **DEPRECATED:**
@@ -778,3 +774,8 @@ The Python bindings also provide some additional configuration properties:
778774
mylogger.addHandler(logging.StreamHandler())
779775
producer = confluent_kafka.Producer({'bootstrap.servers': 'mybroker.com'}, logger=mylogger)
780776
777+
.. note::
778+
In the Python client, the ``logger`` configuration property is used for log handler, not ``log_cb``.
779+
780+
For the full range of configuration properties, please consult librdkafka's documentation:
781+
https://github.com/edenhill/librdkafka/blob/master/CONFIGURATION.md

tests/test_log.py

+44
Original file line numberDiff line numberDiff line change
@@ -1,5 +1,6 @@
11
#!/usr/bin/env python
22

3+
from io import StringIO
34
import confluent_kafka
45
import confluent_kafka.avro
56
import logging
@@ -114,3 +115,46 @@ def test_logging_constructor():
114115
p.poll(timeout=0.5)
115116

116117
print('%s: %s: %d log messages seen' % (how, f.name, f.cnt))
118+
119+
120+
def test_producer_logger_logging_in_given_format():
121+
"""Test that asserts that logging is working by matching part of the log message"""
122+
123+
stringBuffer = StringIO()
124+
logger = logging.getLogger('Producer')
125+
logger.setLevel(logging.DEBUG)
126+
handler = logging.StreamHandler(stringBuffer)
127+
handler.setFormatter(logging.Formatter('%(name)s Logger | %(message)s'))
128+
logger.addHandler(handler)
129+
130+
p = confluent_kafka.Producer(
131+
{"bootstrap.servers": "test", "logger": logger, "debug": "msg"})
132+
val = 1
133+
while val > 0:
134+
val = p.flush()
135+
logMessage = stringBuffer.getvalue().strip()
136+
stringBuffer.close()
137+
print(logMessage)
138+
139+
assert "Producer Logger | INIT" in logMessage
140+
141+
142+
def test_consumer_logger_logging_in_given_format():
143+
"""Test that asserts that logging is working by matching part of the log message"""
144+
145+
stringBuffer = StringIO()
146+
logger = logging.getLogger('Consumer')
147+
logger.setLevel(logging.DEBUG)
148+
handler = logging.StreamHandler(stringBuffer)
149+
handler.setFormatter(logging.Formatter('%(name)s Logger | %(message)s'))
150+
logger.addHandler(handler)
151+
152+
c = confluent_kafka.Consumer(
153+
{"bootstrap.servers": "test", "group.id": "test", "logger": logger, "debug": "msg"})
154+
c.poll(0)
155+
156+
logMessage = stringBuffer.getvalue().strip()
157+
stringBuffer.close()
158+
c.close()
159+
160+
assert "Consumer Logger | INIT" in logMessage

0 commit comments

Comments
 (0)