@@ -24,7 +24,7 @@ with Apache Kafka at its core. It's high priority for us that client features ke
24
24
pace with core Apache Kafka and components of the [ Confluent Platform] ( https://www.confluent.io/product/compare/ ) .
25
25
26
26
The Python bindings provides a high-level Producer and Consumer with support
27
- for the balanced consumer groups of Apache Kafka & gt ; = 0.9.
27
+ for the balanced consumer groups of Apache Kafka 0.9.
28
28
29
29
See the [ API documentation] ( http://docs.confluent.io/current/clients/confluent-kafka-python/index.html ) for more info.
30
30
@@ -42,7 +42,6 @@ from confluent_kafka import Producer
42
42
p = Producer({' bootstrap.servers' : ' mybroker,mybroker2' })
43
43
for data in some_data_source:
44
44
p.produce(' mytopic' , data.encode(' utf-8' ))
45
- p.poll(0 )
46
45
p.flush()
47
46
```
48
47
@@ -111,72 +110,64 @@ c.close()
111
110
See [ examples] ( examples ) for more examples.
112
111
113
112
114
- Install
115
- =======
116
-
117
- ** Install self-contained binary wheels for OSX and Linux from PyPi:**
118
-
119
- $ pip install confluent-kafka
120
-
121
- ** Install AvroProducer and AvroConsumer:**
122
-
123
- $ pip install confluent-kafka[avro]
124
-
125
- ** Install from source from PyPi** * (requires librdkafka + dependencies to be installed separately)* :
126
-
127
- $ pip install --no-binary :all: confluent-kafka
113
+ Broker Compatibility
114
+ ====================
115
+ The Python client (as well as the underlying C library librdkafka) supports
116
+ all broker versions > ; = 0.8.
117
+ But due to the nature of the Kafka protocol in broker versions 0.8 and 0.9 it
118
+ is not safe for a client to assume what protocol version is actually supported
119
+ by the broker, thus you will need to hint the Python client what protocol
120
+ version it may use. This is done through two configuration settings:
128
121
129
- ** Install from source directory:**
122
+ * ` broker.version.fallback=YOUR_BROKER_VERSION ` (default 0.9.0.1)
123
+ * ` api.version.request=true|false ` (default true)
130
124
131
- $ pip install .
125
+ When using a Kafka 0.10 broker or later you don't need to do anything
126
+ (` api.version.request=true ` is the default).
127
+ If you use Kafka broker 0.9 or 0.8 you must set
128
+ ` api.version.request=false ` and set
129
+ ` broker.version.fallback ` to your broker version,
130
+ e.g ` broker.version.fallback=0.9.0.1 ` .
132
131
133
- # for AvroProducer or AvroConsumer
134
- $ pip install .[avro]
132
+ More info here:
133
+ https://github.com/edenhill/librdkafka/wiki/Broker-version-compatibility
135
134
136
135
137
136
Prerequisites
138
137
=============
139
138
140
- * Python >= 2.7 or Python 3.x
141
- * ( [ librdkafka] ( https://github.com/edenhill/librdkafka ) >= 0.9.2 )
139
+ * Python >= 2.6 or Python 3.x
140
+ * [ librdkafka] ( https://github.com/edenhill/librdkafka ) >= 0.9.1 (embedded in Linux wheels )
142
141
143
- The latest version of librdkafka is embedded in the OSX and Linux wheels,
144
- for other platforms or when a specific version of librdkafka is desired, follow these guidelines:
142
+ librdkafka is embedded in the manylinux wheels, for other platforms or
143
+ when a specific version of librdkafka is desired, following these guidelines:
145
144
146
- * For ** Debian/Ubuntu** based systems, add the Confluent APT repository and then do ` sudo apt-get install librdkafka-dev python-dev ` :
145
+ * For ** Debian/Ubuntu**** based systems, add this APT repo and then do ` sudo apt-get install librdkafka-dev python-dev ` :
147
146
http://docs.confluent.io/current/installation.html#installation-apt
148
147
149
- * For ** RedHat** and ** RPM** -based distros, add the Confluent YUM repository and then do ` sudo yum install librdkafka-devel python-devel ` :
148
+ * For ** RedHat** and ** RPM** -based distros, add this YUM repo and then do ` sudo yum install librdkafka-devel python-devel ` :
150
149
http://docs.confluent.io/current/installation.html#rpm-packages-via-yum
151
150
152
151
* On ** OSX** , use ** homebrew** and do ` sudo brew install librdkafka `
153
152
154
- ** NOTE:** The pre-built Linux wheels do NOT contain SASL Kerberos support. If you need SASL Kerberos support you must install librdkafka and its dependencies using the above repositories and then build confluent-kafka from source using the instructions below.
155
153
154
+ Install
155
+ =======
156
156
157
+ ** Install from PyPi:**
157
158
158
- Broker Compatibility
159
- ====================
160
- The Python client (as well as the underlying C library librdkafka) supports
161
- all broker versions > ; = 0.8.
162
- But due to the nature of the Kafka protocol in broker versions 0.8 and 0.9 it
163
- is not safe for a client to assume what protocol version is actually supported
164
- by the broker, thus you will need to hint the Python client what protocol
165
- version it may use. This is done through two configuration settings:
159
+ $ pip install confluent-kafka
166
160
167
- * ` broker.version.fallback=YOUR_BROKER_VERSION ` (default 0.9.0.1)
168
- * ` api.version.request=true|false ` (default true)
161
+ # for AvroProducer or AvroConsumer
162
+ $ pip install confluent-kafka[avro]
169
163
170
- When using a Kafka 0.10 broker or later you don't need to do anything
171
- (` api.version.request=true ` is the default).
172
- If you use Kafka broker 0.9 or 0.8 you must set ` api.version.request=false `
173
- and set ` broker.version.fallback ` to your broker version,
174
- e.g ` broker.version.fallback=0.9.0.1 ` .
175
164
176
- More info here:
177
- https://github.com/edenhill/librdkafka/wiki/Broker-version-compatibility
165
+ ** Install from source / tarball:**
178
166
167
+ $ pip install .
179
168
169
+ # for AvroProducer or AvroConsumer
170
+ $ pip install .[avro]
180
171
181
172
182
173
Build
0 commit comments