@@ -24,7 +24,7 @@ with Apache Kafka at its core. It's high priority for us that client features ke
2424pace with core Apache Kafka and components of the [ Confluent Platform] ( https://www.confluent.io/product/compare/ ) .
2525
2626The Python bindings provides a high-level Producer and Consumer with support
27- for the balanced consumer groups of Apache Kafka & gt ; = 0.9.
27+ for the balanced consumer groups of Apache Kafka 0.9.
2828
2929See the [ API documentation] ( http://docs.confluent.io/current/clients/confluent-kafka-python/index.html ) for more info.
3030
@@ -42,7 +42,6 @@ from confluent_kafka import Producer
4242p = Producer({' bootstrap.servers' : ' mybroker,mybroker2' })
4343for data in some_data_source:
4444 p.produce(' mytopic' , data.encode(' utf-8' ))
45- p.poll(0 )
4645p.flush()
4746```
4847
@@ -111,72 +110,64 @@ c.close()
111110See [ examples] ( examples ) for more examples.
112111
113112
114- Install
115- =======
116-
117- ** Install self-contained binary wheels for OSX and Linux from PyPi:**
118-
119- $ pip install confluent-kafka
120-
121- ** Install AvroProducer and AvroConsumer:**
122-
123- $ pip install confluent-kafka[avro]
124-
125- ** Install from source from PyPi** * (requires librdkafka + dependencies to be installed separately)* :
126-
127- $ pip install --no-binary :all: confluent-kafka
113+ Broker Compatibility
114+ ====================
115+ The Python client (as well as the underlying C library librdkafka) supports
116+ all broker versions > ; = 0.8.
117+ But due to the nature of the Kafka protocol in broker versions 0.8 and 0.9 it
118+ is not safe for a client to assume what protocol version is actually supported
119+ by the broker, thus you will need to hint the Python client what protocol
120+ version it may use. This is done through two configuration settings:
128121
129- ** Install from source directory:**
122+ * ` broker.version.fallback=YOUR_BROKER_VERSION ` (default 0.9.0.1)
123+ * ` api.version.request=true|false ` (default true)
130124
131- $ pip install .
125+ When using a Kafka 0.10 broker or later you don't need to do anything
126+ (` api.version.request=true ` is the default).
127+ If you use Kafka broker 0.9 or 0.8 you must set
128+ ` api.version.request=false ` and set
129+ ` broker.version.fallback ` to your broker version,
130+ e.g ` broker.version.fallback=0.9.0.1 ` .
132131
133- # for AvroProducer or AvroConsumer
134- $ pip install .[avro]
132+ More info here:
133+ https://github.com/edenhill/librdkafka/wiki/Broker-version-compatibility
135134
136135
137136Prerequisites
138137=============
139138
140- * Python >= 2.7 or Python 3.x
141- * ( [ librdkafka] ( https://github.com/edenhill/librdkafka ) >= 0.9.2 )
139+ * Python >= 2.6 or Python 3.x
140+ * [ librdkafka] ( https://github.com/edenhill/librdkafka ) >= 0.9.1 (embedded in Linux wheels )
142141
143- The latest version of librdkafka is embedded in the OSX and Linux wheels,
144- for other platforms or when a specific version of librdkafka is desired, follow these guidelines:
142+ librdkafka is embedded in the manylinux wheels, for other platforms or
143+ when a specific version of librdkafka is desired, following these guidelines:
145144
146- * For ** Debian/Ubuntu** based systems, add the Confluent APT repository and then do ` sudo apt-get install librdkafka-dev python-dev ` :
145+ * For ** Debian/Ubuntu**** based systems, add this APT repo and then do ` sudo apt-get install librdkafka-dev python-dev ` :
147146http://docs.confluent.io/current/installation.html#installation-apt
148147
149- * For ** RedHat** and ** RPM** -based distros, add the Confluent YUM repository and then do ` sudo yum install librdkafka-devel python-devel ` :
148+ * For ** RedHat** and ** RPM** -based distros, add this YUM repo and then do ` sudo yum install librdkafka-devel python-devel ` :
150149http://docs.confluent.io/current/installation.html#rpm-packages-via-yum
151150
152151 * On ** OSX** , use ** homebrew** and do ` sudo brew install librdkafka `
153152
154- ** NOTE:** The pre-built Linux wheels do NOT contain SASL Kerberos support. If you need SASL Kerberos support you must install librdkafka and its dependencies using the above repositories and then build confluent-kafka from source using the instructions below.
155153
154+ Install
155+ =======
156156
157+ ** Install from PyPi:**
157158
158- Broker Compatibility
159- ====================
160- The Python client (as well as the underlying C library librdkafka) supports
161- all broker versions > ; = 0.8.
162- But due to the nature of the Kafka protocol in broker versions 0.8 and 0.9 it
163- is not safe for a client to assume what protocol version is actually supported
164- by the broker, thus you will need to hint the Python client what protocol
165- version it may use. This is done through two configuration settings:
159+ $ pip install confluent-kafka
166160
167- * ` broker.version.fallback=YOUR_BROKER_VERSION ` (default 0.9.0.1)
168- * ` api.version.request=true|false ` (default true)
161+ # for AvroProducer or AvroConsumer
162+ $ pip install confluent-kafka[avro]
169163
170- When using a Kafka 0.10 broker or later you don't need to do anything
171- (` api.version.request=true ` is the default).
172- If you use Kafka broker 0.9 or 0.8 you must set ` api.version.request=false `
173- and set ` broker.version.fallback ` to your broker version,
174- e.g ` broker.version.fallback=0.9.0.1 ` .
175164
176- More info here:
177- https://github.com/edenhill/librdkafka/wiki/Broker-version-compatibility
165+ ** Install from source / tarball:**
178166
167+ $ pip install .
179168
169+ # for AvroProducer or AvroConsumer
170+ $ pip install .[avro]
180171
181172
182173Build
0 commit comments