Skip to content

Commit 7e04f4e

Browse files
committedApr 14, 2016
Initial version
0 parents  commit 7e04f4e

25 files changed

+4400
-0
lines changed
 

‎.dir-locals.el

+1
Original file line numberDiff line numberDiff line change
@@ -0,0 +1 @@
1+
( (c-mode . ((c-file-style . "linux"))) )

‎.gitignore

+13
Original file line numberDiff line numberDiff line change
@@ -0,0 +1,13 @@
1+
build
2+
_build
3+
dist
4+
*~
5+
\#*
6+
MANIFEST
7+
core
8+
vgcore*
9+
__pycache__
10+
*.so
11+
confluent?kafka.egg-info
12+
*.pyc
13+
.cache

‎Consumer.c

+658
Large diffs are not rendered by default.

‎LICENSE

+202
Original file line numberDiff line numberDiff line change
@@ -0,0 +1,202 @@
1+
Apache License
2+
Version 2.0, January 2004
3+
http://www.apache.org/licenses/
4+
5+
TERMS AND CONDITIONS FOR USE, REPRODUCTION, AND DISTRIBUTION
6+
7+
1. Definitions.
8+
9+
"License" shall mean the terms and conditions for use, reproduction,
10+
and distribution as defined by Sections 1 through 9 of this document.
11+
12+
"Licensor" shall mean the copyright owner or entity authorized by
13+
the copyright owner that is granting the License.
14+
15+
"Legal Entity" shall mean the union of the acting entity and all
16+
other entities that control, are controlled by, or are under common
17+
control with that entity. For the purposes of this definition,
18+
"control" means (i) the power, direct or indirect, to cause the
19+
direction or management of such entity, whether by contract or
20+
otherwise, or (ii) ownership of fifty percent (50%) or more of the
21+
outstanding shares, or (iii) beneficial ownership of such entity.
22+
23+
"You" (or "Your") shall mean an individual or Legal Entity
24+
exercising permissions granted by this License.
25+
26+
"Source" form shall mean the preferred form for making modifications,
27+
including but not limited to software source code, documentation
28+
source, and configuration files.
29+
30+
"Object" form shall mean any form resulting from mechanical
31+
transformation or translation of a Source form, including but
32+
not limited to compiled object code, generated documentation,
33+
and conversions to other media types.
34+
35+
"Work" shall mean the work of authorship, whether in Source or
36+
Object form, made available under the License, as indicated by a
37+
copyright notice that is included in or attached to the work
38+
(an example is provided in the Appendix below).
39+
40+
"Derivative Works" shall mean any work, whether in Source or Object
41+
form, that is based on (or derived from) the Work and for which the
42+
editorial revisions, annotations, elaborations, or other modifications
43+
represent, as a whole, an original work of authorship. For the purposes
44+
of this License, Derivative Works shall not include works that remain
45+
separable from, or merely link (or bind by name) to the interfaces of,
46+
the Work and Derivative Works thereof.
47+
48+
"Contribution" shall mean any work of authorship, including
49+
the original version of the Work and any modifications or additions
50+
to that Work or Derivative Works thereof, that is intentionally
51+
submitted to Licensor for inclusion in the Work by the copyright owner
52+
or by an individual or Legal Entity authorized to submit on behalf of
53+
the copyright owner. For the purposes of this definition, "submitted"
54+
means any form of electronic, verbal, or written communication sent
55+
to the Licensor or its representatives, including but not limited to
56+
communication on electronic mailing lists, source code control systems,
57+
and issue tracking systems that are managed by, or on behalf of, the
58+
Licensor for the purpose of discussing and improving the Work, but
59+
excluding communication that is conspicuously marked or otherwise
60+
designated in writing by the copyright owner as "Not a Contribution."
61+
62+
"Contributor" shall mean Licensor and any individual or Legal Entity
63+
on behalf of whom a Contribution has been received by Licensor and
64+
subsequently incorporated within the Work.
65+
66+
2. Grant of Copyright License. Subject to the terms and conditions of
67+
this License, each Contributor hereby grants to You a perpetual,
68+
worldwide, non-exclusive, no-charge, royalty-free, irrevocable
69+
copyright license to reproduce, prepare Derivative Works of,
70+
publicly display, publicly perform, sublicense, and distribute the
71+
Work and such Derivative Works in Source or Object form.
72+
73+
3. Grant of Patent License. Subject to the terms and conditions of
74+
this License, each Contributor hereby grants to You a perpetual,
75+
worldwide, non-exclusive, no-charge, royalty-free, irrevocable
76+
(except as stated in this section) patent license to make, have made,
77+
use, offer to sell, sell, import, and otherwise transfer the Work,
78+
where such license applies only to those patent claims licensable
79+
by such Contributor that are necessarily infringed by their
80+
Contribution(s) alone or by combination of their Contribution(s)
81+
with the Work to which such Contribution(s) was submitted. If You
82+
institute patent litigation against any entity (including a
83+
cross-claim or counterclaim in a lawsuit) alleging that the Work
84+
or a Contribution incorporated within the Work constitutes direct
85+
or contributory patent infringement, then any patent licenses
86+
granted to You under this License for that Work shall terminate
87+
as of the date such litigation is filed.
88+
89+
4. Redistribution. You may reproduce and distribute copies of the
90+
Work or Derivative Works thereof in any medium, with or without
91+
modifications, and in Source or Object form, provided that You
92+
meet the following conditions:
93+
94+
(a) You must give any other recipients of the Work or
95+
Derivative Works a copy of this License; and
96+
97+
(b) You must cause any modified files to carry prominent notices
98+
stating that You changed the files; and
99+
100+
(c) You must retain, in the Source form of any Derivative Works
101+
that You distribute, all copyright, patent, trademark, and
102+
attribution notices from the Source form of the Work,
103+
excluding those notices that do not pertain to any part of
104+
the Derivative Works; and
105+
106+
(d) If the Work includes a "NOTICE" text file as part of its
107+
distribution, then any Derivative Works that You distribute must
108+
include a readable copy of the attribution notices contained
109+
within such NOTICE file, excluding those notices that do not
110+
pertain to any part of the Derivative Works, in at least one
111+
of the following places: within a NOTICE text file distributed
112+
as part of the Derivative Works; within the Source form or
113+
documentation, if provided along with the Derivative Works; or,
114+
within a display generated by the Derivative Works, if and
115+
wherever such third-party notices normally appear. The contents
116+
of the NOTICE file are for informational purposes only and
117+
do not modify the License. You may add Your own attribution
118+
notices within Derivative Works that You distribute, alongside
119+
or as an addendum to the NOTICE text from the Work, provided
120+
that such additional attribution notices cannot be construed
121+
as modifying the License.
122+
123+
You may add Your own copyright statement to Your modifications and
124+
may provide additional or different license terms and conditions
125+
for use, reproduction, or distribution of Your modifications, or
126+
for any such Derivative Works as a whole, provided Your use,
127+
reproduction, and distribution of the Work otherwise complies with
128+
the conditions stated in this License.
129+
130+
5. Submission of Contributions. Unless You explicitly state otherwise,
131+
any Contribution intentionally submitted for inclusion in the Work
132+
by You to the Licensor shall be under the terms and conditions of
133+
this License, without any additional terms or conditions.
134+
Notwithstanding the above, nothing herein shall supersede or modify
135+
the terms of any separate license agreement you may have executed
136+
with Licensor regarding such Contributions.
137+
138+
6. Trademarks. This License does not grant permission to use the trade
139+
names, trademarks, service marks, or product names of the Licensor,
140+
except as required for reasonable and customary use in describing the
141+
origin of the Work and reproducing the content of the NOTICE file.
142+
143+
7. Disclaimer of Warranty. Unless required by applicable law or
144+
agreed to in writing, Licensor provides the Work (and each
145+
Contributor provides its Contributions) on an "AS IS" BASIS,
146+
WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or
147+
implied, including, without limitation, any warranties or conditions
148+
of TITLE, NON-INFRINGEMENT, MERCHANTABILITY, or FITNESS FOR A
149+
PARTICULAR PURPOSE. You are solely responsible for determining the
150+
appropriateness of using or redistributing the Work and assume any
151+
risks associated with Your exercise of permissions under this License.
152+
153+
8. Limitation of Liability. In no event and under no legal theory,
154+
whether in tort (including negligence), contract, or otherwise,
155+
unless required by applicable law (such as deliberate and grossly
156+
negligent acts) or agreed to in writing, shall any Contributor be
157+
liable to You for damages, including any direct, indirect, special,
158+
incidental, or consequential damages of any character arising as a
159+
result of this License or out of the use or inability to use the
160+
Work (including but not limited to damages for loss of goodwill,
161+
work stoppage, computer failure or malfunction, or any and all
162+
other commercial damages or losses), even if such Contributor
163+
has been advised of the possibility of such damages.
164+
165+
9. Accepting Warranty or Additional Liability. While redistributing
166+
the Work or Derivative Works thereof, You may choose to offer,
167+
and charge a fee for, acceptance of support, warranty, indemnity,
168+
or other liability obligations and/or rights consistent with this
169+
License. However, in accepting such obligations, You may act only
170+
on Your own behalf and on Your sole responsibility, not on behalf
171+
of any other Contributor, and only if You agree to indemnify,
172+
defend, and hold each Contributor harmless for any liability
173+
incurred by, or claims asserted against, such Contributor by reason
174+
of your accepting any such warranty or additional liability.
175+
176+
END OF TERMS AND CONDITIONS
177+
178+
APPENDIX: How to apply the Apache License to your work.
179+
180+
To apply the Apache License to your work, attach the following
181+
boilerplate notice, with the fields enclosed by brackets "{}"
182+
replaced with your own identifying information. (Don't include
183+
the brackets!) The text should be enclosed in the appropriate
184+
comment syntax for the file format. We also recommend that a
185+
file or class name and description of purpose be included on the
186+
same "printed page" as the copyright notice for easier
187+
identification within third-party archives.
188+
189+
Copyright {yyyy} {name of copyright owner}
190+
191+
Licensed under the Apache License, Version 2.0 (the "License");
192+
you may not use this file except in compliance with the License.
193+
You may obtain a copy of the License at
194+
195+
http://www.apache.org/licenses/LICENSE-2.0
196+
197+
Unless required by applicable law or agreed to in writing, software
198+
distributed under the License is distributed on an "AS IS" BASIS,
199+
WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
200+
See the License for the specific language governing permissions and
201+
limitations under the License.
202+

‎MANIFEST.in

+3
Original file line numberDiff line numberDiff line change
@@ -0,0 +1,3 @@
1+
include README.md
2+
include *.c *.h
3+

‎Makefile

+14
Original file line numberDiff line numberDiff line change
@@ -0,0 +1,14 @@
1+
all:
2+
@echo "Targets:"
3+
@echo " clean"
4+
@echo " docs"
5+
6+
7+
clean:
8+
python setup.py clean
9+
make -C docs clean
10+
11+
.PHONY: docs
12+
13+
docs:
14+
$(MAKE) -C docs html

‎Producer.c

+516
Large diffs are not rendered by default.

‎README.md

+59
Original file line numberDiff line numberDiff line change
@@ -0,0 +1,59 @@
1+
Confluent's Apache Kafka client for Python
2+
==========================================
3+
4+
5+
Prerequisites
6+
===============
7+
8+
librdkafka >=0.9.1 (or master>=2016-04-13)
9+
py.test (pip install pytest)
10+
11+
12+
Build
13+
=====
14+
15+
For Python 2:
16+
17+
python setup.by build
18+
19+
20+
For Python 3:
21+
22+
python3 setup.by build
23+
24+
25+
Install
26+
=======
27+
Preferably in a virtualenv:
28+
29+
pip install .
30+
31+
32+
Run unit-tests
33+
==============
34+
35+
py.test
36+
37+
38+
Run integration tests
39+
=====================
40+
WARNING:: These tests require an active Kafka cluster and will make use of
41+
a topic named 'test'.
42+
43+
./integration_test.py <kafka-broker>
44+
45+
46+
47+
Generate documentation
48+
======================
49+
50+
make docs
51+
52+
or:
53+
54+
python setup.by build_sphinx
55+
56+
57+
Documentation will be generated in docs/_build/
58+
59+

‎confluent_kafka.c

+1,283
Large diffs are not rendered by default.

‎confluent_kafka.h

+205
Original file line numberDiff line numberDiff line change
@@ -0,0 +1,205 @@
1+
/**
2+
* Copyright 2016 Confluent Inc.
3+
*
4+
* Licensed under the Apache License, Version 2.0 (the "License");
5+
* you may not use this file except in compliance with the License.
6+
* You may obtain a copy of the License at
7+
*
8+
* http://www.apache.org/licenses/LICENSE-2.0
9+
*
10+
* Unless required by applicable law or agreed to in writing, software
11+
* distributed under the License is distributed on an "AS IS" BASIS,
12+
* WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
13+
* See the License for the specific language governing permissions and
14+
* limitations under the License.
15+
*/
16+
17+
#include <Python.h>
18+
#include <structmember.h>
19+
20+
#include <librdkafka/rdkafka.h>
21+
22+
#if PY_MAJOR_VERSION >= 3
23+
#define PY3
24+
#include <bytesobject.h>
25+
#endif
26+
27+
28+
29+
/****************************************************************************
30+
*
31+
*
32+
* Python 2 & 3 portability
33+
*
34+
* Binary data (we call it cfl_PyBin):
35+
* Python 2: string
36+
* Python 3: bytes
37+
*
38+
* Unicode Strings (we call it cfl_PyUnistr):
39+
* Python 2: unicode
40+
* Python 3: strings
41+
*
42+
****************************************************************************/
43+
44+
#ifdef PY3
45+
/**
46+
* @brief Binary type, use as cfl_PyBin(_X(A,B)) where _X() is the type-less
47+
* suffix of a PyBinary/Str_X() function
48+
*/
49+
#define cfl_PyBin(X) PyBinary ## X
50+
51+
/**
52+
* @brief Unicode type, same usage as PyBin()
53+
*/
54+
#define cfl_PyUnistr(X) PyUnicode ## X
55+
56+
/**
57+
* @returns Unicode Python object as char * in UTF-8 encoding
58+
*/
59+
#define cfl_PyUnistr_AsUTF8(X) PyUnicode_AsUTF8(X)
60+
61+
/**
62+
* @returns Unicode Python string object
63+
*/
64+
#define cfl_PyObject_Unistr(X) PyObject_Str(X)
65+
#else
66+
67+
/* See comments above */
68+
#define cfl_PyBin(X) PyString ## X
69+
#define cfl_PyUnistr(X) PyUnicode ## X
70+
#define cfl_PyUnistr_AsUTF8(X) PyBytes_AsString(PyUnicode_AsUTF8String(X))
71+
#define cfl_PyObject_Unistr(X) PyObject_Unicode(X)
72+
#endif
73+
74+
75+
/****************************************************************************
76+
*
77+
*
78+
* KafkaError
79+
*
80+
*
81+
*
82+
*
83+
****************************************************************************/
84+
extern PyObject *KafkaException;
85+
86+
PyObject *KafkaError_new0 (rd_kafka_resp_err_t err, const char *fmt, ...);
87+
88+
89+
/**
90+
* @brief Raise an exception using KafkaError.
91+
* \p err and and \p ... (string representation of error) is set on the returned
92+
* KafkaError object.
93+
*/
94+
#define cfl_PyErr_Format(err,...) do { \
95+
PyObject *_eo = KafkaError_new0(err, __VA_ARGS__); \
96+
PyErr_SetObject(KafkaException, _eo); \
97+
} while (0)
98+
99+
/****************************************************************************
100+
*
101+
*
102+
* Common
103+
*
104+
*
105+
*
106+
*
107+
****************************************************************************/
108+
rd_kafka_conf_t *common_conf_setup (rd_kafka_type_t ktype,
109+
void *self0,
110+
PyObject *args,
111+
PyObject *kwargs);
112+
PyObject *c_parts_to_py (const rd_kafka_topic_partition_list_t *c_parts);
113+
rd_kafka_topic_partition_list_t *py_to_c_parts (PyObject *plist);
114+
115+
116+
/****************************************************************************
117+
*
118+
*
119+
* Message
120+
*
121+
*
122+
*
123+
*
124+
****************************************************************************/
125+
126+
/**
127+
* @brief confluent_kafka.Message object
128+
*/
129+
typedef struct {
130+
PyObject_HEAD
131+
PyObject *topic;
132+
PyObject *value;
133+
PyObject *key;
134+
PyObject *error;
135+
int32_t partition;
136+
int64_t offset;
137+
} Message;
138+
139+
extern PyTypeObject MessageType;
140+
141+
PyObject *Message_new0 (const rd_kafka_message_t *rkm);
142+
PyObject *Message_error (Message *self, PyObject *ignore);
143+
144+
145+
/****************************************************************************
146+
*
147+
*
148+
* Producer
149+
*
150+
*
151+
*
152+
*
153+
****************************************************************************/
154+
155+
/**
156+
* @brief confluent_kafka.Producer object
157+
*/
158+
typedef struct {
159+
PyObject_HEAD
160+
rd_kafka_t *rk;
161+
PyObject *default_dr_cb;
162+
PyObject *partitioner_cb; /**< Registered Python partitioner */
163+
int32_t (*c_partitioner_cb) (
164+
const rd_kafka_topic_t *,
165+
const void *, size_t, int32_t,
166+
void *, void *); /**< Fallback C partitioner*/
167+
int callback_crashed;
168+
PyThreadState *thread_state;
169+
} Producer;
170+
171+
172+
extern PyTypeObject ProducerType;
173+
174+
int32_t Producer_partitioner_cb (const rd_kafka_topic_t *rkt,
175+
const void *keydata,
176+
size_t keylen,
177+
int32_t partition_cnt,
178+
void *rkt_opaque, void *msg_opaque);
179+
180+
181+
/****************************************************************************
182+
*
183+
*
184+
* Consumer
185+
*
186+
*
187+
*
188+
*
189+
****************************************************************************/
190+
191+
/**
192+
* @brief confluent_kafka.Consumer object
193+
*/
194+
typedef struct {
195+
PyObject_HEAD
196+
rd_kafka_t *rk;
197+
int rebalance_assigned; /* Rebalance: Callback performed assign() call.*/
198+
PyObject *on_assign; /* Rebalance: on_assign callback */
199+
PyObject *on_revoke; /* Rebalance: on_revoke callback */
200+
int callback_crashed;
201+
PyThreadState *thread_state;
202+
} Consumer;
203+
204+
extern PyTypeObject ConsumerType ;
205+

‎docs/Makefile

+177
Original file line numberDiff line numberDiff line change
@@ -0,0 +1,177 @@
1+
# Makefile for Sphinx documentation
2+
#
3+
4+
# You can set these variables from the command line.
5+
SPHINXOPTS =
6+
SPHINXBUILD = sphinx-build
7+
PAPER =
8+
BUILDDIR = _build
9+
10+
# User-friendly check for sphinx-build
11+
ifeq ($(shell which $(SPHINXBUILD) >/dev/null 2>&1; echo $$?), 1)
12+
$(error The '$(SPHINXBUILD)' command was not found. Make sure you have Sphinx installed, then set the SPHINXBUILD environment variable to point to the full path of the '$(SPHINXBUILD)' executable. Alternatively you can add the directory with the executable to your PATH. If you don't have Sphinx installed, grab it from http://sphinx-doc.org/)
13+
endif
14+
15+
# Internal variables.
16+
PAPEROPT_a4 = -D latex_paper_size=a4
17+
PAPEROPT_letter = -D latex_paper_size=letter
18+
ALLSPHINXOPTS = -d $(BUILDDIR)/doctrees $(PAPEROPT_$(PAPER)) $(SPHINXOPTS) .
19+
# the i18n builder cannot share the environment and doctrees with the others
20+
I18NSPHINXOPTS = $(PAPEROPT_$(PAPER)) $(SPHINXOPTS) .
21+
22+
.PHONY: help clean html dirhtml singlehtml pickle json htmlhelp qthelp devhelp epub latex latexpdf text man changes linkcheck doctest gettext
23+
24+
help:
25+
@echo "Please use \`make <target>' where <target> is one of"
26+
@echo " html to make standalone HTML files"
27+
@echo " dirhtml to make HTML files named index.html in directories"
28+
@echo " singlehtml to make a single large HTML file"
29+
@echo " pickle to make pickle files"
30+
@echo " json to make JSON files"
31+
@echo " htmlhelp to make HTML files and a HTML help project"
32+
@echo " qthelp to make HTML files and a qthelp project"
33+
@echo " devhelp to make HTML files and a Devhelp project"
34+
@echo " epub to make an epub"
35+
@echo " latex to make LaTeX files, you can set PAPER=a4 or PAPER=letter"
36+
@echo " latexpdf to make LaTeX files and run them through pdflatex"
37+
@echo " latexpdfja to make LaTeX files and run them through platex/dvipdfmx"
38+
@echo " text to make text files"
39+
@echo " man to make manual pages"
40+
@echo " texinfo to make Texinfo files"
41+
@echo " info to make Texinfo files and run them through makeinfo"
42+
@echo " gettext to make PO message catalogs"
43+
@echo " changes to make an overview of all changed/added/deprecated items"
44+
@echo " xml to make Docutils-native XML files"
45+
@echo " pseudoxml to make pseudoxml-XML files for display purposes"
46+
@echo " linkcheck to check all external links for integrity"
47+
@echo " doctest to run all doctests embedded in the documentation (if enabled)"
48+
49+
clean:
50+
rm -rf $(BUILDDIR)/*
51+
52+
html:
53+
$(SPHINXBUILD) -b html $(ALLSPHINXOPTS) $(BUILDDIR)/html
54+
@echo
55+
@echo "Build finished. The HTML pages are in $(BUILDDIR)/html."
56+
57+
dirhtml:
58+
$(SPHINXBUILD) -b dirhtml $(ALLSPHINXOPTS) $(BUILDDIR)/dirhtml
59+
@echo
60+
@echo "Build finished. The HTML pages are in $(BUILDDIR)/dirhtml."
61+
62+
singlehtml:
63+
$(SPHINXBUILD) -b singlehtml $(ALLSPHINXOPTS) $(BUILDDIR)/singlehtml
64+
@echo
65+
@echo "Build finished. The HTML page is in $(BUILDDIR)/singlehtml."
66+
67+
pickle:
68+
$(SPHINXBUILD) -b pickle $(ALLSPHINXOPTS) $(BUILDDIR)/pickle
69+
@echo
70+
@echo "Build finished; now you can process the pickle files."
71+
72+
json:
73+
$(SPHINXBUILD) -b json $(ALLSPHINXOPTS) $(BUILDDIR)/json
74+
@echo
75+
@echo "Build finished; now you can process the JSON files."
76+
77+
htmlhelp:
78+
$(SPHINXBUILD) -b htmlhelp $(ALLSPHINXOPTS) $(BUILDDIR)/htmlhelp
79+
@echo
80+
@echo "Build finished; now you can run HTML Help Workshop with the" \
81+
".hhp project file in $(BUILDDIR)/htmlhelp."
82+
83+
qthelp:
84+
$(SPHINXBUILD) -b qthelp $(ALLSPHINXOPTS) $(BUILDDIR)/qthelp
85+
@echo
86+
@echo "Build finished; now you can run "qcollectiongenerator" with the" \
87+
".qhcp project file in $(BUILDDIR)/qthelp, like this:"
88+
@echo "# qcollectiongenerator $(BUILDDIR)/qthelp/confluent-kafka.qhcp"
89+
@echo "To view the help file:"
90+
@echo "# assistant -collectionFile $(BUILDDIR)/qthelp/confluent-kafka.qhc"
91+
92+
devhelp:
93+
$(SPHINXBUILD) -b devhelp $(ALLSPHINXOPTS) $(BUILDDIR)/devhelp
94+
@echo
95+
@echo "Build finished."
96+
@echo "To view the help file:"
97+
@echo "# mkdir -p $$HOME/.local/share/devhelp/confluent-kafka"
98+
@echo "# ln -s $(BUILDDIR)/devhelp $$HOME/.local/share/devhelp/confluent-kafka"
99+
@echo "# devhelp"
100+
101+
epub:
102+
$(SPHINXBUILD) -b epub $(ALLSPHINXOPTS) $(BUILDDIR)/epub
103+
@echo
104+
@echo "Build finished. The epub file is in $(BUILDDIR)/epub."
105+
106+
latex:
107+
$(SPHINXBUILD) -b latex $(ALLSPHINXOPTS) $(BUILDDIR)/latex
108+
@echo
109+
@echo "Build finished; the LaTeX files are in $(BUILDDIR)/latex."
110+
@echo "Run \`make' in that directory to run these through (pdf)latex" \
111+
"(use \`make latexpdf' here to do that automatically)."
112+
113+
latexpdf:
114+
$(SPHINXBUILD) -b latex $(ALLSPHINXOPTS) $(BUILDDIR)/latex
115+
@echo "Running LaTeX files through pdflatex..."
116+
$(MAKE) -C $(BUILDDIR)/latex all-pdf
117+
@echo "pdflatex finished; the PDF files are in $(BUILDDIR)/latex."
118+
119+
latexpdfja:
120+
$(SPHINXBUILD) -b latex $(ALLSPHINXOPTS) $(BUILDDIR)/latex
121+
@echo "Running LaTeX files through platex and dvipdfmx..."
122+
$(MAKE) -C $(BUILDDIR)/latex all-pdf-ja
123+
@echo "pdflatex finished; the PDF files are in $(BUILDDIR)/latex."
124+
125+
text:
126+
$(SPHINXBUILD) -b text $(ALLSPHINXOPTS) $(BUILDDIR)/text
127+
@echo
128+
@echo "Build finished. The text files are in $(BUILDDIR)/text."
129+
130+
man:
131+
$(SPHINXBUILD) -b man $(ALLSPHINXOPTS) $(BUILDDIR)/man
132+
@echo
133+
@echo "Build finished. The manual pages are in $(BUILDDIR)/man."
134+
135+
texinfo:
136+
$(SPHINXBUILD) -b texinfo $(ALLSPHINXOPTS) $(BUILDDIR)/texinfo
137+
@echo
138+
@echo "Build finished. The Texinfo files are in $(BUILDDIR)/texinfo."
139+
@echo "Run \`make' in that directory to run these through makeinfo" \
140+
"(use \`make info' here to do that automatically)."
141+
142+
info:
143+
$(SPHINXBUILD) -b texinfo $(ALLSPHINXOPTS) $(BUILDDIR)/texinfo
144+
@echo "Running Texinfo files through makeinfo..."
145+
make -C $(BUILDDIR)/texinfo info
146+
@echo "makeinfo finished; the Info files are in $(BUILDDIR)/texinfo."
147+
148+
gettext:
149+
$(SPHINXBUILD) -b gettext $(I18NSPHINXOPTS) $(BUILDDIR)/locale
150+
@echo
151+
@echo "Build finished. The message catalogs are in $(BUILDDIR)/locale."
152+
153+
changes:
154+
$(SPHINXBUILD) -b changes $(ALLSPHINXOPTS) $(BUILDDIR)/changes
155+
@echo
156+
@echo "The overview file is in $(BUILDDIR)/changes."
157+
158+
linkcheck:
159+
$(SPHINXBUILD) -b linkcheck $(ALLSPHINXOPTS) $(BUILDDIR)/linkcheck
160+
@echo
161+
@echo "Link check complete; look for any errors in the above output " \
162+
"or in $(BUILDDIR)/linkcheck/output.txt."
163+
164+
doctest:
165+
$(SPHINXBUILD) -b doctest $(ALLSPHINXOPTS) $(BUILDDIR)/doctest
166+
@echo "Testing of doctests in the sources finished, look at the " \
167+
"results in $(BUILDDIR)/doctest/output.txt."
168+
169+
xml:
170+
$(SPHINXBUILD) -b xml $(ALLSPHINXOPTS) $(BUILDDIR)/xml
171+
@echo
172+
@echo "Build finished. The XML files are in $(BUILDDIR)/xml."
173+
174+
pseudoxml:
175+
$(SPHINXBUILD) -b pseudoxml $(ALLSPHINXOPTS) $(BUILDDIR)/pseudoxml
176+
@echo
177+
@echo "Build finished. The pseudo-XML files are in $(BUILDDIR)/pseudoxml."

‎docs/conf.py

+262
Original file line numberDiff line numberDiff line change
@@ -0,0 +1,262 @@
1+
# -*- coding: utf-8 -*-
2+
#
3+
# confluent-kafka documentation build configuration file, created by
4+
# sphinx-quickstart on Mon Feb 1 15:12:01 2016.
5+
#
6+
# This file is execfile()d with the current directory set to its
7+
# containing dir.
8+
#
9+
# Note that not all possible configuration values are present in this
10+
# autogenerated file.
11+
#
12+
# All configuration values have a default; values that are commented out
13+
# serve to show the default.
14+
15+
import sys
16+
import os
17+
from glob import glob
18+
19+
# If extensions (or modules to document with autodoc) are in another directory,
20+
# add these directories to sys.path here. If the directory is relative to the
21+
# documentation root, use os.path.abspath to make it absolute, like shown here.
22+
sys.path[:0] = [os.path.abspath(x) for x in glob('../build/lib.*')]
23+
24+
# -- General configuration ------------------------------------------------
25+
26+
# If your documentation needs a minimal Sphinx version, state it here.
27+
#needs_sphinx = '1.0'
28+
29+
# Add any Sphinx extension module names here, as strings. They can be
30+
# extensions coming with Sphinx (named 'sphinx.ext.*') or your custom
31+
# ones.
32+
extensions = [
33+
'sphinx.ext.autodoc',
34+
'sphinx.ext.coverage',
35+
]
36+
37+
# Add any paths that contain templates here, relative to this directory.
38+
templates_path = ['_templates']
39+
40+
# The suffix of source filenames.
41+
source_suffix = '.rst'
42+
43+
# The encoding of source files.
44+
#source_encoding = 'utf-8-sig'
45+
46+
# The master toctree document.
47+
master_doc = 'index'
48+
49+
# General information about the project.
50+
project = u'confluent-kafka'
51+
copyright = u'2016, Confluent Inc.'
52+
53+
# The version info for the project you're documenting, acts as replacement for
54+
# |version| and |release|, also used in various other places throughout the
55+
# built documents.
56+
#
57+
# The short X.Y version.
58+
version = '0.9.1'
59+
# The full version, including alpha/beta/rc tags.
60+
release = '0.9.1'
61+
62+
# The language for content autogenerated by Sphinx. Refer to documentation
63+
# for a list of supported languages.
64+
#language = None
65+
66+
# There are two options for replacing |today|: either, you set today to some
67+
# non-false value, then it is used:
68+
#today = ''
69+
# Else, today_fmt is used as the format for a strftime call.
70+
#today_fmt = '%B %d, %Y'
71+
72+
# List of patterns, relative to source directory, that match files and
73+
# directories to ignore when looking for source files.
74+
exclude_patterns = ['_build']
75+
76+
# The reST default role (used for this markup: `text`) to use for all
77+
# documents.
78+
#default_role = None
79+
80+
# If true, '()' will be appended to :func: etc. cross-reference text.
81+
#add_function_parentheses = True
82+
83+
# If true, the current module name will be prepended to all description
84+
# unit titles (such as .. function::).
85+
#add_module_names = True
86+
87+
# If true, sectionauthor and moduleauthor directives will be shown in the
88+
# output. They are ignored by default.
89+
#show_authors = False
90+
91+
# The name of the Pygments (syntax highlighting) style to use.
92+
pygments_style = 'sphinx'
93+
94+
# A list of ignored prefixes for module index sorting.
95+
#modindex_common_prefix = []
96+
97+
# If true, keep warnings as "system message" paragraphs in the built documents.
98+
#keep_warnings = False
99+
100+
101+
# -- Options for HTML output ----------------------------------------------
102+
103+
# The theme to use for HTML and HTML Help pages. See the documentation for
104+
# a list of builtin themes.
105+
html_theme = 'default'
106+
107+
# Theme options are theme-specific and customize the look and feel of a theme
108+
# further. For a list of options available for each theme, see the
109+
# documentation.
110+
#html_theme_options = {}
111+
112+
# Add any paths that contain custom themes here, relative to this directory.
113+
#html_theme_path = []
114+
115+
# The name for this set of Sphinx documents. If None, it defaults to
116+
# "<project> v<release> documentation".
117+
#html_title = None
118+
119+
# A shorter title for the navigation bar. Default is the same as html_title.
120+
#html_short_title = None
121+
122+
# The name of an image file (relative to this directory) to place at the top
123+
# of the sidebar.
124+
#html_logo = None
125+
126+
# The name of an image file (within the static path) to use as favicon of the
127+
# docs. This file should be a Windows icon file (.ico) being 16x16 or 32x32
128+
# pixels large.
129+
#html_favicon = None
130+
131+
# Add any paths that contain custom static files (such as style sheets) here,
132+
# relative to this directory. They are copied after the builtin static files,
133+
# so a file named "default.css" will overwrite the builtin "default.css".
134+
html_static_path = ['_static']
135+
136+
# Add any extra paths that contain custom files (such as robots.txt or
137+
# .htaccess) here, relative to this directory. These files are copied
138+
# directly to the root of the documentation.
139+
#html_extra_path = []
140+
141+
# If not '', a 'Last updated on:' timestamp is inserted at every page bottom,
142+
# using the given strftime format.
143+
#html_last_updated_fmt = '%b %d, %Y'
144+
145+
# If true, SmartyPants will be used to convert quotes and dashes to
146+
# typographically correct entities.
147+
#html_use_smartypants = True
148+
149+
# Custom sidebar templates, maps document names to template names.
150+
#html_sidebars = {}
151+
152+
# Additional templates that should be rendered to pages, maps page names to
153+
# template names.
154+
#html_additional_pages = {}
155+
156+
# If false, no module index is generated.
157+
#html_domain_indices = True
158+
159+
# If false, no index is generated.
160+
#html_use_index = True
161+
162+
# If true, the index is split into individual pages for each letter.
163+
#html_split_index = False
164+
165+
# If true, links to the reST sources are added to the pages.
166+
#html_show_sourcelink = True
167+
168+
# If true, "Created using Sphinx" is shown in the HTML footer. Default is True.
169+
#html_show_sphinx = True
170+
171+
# If true, "(C) Copyright ..." is shown in the HTML footer. Default is True.
172+
#html_show_copyright = True
173+
174+
# If true, an OpenSearch description file will be output, and all pages will
175+
# contain a <link> tag referring to it. The value of this option must be the
176+
# base URL from which the finished HTML is served.
177+
#html_use_opensearch = ''
178+
179+
# This is the file name suffix for HTML files (e.g. ".xhtml").
180+
#html_file_suffix = None
181+
182+
# Output file base name for HTML help builder.
183+
htmlhelp_basename = 'confluent-kafkadoc'
184+
185+
186+
# -- Options for LaTeX output ---------------------------------------------
187+
188+
latex_elements = {
189+
# The paper size ('letterpaper' or 'a4paper').
190+
#'papersize': 'letterpaper',
191+
192+
# The font size ('10pt', '11pt' or '12pt').
193+
#'pointsize': '10pt',
194+
195+
# Additional stuff for the LaTeX preamble.
196+
#'preamble': '',
197+
}
198+
199+
# Grouping the document tree into LaTeX files. List of tuples
200+
# (source start file, target name, title,
201+
# author, documentclass [howto, manual, or own class]).
202+
latex_documents = [
203+
('index', 'confluent-kafka.tex', u'confluent-kafka Documentation',
204+
u'Magnus Edenhill', 'manual'),
205+
]
206+
207+
# The name of an image file (relative to this directory) to place at the top of
208+
# the title page.
209+
#latex_logo = None
210+
211+
# For "manual" documents, if this is true, then toplevel headings are parts,
212+
# not chapters.
213+
#latex_use_parts = False
214+
215+
# If true, show page references after internal links.
216+
#latex_show_pagerefs = False
217+
218+
# If true, show URL addresses after external links.
219+
#latex_show_urls = False
220+
221+
# Documents to append as an appendix to all manuals.
222+
#latex_appendices = []
223+
224+
# If false, no module index is generated.
225+
#latex_domain_indices = True
226+
227+
228+
# -- Options for manual page output ---------------------------------------
229+
230+
# One entry per manual page. List of tuples
231+
# (source start file, name, description, authors, manual section).
232+
man_pages = [
233+
('index', 'confluent-kafka', u'confluent-kafka Documentation',
234+
[u'Magnus Edenhill'], 1)
235+
]
236+
237+
# If true, show URL addresses after external links.
238+
#man_show_urls = False
239+
240+
241+
# -- Options for Texinfo output -------------------------------------------
242+
243+
# Grouping the document tree into Texinfo files. List of tuples
244+
# (source start file, target name, title, author,
245+
# dir menu entry, description, category)
246+
texinfo_documents = [
247+
('index', 'confluent-kafka', u'confluent-kafka Documentation',
248+
u'Magnus Edenhill', 'confluent-kafka', 'One line description of project.',
249+
'Miscellaneous'),
250+
]
251+
252+
# Documents to append as an appendix to all manuals.
253+
#texinfo_appendices = []
254+
255+
# If false, no module index is generated.
256+
#texinfo_domain_indices = True
257+
258+
# How to display URL addresses: 'footnote', 'no', or 'inline'.
259+
#texinfo_show_urls = 'footnote'
260+
261+
# If true, do not generate a @detailmenu in the "Top" node's menu.
262+
#texinfo_no_detailmenu = False

‎docs/index.rst

+44
Original file line numberDiff line numberDiff line change
@@ -0,0 +1,44 @@
1+
Welcome to Confluent's Apache Kafka Python client documentation
2+
===============================================================
3+
4+
Indices and tables
5+
==================
6+
7+
* :ref:`genindex`
8+
9+
:mod:`confluent_kafka` --- Confluent's Apache Kafka Python client
10+
*****************************************************************
11+
12+
.. automodule:: confluent_kafka
13+
:synopsis: Confluent's Apache Kafka Python client.
14+
:members:
15+
16+
17+
Configuration
18+
=============
19+
Configuration of producer and consumer instances is performed by
20+
providing a dict of configuration properties to the instance constructor, e.g.::
21+
22+
conf = {'bootstrap.servers': 'mybroker.com',
23+
'group.id': 'mygroup', 'session.timeout.ms': 6000,
24+
'default.topic.config': {'auto.offset.reset': 'smallest'}}
25+
consumer = confluent_kafka.Consumer(**conf)
26+
27+
The supported configuration values are dictated by the underlying
28+
librdkafka C library. For the full range of configuration properties
29+
please consult librdkafka's documentation:
30+
https://github.com/edenhill/librdkafka/blob/master/CONFIGURATION.md
31+
32+
The Python bindings also provide some additional configuration properties:
33+
34+
* ``default.topic.config``: value is a dict of topic-level configuration
35+
properties that are applied to all used topics for the instance.
36+
37+
* ``delivery_callback`` (**Producer**): value is a Python function reference
38+
that is called once for each produced message to indicate the final
39+
delivery result (success or failure).
40+
This property may also be set per-message by passing ``callback=somefunc``
41+
to the confluent_kafka.Producer.produce() function.
42+
43+
44+

‎docs/make-doc.bat

+242
Original file line numberDiff line numberDiff line change
@@ -0,0 +1,242 @@
1+
@ECHO OFF
2+
3+
REM Command file for Sphinx documentation
4+
5+
if "%SPHINXBUILD%" == "" (
6+
set SPHINXBUILD=sphinx-build
7+
)
8+
set BUILDDIR=_build
9+
set ALLSPHINXOPTS=-d %BUILDDIR%/doctrees %SPHINXOPTS% .
10+
set I18NSPHINXOPTS=%SPHINXOPTS% .
11+
if NOT "%PAPER%" == "" (
12+
set ALLSPHINXOPTS=-D latex_paper_size=%PAPER% %ALLSPHINXOPTS%
13+
set I18NSPHINXOPTS=-D latex_paper_size=%PAPER% %I18NSPHINXOPTS%
14+
)
15+
16+
if "%1" == "" goto help
17+
18+
if "%1" == "help" (
19+
:help
20+
echo.Please use `make ^<target^>` where ^<target^> is one of
21+
echo. html to make standalone HTML files
22+
echo. dirhtml to make HTML files named index.html in directories
23+
echo. singlehtml to make a single large HTML file
24+
echo. pickle to make pickle files
25+
echo. json to make JSON files
26+
echo. htmlhelp to make HTML files and a HTML help project
27+
echo. qthelp to make HTML files and a qthelp project
28+
echo. devhelp to make HTML files and a Devhelp project
29+
echo. epub to make an epub
30+
echo. latex to make LaTeX files, you can set PAPER=a4 or PAPER=letter
31+
echo. text to make text files
32+
echo. man to make manual pages
33+
echo. texinfo to make Texinfo files
34+
echo. gettext to make PO message catalogs
35+
echo. changes to make an overview over all changed/added/deprecated items
36+
echo. xml to make Docutils-native XML files
37+
echo. pseudoxml to make pseudoxml-XML files for display purposes
38+
echo. linkcheck to check all external links for integrity
39+
echo. doctest to run all doctests embedded in the documentation if enabled
40+
goto end
41+
)
42+
43+
if "%1" == "clean" (
44+
for /d %%i in (%BUILDDIR%\*) do rmdir /q /s %%i
45+
del /q /s %BUILDDIR%\*
46+
goto end
47+
)
48+
49+
50+
%SPHINXBUILD% 2> nul
51+
if errorlevel 9009 (
52+
echo.
53+
echo.The 'sphinx-build' command was not found. Make sure you have Sphinx
54+
echo.installed, then set the SPHINXBUILD environment variable to point
55+
echo.to the full path of the 'sphinx-build' executable. Alternatively you
56+
echo.may add the Sphinx directory to PATH.
57+
echo.
58+
echo.If you don't have Sphinx installed, grab it from
59+
echo.http://sphinx-doc.org/
60+
exit /b 1
61+
)
62+
63+
if "%1" == "html" (
64+
%SPHINXBUILD% -b html %ALLSPHINXOPTS% %BUILDDIR%/html
65+
if errorlevel 1 exit /b 1
66+
echo.
67+
echo.Build finished. The HTML pages are in %BUILDDIR%/html.
68+
goto end
69+
)
70+
71+
if "%1" == "dirhtml" (
72+
%SPHINXBUILD% -b dirhtml %ALLSPHINXOPTS% %BUILDDIR%/dirhtml
73+
if errorlevel 1 exit /b 1
74+
echo.
75+
echo.Build finished. The HTML pages are in %BUILDDIR%/dirhtml.
76+
goto end
77+
)
78+
79+
if "%1" == "singlehtml" (
80+
%SPHINXBUILD% -b singlehtml %ALLSPHINXOPTS% %BUILDDIR%/singlehtml
81+
if errorlevel 1 exit /b 1
82+
echo.
83+
echo.Build finished. The HTML pages are in %BUILDDIR%/singlehtml.
84+
goto end
85+
)
86+
87+
if "%1" == "pickle" (
88+
%SPHINXBUILD% -b pickle %ALLSPHINXOPTS% %BUILDDIR%/pickle
89+
if errorlevel 1 exit /b 1
90+
echo.
91+
echo.Build finished; now you can process the pickle files.
92+
goto end
93+
)
94+
95+
if "%1" == "json" (
96+
%SPHINXBUILD% -b json %ALLSPHINXOPTS% %BUILDDIR%/json
97+
if errorlevel 1 exit /b 1
98+
echo.
99+
echo.Build finished; now you can process the JSON files.
100+
goto end
101+
)
102+
103+
if "%1" == "htmlhelp" (
104+
%SPHINXBUILD% -b htmlhelp %ALLSPHINXOPTS% %BUILDDIR%/htmlhelp
105+
if errorlevel 1 exit /b 1
106+
echo.
107+
echo.Build finished; now you can run HTML Help Workshop with the ^
108+
.hhp project file in %BUILDDIR%/htmlhelp.
109+
goto end
110+
)
111+
112+
if "%1" == "qthelp" (
113+
%SPHINXBUILD% -b qthelp %ALLSPHINXOPTS% %BUILDDIR%/qthelp
114+
if errorlevel 1 exit /b 1
115+
echo.
116+
echo.Build finished; now you can run "qcollectiongenerator" with the ^
117+
.qhcp project file in %BUILDDIR%/qthelp, like this:
118+
echo.^> qcollectiongenerator %BUILDDIR%\qthelp\confluent-kafka.qhcp
119+
echo.To view the help file:
120+
echo.^> assistant -collectionFile %BUILDDIR%\qthelp\confluent-kafka.ghc
121+
goto end
122+
)
123+
124+
if "%1" == "devhelp" (
125+
%SPHINXBUILD% -b devhelp %ALLSPHINXOPTS% %BUILDDIR%/devhelp
126+
if errorlevel 1 exit /b 1
127+
echo.
128+
echo.Build finished.
129+
goto end
130+
)
131+
132+
if "%1" == "epub" (
133+
%SPHINXBUILD% -b epub %ALLSPHINXOPTS% %BUILDDIR%/epub
134+
if errorlevel 1 exit /b 1
135+
echo.
136+
echo.Build finished. The epub file is in %BUILDDIR%/epub.
137+
goto end
138+
)
139+
140+
if "%1" == "latex" (
141+
%SPHINXBUILD% -b latex %ALLSPHINXOPTS% %BUILDDIR%/latex
142+
if errorlevel 1 exit /b 1
143+
echo.
144+
echo.Build finished; the LaTeX files are in %BUILDDIR%/latex.
145+
goto end
146+
)
147+
148+
if "%1" == "latexpdf" (
149+
%SPHINXBUILD% -b latex %ALLSPHINXOPTS% %BUILDDIR%/latex
150+
cd %BUILDDIR%/latex
151+
make all-pdf
152+
cd %BUILDDIR%/..
153+
echo.
154+
echo.Build finished; the PDF files are in %BUILDDIR%/latex.
155+
goto end
156+
)
157+
158+
if "%1" == "latexpdfja" (
159+
%SPHINXBUILD% -b latex %ALLSPHINXOPTS% %BUILDDIR%/latex
160+
cd %BUILDDIR%/latex
161+
make all-pdf-ja
162+
cd %BUILDDIR%/..
163+
echo.
164+
echo.Build finished; the PDF files are in %BUILDDIR%/latex.
165+
goto end
166+
)
167+
168+
if "%1" == "text" (
169+
%SPHINXBUILD% -b text %ALLSPHINXOPTS% %BUILDDIR%/text
170+
if errorlevel 1 exit /b 1
171+
echo.
172+
echo.Build finished. The text files are in %BUILDDIR%/text.
173+
goto end
174+
)
175+
176+
if "%1" == "man" (
177+
%SPHINXBUILD% -b man %ALLSPHINXOPTS% %BUILDDIR%/man
178+
if errorlevel 1 exit /b 1
179+
echo.
180+
echo.Build finished. The manual pages are in %BUILDDIR%/man.
181+
goto end
182+
)
183+
184+
if "%1" == "texinfo" (
185+
%SPHINXBUILD% -b texinfo %ALLSPHINXOPTS% %BUILDDIR%/texinfo
186+
if errorlevel 1 exit /b 1
187+
echo.
188+
echo.Build finished. The Texinfo files are in %BUILDDIR%/texinfo.
189+
goto end
190+
)
191+
192+
if "%1" == "gettext" (
193+
%SPHINXBUILD% -b gettext %I18NSPHINXOPTS% %BUILDDIR%/locale
194+
if errorlevel 1 exit /b 1
195+
echo.
196+
echo.Build finished. The message catalogs are in %BUILDDIR%/locale.
197+
goto end
198+
)
199+
200+
if "%1" == "changes" (
201+
%SPHINXBUILD% -b changes %ALLSPHINXOPTS% %BUILDDIR%/changes
202+
if errorlevel 1 exit /b 1
203+
echo.
204+
echo.The overview file is in %BUILDDIR%/changes.
205+
goto end
206+
)
207+
208+
if "%1" == "linkcheck" (
209+
%SPHINXBUILD% -b linkcheck %ALLSPHINXOPTS% %BUILDDIR%/linkcheck
210+
if errorlevel 1 exit /b 1
211+
echo.
212+
echo.Link check complete; look for any errors in the above output ^
213+
or in %BUILDDIR%/linkcheck/output.txt.
214+
goto end
215+
)
216+
217+
if "%1" == "doctest" (
218+
%SPHINXBUILD% -b doctest %ALLSPHINXOPTS% %BUILDDIR%/doctest
219+
if errorlevel 1 exit /b 1
220+
echo.
221+
echo.Testing of doctests in the sources finished, look at the ^
222+
results in %BUILDDIR%/doctest/output.txt.
223+
goto end
224+
)
225+
226+
if "%1" == "xml" (
227+
%SPHINXBUILD% -b xml %ALLSPHINXOPTS% %BUILDDIR%/xml
228+
if errorlevel 1 exit /b 1
229+
echo.
230+
echo.Build finished. The XML files are in %BUILDDIR%/xml.
231+
goto end
232+
)
233+
234+
if "%1" == "pseudoxml" (
235+
%SPHINXBUILD% -b pseudoxml %ALLSPHINXOPTS% %BUILDDIR%/pseudoxml
236+
if errorlevel 1 exit /b 1
237+
echo.
238+
echo.Build finished. The pseudo-XML files are in %BUILDDIR%/pseudoxml.
239+
goto end
240+
)
241+
242+
:end

‎examples/consumer.py

+75
Original file line numberDiff line numberDiff line change
@@ -0,0 +1,75 @@
1+
#!/usr/bin/env python
2+
#
3+
# Copyright 2016 Confluent Inc.
4+
#
5+
# Licensed under the Apache License, Version 2.0 (the "License");
6+
# you may not use this file except in compliance with the License.
7+
# You may obtain a copy of the License at
8+
#
9+
# http://www.apache.org/licenses/LICENSE-2.0
10+
#
11+
# Unless required by applicable law or agreed to in writing, software
12+
# distributed under the License is distributed on an "AS IS" BASIS,
13+
# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
14+
# See the License for the specific language governing permissions and
15+
# limitations under the License.
16+
#
17+
18+
#
19+
# Example high-level Kafka 0.9 balanced Consumer
20+
#
21+
22+
from confluent_kafka import Consumer, KafkaException, KafkaError
23+
import sys
24+
25+
if __name__ == '__main__':
26+
if len(sys.argv) < 4:
27+
sys.stderr.write('Usage: %s <bootstrap-brokers> <group> <topic1> <topic2> ..\n' % sys.argv[0])
28+
sys.exit(1)
29+
30+
broker = sys.argv[1]
31+
group = sys.argv[2]
32+
topics = sys.argv[3:]
33+
34+
# Consumer configuration
35+
# See https://github.com/edenhill/librdkafka/blob/master/CONFIGURATION.md
36+
conf = {'bootstrap.servers': broker, 'group.id': group, 'session.timeout.ms': 6000,
37+
'default.topic.config': {'auto.offset.reset': 'smallest'}}
38+
39+
40+
# Create Consumer instance
41+
c = Consumer(**conf)
42+
43+
def print_assignment (consumer, partitions):
44+
print('Assignment:', partitions)
45+
46+
# Subscribe to topics
47+
c.subscribe(topics, on_assign=print_assignment)
48+
49+
# Read messages from Kafka, print to stdout
50+
try:
51+
while True:
52+
msg = c.poll(timeout=1.0)
53+
if msg is None:
54+
continue
55+
if msg.error():
56+
# Error or event
57+
if msg.error().code() == KafkaError._PARTITION_EOF:
58+
# End of partition event
59+
sys.stderr.write('%% %s [%d] reached end at offset %d\n' %
60+
(msg.topic(), msg.partition(), msg.offset()))
61+
elif msg.error():
62+
# Error
63+
raise KafkaException(msg.error())
64+
else:
65+
# Proper message
66+
sys.stderr.write('%% %s [%d] at offset %d with key %s:\n' %
67+
(msg.topic(), msg.partition(), msg.offset(),
68+
str(msg.key())))
69+
print(msg.value())
70+
71+
except KeyboardInterrupt:
72+
sys.stderr.write('%% Aborted by user\n')
73+
74+
# Close down consumer to commit final offsets.
75+
c.close()

‎examples/producer.py

+70
Original file line numberDiff line numberDiff line change
@@ -0,0 +1,70 @@
1+
#!/usr/bin/env python
2+
#
3+
# Copyright 2016 Confluent Inc.
4+
#
5+
# Licensed under the Apache License, Version 2.0 (the "License");
6+
# you may not use this file except in compliance with the License.
7+
# You may obtain a copy of the License at
8+
#
9+
# http://www.apache.org/licenses/LICENSE-2.0
10+
#
11+
# Unless required by applicable law or agreed to in writing, software
12+
# distributed under the License is distributed on an "AS IS" BASIS,
13+
# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
14+
# See the License for the specific language governing permissions and
15+
# limitations under the License.
16+
#
17+
18+
#
19+
# Example Kafka Producer.
20+
# Reads lines from stdin and sends to Kafka.
21+
#
22+
23+
from confluent_kafka import Producer
24+
import sys
25+
26+
if __name__ == '__main__':
27+
if len(sys.argv) != 3:
28+
sys.stderr.write('Usage: %s <bootstrap-brokers> <topic>\n' % sys.argv[0])
29+
sys.exit(1)
30+
31+
broker = sys.argv[1]
32+
topic = sys.argv[2]
33+
34+
# Producer configuration
35+
# See https://github.com/edenhill/librdkafka/blob/master/CONFIGURATION.md
36+
conf = {'bootstrap.servers': broker}
37+
38+
# Create Producer instance
39+
p = Producer(**conf)
40+
41+
# Optional per-message delivery callback (triggered by poll() or flush())
42+
# when a message has been successfully delivered or permanently
43+
# failed delivery (after retries).
44+
def delivery_callback (err, msg):
45+
if err:
46+
sys.stderr.write('%% Message failed delivery: %s\n' % err)
47+
else:
48+
sys.stderr.write('%% Message delivered to %s [%d]\n' % \
49+
(msg.topic(), msg.partition()))
50+
51+
# Read lines from stdin, produce each line to Kafka
52+
for line in sys.stdin:
53+
try:
54+
# Produce line (without newline)
55+
p.produce(topic, line.rstrip(), callback=delivery_callback)
56+
57+
except BufferError as e:
58+
sys.stderr.write('%% Local producer queue is full ' \
59+
'(%d messages awaiting delivery): try again\n' %
60+
len(p))
61+
62+
# Serve delivery callback queue.
63+
# NOTE: Since produce() is an asynchronous API this poll() call
64+
# will most likely not serve the delivery callback for the
65+
# last produce()d message.
66+
p.poll(0)
67+
68+
# Wait until all messages have been delivered
69+
sys.stderr.write('%% Waiting for %d deliveries\n' % len(p))
70+
p.flush()

‎integration_test.py

+368
Original file line numberDiff line numberDiff line change
@@ -0,0 +1,368 @@
1+
#!/usr/bin/env python
2+
#
3+
#
4+
# Copyright 2016 Confluent Inc.
5+
#
6+
# Licensed under the Apache License, Version 2.0 (the "License");
7+
# you may not use this file except in compliance with the License.
8+
# You may obtain a copy of the License at
9+
#
10+
# http://www.apache.org/licenses/LICENSE-2.0
11+
#
12+
# Unless required by applicable law or agreed to in writing, software
13+
# distributed under the License is distributed on an "AS IS" BASIS,
14+
# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
15+
# See the License for the specific language governing permissions and
16+
# limitations under the License.
17+
#
18+
19+
20+
""" Test script for confluent_kafka module """
21+
22+
import confluent_kafka
23+
import re
24+
import time
25+
import uuid
26+
import sys
27+
28+
try:
29+
from progress.bar import Bar
30+
with_progress = True
31+
except ImportError as e:
32+
with_progress = False
33+
34+
# Kafka bootstrap server(s)
35+
bootstrap_servers = 'localhost'
36+
37+
38+
39+
40+
41+
class MyTestDr(object):
42+
""" Producer: Delivery report callback """
43+
44+
def __init__(self, silent=False):
45+
super(MyTestDr, self).__init__()
46+
self.msgs_delivered = 0
47+
self.bytes_delivered = 0
48+
self.silent = silent
49+
50+
@staticmethod
51+
def _delivery(err, msg, silent=False):
52+
if err:
53+
print('Message delivery failed (%s [%s]): %s' % \
54+
(msg.topic(), str(msg.partition()), err))
55+
return 0
56+
else:
57+
if not silent:
58+
print('Message delivered to %s [%s] at offset [%s]: %s' % \
59+
(msg.topic(), msg.partition(), msg.offset(), msg.value()))
60+
return 1
61+
62+
def delivery(self, err, msg):
63+
if err:
64+
print('Message delivery failed (%s [%s]): %s' % \
65+
(msg.topic(), str(msg.partition()), err))
66+
return
67+
elif not self.silent:
68+
print('Message delivered to %s [%s] at offset [%s]: %s' % \
69+
(msg.topic(), msg.partition(), msg.offset(), msg.value()))
70+
self.msgs_delivered += 1
71+
self.bytes_delivered += len(msg)
72+
73+
74+
75+
def verify_producer():
76+
""" Verify basic Producer functionality """
77+
78+
# Producer config
79+
conf = {'bootstrap.servers': bootstrap_servers,
80+
'default.topic.config':{'produce.offset.report': True}}
81+
82+
# Create producer
83+
p = confluent_kafka.Producer(**conf)
84+
print('producer at %s' % p)
85+
86+
# Produce some messages
87+
p.produce('test', 'Hello Python!')
88+
p.produce('test', key='Just a key')
89+
p.produce('test', partition=1, value='Strictly for partition 1',
90+
key='mykey')
91+
92+
# Produce more messages, now with delivery report callbacks in various forms.
93+
mydr = MyTestDr()
94+
p.produce('test', value='This one has a dr callback',
95+
callback=mydr.delivery)
96+
p.produce('test', value='This one has a lambda',
97+
callback=lambda err, msg: MyTestDr._delivery(err, msg))
98+
p.produce('test', value='This one has neither')
99+
100+
# Produce even more messages
101+
for i in range(0, 10):
102+
p.produce('test', value='Message #%d' % i, key=str(i),
103+
callback=mydr.delivery)
104+
p.poll(0)
105+
106+
print('Waiting for %d messages to be delivered' % len(p))
107+
108+
# Block until all messages are delivered/failed
109+
p.flush()
110+
111+
112+
113+
def verify_producer_performance(with_dr_cb=True):
114+
""" Time how long it takes to produce and delivery X messages """
115+
conf = {'bootstrap.servers': bootstrap_servers}
116+
117+
p = confluent_kafka.Producer(**conf)
118+
119+
topic = 'test'
120+
msgcnt = 1000000
121+
msgsize = 100
122+
msg_pattern = 'test.py performance'
123+
msg_payload = (msg_pattern * int(msgsize / len(msg_pattern)))[0:msgsize]
124+
125+
dr = MyTestDr(silent=True)
126+
127+
t_produce_start = time.time()
128+
msgs_produced = 0
129+
msgs_backpressure = 0
130+
print('# producing %d messages to topic %s' % (msgcnt, topic))
131+
132+
if with_progress:
133+
bar = Bar('Producing', max=msgcnt)
134+
else:
135+
bar = None
136+
137+
for i in range(0, msgcnt):
138+
try:
139+
if with_dr_cb:
140+
p.produce('test', value=msg_payload, callback=dr.delivery)
141+
else:
142+
p.produce('test', value=msg_payload)
143+
except BufferError as e:
144+
# Local queue is full (slow broker connection?)
145+
msgs_backpressure += 1
146+
if bar is not None and (msgs_backpressure % 1000) == 0:
147+
bar.next(n=0)
148+
p.poll(0)
149+
continue
150+
151+
if bar is not None and (msgs_produced % 5000) == 0:
152+
bar.next(n=5000)
153+
msgs_produced += 1
154+
p.poll(0)
155+
156+
t_produce_spent = time.time() - t_produce_start
157+
158+
bytecnt = msgs_produced * msgsize
159+
160+
if bar is not None:
161+
bar.finish()
162+
163+
print('# producing %d messages (%.2fMb) took %.3fs: %d msgs/s, %.2f Mb/s' % \
164+
(msgs_produced, bytecnt / (1024*1024), t_produce_spent,
165+
msgs_produced / t_produce_spent,
166+
(bytecnt/t_produce_spent) / (1024*1024)))
167+
print('# %d messages not produce()d due to backpressure (local queue full)' % msgs_backpressure)
168+
169+
print('waiting for %d/%d deliveries' % (len(p), msgs_produced))
170+
# Wait for deliveries
171+
p.flush()
172+
t_delivery_spent = time.time() - t_produce_start
173+
174+
175+
print('# producing %d messages (%.2fMb) took %.3fs: %d msgs/s, %.2f Mb/s' % \
176+
(msgs_produced, bytecnt / (1024*1024), t_produce_spent,
177+
msgs_produced / t_produce_spent,
178+
(bytecnt/t_produce_spent) / (1024*1024)))
179+
180+
# Fake numbers if not using a dr_cb
181+
if not with_dr_cb:
182+
print('# not using dr_cb')
183+
dr.msgs_delivered = msgs_produced
184+
dr.bytes_delivered = bytecnt
185+
186+
print('# delivering %d messages (%.2fMb) took %.3fs: %d msgs/s, %.2f Mb/s' % \
187+
(dr.msgs_delivered, dr.bytes_delivered / (1024*1024), t_delivery_spent,
188+
dr.msgs_delivered / t_delivery_spent,
189+
(dr.bytes_delivered/t_delivery_spent) / (1024*1024)))
190+
print('# post-produce delivery wait took %.3fs' % \
191+
(t_delivery_spent - t_produce_spent))
192+
193+
194+
def verify_consumer():
195+
""" Verify basic Consumer functionality """
196+
197+
# Consumer config
198+
conf = {'bootstrap.servers': bootstrap_servers,
199+
'group.id': 'test.py',
200+
'session.timeout.ms': 6000,
201+
'enable.auto.commit': False,
202+
'default.topic.config': {
203+
'auto.offset.reset': 'earliest'
204+
}}
205+
206+
# Create consumer
207+
c = confluent_kafka.Consumer(**conf)
208+
209+
# Subscribe to a list of topics
210+
c.subscribe(["test"])
211+
212+
max_msgcnt = 100
213+
msgcnt = 0
214+
215+
while True:
216+
# Consume until EOF or error
217+
218+
# Consume message (error()==0) or event (error()!=0)
219+
msg = c.poll()
220+
if msg is None:
221+
raise Exception('Got timeout from poll() without a timeout set: %s' % msg)
222+
223+
if msg.error():
224+
if msg.error().code() == confluent_kafka.KafkaError._PARTITION_EOF:
225+
print('Reached end of %s [%d] at offset %d' % \
226+
(msg.topic(), msg.partition(), msg.offset()))
227+
break
228+
else:
229+
print('Consumer error: %s: ignoring' % msg.error())
230+
break
231+
232+
if False:
233+
print('%s[%d]@%d: key=%s, value=%s' % \
234+
(msg.topic(), msg.partition(), msg.offset(),
235+
msg.key(), msg.value()))
236+
237+
if (msg.offset() % 5) == 0:
238+
# Async commit
239+
c.commit(msg, async=True)
240+
elif (msg.offset() % 4) == 0:
241+
c.commit(msg, async=False)
242+
243+
msgcnt += 1
244+
if msgcnt >= max_msgcnt:
245+
print('max_msgcnt %d reached' % msgcnt)
246+
break
247+
248+
249+
# Close consumer
250+
c.close()
251+
252+
253+
# Start a new client and get the committed offsets
254+
c = confluent_kafka.Consumer(**conf)
255+
offsets = c.committed(list(map(lambda p: confluent_kafka.TopicPartition("test", p), range(0,3))))
256+
for tp in offsets:
257+
print(tp)
258+
259+
c.close()
260+
261+
262+
263+
264+
def verify_consumer_performance():
265+
""" Verify Consumer performance """
266+
267+
conf = {'bootstrap.servers': bootstrap_servers,
268+
'group.id': uuid.uuid1(),
269+
'session.timeout.ms': 6000,
270+
'default.topic.config': {
271+
'auto.offset.reset': 'earliest'
272+
}}
273+
274+
c = confluent_kafka.Consumer(**conf)
275+
276+
def my_on_assign (consumer, partitions):
277+
print('on_assign:', len(partitions), 'partitions:')
278+
for p in partitions:
279+
print(' %s [%d] @ %d' % (p.topic, p.partition, p.offset))
280+
consumer.assign(partitions)
281+
282+
def my_on_revoke (consumer, partitions):
283+
print('on_revoke:', len(partitions), 'partitions:')
284+
for p in partitions:
285+
print(' %s [%d] @ %d' % (p.topic, p.partition, p.offset))
286+
consumer.unassign()
287+
288+
c.subscribe(["test"], on_assign=my_on_assign, on_revoke=my_on_revoke)
289+
290+
max_msgcnt = 1000000
291+
bytecnt = 0
292+
msgcnt = 0
293+
294+
print('Will now consume %d messages' % max_msgcnt)
295+
296+
if with_progress:
297+
bar = Bar('Consuming', max=max_msgcnt,
298+
suffix='%(index)d/%(max)d [%(eta_td)s]')
299+
else:
300+
bar = None
301+
302+
while True:
303+
# Consume until EOF or error
304+
305+
msg = c.poll(timeout=20.0)
306+
if msg is None:
307+
raise Exception('Stalled at %d/%d message, no new messages for 20s' %
308+
(msgcnt, max_msgcnt))
309+
310+
if msg.error():
311+
if msg.error().code() == confluent_kafka.KafkaError._PARTITION_EOF:
312+
# Reached EOF for a partition, ignore.
313+
continue
314+
else:
315+
raise confluent_kafka.KafkaException(msg.error())
316+
317+
318+
bytecnt += len(msg)
319+
msgcnt += 1
320+
321+
if bar is not None and (msgcnt % 10000) == 0:
322+
bar.next(n=10000)
323+
324+
if msgcnt == 1:
325+
t_first_msg = time.time()
326+
if msgcnt >= max_msgcnt:
327+
break
328+
329+
if bar is not None:
330+
bar.finish()
331+
332+
if msgcnt > 0:
333+
t_spent = time.time() - t_first_msg
334+
print('%d messages (%.2fMb) consumed in %.3fs: %d msgs/s, %.2f Mb/s' % \
335+
(msgcnt, bytecnt / (1024*1024), t_spent, msgcnt / t_spent,
336+
(bytecnt / t_spent) / (1024*1024)))
337+
338+
print('closing consumer')
339+
c.close()
340+
341+
342+
343+
if __name__ == '__main__':
344+
345+
if len(sys.argv) > 1:
346+
bootstrap_servers = sys.argv[1]
347+
348+
print('Using confluent_kafka module version %s (0x%x)' % confluent_kafka.version())
349+
print('Using librdkafka version %s (0x%x)' % confluent_kafka.libversion())
350+
351+
print('=' * 30, 'Verifying Producer', '=' * 30)
352+
verify_producer()
353+
354+
print('=' * 30, 'Verifying Consumer', '=' * 30)
355+
verify_consumer()
356+
357+
print('=' * 30, 'Verifying Producer performance (with dr_cb)', '=' * 30)
358+
verify_producer_performance(with_dr_cb=True)
359+
360+
print('=' * 30, 'Verifying Producer performance (without dr_cb)', '=' * 30)
361+
verify_producer_performance(with_dr_cb=False)
362+
363+
print('=' * 30, 'Verifying Consumer performance', '=' * 30)
364+
verify_consumer_performance()
365+
366+
print('=' * 30, 'Done', '=' * 30)
367+
368+

‎setup.py

+19
Original file line numberDiff line numberDiff line change
@@ -0,0 +1,19 @@
1+
#!/usr/bin/env python
2+
3+
from setuptools import setup
4+
from distutils.core import Extension
5+
6+
7+
module = Extension('confluent_kafka',
8+
include_dirs = ['/usr/local/include'],
9+
libraries= ['rdkafka'],
10+
sources=['confluent_kafka.c', 'Producer.c', 'Consumer.c'])
11+
12+
setup (name='confluent-kafka',
13+
version='0.9.1',
14+
description='Confluent\'s Apache Kafka client for Python',
15+
author='Confluent Inc',
16+
author_email='support@confluent.io',
17+
url='https://github.com/confluentinc/confluent-kafka-python',
18+
ext_modules=[module])
19+

‎tests/README.md

+7
Original file line numberDiff line numberDiff line change
@@ -0,0 +1,7 @@
1+
Unit tests
2+
==========
3+
4+
Run from top-level directory with:
5+
6+
py.test
7+

‎tests/test_Consumer.py

+57
Original file line numberDiff line numberDiff line change
@@ -0,0 +1,57 @@
1+
#!/usr/bin/env python
2+
3+
from confluent_kafka import Consumer, TopicPartition, KafkaError, KafkaException
4+
5+
6+
def test_basic_api():
7+
""" Basic API tests, these wont really do anything since there is no
8+
broker configured. """
9+
10+
try:
11+
kc = Consumer()
12+
except TypeError as e:
13+
assert str(e) == "expected configuration dict"
14+
15+
kc = Consumer({'group.id':'test', 'socket.timeout.ms':'100'})
16+
17+
kc.subscribe(["test"])
18+
kc.unsubscribe()
19+
20+
def dummy_assign_revoke (consumer, partitions):
21+
pass
22+
23+
kc.subscribe(["test"], on_assign=dummy_assign_revoke, on_revoke=dummy_assign_revoke)
24+
kc.unsubscribe()
25+
26+
msg = kc.poll(timeout=0.001)
27+
if msg is None:
28+
print('OK: poll() timeout')
29+
elif msg.error():
30+
print('OK: consumer error: %s' % msg.error().str())
31+
else:
32+
print('OK: consumed message')
33+
34+
partitions = list(map(lambda p: TopicPartition("test", p), range(0,100,3)))
35+
kc.assign(partitions)
36+
37+
kc.unassign()
38+
39+
kc.commit(async=True)
40+
41+
try:
42+
kc.commit(async=False)
43+
except KafkaException as e:
44+
assert e.args[0].code() in (KafkaError._TIMED_OUT, KafkaError._WAIT_COORD)
45+
46+
# Get current position, should all be invalid.
47+
kc.position(partitions)
48+
assert len([p for p in partitions if p.offset == -1001]) == len(partitions)
49+
50+
try:
51+
offsets = kc.committed(partitions, timeout=0.001)
52+
except KafkaException as e:
53+
assert e.args[0].code() == KafkaError._TIMED_OUT
54+
55+
56+
kc.close()
57+

‎tests/test_Producer.py

+34
Original file line numberDiff line numberDiff line change
@@ -0,0 +1,34 @@
1+
#!/usr/bin/env python
2+
3+
from confluent_kafka import Producer, KafkaError, KafkaException
4+
5+
6+
def test_basic_api():
7+
""" Basic API tests, these wont really do anything since there is no
8+
broker configured. """
9+
10+
try:
11+
p = Producer()
12+
except TypeError as e:
13+
assert str(e) == "expected configuration dict"
14+
15+
16+
p = Producer({'socket.timeout.ms':10,
17+
'default.topic.config': {'message.timeout.ms': 10}})
18+
19+
p.produce('mytopic')
20+
p.produce('mytopic', value='somedata', key='a key')
21+
22+
def on_delivery(err,msg):
23+
print('delivery', str)
24+
# Since there is no broker, produced messages should time out.
25+
assert err.code() == KafkaError._MSG_TIMED_OUT
26+
27+
p.produce(topic='another_topic', value='testing', partition=9,
28+
callback=on_delivery)
29+
30+
p.poll(0.001)
31+
32+
p.flush()
33+
34+

‎tests/test_TopicPartition.py

+38
Original file line numberDiff line numberDiff line change
@@ -0,0 +1,38 @@
1+
#!/usr/bin/env python
2+
3+
from confluent_kafka import TopicPartition
4+
5+
def test_sort():
6+
""" TopicPartition sorting (rich comparator) """
7+
8+
# sorting uses the comparator
9+
correct = [TopicPartition('topic1', 3),
10+
TopicPartition('topic3', 0),
11+
TopicPartition('topicA', 5),
12+
TopicPartition('topicA', 5)]
13+
14+
tps = sorted([TopicPartition('topicA', 5),
15+
TopicPartition('topic3', 0),
16+
TopicPartition('topicA', 5),
17+
TopicPartition('topic1', 3)])
18+
19+
assert correct == tps
20+
21+
22+
def test_cmp():
23+
""" TopicPartition comparator """
24+
25+
assert TopicPartition('aa', 19002) > TopicPartition('aa', 0)
26+
assert TopicPartition('aa', 13) >= TopicPartition('aa', 12)
27+
assert TopicPartition('BaB', 9) != TopicPartition('Card', 9)
28+
assert TopicPartition('b3x', 4) == TopicPartition('b3x', 4)
29+
assert TopicPartition('ulv', 2) < TopicPartition('xy', 0)
30+
assert TopicPartition('ulv', 2) <= TopicPartition('ulv', 3)
31+
32+
33+
def test_hash():
34+
35+
tp1 = TopicPartition('test', 99)
36+
tp2 = TopicPartition('somethingelse', 12)
37+
assert hash(tp1) != hash(tp2)
38+

‎tests/test_docs.py

+28
Original file line numberDiff line numberDiff line change
@@ -0,0 +1,28 @@
1+
#!/usr/bin/env python
2+
3+
import confluent_kafka
4+
import re
5+
6+
7+
def test_verify_docs():
8+
""" Make sure all exported functions, classes, etc, have proper docstrings
9+
"""
10+
fails = 0
11+
12+
for n in dir(confluent_kafka):
13+
if n[0:2] == '__':
14+
# Skip internals
15+
continue
16+
17+
o = confluent_kafka.__dict__.get(n)
18+
d = o.__doc__
19+
if not d:
20+
print('Missing __doc__ for: %s (type %s)' % (n, type(o)))
21+
fails += 1
22+
elif not re.search(r':', d):
23+
print('Missing Doxygen tag for: %s (type %s)' % (n, type(o)))
24+
fails += 1
25+
26+
assert fails == 0
27+
28+

‎tests/test_enums.py

+9
Original file line numberDiff line numberDiff line change
@@ -0,0 +1,9 @@
1+
#!/usr/bin/env python
2+
3+
import confluent_kafka
4+
5+
def test_enums():
6+
""" Make sure librdkafka error enums are reachable directly from the
7+
KafkaError class without an instantiated object. """
8+
print(confluent_kafka.KafkaError._NO_OFFSET)
9+
print(confluent_kafka.KafkaError.REBALANCE_IN_PROGRESS)

‎tests/test_misc.py

+16
Original file line numberDiff line numberDiff line change
@@ -0,0 +1,16 @@
1+
#!/usr/bin/env python
2+
3+
import confluent_kafka
4+
5+
6+
def test_version():
7+
print('Using confluent_kafka module version %s (0x%x)' % confluent_kafka.version())
8+
sver, iver = confluent_kafka.version()
9+
assert len(sver) > 0
10+
assert iver > 0
11+
12+
print('Using librdkafka version %s (0x%x)' % confluent_kafka.libversion())
13+
sver, iver = confluent_kafka.libversion()
14+
assert len(sver) > 0
15+
assert iver > 0
16+

0 commit comments

Comments
 (0)
Please sign in to comment.