-
Notifications
You must be signed in to change notification settings - Fork 28
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Trying to decode a string containing 100,000 [
characters (and nothing else) crashes Python
#1
Comments
Here's the end of the backtrace:
|
Note: this may be considered a security issue, but I didn't think delaying disclosure made any sense, since the JSONTestSuite tests are public and have been for months, and were discussed in the press more than once. |
Comparing to ujson, it implements a maximum recursion depth as a protection against this kind of problem: https://github.com/esnme/ultrajson/blob/master/lib/ultrajson.h#L72-L75 https://github.com/esnme/ultrajson/blob/master/lib/ultrajsondec.c#L682-L685 https://github.com/esnme/ultrajson/blob/master/lib/ultrajsondec.c#L747-L751 |
This implements a maximum recursion depth when parsing arrays and objects, to prevent the potential to crash the parser by asking it to parse something with a huge amount of recursion. The issue was discovered using Nicolas Seriot's JSONTestSuite: https://github.com/nst/JSONTestSuite The fix is directly inspired by the similar protection in ultrajson / ujson: https://github.com/esnme/ultrajson Signed-off-by: Adam Williamson <[email protected]>
#3 fixes this, I tested it. |
Hi Adam, the issue is fixed in version 1.2.0. Thanks for tracking this. |
This is one of the tests from JSONTestSuite. It just throws a file containing 100,000
[
characters at the parser. When trying to parse this,python-cjson
crashes Python. The backtrace is gigantic, and I haven't actually got to the end of it in gdb, but it starts like this:It then repeats that, over and over and over again; I suspect there are going to be 100,000 of 'em before we get to the end.
The text was updated successfully, but these errors were encountered: