Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

🔖 Release 2.8.905 #138

Merged
merged 1 commit into from
Aug 4, 2024
Merged
Show file tree
Hide file tree
Changes from all commits
Commits
File filter

Filter by extension

Filter by extension


Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
14 changes: 14 additions & 0 deletions CHANGES.rst
Original file line number Diff line number Diff line change
@@ -1,3 +1,17 @@
2.8.905 (2024-08-04)
====================

- Fixed wrong upgrade attempt to QUIC when using a SOCKS proxy. Any usage of a proxy disable HTTP/3 over QUIC as per documented.
until proper support is implemented in a next minor version.
- Backported upstream urllib3 #3434: util/ssl: make code resilient to missing hash functions.
In certain environments such as in a FIPS enabled system, certain algorithms such as md5 may be unavailable. Due
to the importing of such a module on a system where it is unavailable, urllib3(-future) will crash and is unusable.
https://github.com/urllib3/urllib3/pull/3434
- Backported upstream urllib3 GHSA-34jh-p97f-mpxf: Strip Proxy-Authorization header on redirects.
Added the ``Proxy-Authorization`` header to the list of headers to strip from requests when redirecting to a different host.
As before, different headers can be set via ``Retry.remove_headers_on_redirect``.
- Fixed state-machine desync on a rare scenario when uploading a body using HTTP/3 over QUIC.

2.8.904 (2024-07-18)
====================

Expand Down
25 changes: 14 additions & 11 deletions README.md
Original file line number Diff line number Diff line change
Expand Up @@ -141,26 +141,29 @@ We agree that this solution isn't perfect and actually put a lot of pressure on

Here are some of the reasons (not exhaustive) we choose to work this way:

- A) Some major companies may not be able to touch the production code but can "change/swap" dependencies.
- B) urllib3-future main purpose is to fuel Niquests, which is itself a drop-in replacement of Requests.
> A) Some major companies may not be able to touch the production code but can "change/swap" dependencies.

> B) urllib3-future main purpose is to fuel Niquests, which is itself a drop-in replacement of Requests.
And there's more than 100 packages commonly used that plug into Requests, but the code (of the packages) invoke urllib3
So... We cannot fork those 100+ projects to patch urllib3 usage, it is impossible at the moment, given our means.
Requests trapped us, and there should be a way to escape the nonsense "migrate" to another http client that reinvent
basic things and interactions.
- C) We don't have to reinvent the wheel.
- D) Some of our partners started noticing that HTTP/1 started to be disabled by some webservices in favor of HTTP/2+

> C) We don't have to reinvent the wheel.

> D) Some of our partners started noticing that HTTP/1 started to be disabled by some webservices in favor of HTTP/2+
So, this fork can unblock them at (almost) zero cost.

**OK... then what do I gain from this?**
- **OK... then what do I gain from this?**

- It is faster than its counterpart, we measured gain up to 2X faster in a multithreaded environment using a http2 endpoint.
- It works well with gevent / does not conflict. We do not use the standard queue class from stdlib as it does not fit http2+ constraints.
- Leveraging recent protocols like http2 and http3 transparently. Code and behaviors does not change one bit.
- You do not depend on the standard library to emit http/1 requests, and that is actually a good news. http.client
1. It is faster than its counterpart, we measured gain up to 2X faster in a multithreaded environment using a http2 endpoint.
2. It works well with gevent / does not conflict. We do not use the standard queue class from stdlib as it does not fit http2+ constraints.
3. Leveraging recent protocols like http2 and http3 transparently. Code and behaviors does not change one bit.
4. You do not depend on the standard library to emit http/1 requests, and that is actually a good news. http.client
has numerous known flaws but cannot be fixed as we speak. (e.g. urllib3 is based on http.client)
- There a ton of other improvement you may leverage, but for that you will need to migrate to Niquests or update your code
5. There a ton of other improvement you may leverage, but for that you will need to migrate to Niquests or update your code
to enable specific capabilities, like but not limited to: "DNS over QUIC, HTTP" / "Happy Eyeballs" / "Native Asyncio" / "Advanced Multiplexing".
- Non-blocking IO with concurrent streams/requests. And yes, transparently.
6. Non-blocking IO with concurrent streams/requests. And yes, transparently.

- **Is this funded?**

Expand Down
1 change: 1 addition & 0 deletions dev-requirements.txt
Original file line number Diff line number Diff line change
@@ -1,5 +1,6 @@
coverage>=7.2.7,<=7.4.1
tornado>=6.2,<=6.4
# 2.5 seems to break test_proxy_rejection by hanging forever
python-socks==2.4.4
pytest==7.4.4
pytest-timeout==2.3.1
Expand Down
1 change: 1 addition & 0 deletions pyproject.toml
Original file line number Diff line number Diff line change
Expand Up @@ -126,6 +126,7 @@ filterwarnings = [
'''ignore:A plugin raised an exception during''',
'''ignore:Exception ignored in:pytest.PytestUnraisableExceptionWarning''',
'''ignore:Exception in thread:pytest.PytestUnhandledThreadExceptionWarning''',
'''ignore:The `hash` argument is deprecated in favor of `unsafe_hash`:DeprecationWarning''',
]

[tool.isort]
Expand Down
2 changes: 1 addition & 1 deletion src/urllib3/_version.py
Original file line number Diff line number Diff line change
@@ -1,4 +1,4 @@
# This file is protected via CODEOWNERS
from __future__ import annotations

__version__ = "2.8.904"
__version__ = "2.8.905"
15 changes: 14 additions & 1 deletion src/urllib3/backend/_async/hface.py
Original file line number Diff line number Diff line change
Expand Up @@ -825,7 +825,12 @@ def putheader(self, header: str, *values: str) -> None:
# We assume it is passed as-is (meaning 'keep-alive' lower-cased)
# It may(should) break the connection.
if not support_te_chunked:
if encoded_header in {b"transfer-encoding", b"connection"}:
if encoded_header in {
b"transfer-encoding",
b"connection",
b"upgrade",
b"keep-alive",
}:
return

if self.__expected_body_length is None and encoded_header == b"content-length":
Expand Down Expand Up @@ -1106,6 +1111,14 @@ async def send( # type: ignore[override]

raise EarlyResponse(promise=rp)

while True:
data_out = self._protocol.bytes_to_send()

if not data_out:
break

await self.sock.sendall(data_out)

if self.__remaining_body_length:
self.__remaining_body_length -= len(data)

Expand Down
15 changes: 14 additions & 1 deletion src/urllib3/backend/hface.py
Original file line number Diff line number Diff line change
Expand Up @@ -891,7 +891,12 @@ def putheader(self, header: str, *values: str) -> None:
# We assume it is passed as-is (meaning 'keep-alive' lower-cased)
# It may(should) break the connection.
if not support_te_chunked:
if encoded_header in {b"transfer-encoding", b"connection"}:
if encoded_header in {
b"transfer-encoding",
b"connection",
b"upgrade",
b"keep-alive",
}:
return

if self.__expected_body_length is None and encoded_header == b"content-length":
Expand Down Expand Up @@ -1178,6 +1183,14 @@ def send(

raise EarlyResponse(promise=rp)

while True:
data_out = self._protocol.bytes_to_send()

if not data_out:
break

self.sock.sendall(data_out)

if self.__remaining_body_length:
self.__remaining_body_length -= len(data)

Expand Down
26 changes: 18 additions & 8 deletions src/urllib3/contrib/socks.py
Original file line number Diff line number Diff line change
@@ -1,7 +1,7 @@
"""
This module contains provisional support for SOCKS proxies from within
urllib3. This module supports SOCKS4, SOCKS4A (an extension of SOCKS4), and
SOCKS5. To enable its functionality, either install PySocks or install this
SOCKS5. To enable its functionality, either install python-socks or install this
module with the ``socks`` extra.

The SOCKS implementation supports the full range of urllib3 features. It also
Expand Down Expand Up @@ -40,14 +40,13 @@

from __future__ import annotations

from python_socks import ( # type: ignore[import-untyped]
ProxyConnectionError,
ProxyError,
ProxyTimeoutError,
ProxyType,
)

try:
from python_socks import ( # type: ignore[import-untyped]
ProxyConnectionError,
ProxyError,
ProxyTimeoutError,
ProxyType,
)
from python_socks.sync import Proxy # type: ignore[import-untyped]

from ._socks_override import AsyncioProxy
Expand Down Expand Up @@ -89,6 +88,7 @@
from .._async.connectionpool import AsyncHTTPConnectionPool, AsyncHTTPSConnectionPool
from .._async.poolmanager import AsyncPoolManager
from .._typing import _TYPE_SOCKS_OPTIONS
from ..backend import HttpVersion

# synchronous part
from ..connection import HTTPConnection, HTTPSConnection
Expand Down Expand Up @@ -257,6 +257,11 @@ def __init__(
}
connection_pool_kw["_socks_options"] = socks_options

if "disabled_svn" not in connection_pool_kw:
connection_pool_kw["disabled_svn"] = set()

connection_pool_kw["disabled_svn"].add(HttpVersion.h3)

super().__init__(num_pools, headers, **connection_pool_kw)

self.pool_classes_by_scheme = SOCKSProxyManager.pool_classes_by_scheme
Expand Down Expand Up @@ -415,6 +420,11 @@ def __init__(
}
connection_pool_kw["_socks_options"] = socks_options

if "disabled_svn" not in connection_pool_kw:
connection_pool_kw["disabled_svn"] = set()

connection_pool_kw["disabled_svn"].add(HttpVersion.h3)

super().__init__(num_pools, headers, **connection_pool_kw)

self.pool_classes_by_scheme = AsyncSOCKSProxyManager.pool_classes_by_scheme
5 changes: 5 additions & 0 deletions src/urllib3/poolmanager.py
Original file line number Diff line number Diff line change
Expand Up @@ -935,6 +935,11 @@ def __init__(
connection_pool_kw["_proxy_headers"] = self.proxy_headers
connection_pool_kw["_proxy_config"] = self.proxy_config

if "disabled_svn" not in connection_pool_kw:
connection_pool_kw["disabled_svn"] = set()

connection_pool_kw["disabled_svn"].add(HttpVersion.h3)

super().__init__(num_pools, headers, **connection_pool_kw)

def connection_from_host(
Expand Down
4 changes: 3 additions & 1 deletion src/urllib3/util/retry.py
Original file line number Diff line number Diff line change
Expand Up @@ -190,7 +190,9 @@ class Retry:
RETRY_AFTER_STATUS_CODES = frozenset([413, 429, 503])

#: Default headers to be used for ``remove_headers_on_redirect``
DEFAULT_REMOVE_HEADERS_ON_REDIRECT = frozenset(["Cookie", "Authorization"])
DEFAULT_REMOVE_HEADERS_ON_REDIRECT = frozenset(
["Cookie", "Authorization", "Proxy-Authorization"]
)

#: Default maximum backoff time.
DEFAULT_BACKOFF_MAX = 120
Expand Down
22 changes: 18 additions & 4 deletions src/urllib3/util/ssl_.py
Original file line number Diff line number Diff line change
@@ -1,5 +1,6 @@
from __future__ import annotations

import hashlib
import hmac
import io
import os
Expand All @@ -9,7 +10,6 @@
import typing
import warnings
from binascii import unhexlify
from hashlib import md5, sha1, sha256

from .._constant import MOZ_INTERMEDIATE_CIPHERS
from ..contrib.imcc import load_cert_chain as _ctx_load_cert_chain
Expand All @@ -27,7 +27,14 @@
IS_PYOPENSSL = False # kept for BC reason

# Maps the length of a digest to a possible hash function producing this digest
HASHFUNC_MAP = {32: md5, 40: sha1, 64: sha256}
HASHFUNC_MAP = {
length: getattr(hashlib, algorithm, None)
for length, algorithm in (
(32, "md5"),
(40, "sha1"),
(64, "sha256"),
)
}


def _compute_key_ctx_build(
Expand Down Expand Up @@ -238,10 +245,17 @@ def assert_fingerprint(cert: bytes | None, fingerprint: str) -> None:

fingerprint = fingerprint.replace(":", "").lower()
digest_length = len(fingerprint)
hashfunc = HASHFUNC_MAP.get(digest_length)
if not hashfunc:
if digest_length not in HASHFUNC_MAP:
raise SSLError(f"Fingerprint of invalid length: {fingerprint}")

hashfunc = HASHFUNC_MAP[digest_length]

if hashfunc is None:
raise SSLError(
f"Hash function implementation unavailable for fingerprint length: {digest_length}. "
"Hint: your OpenSSL build may not include it for compliance issues."
)

# We need encode() here for py32; works on py2 and p33.
fingerprint_bytes = unhexlify(fingerprint.encode())

Expand Down
Loading
Loading