diff --git a/CHANGES.rst b/CHANGES.rst index b2f0f4a2e6..cd05205056 100644 --- a/CHANGES.rst +++ b/CHANGES.rst @@ -1,3 +1,24 @@ +2.1.903 (2023-10-23) +==================== + +- Removed ``BaseHTTPConnection``, and ``BaseHTTPSConnection``. + Rationale: The initial idea, as far as I understand it, was to create a ``HTTPSConnection`` per protocols, e.g. + HTTP/2, and HTTP/3. From the point of view of ``urllib3.future`` it was taken care of in ``contrib.hface`` + where the protocols state-machines are handled. We plan to always have a unified ``Connection`` class that + regroup all protocols for convenience. The private module ``urllib3._base_connection`` is renamed to ``urllib3._typing``. + It brings a lot of simplification, which is welcomed. +- Reduced ``BaseHTTPResponse`` to a mere alias of ``HTTPResponse`` for the same reasoning as before. There is absolutely + no need whatsoever in the foreseeable future to ship urllib3.future with an alternative implementation of ``HTTPResponse``. + It will be removed in a future major. +- Removed ``RECENT_DATE`` and linked logic as it does not make sense to (i) maintain it (ii) the certificate verification + failure won't be avoided anyway, so it is a warning prior to an unavoidable error. The warning class ``SystemTimeWarning`` + will be removed in a future major. +- Added support for stopping sending body if the server responded early in HTTP/2, or HTTP/3. + This can happen when a server says that you exhausted the size limit or if previously sent + headers were rejected for example. This should save a lot of time to users in given cases. +- Refactored scattered typing aliases across the sources. ``urllib3._typing`` now contain all of our definitions. +- Avoid installation of ``qh3`` in PyPy 3.11+ while pre-built wheels are unavailable. + 2.1.902 (2023-10-21) ==================== diff --git a/docs/reference/contrib/index.rst b/docs/reference/contrib/index.rst index e233241616..90101122bc 100644 --- a/docs/reference/contrib/index.rst +++ b/docs/reference/contrib/index.rst @@ -6,6 +6,4 @@ prime time or that require optional third-party dependencies. .. toctree:: - pyopenssl - securetransport socks diff --git a/docs/reference/contrib/pyopenssl.rst b/docs/reference/contrib/pyopenssl.rst deleted file mode 100644 index a7425b3ddf..0000000000 --- a/docs/reference/contrib/pyopenssl.rst +++ /dev/null @@ -1,10 +0,0 @@ -PyOpenSSL -========= -.. warning:: - DEPRECATED: This module is deprecated and will be removed in urllib3 v2.1.0. - Read more in this `issue `_. - -.. automodule:: urllib3.contrib.pyopenssl - :members: - :undoc-members: - :show-inheritance: diff --git a/docs/reference/contrib/securetransport.rst b/docs/reference/contrib/securetransport.rst deleted file mode 100644 index d4af1b8ba0..0000000000 --- a/docs/reference/contrib/securetransport.rst +++ /dev/null @@ -1,31 +0,0 @@ -macOS SecureTransport -===================== -.. warning:: - DEPRECATED: This module is deprecated and will be removed in urllib3 v2.1.0. - Read more in this `issue `_. - -`SecureTranport `_ -support for urllib3 via ctypes. - -This makes platform-native TLS available to urllib3 users on macOS without the -use of a compiler. This is an important feature because the Python Package -Index is moving to become a TLSv1.2-or-higher server, and the default OpenSSL -that ships with macOS is not capable of doing TLSv1.2. The only way to resolve -this is to give macOS users an alternative solution to the problem, and that -solution is to use SecureTransport. - -We use ctypes here because this solution must not require a compiler. That's -because Pip is not allowed to require a compiler either. - -This code is a bastardised version of the code found in Will Bond's -`oscrypto `_ library. An enormous debt -is owed to him for blazing this trail for us. For that reason, this code -should be considered to be covered both by urllib3's license and by -`oscrypto's `_. - -To use this module, simply import and inject it: - -.. code-block:: python - - import urllib3.contrib.securetransport - urllib3.contrib.securetransport.inject_into_urllib3() diff --git a/docs/reference/urllib3.response.rst b/docs/reference/urllib3.response.rst index d00b8af65c..4f3e3f4529 100644 --- a/docs/reference/urllib3.response.rst +++ b/docs/reference/urllib3.response.rst @@ -4,11 +4,6 @@ Response and Decoders Response -------- -.. autoclass:: urllib3.response.BaseHTTPResponse - :members: - :undoc-members: - :show-inheritance: - .. autoclass:: urllib3.response.HTTPResponse :members: :undoc-members: diff --git a/docs/v2-migration-guide.rst b/docs/v2-migration-guide.rst index 06db4a0e18..0ad1381be0 100644 --- a/docs/v2-migration-guide.rst +++ b/docs/v2-migration-guide.rst @@ -43,7 +43,6 @@ What are the important changes? Here's a short summary of which changes in urllib3 v2.0 are most important: - Python version must be **3.7 or later** (previously supported Python 2.7, 3.5, and 3.6). -- Removed support for non-OpenSSL TLS libraries (like LibreSSL and wolfSSL). - Removed support for OpenSSL versions older than 1.1.1. - Removed support for Python implementations that aren't CPython or PyPy3 (previously supported Google App Engine, Jython). - Removed the ``urllib3.contrib.ntlmpool`` module. @@ -167,19 +166,6 @@ It's important to know that even if you don't upgrade all of your services to 2. immediately you will `receive security fixes on the 1.26.x release stream <#security-fixes-for-urllib3-v1-26-x>` for some time. -Security fixes for urllib3 v1.26.x -~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~ - -Thanks to support from `Tidelift `_ -we're able to continue supporting the v1.26.x release stream with -security fixes for the foreseeable future 💖 - -However, upgrading is still recommended as **no new feature developments or non-critical -bug fixes will be shipped to the 1.26.x release stream**. - -If your organization relies on urllib3 and is interested in continuing support you can learn -more about the `Tidelift Subscription for Enterprise `_. - **🤔 Common upgrading issues** ------------------------------- @@ -341,11 +327,6 @@ on code using urllib3. This also means that for IDEs that support type hints you'll receive better suggestions from auto-complete. No more confusion with ``**kwargs``! -We've also added API interfaces like ``BaseHTTPResponse`` -and ``BaseHTTPConnection`` to ensure that when you're sub-classing -an interface you're only using supported public APIs to ensure -compatibility and minimize breakages down the road. - .. note:: If you're one of the rare few who is subclassing connections diff --git a/mypy-requirements.txt b/mypy-requirements.txt index 87b7a04be0..f5ef2203be 100644 --- a/mypy-requirements.txt +++ b/mypy-requirements.txt @@ -1,4 +1,5 @@ -mypy==1.4.1 +mypy==1.6.1; python_version >= '3.8' +mypy==1.4.1; python_version < '3.8' idna>=2.0.0 cryptography>=1.3.4 tornado>=6.1 diff --git a/notes/public-and-private-apis.md b/notes/public-and-private-apis.md index 6321696dbb..c4969811db 100644 --- a/notes/public-and-private-apis.md +++ b/notes/public-and-private-apis.md @@ -7,13 +7,13 @@ - `urllib3.ProxyManager` - `urllib3.HTTPConnectionPool` - `urllib3.HTTPSConnectionPool` -- `urllib3.BaseHTTPResponse` - `urllib3.HTTPResponse` - `urllib3.HTTPHeaderDict` - `urllib3.filepost` - `urllib3.fields` - `urllib3.exceptions` -- `urllib3.contrib.*` +- `urllib3.contrib.socks` +- `urllib3.contrib.pyopenssl` - `urllib3.util` Only public way to configure proxies is through `ProxyManager`? @@ -21,8 +21,6 @@ Only public way to configure proxies is through `ProxyManager`? ## Private APIs - `urllib3.connection` -- `urllib3.connection.BaseHTTPConnection` -- `urllib3.connection.BaseHTTPSConnection` - `urllib3.connection.HTTPConnection` - `urllib3.connection.HTTPSConnection` - `urllib3.util.*` (submodules) diff --git a/pyproject.toml b/pyproject.toml index 125b40c2aa..6074959089 100644 --- a/pyproject.toml +++ b/pyproject.toml @@ -37,7 +37,7 @@ classifiers = [ requires-python = ">=3.7" dynamic = ["version"] dependencies = [ - "qh3>=0.11.3,<1.0.0; (platform_system == 'Darwin' or platform_system == 'Windows' or platform_system == 'Linux') and (platform_python_implementation == 'CPython' or (platform_python_implementation == 'PyPy' and python_version >= '3.8'))", + "qh3>=0.11.3,<1.0.0; (platform_system == 'Darwin' or platform_system == 'Windows' or platform_system == 'Linux') and (platform_python_implementation == 'CPython' or (platform_python_implementation == 'PyPy' and python_version >= '3.8' and python_version < '3.11'))", "h11>=0.11.0,<1.0.0", "h2>=4.0.0,<5.0.0", ] diff --git a/src/urllib3/__init__.py b/src/urllib3/__init__.py index f0763f6e7f..9e434cade3 100644 --- a/src/urllib3/__init__.py +++ b/src/urllib3/__init__.py @@ -12,12 +12,12 @@ from os import environ from . import exceptions -from ._base_connection import _TYPE_BODY from ._collections import HTTPHeaderDict +from ._typing import _TYPE_BODY, _TYPE_FIELDS from ._version import __version__ from .backend import ConnectionInfo, HttpVersion from .connectionpool import HTTPConnectionPool, HTTPSConnectionPool, connection_from_url -from .filepost import _TYPE_FIELDS, encode_multipart_formdata +from .filepost import encode_multipart_formdata from .poolmanager import PoolManager, ProxyManager, proxy_from_url from .response import BaseHTTPResponse, HTTPResponse from .util.request import make_headers diff --git a/src/urllib3/_base_connection.py b/src/urllib3/_base_connection.py deleted file mode 100644 index 3e2213825d..0000000000 --- a/src/urllib3/_base_connection.py +++ /dev/null @@ -1,184 +0,0 @@ -from __future__ import annotations - -import typing - -from .backend import ConnectionInfo, LowLevelResponse -from .util.connection import _TYPE_SOCKET_OPTIONS -from .util.timeout import _DEFAULT_TIMEOUT, _TYPE_TIMEOUT -from .util.url import Url - -_TYPE_BODY = typing.Union[ - bytes, - typing.IO[typing.Any], - typing.Iterable[bytes], - typing.Iterable[str], - str, - LowLevelResponse, -] - - -class ProxyConfig(typing.NamedTuple): - ssl_context: ssl.SSLContext | None - use_forwarding_for_https: bool - assert_hostname: None | str | Literal[False] - assert_fingerprint: str | None - - -class _ResponseOptions(typing.NamedTuple): - # TODO: Remove this in favor of a better - # HTTP request/response lifecycle tracking. - request_method: str - request_url: str - preload_content: bool - decode_content: bool - enforce_content_length: bool - - -if typing.TYPE_CHECKING: - import ssl - - from typing_extensions import Literal, Protocol - - from .response import BaseHTTPResponse - - class BaseHTTPConnection(Protocol): - scheme: typing.ClassVar[str] - default_port: typing.ClassVar[int] - default_socket_options: typing.ClassVar[_TYPE_SOCKET_OPTIONS] - - host: str - port: int | None - timeout: None | ( - float - ) # Instance doesn't store _DEFAULT_TIMEOUT, must be resolved. - blocksize: int - source_address: tuple[str, int] | None - socket_options: _TYPE_SOCKET_OPTIONS | None - - proxy: Url | None - proxy_config: ProxyConfig | None - - is_verified: bool - proxy_is_verified: bool | None - - conn_info: ConnectionInfo | None = None - - def __init__( - self, - host: str, - port: int | None = None, - *, - timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, - source_address: tuple[str, int] | None = None, - blocksize: int = 8192, - socket_options: _TYPE_SOCKET_OPTIONS | None = ..., - proxy: Url | None = None, - proxy_config: ProxyConfig | None = None, - ) -> None: - ... - - def set_tunnel( - self, - host: str, - port: int | None = None, - headers: typing.Mapping[str, str] | None = None, - scheme: str = "http", - ) -> None: - ... - - def connect(self) -> None: - ... - - def request( - self, - method: str, - url: str, - body: _TYPE_BODY | None = None, - headers: typing.Mapping[str, str] | None = None, - # We know *at least* botocore is depending on the order of the - # first 3 parameters so to be safe we only mark the later ones - # as keyword-only to ensure we have space to extend. - *, - chunked: bool = False, - preload_content: bool = True, - decode_content: bool = True, - enforce_content_length: bool = True, - ) -> None: - ... - - def getresponse(self) -> BaseHTTPResponse: - ... - - def close(self) -> None: - ... - - @property - def is_closed(self) -> bool: - """Whether the connection either is brand new or has been previously closed. - If this property is True then both ``is_connected`` and ``has_connected_to_proxy`` - properties must be False. - """ - - @property - def is_connected(self) -> bool: - """Whether the connection is actively connected to any origin (proxy or target)""" - - @property - def has_connected_to_proxy(self) -> bool: - """Whether the connection has successfully connected to its proxy. - This returns False if no proxy is in use. Used to determine whether - errors are coming from the proxy layer or from tunnelling to the target origin. - """ - - class BaseHTTPSConnection(BaseHTTPConnection, Protocol): - default_port: typing.ClassVar[int] - default_socket_options: typing.ClassVar[_TYPE_SOCKET_OPTIONS] - - # Certificate verification methods - cert_reqs: int | str | None - assert_hostname: None | str | Literal[False] - assert_fingerprint: str | None - ssl_context: ssl.SSLContext | None - - # Trusted CAs - ca_certs: str | None - ca_cert_dir: str | None - ca_cert_data: None | str | bytes - - # TLS version - ssl_minimum_version: int | None - ssl_maximum_version: int | None - ssl_version: int | str | None # Deprecated - - # Client certificates - cert_file: str | None - key_file: str | None - key_password: str | None - - def __init__( - self, - host: str, - port: int | None = None, - *, - timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, - source_address: tuple[str, int] | None = None, - blocksize: int = 8192, - socket_options: _TYPE_SOCKET_OPTIONS | None = ..., - proxy: Url | None = None, - proxy_config: ProxyConfig | None = None, - cert_reqs: int | str | None = None, - assert_hostname: None | str | Literal[False] = None, - assert_fingerprint: str | None = None, - server_hostname: str | None = None, - ssl_context: ssl.SSLContext | None = None, - ca_certs: str | None = None, - ca_cert_dir: str | None = None, - ca_cert_data: None | str | bytes = None, - ssl_minimum_version: int | None = None, - ssl_maximum_version: int | None = None, - ssl_version: int | str | None = None, # Deprecated - cert_file: str | None = None, - key_file: str | None = None, - key_password: str | None = None, - ) -> None: - ... diff --git a/src/urllib3/_request_methods.py b/src/urllib3/_request_methods.py index 1d0f3465ad..600bdcb52c 100644 --- a/src/urllib3/_request_methods.py +++ b/src/urllib3/_request_methods.py @@ -4,18 +4,13 @@ import typing from urllib.parse import urlencode -from ._base_connection import _TYPE_BODY from ._collections import HTTPHeaderDict -from .filepost import _TYPE_FIELDS, encode_multipart_formdata +from ._typing import _TYPE_BODY, _TYPE_ENCODE_URL_FIELDS, _TYPE_FIELDS +from .filepost import encode_multipart_formdata from .response import BaseHTTPResponse __all__ = ["RequestMethods"] -_TYPE_ENCODE_URL_FIELDS = typing.Union[ - typing.Sequence[typing.Tuple[str, typing.Union[str, bytes]]], - typing.Mapping[str, typing.Union[str, bytes]], -] - class RequestMethods: """ diff --git a/src/urllib3/_typing.py b/src/urllib3/_typing.py new file mode 100644 index 0000000000..0b0d088d82 --- /dev/null +++ b/src/urllib3/_typing.py @@ -0,0 +1,86 @@ +from __future__ import annotations + +import typing + +from .backend import LowLevelResponse +from .fields import RequestField +from .util.request import _TYPE_FAILEDTELL +from .util.timeout import _TYPE_DEFAULT, Timeout + +if typing.TYPE_CHECKING: + import ssl + + from typing_extensions import Literal, TypedDict + + class _TYPE_PEER_CERT_RET_DICT(TypedDict, total=False): + subjectAltName: tuple[tuple[str, str], ...] + subject: tuple[tuple[tuple[str, str], ...], ...] + serialNumber: str + + +_TYPE_BODY: typing.TypeAlias = typing.Union[ + bytes, + typing.IO[typing.Any], + typing.Iterable[bytes], + typing.Iterable[str], + str, + LowLevelResponse, +] + +_TYPE_FIELD_VALUE: typing.TypeAlias = typing.Union[str, bytes] +_TYPE_FIELD_VALUE_TUPLE: typing.TypeAlias = typing.Union[ + _TYPE_FIELD_VALUE, + typing.Tuple[str, _TYPE_FIELD_VALUE], + typing.Tuple[str, _TYPE_FIELD_VALUE, str], +] + +_TYPE_FIELDS_SEQUENCE: typing.TypeAlias = typing.Sequence[ + typing.Union[typing.Tuple[str, _TYPE_FIELD_VALUE_TUPLE], RequestField] +] +_TYPE_FIELDS: typing.TypeAlias = typing.Union[ + _TYPE_FIELDS_SEQUENCE, + typing.Mapping[str, _TYPE_FIELD_VALUE_TUPLE], +] +_TYPE_ENCODE_URL_FIELDS: typing.TypeAlias = typing.Union[ + typing.Sequence[typing.Tuple[str, typing.Union[str, bytes]]], + typing.Mapping[str, typing.Union[str, bytes]], +] +_TYPE_SOCKET_OPTIONS: typing.TypeAlias = typing.Sequence[ + typing.Union[ + typing.Tuple[int, int, typing.Union[int, bytes]], + typing.Tuple[int, int, typing.Union[int, bytes], str], + ] +] +_TYPE_REDUCE_RESULT: typing.TypeAlias = typing.Tuple[ + typing.Callable[..., object], typing.Tuple[object, ...] +] + + +_TYPE_TIMEOUT: typing.TypeAlias = typing.Union[float, _TYPE_DEFAULT, Timeout, None] +_TYPE_TIMEOUT_INTERNAL: typing.TypeAlias = typing.Union[float, _TYPE_DEFAULT, None] +_TYPE_PEER_CERT_RET: typing.TypeAlias = typing.Union[ + "_TYPE_PEER_CERT_RET_DICT", bytes, None +] + +_TYPE_BODY_POSITION: typing.TypeAlias = typing.Union[int, _TYPE_FAILEDTELL] + +try: + from typing import TypedDict + + class _TYPE_SOCKS_OPTIONS(TypedDict): + socks_version: int + proxy_host: str | None + proxy_port: str | None + username: str | None + password: str | None + rdns: bool + +except ImportError: # Python 3.7 + _TYPE_SOCKS_OPTIONS = typing.Dict[str, typing.Any] # type: ignore[misc, assignment] + + +class ProxyConfig(typing.NamedTuple): + ssl_context: ssl.SSLContext | None + use_forwarding_for_https: bool + assert_hostname: None | str | Literal[False] + assert_fingerprint: str | None diff --git a/src/urllib3/_version.py b/src/urllib3/_version.py index 8b39682388..bd992feab3 100644 --- a/src/urllib3/_version.py +++ b/src/urllib3/_version.py @@ -1,4 +1,4 @@ # This file is protected via CODEOWNERS from __future__ import annotations -__version__ = "2.1.902" +__version__ = "2.1.903" diff --git a/src/urllib3/backend/_base.py b/src/urllib3/backend/_base.py index 22a45e7f42..a30e550726 100644 --- a/src/urllib3/backend/_base.py +++ b/src/urllib3/backend/_base.py @@ -6,9 +6,9 @@ if typing.TYPE_CHECKING: from ssl import SSLSocket, SSLContext, TLSVersion + from .._typing import _TYPE_SOCKET_OPTIONS from .._collections import HTTPHeaderDict -from ..util import connection class HttpVersion(str, enum.Enum): @@ -154,7 +154,7 @@ class BaseBackend: default_socket_kind: socket.SocketKind = socket.SOCK_STREAM #: Disable Nagle's algorithm by default. - default_socket_options: typing.ClassVar[connection._TYPE_SOCKET_OPTIONS] = [ + default_socket_options: typing.ClassVar[_TYPE_SOCKET_OPTIONS] = [ (socket.IPPROTO_TCP, socket.TCP_NODELAY, 1, "tcp") ] @@ -175,8 +175,7 @@ def __init__( source_address: tuple[str, int] | None = None, blocksize: int = 8192, *, - socket_options: None - | (connection._TYPE_SOCKET_OPTIONS) = default_socket_options, + socket_options: None | _TYPE_SOCKET_OPTIONS = default_socket_options, disabled_svn: set[HttpVersion] | None = None, preemptive_quic_cache: QuicPreemptiveCacheType | None = None, ): diff --git a/src/urllib3/backend/hface.py b/src/urllib3/backend/hface.py index 636f2b3da9..137c9f9fd8 100644 --- a/src/urllib3/backend/hface.py +++ b/src/urllib3/backend/hface.py @@ -32,8 +32,8 @@ HeadersReceived, StreamResetReceived, ) -from ..exceptions import InvalidHeader, ProtocolError, SSLError -from ..util import connection, parse_alt_svc +from ..exceptions import EarlyResponse, InvalidHeader, ProtocolError, SSLError +from ..util import parse_alt_svc from ._base import ( BaseBackend, ConnectionInfo, @@ -42,6 +42,9 @@ QuicPreemptiveCacheType, ) +if typing.TYPE_CHECKING: + from .._typing import _TYPE_SOCKET_OPTIONS + _HAS_SYS_AUDIT = hasattr(sys, "audit") _HAS_QH3 = HTTPProtocolFactory.has(HTTP3Protocol) # type: ignore[type-abstract] @@ -58,7 +61,7 @@ def __init__( blocksize: int = 8192, *, socket_options: None - | (connection._TYPE_SOCKET_OPTIONS) = BaseBackend.default_socket_options, + | _TYPE_SOCKET_OPTIONS = BaseBackend.default_socket_options, disabled_svn: set[HttpVersion] | None = None, preemptive_quic_cache: QuicPreemptiveCacheType | None = None, ): @@ -74,7 +77,7 @@ def __init__( ) self._protocol: HTTPOverQUICProtocol | HTTPOverTCPProtocol | None = None - self._svn = None + self._svn: HttpVersion | None = None self._stream_id: int | None = None @@ -106,7 +109,7 @@ def _new_conn(self) -> socket.socket | None: if self.__alt_authority: self._svn = HttpVersion.h3 # we ignore alt-host as we do not trust cache security - self.port = self.__alt_authority[1] + self.port: int = self.__alt_authority[1] if self._svn == HttpVersion.h3: self.socket_kind = SOCK_DGRAM @@ -366,8 +369,8 @@ def set_tunnel( if HttpVersion.h3 not in self._disabled_svn: self._disabled_svn.add(HttpVersion.h3) - self._tunnel_host = host - self._tunnel_port = port + self._tunnel_host: str | None = host + self._tunnel_port: int | None = port if headers: self._tunnel_headers = headers @@ -777,7 +780,7 @@ def getresponse(self) -> LowLevelResponse: ) # keep last response - self._response = response + self._response: LowLevelResponse = response # save the quic ticket for session resumption if self._svn == HttpVersion.h3 and hasattr(self._protocol, "session_ticket"): @@ -828,6 +831,10 @@ def send( ): self._protocol.bytes_received(self.sock.recv(self.blocksize)) + # this is a bad sign. we should stop sending and instead retrieve the response. + if self._protocol.has_pending_event(): + raise EarlyResponse() + if self.__remaining_body_length: self.__remaining_body_length -= len(data) diff --git a/src/urllib3/connection.py b/src/urllib3/connection.py index 7652b37c6f..f37f71ef41 100644 --- a/src/urllib3/connection.py +++ b/src/urllib3/connection.py @@ -1,12 +1,10 @@ from __future__ import annotations -import datetime import logging import os import re import socket import typing -import warnings from http.client import HTTPException as HTTPException # noqa: F401 from http.client import ResponseNotReady from socket import timeout as SocketTimeout @@ -15,11 +13,17 @@ from typing_extensions import Literal from .response import HTTPResponse - from .util.ssl_ import _TYPE_PEER_CERT_RET_DICT from .util.ssltransport import SSLTransport + from ._typing import ( + _TYPE_BODY, + _TYPE_PEER_CERT_RET_DICT, + _TYPE_SOCKET_OPTIONS, + _TYPE_TIMEOUT_INTERNAL, + ProxyConfig, + ) from ._collections import HTTPHeaderDict -from .util.timeout import _DEFAULT_TIMEOUT, _TYPE_TIMEOUT, Timeout +from .util.timeout import _DEFAULT_TIMEOUT, Timeout from .util.util import to_str from .util.wait import wait_for_read @@ -34,17 +38,14 @@ class BaseSSLError(BaseException): # type: ignore[no-redef] pass -from ._base_connection import _TYPE_BODY -from ._base_connection import ProxyConfig as ProxyConfig -from ._base_connection import _ResponseOptions as _ResponseOptions from ._version import __version__ from .backend import HfaceBackend, HttpVersion, QuicPreemptiveCacheType from .exceptions import ( ConnectTimeoutError, + EarlyResponse, NameResolutionError, NewConnectionError, ProxyError, - SystemTimeWarning, ) from .util import SKIP_HEADER, SKIPPABLE_HEADERS, connection, ssl_ from .util.request import body_to_chunks @@ -68,13 +69,19 @@ class BaseSSLError(BaseException): # type: ignore[no-redef] port_by_scheme = {"http": 80, "https": 443} -# When it comes time to update this value as a part of regular maintenance -# (ie test_recent_date is failing) update it to ~6 months before the current date. -RECENT_DATE = datetime.date(2022, 1, 1) - _CONTAINS_CONTROL_CHAR_RE = re.compile(r"[^-!#$%&'*+.^_`|~0-9a-zA-Z]") +class _ResponseOptions(typing.NamedTuple): + # TODO: Remove this in favor of a better + # HTTP request/response lifecycle tracking. + request_method: str + request_url: str + preload_content: bool + decode_content: bool + enforce_content_length: bool + + class HTTPConnection(HfaceBackend): """ Based on :class:`urllib3.backend.BaseBackend` but provides an extra constructor @@ -105,7 +112,7 @@ class HTTPConnection(HfaceBackend): blocksize: int source_address: tuple[str, int] | None - socket_options: connection._TYPE_SOCKET_OPTIONS | None + socket_options: _TYPE_SOCKET_OPTIONS | None _has_connected_to_proxy: bool _response_options: _ResponseOptions | None @@ -118,11 +125,11 @@ def __init__( host: str, port: int | None = None, *, - timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, + timeout: _TYPE_TIMEOUT_INTERNAL = _DEFAULT_TIMEOUT, source_address: tuple[str, int] | None = None, blocksize: int = 8192, socket_options: None - | (connection._TYPE_SOCKET_OPTIONS) = HfaceBackend.default_socket_options, + | _TYPE_SOCKET_OPTIONS = HfaceBackend.default_socket_options, proxy: Url | None = None, proxy_config: ProxyConfig | None = None, disabled_svn: set[HttpVersion] | None = None, @@ -144,11 +151,6 @@ def __init__( self._has_connected_to_proxy = False self._response_options = None - # https://github.com/python/mypy/issues/4125 - # Mypy treats this as LSP violation, which is considered a bug. - # If `host` is made a property it violates LSP, because a writeable attribute is overridden with a read-only one. - # However, there is also a `host` setter so LSP is not violated. - # Potentially, a `@host.deleter` might be needed depending on how this issue will be fixed. @property def host(self) -> str: """ @@ -390,17 +392,20 @@ def request( self.putheader(header, value) self.endheaders(expect_body_afterward=chunks is not None) - # If we're given a body we start sending that in chunks. - if chunks is not None: - for chunk in chunks: - # Sending empty chunks isn't allowed for TE: chunked - # as it indicates the end of the body. - if not chunk: - continue - if isinstance(chunk, str): - chunk = chunk.encode("utf-8") - self.send(chunk) - self.send(b"", eot=True) + try: + # If we're given a body we start sending that in chunks. + if chunks is not None: + for chunk in chunks: + # Sending empty chunks isn't allowed for TE: chunked + # as it indicates the end of the body. + if not chunk: + continue + if isinstance(chunk, str): + chunk = chunk.encode("utf-8") + self.send(chunk) + self.send(b"", eot=True) + except EarlyResponse: + pass def getresponse( # type: ignore[override] self, @@ -472,11 +477,11 @@ def __init__( host: str, port: int | None = None, *, - timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, + timeout: _TYPE_TIMEOUT_INTERNAL = _DEFAULT_TIMEOUT, source_address: tuple[str, int] | None = None, blocksize: int = 8192, socket_options: None - | (connection._TYPE_SOCKET_OPTIONS) = HTTPConnection.default_socket_options, + | _TYPE_SOCKET_OPTIONS = HTTPConnection.default_socket_options, disabled_svn: set[HttpVersion] | None = None, preemptive_quic_cache: QuicPreemptiveCacheType | None = None, proxy: Url | None = None, @@ -617,16 +622,6 @@ def connect(self) -> None: if self.server_hostname is not None: server_hostname = self.server_hostname - is_time_off = datetime.date.today() < RECENT_DATE - if is_time_off: - warnings.warn( - ( - f"System time is way off (before {RECENT_DATE}). This will probably " - "lead to SSL verification errors" - ), - SystemTimeWarning, - ) - sock_and_verified = _ssl_wrap_socket_and_match_hostname( sock=sock, cert_reqs=self.cert_reqs, @@ -666,7 +661,8 @@ def _connect_tls_proxy( Establish a TLS connection to the proxy using the provided SSL context. """ # `_connect_tls_proxy` is called when self._tunnel_host is truthy. - proxy_config = typing.cast(ProxyConfig, self.proxy_config) + assert self.proxy_config is not None + proxy_config = self.proxy_config ssl_context = proxy_config.ssl_context sock_and_verified = _ssl_wrap_socket_and_match_hostname( sock, diff --git a/src/urllib3/connectionpool.py b/src/urllib3/connectionpool.py index 5427920dfe..afaa266bac 100644 --- a/src/urllib3/connectionpool.py +++ b/src/urllib3/connectionpool.py @@ -10,9 +10,9 @@ from socket import timeout as SocketTimeout from types import TracebackType -from ._base_connection import _TYPE_BODY from ._collections import HTTPHeaderDict from ._request_methods import RequestMethods +from ._typing import _TYPE_BODY, _TYPE_BODY_POSITION, _TYPE_TIMEOUT, ProxyConfig from .backend import ConnectionInfo from .connection import ( BaseSSLError, @@ -21,7 +21,6 @@ HTTPConnection, HTTPException, HTTPSConnection, - ProxyConfig, _wrap_proxy_error, ) from .connection import port_by_scheme as port_by_scheme @@ -43,14 +42,10 @@ from .response import BaseHTTPResponse from .util.connection import is_connection_dropped from .util.proxy import connection_requires_http_tunnel -from .util.request import ( - _TYPE_BODY_POSITION, - NOT_FORWARDABLE_HEADERS, - set_file_position, -) +from .util.request import NOT_FORWARDABLE_HEADERS, set_file_position from .util.retry import Retry from .util.ssl_match_hostname import CertificateError -from .util.timeout import _DEFAULT_TIMEOUT, _TYPE_DEFAULT, Timeout +from .util.timeout import _DEFAULT_TIMEOUT, Timeout from .util.url import Url, _encode_target from .util.url import _normalize_host as normalize_host from .util.url import parse_url @@ -61,12 +56,8 @@ from typing_extensions import Literal - from ._base_connection import BaseHTTPConnection, BaseHTTPSConnection - log = logging.getLogger(__name__) -_TYPE_TIMEOUT = typing.Union[Timeout, float, _TYPE_DEFAULT, None] - _SelfT = typing.TypeVar("_SelfT") @@ -177,9 +168,7 @@ class HTTPConnectionPool(ConnectionPool, RequestMethods): """ scheme = "http" - ConnectionCls: ( - type[BaseHTTPConnection] | type[BaseHTTPSConnection] - ) = HTTPConnection + ConnectionCls: (type[HTTPConnection] | type[HTTPSConnection]) = HTTPConnection def __init__( self, @@ -242,7 +231,7 @@ def __init__( # HTTPConnectionPool object is garbage collected. weakref.finalize(self, _close_pool_connections, pool) - def _new_conn(self) -> BaseHTTPConnection: + def _new_conn(self) -> HTTPConnection: """ Return a fresh :class:`HTTPConnection`. """ @@ -262,7 +251,7 @@ def _new_conn(self) -> BaseHTTPConnection: ) return conn - def _get_conn(self, timeout: float | None = None) -> BaseHTTPConnection: + def _get_conn(self, timeout: float | None = None) -> HTTPConnection: """ Get a connection. Will return a pooled connection if one is available. @@ -300,7 +289,7 @@ def _get_conn(self, timeout: float | None = None) -> BaseHTTPConnection: return conn or self._new_conn() - def _put_conn(self, conn: BaseHTTPConnection | None) -> None: + def _put_conn(self, conn: HTTPConnection | None) -> None: """ Put a connection back into the pool. @@ -343,14 +332,14 @@ def _put_conn(self, conn: BaseHTTPConnection | None) -> None: if conn: conn.close() - def _validate_conn(self, conn: BaseHTTPConnection) -> None: + def _validate_conn(self, conn: HTTPConnection) -> None: """ Called right before a request is made, after the socket is created. """ if conn.is_closed: conn.connect() - def _prepare_proxy(self, conn: BaseHTTPConnection) -> None: + def _prepare_proxy(self, conn: HTTPConnection) -> None: # Nothing to do for HTTP connections. pass @@ -387,7 +376,7 @@ def _raise_timeout( def _make_request( self, - conn: BaseHTTPConnection, + conn: HTTPConnection, method: str, url: str, body: _TYPE_BODY | None = None, @@ -395,7 +384,7 @@ def _make_request( retries: Retry | None = None, timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, chunked: bool = False, - response_conn: BaseHTTPConnection | None = None, + response_conn: HTTPConnection | None = None, preload_content: bool = True, decode_content: bool = True, enforce_content_length: bool = True, @@ -552,8 +541,8 @@ def _make_request( # Set properties that are used by the pooling layer. response.retries = retries - response._connection = response_conn # type: ignore[attr-defined] - response._pool = self # type: ignore[attr-defined] + response._connection = response_conn + response._pool = self log.debug( '%s://%s:%s "%s %s %s" %s %s', @@ -563,9 +552,9 @@ def _make_request( method, url, # HTTP version - conn._http_vsn_str, # type: ignore[attr-defined] + conn._http_vsn_str, response.status, - response.length_remaining, # type: ignore[attr-defined] + response.length_remaining, ) return response @@ -993,7 +982,7 @@ class HTTPSConnectionPool(HTTPConnectionPool): """ scheme = "https" - ConnectionCls: type[BaseHTTPSConnection] = HTTPSConnection + ConnectionCls: type[HTTPSConnection] = HTTPSConnection def __init__( self, @@ -1061,7 +1050,7 @@ def _prepare_proxy(self, conn: HTTPSConnection) -> None: # type: ignore[overrid ) conn.connect() - def _new_conn(self) -> BaseHTTPSConnection: + def _new_conn(self) -> HTTPSConnection: """ Return a fresh :class:`urllib3.connection.HTTPConnection`. """ @@ -1103,7 +1092,7 @@ def _new_conn(self) -> BaseHTTPSConnection: **self.conn_kw, ) - def _validate_conn(self, conn: BaseHTTPConnection) -> None: + def _validate_conn(self, conn: HTTPConnection) -> None: """ Called right before a request is made, after the socket is created. """ diff --git a/src/urllib3/contrib/socks.py b/src/urllib3/contrib/socks.py index 98e216fee2..422f9ef17f 100644 --- a/src/urllib3/contrib/socks.py +++ b/src/urllib3/contrib/socks.py @@ -41,7 +41,7 @@ from __future__ import annotations try: - import socks # type: ignore[import] + import socks # type: ignore[import-not-found] except ImportError: import warnings @@ -60,6 +60,7 @@ import typing from socket import timeout as SocketTimeout +from .._typing import _TYPE_SOCKS_OPTIONS from ..connection import HTTPConnection, HTTPSConnection from ..connectionpool import HTTPConnectionPool, HTTPSConnectionPool from ..exceptions import ConnectTimeoutError, NewConnectionError @@ -71,20 +72,6 @@ except ImportError: ssl = None # type: ignore[assignment] -try: - from typing import TypedDict - - class _TYPE_SOCKS_OPTIONS(TypedDict): - socks_version: int - proxy_host: str | None - proxy_port: str | None - username: str | None - password: str | None - rdns: bool - -except ImportError: # Python 3.7 - _TYPE_SOCKS_OPTIONS = typing.Dict[str, typing.Any] # type: ignore[misc, assignment] - class SOCKSConnection(HTTPConnection): """ diff --git a/src/urllib3/exceptions.py b/src/urllib3/exceptions.py index 8afe761065..6698be6ab7 100644 --- a/src/urllib3/exceptions.py +++ b/src/urllib3/exceptions.py @@ -6,6 +6,7 @@ from http.client import IncompleteRead as httplib_IncompleteRead if typing.TYPE_CHECKING: + from ._typing import _TYPE_REDUCE_RESULT from .connection import HTTPConnection from .connectionpool import ConnectionPool from .response import HTTPResponse @@ -22,11 +23,6 @@ class HTTPWarning(Warning): """Base warning used by this module.""" -_TYPE_REDUCE_RESULT = typing.Tuple[ - typing.Callable[..., object], typing.Tuple[object, ...] -] - - class PoolError(HTTPError): """Base exception for errors caused within a pool.""" @@ -304,3 +300,7 @@ def __init__( class UnrewindableBodyError(HTTPError): """urllib3 encountered an error when trying to rewind a body""" + + +class EarlyResponse(HTTPError): + """urllib3 received a response prior to sending the whole body""" diff --git a/src/urllib3/fields.py b/src/urllib3/fields.py index 282205482c..7d1e4dc827 100644 --- a/src/urllib3/fields.py +++ b/src/urllib3/fields.py @@ -4,12 +4,8 @@ import mimetypes import typing -_TYPE_FIELD_VALUE = typing.Union[str, bytes] -_TYPE_FIELD_VALUE_TUPLE = typing.Union[ - _TYPE_FIELD_VALUE, - typing.Tuple[str, _TYPE_FIELD_VALUE], - typing.Tuple[str, _TYPE_FIELD_VALUE, str], -] +if typing.TYPE_CHECKING: + from ._typing import _TYPE_FIELD_VALUE, _TYPE_FIELD_VALUE_TUPLE def guess_content_type( @@ -171,13 +167,9 @@ def from_tuples( if isinstance(value, tuple): if len(value) == 3: - filename, data, content_type = typing.cast( - typing.Tuple[str, _TYPE_FIELD_VALUE, str], value - ) + filename, data, content_type = value # type: ignore[misc] else: - filename, data = typing.cast( - typing.Tuple[str, _TYPE_FIELD_VALUE], value - ) + filename, data = value # type: ignore[misc] content_type = guess_content_type(filename) else: filename = None diff --git a/src/urllib3/filepost.py b/src/urllib3/filepost.py index 1c90a211fb..0cc45f8e4c 100644 --- a/src/urllib3/filepost.py +++ b/src/urllib3/filepost.py @@ -6,18 +6,11 @@ import typing from io import BytesIO -from .fields import _TYPE_FIELD_VALUE_TUPLE, RequestField +from ._typing import _TYPE_FIELD_VALUE_TUPLE, _TYPE_FIELDS +from .fields import RequestField writer = codecs.lookup("utf-8")[3] -_TYPE_FIELDS_SEQUENCE = typing.Sequence[ - typing.Union[typing.Tuple[str, _TYPE_FIELD_VALUE_TUPLE], RequestField] -] -_TYPE_FIELDS = typing.Union[ - _TYPE_FIELDS_SEQUENCE, - typing.Mapping[str, _TYPE_FIELD_VALUE_TUPLE], -] - def choose_boundary() -> str: """ diff --git a/src/urllib3/poolmanager.py b/src/urllib3/poolmanager.py index 0525e1a099..bdab1f2e5d 100644 --- a/src/urllib3/poolmanager.py +++ b/src/urllib3/poolmanager.py @@ -9,8 +9,8 @@ from ._collections import HTTPHeaderDict, RecentlyUsedContainer from ._request_methods import RequestMethods +from ._typing import _TYPE_SOCKET_OPTIONS, ProxyConfig from .backend import HttpVersion, QuicPreemptiveCacheType -from .connection import ProxyConfig from .connectionpool import HTTPConnectionPool, HTTPSConnectionPool, port_by_scheme from .exceptions import ( LocationValueError, @@ -19,7 +19,6 @@ URLSchemeUnknown, ) from .response import BaseHTTPResponse -from .util.connection import _TYPE_SOCKET_OPTIONS from .util.proxy import connection_requires_http_tunnel from .util.request import NOT_FORWARDABLE_HEADERS from .util.retry import Retry diff --git a/src/urllib3/response.py b/src/urllib3/response.py index 31153f4d72..fdee05c604 100644 --- a/src/urllib3/response.py +++ b/src/urllib3/response.py @@ -14,14 +14,14 @@ try: try: - import brotlicffi as brotli # type: ignore[import] + import brotlicffi as brotli # type: ignore[import-not-found] except ImportError: - import brotli # type: ignore[import] + import brotli # type: ignore[import-not-found] except ImportError: brotli = None try: - import zstandard as zstd # type: ignore[import] + import zstandard as zstd # type: ignore[import-not-found] # The package 'zstandard' added the 'eof' property starting # in v0.18.0 which we require to ensure a complete and @@ -36,8 +36,8 @@ except (AttributeError, ImportError, ValueError): # Defensive: zstd = None -from ._base_connection import _TYPE_BODY from ._collections import HTTPHeaderDict +from ._typing import _TYPE_BODY from .backend import LowLevelResponse from .connection import BaseSSLError, HTTPConnection, HTTPException from .exceptions import ( @@ -275,7 +275,39 @@ def get(self, n: int) -> bytes: return ret.getvalue() -class BaseHTTPResponse(io.IOBase): +class HTTPResponse(io.IOBase): + """ + HTTP Response container. + + Backwards-compatible with :class:`http.client.HTTPResponse` but the response ``body`` is + loaded and decoded on-demand when the ``data`` property is accessed. This + class is also compatible with the Python standard library's :mod:`io` + module, and can hence be treated as a readable object in the context of that + framework. + + Extra parameters for behaviour not present in :class:`http.client.HTTPResponse`: + + :param preload_content: + If True, the response's body will be preloaded during construction. + + :param decode_content: + If True, will attempt to decode the body based on the + 'content-encoding' header. + + :param original_response: + When this HTTPResponse wrapper is generated from an :class:`http.client.HTTPResponse` + object, it's convenient to include the original for debug purposes. It's + otherwise unused. + + :param retries: + The retries contains the last :class:`~urllib3.util.retry.Retry` that + was used during the request. + + :param enforce_content_length: + Enforce content length checking. Body returned by server must match + value of Content-Length header, if present. Otherwise, raise error. + """ + CONTENT_DECODERS = ["gzip", "deflate"] if brotli is not None: CONTENT_DECODERS += ["br"] @@ -292,14 +324,22 @@ class BaseHTTPResponse(io.IOBase): def __init__( self, - *, + body: _TYPE_BODY = "", headers: typing.Mapping[str, str] | typing.Mapping[bytes, bytes] | None = None, - status: int, - version: int, - reason: str | None, - decode_content: bool, - request_url: str | None, + status: int = 0, + version: int = 0, + reason: str | None = None, + preload_content: bool = True, + decode_content: bool = True, + original_response: LowLevelResponse | None = None, + pool: HTTPConnectionPool | None = None, + connection: HTTPConnection | None = None, + msg: _HttplibHTTPMessage | None = None, retries: Retry | None = None, + enforce_content_length: bool = True, + request_method: str | None = None, + request_url: str | None = None, + auto_close: bool = True, ) -> None: if isinstance(headers, HTTPHeaderDict): self.headers = headers @@ -311,17 +351,51 @@ def __init__( self.decode_content = decode_content self._has_decoded_content = False self._request_url: str | None = request_url + self._retries: Retry | None = None + self.retries = retries self.chunked = False tr_enc = self.headers.get("transfer-encoding", "").lower() # Don't incur the penalty of creating a list and then discarding it encodings = (enc.strip() for enc in tr_enc.split(",")) + if "chunked" in encodings: self.chunked = True self._decoder: ContentDecoder | None = None + self.enforce_content_length = enforce_content_length + self.auto_close = auto_close + + self._body = None + self._fp: LowLevelResponse | typing.IO[typing.Any] | None = None + self._original_response = original_response + self._fp_bytes_read = 0 + self.msg = msg + + if body and isinstance(body, (str, bytes)): + self._body = body + + self._pool = pool + self._connection = connection + + if hasattr(body, "read"): + self._fp = body # type: ignore[assignment] + + # Are we using the chunked-style of transfer encoding? + self.chunk_left: int | None = None + + # Determine length of response + self.length_remaining: int | None = self._init_length(request_method) + + # Used to return the correct amount of bytes for partial read()s + self._decoded_buffer = BytesQueueBuffer() + + # If requested, preload the body. + if preload_content and not self._body: + self._body = self.read(decode_content=decode_content) + def get_redirect_location(self) -> str | None | Literal[False]: """ Should we redirect and where to? @@ -334,10 +408,6 @@ def get_redirect_location(self) -> str | None | Literal[False]: return self.headers.get("location") return False - @property - def data(self) -> bytes: - raise NotImplementedError() - def json(self) -> typing.Any: """ Parses the body of the HTTP response as JSON. @@ -351,18 +421,6 @@ def json(self) -> typing.Any: data = self.data.decode("utf-8") return _json.loads(data) - @property - def url(self) -> str | None: - raise NotImplementedError() - - @url.setter - def url(self, url: str | None) -> None: - raise NotImplementedError() - - @property - def connection(self) -> HTTPConnection | None: - raise NotImplementedError() - @property def retries(self) -> Retry | None: return self._retries @@ -374,35 +432,6 @@ def retries(self, retries: Retry | None) -> None: self.url = retries.history[-1].redirect_location self._retries = retries - def stream( - self, amt: int | None = 2**16, decode_content: bool | None = None - ) -> typing.Iterator[bytes]: - raise NotImplementedError() - - def read( - self, - amt: int | None = None, - decode_content: bool | None = None, - cache_content: bool = False, - ) -> bytes: - raise NotImplementedError() - - def read_chunked( - self, - amt: int | None = None, - decode_content: bool | None = None, - ) -> typing.Iterator[bytes]: - raise NotImplementedError() - - def release_conn(self) -> None: - raise NotImplementedError() - - def drain_conn(self) -> None: - raise NotImplementedError() - - def close(self) -> None: - raise NotImplementedError() - def _init_decoder(self) -> None: """ Set-up the _decoder attribute if necessary. @@ -477,100 +506,6 @@ def info(self) -> HTTPHeaderDict: def geturl(self) -> str | None: return self.url - -class HTTPResponse(BaseHTTPResponse): - """ - HTTP Response container. - - Backwards-compatible with :class:`http.client.HTTPResponse` but the response ``body`` is - loaded and decoded on-demand when the ``data`` property is accessed. This - class is also compatible with the Python standard library's :mod:`io` - module, and can hence be treated as a readable object in the context of that - framework. - - Extra parameters for behaviour not present in :class:`http.client.HTTPResponse`: - - :param preload_content: - If True, the response's body will be preloaded during construction. - - :param decode_content: - If True, will attempt to decode the body based on the - 'content-encoding' header. - - :param original_response: - When this HTTPResponse wrapper is generated from an :class:`http.client.HTTPResponse` - object, it's convenient to include the original for debug purposes. It's - otherwise unused. - - :param retries: - The retries contains the last :class:`~urllib3.util.retry.Retry` that - was used during the request. - - :param enforce_content_length: - Enforce content length checking. Body returned by server must match - value of Content-Length header, if present. Otherwise, raise error. - """ - - def __init__( - self, - body: _TYPE_BODY = "", - headers: typing.Mapping[str, str] | typing.Mapping[bytes, bytes] | None = None, - status: int = 0, - version: int = 0, - reason: str | None = None, - preload_content: bool = True, - decode_content: bool = True, - original_response: LowLevelResponse | None = None, - pool: HTTPConnectionPool | None = None, - connection: HTTPConnection | None = None, - msg: _HttplibHTTPMessage | None = None, - retries: Retry | None = None, - enforce_content_length: bool = True, - request_method: str | None = None, - request_url: str | None = None, - auto_close: bool = True, - ) -> None: - super().__init__( - headers=headers, - status=status, - version=version, - reason=reason, - decode_content=decode_content, - request_url=request_url, - retries=retries, - ) - - self.enforce_content_length = enforce_content_length - self.auto_close = auto_close - - self._body = None - self._fp: LowLevelResponse | typing.IO[typing.Any] | None = None - self._original_response = original_response - self._fp_bytes_read = 0 - self.msg = msg - - if body and isinstance(body, (str, bytes)): - self._body = body - - self._pool = pool - self._connection = connection - - if hasattr(body, "read"): - self._fp = body # type: ignore[assignment] - - # Are we using the chunked-style of transfer encoding? - self.chunk_left: int | None = None - - # Determine length of response - self.length_remaining = self._init_length(request_method) - - # Used to return the correct amount of bytes for partial read()s - self._decoded_buffer = BytesQueueBuffer() - - # If requested, preload the body. - if preload_content and not self._body: - self._body = self.read(decode_content=decode_content) - def release_conn(self) -> None: if not self._pool or not self._connection: return None @@ -1035,3 +970,7 @@ def __iter__(self) -> typing.Iterator[bytes]: buffer.append(chunk) if buffer: yield b"".join(buffer) + + +# Kept for BC-purposes. +BaseHTTPResponse = HTTPResponse diff --git a/src/urllib3/util/connection.py b/src/urllib3/util/connection.py index 392d907d0f..3c38db8c02 100644 --- a/src/urllib3/util/connection.py +++ b/src/urllib3/util/connection.py @@ -4,20 +4,14 @@ import typing from ..exceptions import LocationParseError -from .timeout import _DEFAULT_TIMEOUT, _TYPE_TIMEOUT - -_TYPE_SOCKET_OPTIONS = typing.Sequence[ - typing.Union[ - typing.Tuple[int, int, typing.Union[int, bytes]], - typing.Tuple[int, int, typing.Union[int, bytes], str], - ] -] +from .timeout import _DEFAULT_TIMEOUT if typing.TYPE_CHECKING: - from .._base_connection import BaseHTTPConnection + from .._typing import _TYPE_SOCKET_OPTIONS, _TYPE_TIMEOUT_INTERNAL + from ..connection import HTTPConnection -def is_connection_dropped(conn: BaseHTTPConnection) -> bool: # Platform-specific +def is_connection_dropped(conn: HTTPConnection) -> bool: # Platform-specific """ Returns True if the connection is dropped and should be closed. :param conn: :class:`urllib3.connection.HTTPConnection` object. @@ -31,7 +25,7 @@ def is_connection_dropped(conn: BaseHTTPConnection) -> bool: # Platform-specifi # discovered in DNS if the system doesn't have IPv6 functionality. def create_connection( address: tuple[str, int], - timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, + timeout: _TYPE_TIMEOUT_INTERNAL = _DEFAULT_TIMEOUT, source_address: tuple[str, int] | None = None, socket_options: _TYPE_SOCKET_OPTIONS | None = None, socket_kind: socket.SocketKind = socket.SOCK_STREAM, @@ -124,7 +118,7 @@ def allowed_gai_family() -> socket.AddressFamily: return family -def _has_ipv6(host: str) -> bool: +def _has_ipv6() -> bool: """Returns True if the system can bind an IPv6 address.""" sock = None has_ipv6 = False @@ -137,7 +131,7 @@ def _has_ipv6(host: str) -> bool: # https://bugs.python.org/issue658327 try: sock = socket.socket(socket.AF_INET6) - sock.bind((host, 0)) + sock.bind(("::1", 0)) has_ipv6 = True except Exception: pass @@ -147,4 +141,4 @@ def _has_ipv6(host: str) -> bool: return has_ipv6 -HAS_IPV6 = _has_ipv6("::1") +HAS_IPV6 = _has_ipv6() diff --git a/src/urllib3/util/proxy.py b/src/urllib3/util/proxy.py index 908fc6621d..f26abd9af6 100644 --- a/src/urllib3/util/proxy.py +++ b/src/urllib3/util/proxy.py @@ -5,7 +5,7 @@ from .url import Url if typing.TYPE_CHECKING: - from ..connection import ProxyConfig + from .._typing import ProxyConfig def connection_requires_http_tunnel( diff --git a/src/urllib3/util/request.py b/src/urllib3/util/request.py index d702bafff5..7e52bfb6b9 100644 --- a/src/urllib3/util/request.py +++ b/src/urllib3/util/request.py @@ -11,6 +11,8 @@ if typing.TYPE_CHECKING: from typing_extensions import Final + from .._typing import _TYPE_BODY_POSITION + # Pass as a value within ``headers`` to skip # emitting some HTTP headers that are added automatically. # The only headers that are supported are ``Accept-Encoding``, @@ -32,15 +34,15 @@ ACCEPT_ENCODING = "gzip,deflate" try: try: - import brotlicffi as _unused_module_brotli # type: ignore[import] # noqa: F401 + import brotlicffi as _unused_module_brotli # type: ignore[import-not-found] # noqa: F401 except ImportError: - import brotli as _unused_module_brotli # type: ignore[import] # noqa: F401 + import brotli as _unused_module_brotli # type: ignore[import-not-found] # noqa: F401 except ImportError: pass else: ACCEPT_ENCODING += ",br" try: - import zstandard as _unused_module_zstd # type: ignore[import] # noqa: F401 + import zstandard as _unused_module_zstd # type: ignore[import-not-found] # noqa: F401 except ImportError: pass else: @@ -53,7 +55,6 @@ class _TYPE_FAILEDTELL(Enum): _FAILEDTELL: Final[_TYPE_FAILEDTELL] = _TYPE_FAILEDTELL.token -_TYPE_BODY_POSITION = typing.Union[int, _TYPE_FAILEDTELL] # When sending a request with these methods we aren't expecting # a body so don't need to set an explicit 'Content-Length: 0' diff --git a/src/urllib3/util/ssl_.py b/src/urllib3/util/ssl_.py index ae4b68aba6..b20c0ca519 100644 --- a/src/urllib3/util/ssl_.py +++ b/src/urllib3/util/ssl_.py @@ -75,15 +75,10 @@ def _is_has_never_check_common_name_reliable( if typing.TYPE_CHECKING: from ssl import VerifyMode - from typing_extensions import Literal, TypedDict + from typing_extensions import Literal from .ssltransport import SSLTransport as SSLTransportType - class _TYPE_PEER_CERT_RET_DICT(TypedDict, total=False): - subjectAltName: tuple[tuple[str, str], ...] - subject: tuple[tuple[tuple[str, str], ...], ...] - serialNumber: str - # Mapping from 'ssl.PROTOCOL_TLSX' to 'TLSVersion.X' _SSL_VERSION_TO_TLS_VERSION: dict[int, int] = {} @@ -137,9 +132,6 @@ class _TYPE_PEER_CERT_RET_DICT(TypedDict, total=False): PROTOCOL_TLS_CLIENT = 16 # type: ignore[assignment] -_TYPE_PEER_CERT_RET = typing.Union["_TYPE_PEER_CERT_RET_DICT", bytes, None] - - def assert_fingerprint(cert: bytes | None, fingerprint: str) -> None: """ Checks if given fingerprint matches the supplied certificate. diff --git a/src/urllib3/util/ssl_match_hostname.py b/src/urllib3/util/ssl_match_hostname.py index 453cfd420d..87816e5977 100644 --- a/src/urllib3/util/ssl_match_hostname.py +++ b/src/urllib3/util/ssl_match_hostname.py @@ -12,7 +12,7 @@ from ipaddress import IPv4Address, IPv6Address if typing.TYPE_CHECKING: - from .ssl_ import _TYPE_PEER_CERT_RET_DICT + from .._typing import _TYPE_PEER_CERT_RET_DICT __version__ = "3.5.0.1" diff --git a/src/urllib3/util/ssltransport.py b/src/urllib3/util/ssltransport.py index 5ec86473b4..14581b18a4 100644 --- a/src/urllib3/util/ssltransport.py +++ b/src/urllib3/util/ssltransport.py @@ -10,7 +10,7 @@ if typing.TYPE_CHECKING: from typing_extensions import Literal - from .ssl_ import _TYPE_PEER_CERT_RET, _TYPE_PEER_CERT_RET_DICT + from .._typing import _TYPE_PEER_CERT_RET, _TYPE_PEER_CERT_RET_DICT _SelfT = typing.TypeVar("_SelfT", bound="SSLTransport") diff --git a/src/urllib3/util/timeout.py b/src/urllib3/util/timeout.py index ec090f69cc..7a0600e638 100644 --- a/src/urllib3/util/timeout.py +++ b/src/urllib3/util/timeout.py @@ -10,6 +10,8 @@ if typing.TYPE_CHECKING: from typing_extensions import Final + from .._typing import _TYPE_TIMEOUT, _TYPE_TIMEOUT_INTERNAL + class _TYPE_DEFAULT(Enum): # This value should never be passed to socket.settimeout() so for safety we use a -1. @@ -19,8 +21,6 @@ class _TYPE_DEFAULT(Enum): _DEFAULT_TIMEOUT: Final[_TYPE_DEFAULT] = _TYPE_DEFAULT.token -_TYPE_TIMEOUT = typing.Optional[typing.Union[float, _TYPE_DEFAULT]] - class Timeout: """Timeout configuration. @@ -112,9 +112,9 @@ class Timeout: def __init__( self, - total: _TYPE_TIMEOUT = None, - connect: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, - read: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, + total: _TYPE_TIMEOUT_INTERNAL = None, + connect: _TYPE_TIMEOUT_INTERNAL = _DEFAULT_TIMEOUT, + read: _TYPE_TIMEOUT_INTERNAL = _DEFAULT_TIMEOUT, ) -> None: self._connect = self._validate_timeout(connect, "connect") self._read = self._validate_timeout(read, "read") @@ -128,11 +128,13 @@ def __repr__(self) -> str: __str__ = __repr__ @staticmethod - def resolve_default_timeout(timeout: _TYPE_TIMEOUT) -> float | None: + def resolve_default_timeout(timeout: _TYPE_TIMEOUT_INTERNAL) -> float | None: return getdefaulttimeout() if timeout is _DEFAULT_TIMEOUT else timeout @classmethod - def _validate_timeout(cls, value: _TYPE_TIMEOUT, name: str) -> _TYPE_TIMEOUT: + def _validate_timeout( + cls, value: _TYPE_TIMEOUT, name: str + ) -> _TYPE_TIMEOUT_INTERNAL: """Check that a timeout attribute is valid. :param value: The timeout value to validate @@ -151,7 +153,7 @@ def _validate_timeout(cls, value: _TYPE_TIMEOUT, name: str) -> _TYPE_TIMEOUT: "be an int, float or None." ) try: - float(value) + float(value) # type: ignore[arg-type] except (TypeError, ValueError): raise ValueError( "Timeout value %s was %s, but it must be an " @@ -159,7 +161,7 @@ def _validate_timeout(cls, value: _TYPE_TIMEOUT, name: str) -> _TYPE_TIMEOUT: ) from None try: - if value <= 0: + if value <= 0: # type: ignore[operator] raise ValueError( "Attempted to set %s timeout to %s, but the " "timeout cannot be set to a value less " @@ -171,10 +173,10 @@ def _validate_timeout(cls, value: _TYPE_TIMEOUT, name: str) -> _TYPE_TIMEOUT: "int, float or None." % (name, value) ) from None - return value + return value # type: ignore[return-value] @classmethod - def from_float(cls, timeout: _TYPE_TIMEOUT) -> Timeout: + def from_float(cls, timeout: _TYPE_TIMEOUT_INTERNAL) -> Timeout: """Create a new Timeout from a legacy timeout value. The timeout value used by httplib.py sets the same timeout on the @@ -229,7 +231,7 @@ def get_connect_duration(self) -> float: return time.monotonic() - self._start_connect @property - def connect_timeout(self) -> _TYPE_TIMEOUT: + def connect_timeout(self) -> _TYPE_TIMEOUT_INTERNAL: """Get the value to use when setting a connection timeout. This will be a positive float or integer, the value None diff --git a/test/__init__.py b/test/__init__.py index 4cf1dbbe7f..d24636adae 100644 --- a/test/__init__.py +++ b/test/__init__.py @@ -18,14 +18,14 @@ try: try: - import brotlicffi as brotli # type: ignore[import] + import brotlicffi as brotli # type: ignore[import-not-found] except ImportError: - import brotli # type: ignore[import] + import brotli # type: ignore[import-not-found] except ImportError: brotli = None try: - import zstandard as zstd # type: ignore[import] + import zstandard as zstd # type: ignore[import-not-found] except ImportError: zstd = None diff --git a/test/contrib/test_socks.py b/test/contrib/test_socks.py index 2878cc8d8b..e44c86b598 100644 --- a/test/contrib/test_socks.py +++ b/test/contrib/test_socks.py @@ -9,7 +9,7 @@ from unittest.mock import Mock, patch import pytest -import socks as py_socks # type: ignore[import] +import socks as py_socks # type: ignore[import-not-found] from dummyserver.server import DEFAULT_CA, DEFAULT_CERTS from dummyserver.testcase import IPV4SocketDummyServerTestCase diff --git a/test/test_connection.py b/test/test_connection.py index d26f36c306..65c317d84f 100644 --- a/test/test_connection.py +++ b/test/test_connection.py @@ -1,6 +1,5 @@ from __future__ import annotations -import datetime import socket import typing from http.client import ResponseNotReady @@ -9,7 +8,6 @@ import pytest from urllib3.connection import ( # type: ignore[attr-defined] - RECENT_DATE, CertificateError, HTTPConnection, HTTPSConnection, @@ -25,7 +23,7 @@ from urllib3.util.ssl_match_hostname import _dnsname_match, match_hostname if typing.TYPE_CHECKING: - from urllib3.util.ssl_ import _TYPE_PEER_CERT_RET_DICT + from urllib3._typing import _TYPE_PEER_CERT_RET_DICT class TestConnection: @@ -197,14 +195,6 @@ def test_match_hostname_ip_address_ipv6_brackets(self) -> None: # Assert no error is raised _match_hostname(cert, asserted_hostname) - def test_recent_date(self) -> None: - # This test is to make sure that the RECENT_DATE value - # doesn't get too far behind what the current date is. - # When this test fails update urllib3.connection.RECENT_DATE - # according to the rules defined in that file. - two_years = datetime.timedelta(days=365 * 2) - assert RECENT_DATE > (datetime.datetime.today() - two_years).date() - def test_HTTPSConnection_default_socket_options(self) -> None: conn = HTTPSConnection("not.a.real.host", port=443) assert conn.socket_options == [ diff --git a/test/test_filepost.py b/test/test_filepost.py index b6da4b9447..56943f0ffa 100644 --- a/test/test_filepost.py +++ b/test/test_filepost.py @@ -2,8 +2,9 @@ import pytest +from urllib3._typing import _TYPE_FIELDS from urllib3.fields import RequestField -from urllib3.filepost import _TYPE_FIELDS, encode_multipart_formdata +from urllib3.filepost import encode_multipart_formdata BOUNDARY = "!! test boundary !!" BOUNDARY_BYTES = BOUNDARY.encode() diff --git a/test/test_response.py b/test/test_response.py index ddd695e486..64cc0eb3ba 100644 --- a/test/test_response.py +++ b/test/test_response.py @@ -24,7 +24,6 @@ SSLError, ) from urllib3.response import ( # type: ignore[attr-defined] - BaseHTTPResponse, BytesQueueBuffer, HTTPResponse, brotli, @@ -416,24 +415,6 @@ def test_body_blob(self) -> None: assert resp.data == b"foo" assert resp.closed - def test_base_io(self) -> None: - resp = BaseHTTPResponse( - status=200, - version=11, - reason=None, - decode_content=False, - request_url=None, - ) - - assert not resp.closed - assert not resp.readable() - assert not resp.writable() - - with pytest.raises(NotImplementedError): - resp.read() - with pytest.raises(NotImplementedError): - resp.close() - def test_io(self, sock: socket.socket) -> None: fp = BytesIO(b"foo") resp = HTTPResponse(fp, preload_content=False) diff --git a/test/test_util.py b/test/test_util.py index 51aa20ffb6..08749423eb 100644 --- a/test/test_util.py +++ b/test/test_util.py @@ -16,7 +16,7 @@ import pytest from urllib3 import add_stderr_logger, disable_warnings -from urllib3.connection import ProxyConfig +from urllib3._typing import ProxyConfig from urllib3.exceptions import ( InsecureRequestWarning, LocationParseError, @@ -777,21 +777,21 @@ class NotReallyAFile: def test_has_ipv6_disabled_on_compile(self) -> None: with patch("socket.has_ipv6", False): - assert not _has_ipv6("::1") + assert not _has_ipv6() def test_has_ipv6_enabled_but_fails(self) -> None: with patch("socket.has_ipv6", True): with patch("socket.socket") as mock: instance = mock.return_value instance.bind = Mock(side_effect=Exception("No IPv6 here!")) - assert not _has_ipv6("::1") + assert not _has_ipv6() def test_has_ipv6_enabled_and_working(self) -> None: with patch("socket.has_ipv6", True): with patch("socket.socket") as mock: instance = mock.return_value instance.bind.return_value = True - assert _has_ipv6("::1") + assert _has_ipv6() def test_ip_family_ipv6_enabled(self) -> None: with patch("urllib3.util.connection.HAS_IPV6", True): diff --git a/test/with_dummyserver/test_connection.py b/test/with_dummyserver/test_connection.py index d06a7551b5..d7d66b9462 100644 --- a/test/with_dummyserver/test_connection.py +++ b/test/with_dummyserver/test_connection.py @@ -60,8 +60,8 @@ def test_releases_conn(pool: HTTPConnectionPool) -> None: # If these variables are set by the pool # then the response can release the connection # back into the pool. - response._pool = pool # type: ignore[attr-defined] - response._connection = conn # type: ignore[attr-defined] + response._pool = pool + response._connection = conn response.release_conn() assert pool.pool.qsize() == 1 # type: ignore[union-attr] @@ -123,15 +123,15 @@ def test_set_tunnel_is_reset(pool: HTTPConnectionPool) -> None: conn.set_tunnel(host="host", port=8080, scheme="http") - assert conn._tunnel_host == "host" # type: ignore[attr-defined] - assert conn._tunnel_port == 8080 # type: ignore[attr-defined] - assert conn._tunnel_scheme == "http" # type: ignore[attr-defined] + assert conn._tunnel_host == "host" + assert conn._tunnel_port == 8080 + assert conn._tunnel_scheme == "http" conn.close() - assert conn._tunnel_host is None # type: ignore[attr-defined] - assert conn._tunnel_port is None # type: ignore[attr-defined] - assert conn._tunnel_scheme is None # type: ignore[attr-defined] + assert conn._tunnel_host is None + assert conn._tunnel_port is None + assert conn._tunnel_scheme is None def test_invalid_tunnel_scheme(pool: HTTPConnectionPool) -> None: diff --git a/test/with_dummyserver/test_connectionpool.py b/test/with_dummyserver/test_connectionpool.py index 1ef9b40f55..0ab7d596ef 100644 --- a/test/with_dummyserver/test_connectionpool.py +++ b/test/with_dummyserver/test_connectionpool.py @@ -16,6 +16,7 @@ from dummyserver.testcase import HTTPDummyServerTestCase, SocketDummyServerTestCase from urllib3 import HTTPConnectionPool, encode_multipart_formdata from urllib3._collections import HTTPHeaderDict +from urllib3._typing import _TYPE_FIELD_VALUE_TUPLE, _TYPE_TIMEOUT from urllib3.connection import _get_default_user_agent from urllib3.exceptions import ( ConnectTimeoutError, @@ -27,10 +28,9 @@ ReadTimeoutError, UnrewindableBodyError, ) -from urllib3.fields import _TYPE_FIELD_VALUE_TUPLE from urllib3.util import SKIP_HEADER, SKIPPABLE_HEADERS from urllib3.util.retry import RequestHistory, Retry -from urllib3.util.timeout import _TYPE_TIMEOUT, Timeout +from urllib3.util.timeout import Timeout from .. import INVALID_SOURCE_ADDRESSES, TARPIT_HOST, VALID_SOURCE_ADDRESSES from ..port_helpers import find_unused_port @@ -71,7 +71,7 @@ def test_conn_closed(self) -> None: pool.urlopen("GET", "/") if not conn.is_closed: with pytest.raises(socket.error): - conn.sock.recv(1024) # type: ignore[attr-defined] + conn.sock.recv(1024) # type: ignore[union-attr] finally: pool._put_conn(conn) @@ -283,7 +283,7 @@ def test_nagle(self) -> None: conn = pool._get_conn() try: pool._make_request(conn, "GET", "/") - tcp_nodelay_setting = conn.sock.getsockopt( # type: ignore[attr-defined] + tcp_nodelay_setting = conn.sock.getsockopt( # type: ignore[union-attr] socket.IPPROTO_TCP, socket.TCP_NODELAY ) assert tcp_nodelay_setting @@ -307,7 +307,7 @@ def test_socket_options(self, socket_options: tuple[int, int, int]) -> None: socket_options=socket_options, ) as pool: # Get the socket of a new connection. - s = pool._new_conn()._new_conn() # type: ignore[attr-defined] + s = pool._new_conn()._new_conn() try: using_keepalive = ( s.getsockopt(socket.SOL_SOCKET, socket.SO_KEEPALIVE) > 0 @@ -326,7 +326,7 @@ def test_disable_default_socket_options( with HTTPConnectionPool( self.host, self.port, socket_options=socket_options ) as pool: - s = pool._new_conn()._new_conn() # type: ignore[attr-defined] + s = pool._new_conn()._new_conn() try: using_nagle = s.getsockopt(socket.IPPROTO_TCP, socket.TCP_NODELAY) == 0 assert using_nagle @@ -344,7 +344,7 @@ def test_defaults_are_applied(self) -> None: # Update the default socket options assert conn.socket_options is not None conn.socket_options += [(socket.SOL_SOCKET, socket.SO_KEEPALIVE, 1)] # type: ignore[operator] - s = conn._new_conn() # type: ignore[attr-defined] + s = conn._new_conn() nagle_disabled = ( s.getsockopt(socket.IPPROTO_TCP, socket.TCP_NODELAY) > 0 ) diff --git a/test/with_dummyserver/test_https.py b/test/with_dummyserver/test_https.py index 28dfa70112..87a6cc93ea 100644 --- a/test/with_dummyserver/test_https.py +++ b/test/with_dummyserver/test_https.py @@ -1,6 +1,5 @@ from __future__ import annotations -import datetime import os.path import shutil import ssl @@ -31,14 +30,13 @@ ) from dummyserver.testcase import HTTPSDummyServerTestCase from urllib3 import HTTPSConnectionPool -from urllib3.connection import RECENT_DATE, HTTPSConnection, VerifiedHTTPSConnection +from urllib3.connection import HTTPSConnection, VerifiedHTTPSConnection from urllib3.exceptions import ( ConnectTimeoutError, InsecureRequestWarning, MaxRetryError, ProtocolError, SSLError, - SystemTimeWarning, ) from urllib3.util.ssl_match_hostname import CertificateError from urllib3.util.timeout import Timeout @@ -346,7 +344,7 @@ def test_wrap_socket_failure_resource_leak(self) -> None: with pytest.raises(ssl.SSLError): conn.connect() - assert conn.sock is not None # type: ignore[attr-defined] + assert conn.sock is not None finally: conn.close() @@ -428,8 +426,7 @@ def test_ssl_unverified_with_ca_certs(self) -> None: # warnings, which we want to ignore here. calls = warn.call_args_list - category = calls[0][0][1] - assert category == InsecureRequestWarning + assert any(c[0][1] == InsecureRequestWarning for c in calls) def test_assert_hostname_false(self) -> None: with HTTPSConnectionPool( @@ -469,8 +466,8 @@ def test_server_hostname(self) -> None: # pyopenssl doesn't let you pull the server_hostname back off the # socket, so only add this assertion if the attribute is there (i.e. # the python ssl module). - if hasattr(conn.sock, "server_hostname"): # type: ignore[attr-defined] - assert conn.sock.server_hostname == "localhost" # type: ignore[attr-defined] + if hasattr(conn.sock, "server_hostname"): + assert conn.sock.server_hostname == "localhost" # type: ignore[union-attr] def test_assert_fingerprint_md5(self) -> None: with HTTPSConnectionPool( @@ -730,38 +727,11 @@ def test_ssl_correct_system_time(self) -> None: w = self._request_without_resource_warnings("GET", "/") assert [] == w - def test_ssl_wrong_system_time(self) -> None: - # PyPy 3.10+ workaround raised warning about untrustworthy TLS protocols. - if sys.implementation.name == "pypy": - warnings.filterwarnings( - "ignore", r"ssl.* is deprecated", DeprecationWarning - ) - - with HTTPSConnectionPool( - self.host, - self.port, - ca_certs=DEFAULT_CA, - ssl_minimum_version=self.tls_version(), - ) as https_pool: - https_pool.cert_reqs = "CERT_REQUIRED" - https_pool.ca_certs = DEFAULT_CA - with mock.patch("urllib3.connection.datetime") as mock_date: - mock_date.date.today.return_value = datetime.date(1970, 1, 1) - - w = self._request_without_resource_warnings("GET", "/") - - assert len(w) == 1 - warning = w[0] - - assert SystemTimeWarning == warning.category - assert isinstance(warning.message, Warning) - assert str(RECENT_DATE) in warning.message.args[0] - def _request_without_resource_warnings( self, method: str, url: str ) -> list[warnings.WarningMessage]: with warnings.catch_warnings(record=True) as w: - warnings.simplefilter("always") + # warnings.simplefilter("always") with HTTPSConnectionPool( self.host, self.port, @@ -812,9 +782,9 @@ def test_tls_protocol_name_of_socket(self) -> None: conn = https_pool._get_conn() try: conn.connect() - if not hasattr(conn.sock, "version"): # type: ignore[attr-defined] + if not hasattr(conn.sock, "version"): pytest.skip("SSLSocket.version() not available") - assert conn.sock.version() == self.tls_protocol_name # type: ignore[attr-defined] + assert conn.sock.version() == self.tls_protocol_name # type: ignore[union-attr] finally: conn.close() @@ -887,7 +857,7 @@ def test_tls_version_maximum_and_minimum(self) -> None: conn = https_pool._get_conn() try: conn.connect() - assert conn.sock.version() == self.tls_protocol_name # type: ignore[attr-defined] + assert conn.sock.version() == self.tls_protocol_name # type: ignore[union-attr] finally: conn.close() diff --git a/test/with_dummyserver/test_proxy_poolmanager.py b/test/with_dummyserver/test_proxy_poolmanager.py index b4fe78bc2d..a6da972d03 100644 --- a/test/with_dummyserver/test_proxy_poolmanager.py +++ b/test/with_dummyserver/test_proxy_poolmanager.py @@ -110,7 +110,7 @@ def test_nagle_proxy(self) -> None: conn = hc2._get_conn() try: hc2._make_request(conn, "GET", "/") - tcp_nodelay_setting = conn.sock.getsockopt( # type: ignore[attr-defined] + tcp_nodelay_setting = conn.sock.getsockopt( # type: ignore[union-attr] socket.IPPROTO_TCP, socket.TCP_NODELAY ) assert tcp_nodelay_setting == 0, (