Skip to content

chore(deps): update dependency aiohttp to v3.11.16 - autoclosed #28

New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Closed
wants to merge 1 commit into from

Conversation

renovate[bot]
Copy link
Contributor

@renovate renovate bot commented Dec 11, 2024

This PR contains the following updates:

Package Change Age Adoption Passing Confidence
aiohttp ==3.11.9 -> ==3.11.16 age adoption passing confidence

Release Notes

aio-libs/aiohttp (aiohttp)

v3.11.16

Compare Source

====================

Bug fixes

  • Replaced deprecated asyncio.iscoroutinefunction with its counterpart from inspect
    -- by :user:layday.

    Related issues and pull requests on GitHub:
    :issue:10634.

  • Fixed :class:multidict.CIMultiDict being mutated when passed to :class:aiohttp.web.Response -- by :user:bdraco.

    Related issues and pull requests on GitHub:
    :issue:10672.


v3.11.15

Compare Source

====================

Bug fixes

  • Reverted explicitly closing sockets if an exception is raised during create_connection -- by :user:bdraco.

    This change originally appeared in aiohttp 3.11.13

    Related issues and pull requests on GitHub:
    :issue:10464, :issue:10617, :issue:10656.

Miscellaneous internal changes

  • Improved performance of WebSocket buffer handling -- by :user:bdraco.

    Related issues and pull requests on GitHub:
    :issue:10601.

  • Improved performance of serializing headers -- by :user:bdraco.

    Related issues and pull requests on GitHub:
    :issue:10625.


v3.11.14

Compare Source

====================

Bug fixes

  • Fixed an issue where dns queries were delayed indefinitely when an exception occurred in a trace.send_dns_cache_miss
    -- by :user:logioniz.

    Related issues and pull requests on GitHub:
    :issue:10529.

  • Fixed DNS resolution on platforms that don't support socket.AI_ADDRCONFIG -- by :user:maxbachmann.

    Related issues and pull requests on GitHub:
    :issue:10542.

  • The connector now raises :exc:aiohttp.ClientConnectionError instead of :exc:OSError when failing to explicitly close the socket after :py:meth:asyncio.loop.create_connection fails -- by :user:bdraco.

    Related issues and pull requests on GitHub:
    :issue:10551.

  • Break cyclic references at connection close when there was a traceback -- by :user:bdraco.

    Special thanks to :user:availov for reporting the issue.

    Related issues and pull requests on GitHub:
    :issue:10556.

  • Break cyclic references when there is an exception handling a request -- by :user:bdraco.

    Related issues and pull requests on GitHub:
    :issue:10569.

Features

  • Improved logging on non-overlapping WebSocket client protocols to include the remote address -- by :user:bdraco.

    Related issues and pull requests on GitHub:
    :issue:10564.

Miscellaneous internal changes

  • Improved performance of parsing content types by adding a cache in the same manner currently done with mime types -- by :user:bdraco.

    Related issues and pull requests on GitHub:
    :issue:10552.


v3.11.13

Compare Source

====================

Bug fixes

  • Removed a break statement inside the finally block in :py:class:~aiohttp.web.RequestHandler
    -- by :user:Cycloctane.

    Related issues and pull requests on GitHub:
    :issue:10434.

  • Changed connection creation to explicitly close sockets if an exception is raised in the event loop's create_connection method -- by :user:top-oai.

    Related issues and pull requests on GitHub:
    :issue:10464.

Packaging updates and notes for downstreams

  • Fixed test test_write_large_payload_deflate_compression_data_in_eof_writelines failing with Python 3.12.9+ or 3.13.2+ -- by :user:bdraco.

    Related issues and pull requests on GitHub:
    :issue:10423.

Miscellaneous internal changes

  • Added human-readable error messages to the exceptions for WebSocket disconnects due to PONG not being received -- by :user:bdraco.

    Previously, the error messages were empty strings, which made it hard to determine what went wrong.

    Related issues and pull requests on GitHub:
    :issue:10422.


v3.11.12

Compare Source

====================

Bug fixes

  • MultipartForm.decode() now follows RFC1341 7.2.1 with a CRLF after the boundary
    -- by :user:imnotjames.

    Related issues and pull requests on GitHub:
    :issue:10270.

  • Restored the missing total_bytes attribute to EmptyStreamReader -- by :user:bdraco.

    Related issues and pull requests on GitHub:
    :issue:10387.

Features

  • Updated :py:func:~aiohttp.request to make it accept _RequestOptions kwargs.
    -- by :user:Cycloctane.

    Related issues and pull requests on GitHub:
    :issue:10300.

  • Improved logging of HTTP protocol errors to include the remote address -- by :user:bdraco.

    Related issues and pull requests on GitHub:
    :issue:10332.

Improved documentation

  • Added aiohttp-openmetrics to list of third-party libraries -- by :user:jelmer.

    Related issues and pull requests on GitHub:
    :issue:10304.

Packaging updates and notes for downstreams

  • Added missing files to the source distribution to fix Makefile targets.
    Added a cythonize-nodeps target to run Cython without invoking pip to install dependencies.

    Related issues and pull requests on GitHub:
    :issue:10366.

  • Started building armv7l musllinux wheels -- by :user:bdraco.

    Related issues and pull requests on GitHub:
    :issue:10404.

Contributor-facing changes

  • The CI/CD workflow has been updated to use upload-artifact v4 and download-artifact v4 GitHub Actions -- by :user:silamon.

    Related issues and pull requests on GitHub:
    :issue:10281.

Miscellaneous internal changes

  • Restored support for zero copy writes when using Python 3.12 versions 3.12.9 and later or Python 3.13.2+ -- by :user:bdraco.

    Zero copy writes were previously disabled due to :cve:2024-12254 which is resolved in these Python versions.

    Related issues and pull requests on GitHub:
    :issue:10137.


v3.11.11

Compare Source

====================

Bug fixes

  • Updated :py:meth:~aiohttp.ClientSession.request to reuse the quote_cookie setting from ClientSession._cookie_jar when processing cookies parameter.
    -- by :user:Cycloctane.

    Related issues and pull requests on GitHub:
    :issue:10093.

  • Fixed type of SSLContext for some static type checkers (e.g. pyright).

    Related issues and pull requests on GitHub:
    :issue:10099.

  • Updated :meth:aiohttp.web.StreamResponse.write annotation to also allow :class:bytearray and :class:memoryview as inputs -- by :user:cdce8p.

    Related issues and pull requests on GitHub:
    :issue:10154.

  • Fixed a hang where a connection previously used for a streaming
    download could be returned to the pool in a paused state.
    -- by :user:javitonino.

    Related issues and pull requests on GitHub:
    :issue:10169.

Features

  • Enabled ALPN on default SSL contexts. This improves compatibility with some
    proxies which don't work without this extension.
    -- by :user:Cycloctane.

    Related issues and pull requests on GitHub:
    :issue:10156.

Miscellaneous internal changes

  • Fixed an infinite loop that can occur when using aiohttp in combination
    with async-solipsism_ -- by :user:bmerry.

    .. _async-solipsism: https://github.com/bmerry/async-solipsism

    Related issues and pull requests on GitHub:
    :issue:10149.


v3.11.10

Compare Source

====================

Bug fixes

  • Fixed race condition in :class:aiohttp.web.FileResponse that could have resulted in an incorrect response if the file was replaced on the file system during prepare -- by :user:bdraco.

    Related issues and pull requests on GitHub:
    :issue:10101, :issue:10113.

  • Replaced deprecated call to :func:mimetypes.guess_type with :func:mimetypes.guess_file_type when using Python 3.13+ -- by :user:bdraco.

    Related issues and pull requests on GitHub:
    :issue:10102.

  • Disabled zero copy writes in the StreamWriter -- by :user:bdraco.

    Related issues and pull requests on GitHub:
    :issue:10125.



Configuration

📅 Schedule: Branch creation - "* * * * 2-4" (UTC), Automerge - At any time (no schedule defined).

🚦 Automerge: Disabled by config. Please merge this manually once you are satisfied.

Rebasing: Whenever PR becomes conflicted, or you tick the rebase/retry checkbox.

🔕 Ignore: Close this PR and you won't be reminded about this update again.


  • If you want to rebase/retry this PR, check this box

This PR was generated by Mend Renovate. View the repository job log.

@renovate renovate bot changed the title chore(deps): update dependency aiohttp to v3.11.10 chore(deps): update dependency aiohttp to v3.11.11 Dec 21, 2024
@renovate renovate bot force-pushed the renovate/aiohttp-3.x branch from 52f07e5 to f0c973b Compare December 21, 2024 03:01
@renovate renovate bot changed the title chore(deps): update dependency aiohttp to v3.11.11 chore(deps): update dependency aiohttp to v3.11.10 Dec 21, 2024
@renovate renovate bot force-pushed the renovate/aiohttp-3.x branch 2 times, most recently from 2f0d6a1 to cdf822c Compare December 25, 2024 21:58
@renovate renovate bot changed the title chore(deps): update dependency aiohttp to v3.11.10 chore(deps): update dependency aiohttp to v3.11.11 Dec 25, 2024
Copy link

[puLL-Merge] - aio-libs/[email protected]

Diff
diff --git .github/workflows/ci-cd.yml .github/workflows/ci-cd.yml
index 765047b933f..d5e119b779d 100644
--- .github/workflows/ci-cd.yml
+++ .github/workflows/ci-cd.yml
@@ -47,7 +47,7 @@ jobs:
       with:
         python-version: 3.11
     - name: Cache PyPI
-      uses: actions/[email protected]
+      uses: actions/[email protected]
       with:
         key: pip-lint-${{ hashFiles('requirements/*.txt') }}
         path: ~/.cache/pip
@@ -99,7 +99,7 @@ jobs:
       with:
         submodules: true
     - name: Cache llhttp generated files
-      uses: actions/[email protected]
+      uses: actions/[email protected]
       id: cache
       with:
         key: llhttp-${{ hashFiles('vendor/llhttp/package*.json', 'vendor/llhttp/src/**/*') }}
@@ -163,7 +163,7 @@ jobs:
         echo "dir=$(pip cache dir)" >> "${GITHUB_OUTPUT}"
       shell: bash
     - name: Cache PyPI
-      uses: actions/[email protected]
+      uses: actions/[email protected]
       with:
         key: pip-ci-${{ runner.os }}-${{ matrix.pyver }}-${{ matrix.no-extensions }}-${{ hashFiles('requirements/*.txt') }}
         path: ${{ steps.pip-cache.outputs.dir }}
@@ -250,11 +250,11 @@ jobs:
       uses: actions/checkout@v4
       with:
         submodules: true
-    - name: Setup Python 3.12
+    - name: Setup Python 3.13
       id: python-install
       uses: actions/setup-python@v5
       with:
-        python-version: 3.12
+        python-version: 3.13
         cache: pip
         cache-dependency-path: requirements/*.txt
     - name: Update pip, wheel, setuptools, build, twine
diff --git CHANGES.rst CHANGES.rst
index 8352236c320..b07cec6a093 100644
--- CHANGES.rst
+++ CHANGES.rst
@@ -10,6 +10,114 @@
 
 .. towncrier release notes start
 
+3.11.11 (2024-12-18)
+====================
+
+Bug fixes
+---------
+
+- Updated :py:meth:`~aiohttp.ClientSession.request` to reuse the ``quote_cookie`` setting from ``ClientSession._cookie_jar`` when processing cookies parameter.
+  -- by :user:`Cycloctane`.
+
+
+  *Related issues and pull requests on GitHub:*
+  :issue:`10093`.
+
+
+
+- Fixed type of ``SSLContext`` for some static type checkers (e.g. pyright).
+
+
+  *Related issues and pull requests on GitHub:*
+  :issue:`10099`.
+
+
+
+- Updated :meth:`aiohttp.web.StreamResponse.write` annotation to also allow :class:`bytearray` and :class:`memoryview` as inputs -- by :user:`cdce8p`.
+
+
+  *Related issues and pull requests on GitHub:*
+  :issue:`10154`.
+
+
+
+- Fixed a hang where a connection previously used for a streaming
+  download could be returned to the pool in a paused state.
+  -- by :user:`javitonino`.
+
+
+  *Related issues and pull requests on GitHub:*
+  :issue:`10169`.
+
+
+
+
+Features
+--------
+
+- Enabled ALPN on default SSL contexts. This improves compatibility with some
+  proxies which don't work without this extension.
+  -- by :user:`Cycloctane`.
+
+
+  *Related issues and pull requests on GitHub:*
+  :issue:`10156`.
+
+
+
+
+Miscellaneous internal changes
+------------------------------
+
+- Fixed an infinite loop that can occur when using aiohttp in combination
+  with `async-solipsism`_ -- by :user:`bmerry`.
+
+  .. _async-solipsism: https://github.com/bmerry/async-solipsism
+
+
+  *Related issues and pull requests on GitHub:*
+  :issue:`10149`.
+
+
+
+
+----
+
+
+3.11.10 (2024-12-05)
+====================
+
+Bug fixes
+---------
+
+- Fixed race condition in :class:`aiohttp.web.FileResponse` that could have resulted in an incorrect response if the file was replaced on the file system during ``prepare`` -- by :user:`bdraco`.
+
+
+  *Related issues and pull requests on GitHub:*
+  :issue:`10101`, :issue:`10113`.
+
+
+
+- Replaced deprecated call to :func:`mimetypes.guess_type` with :func:`mimetypes.guess_file_type` when using Python 3.13+ -- by :user:`bdraco`.
+
+
+  *Related issues and pull requests on GitHub:*
+  :issue:`10102`.
+
+
+
+- Disabled zero copy writes in the ``StreamWriter`` -- by :user:`bdraco`.
+
+
+  *Related issues and pull requests on GitHub:*
+  :issue:`10125`.
+
+
+
+
+----
+
+
 3.11.9 (2024-12-01)
 ===================
 
diff --git CONTRIBUTORS.txt CONTRIBUTORS.txt
index 6adb3b97fb1..589784b29cb 100644
--- CONTRIBUTORS.txt
+++ CONTRIBUTORS.txt
@@ -9,6 +9,7 @@ Adam Mills
 Adrian Krupa
 Adrián Chaves
 Ahmed Tahri
+Alan Bogarin
 Alan Tse
 Alec Hanefeld
 Alejandro Gómez
@@ -170,6 +171,7 @@ Jan Buchar
 Jan Gosmann
 Jarno Elonen
 Jashandeep Sohi
+Javier Torres
 Jean-Baptiste Estival
 Jens Steinhauser
 Jeonghun Lee
@@ -364,6 +366,7 @@ William S.
 Wilson Ong
 wouter bolsterlee
 Xavier Halloran
+Xi Rui
 Xiang Li
 Yang Zhou
 Yannick Koechlin
diff --git aiohttp/__init__.py aiohttp/__init__.py
index 5615e5349ae..b9af3f829f7 100644
--- aiohttp/__init__.py
+++ aiohttp/__init__.py
@@ -1,4 +1,4 @@
-__version__ = "3.11.9"
+__version__ = "3.11.11"
 
 from typing import TYPE_CHECKING, Tuple
 
diff --git aiohttp/abc.py aiohttp/abc.py
index d6f9f782b0f..5794a9108b0 100644
--- aiohttp/abc.py
+++ aiohttp/abc.py
@@ -17,6 +17,7 @@
     Optional,
     Tuple,
     TypedDict,
+    Union,
 )
 
 from multidict import CIMultiDict
@@ -175,6 +176,11 @@ class AbstractCookieJar(Sized, IterableBase):
     def __init__(self, *, loop: Optional[asyncio.AbstractEventLoop] = None) -> None:
         self._loop = loop or asyncio.get_running_loop()
 
+    @property
+    @abstractmethod
+    def quote_cookie(self) -> bool:
+        """Return True if cookies should be quoted."""
+
     @abstractmethod
     def clear(self, predicate: Optional[ClearCookiePredicate] = None) -> None:
         """Clear all cookies if no predicate is passed."""
@@ -200,7 +206,7 @@ class AbstractStreamWriter(ABC):
     length: Optional[int] = 0
 
     @abstractmethod
-    async def write(self, chunk: bytes) -> None:
+    async def write(self, chunk: Union[bytes, bytearray, memoryview]) -> None:
         """Write chunk into stream."""
 
     @abstractmethod
diff --git aiohttp/client.py aiohttp/client.py
index e04a6ff989a..3b1dc08544f 100644
--- aiohttp/client.py
+++ aiohttp/client.py
@@ -658,7 +658,9 @@ async def _request(
                     all_cookies = self._cookie_jar.filter_cookies(url)
 
                     if cookies is not None:
-                        tmp_cookie_jar = CookieJar()
+                        tmp_cookie_jar = CookieJar(
+                            quote_cookie=self._cookie_jar.quote_cookie
+                        )
                         tmp_cookie_jar.update_cookies(cookies)
                         req_cookies = tmp_cookie_jar.filter_cookies(url)
                         if req_cookies:
diff --git aiohttp/client_exceptions.py aiohttp/client_exceptions.py
index 667da8d5084..1d298e9a8cf 100644
--- aiohttp/client_exceptions.py
+++ aiohttp/client_exceptions.py
@@ -8,13 +8,17 @@
 
 from .typedefs import StrOrURL
 
-try:
+if TYPE_CHECKING:
     import ssl
 
     SSLContext = ssl.SSLContext
-except ImportError:  # pragma: no cover
-    ssl = SSLContext = None  # type: ignore[assignment]
+else:
+    try:
+        import ssl
 
+        SSLContext = ssl.SSLContext
+    except ImportError:  # pragma: no cover
+        ssl = SSLContext = None  # type: ignore[assignment]
 
 if TYPE_CHECKING:
     from .client_reqrep import ClientResponse, ConnectionKey, Fingerprint, RequestInfo
diff --git aiohttp/client_reqrep.py aiohttp/client_reqrep.py
index e97c40ce0e5..43b48063c6e 100644
--- aiohttp/client_reqrep.py
+++ aiohttp/client_reqrep.py
@@ -72,12 +72,16 @@
     RawHeaders,
 )
 
-try:
+if TYPE_CHECKING:
     import ssl
     from ssl import SSLContext
-except ImportError:  # pragma: no cover
-    ssl = None  # type: ignore[assignment]
-    SSLContext = object  # type: ignore[misc,assignment]
+else:
+    try:
+        import ssl
+        from ssl import SSLContext
+    except ImportError:  # pragma: no cover
+        ssl = None  # type: ignore[assignment]
+        SSLContext = object  # type: ignore[misc,assignment]
 
 
 __all__ = ("ClientRequest", "ClientResponse", "RequestInfo", "Fingerprint")
diff --git aiohttp/connector.py aiohttp/connector.py
index 93bc2513b20..7e0986df657 100644
--- aiohttp/connector.py
+++ aiohttp/connector.py
@@ -60,14 +60,18 @@
 )
 from .resolver import DefaultResolver
 
-try:
+if TYPE_CHECKING:
     import ssl
 
     SSLContext = ssl.SSLContext
-except ImportError:  # pragma: no cover
-    ssl = None  # type: ignore[assignment]
-    SSLContext = object  # type: ignore[misc,assignment]
+else:
+    try:
+        import ssl
 
+        SSLContext = ssl.SSLContext
+    except ImportError:  # pragma: no cover
+        ssl = None  # type: ignore[assignment]
+        SSLContext = object  # type: ignore[misc,assignment]
 
 EMPTY_SCHEMA_SET = frozenset({""})
 HTTP_SCHEMA_SET = frozenset({"http", "https"})
@@ -776,14 +780,16 @@ def _make_ssl_context(verified: bool) -> SSLContext:
         # No ssl support
         return None
     if verified:
-        return ssl.create_default_context()
-    sslcontext = ssl.SSLContext(ssl.PROTOCOL_TLS_CLIENT)
-    sslcontext.options |= ssl.OP_NO_SSLv2
-    sslcontext.options |= ssl.OP_NO_SSLv3
-    sslcontext.check_hostname = False
-    sslcontext.verify_mode = ssl.CERT_NONE
-    sslcontext.options |= ssl.OP_NO_COMPRESSION
-    sslcontext.set_default_verify_paths()
+        sslcontext = ssl.create_default_context()
+    else:
+        sslcontext = ssl.SSLContext(ssl.PROTOCOL_TLS_CLIENT)
+        sslcontext.options |= ssl.OP_NO_SSLv2
+        sslcontext.options |= ssl.OP_NO_SSLv3
+        sslcontext.check_hostname = False
+        sslcontext.verify_mode = ssl.CERT_NONE
+        sslcontext.options |= ssl.OP_NO_COMPRESSION
+        sslcontext.set_default_verify_paths()
+    sslcontext.set_alpn_protocols(("http/1.1",))
     return sslcontext
 
 
diff --git aiohttp/cookiejar.py aiohttp/cookiejar.py
index ef04bda5ad6..f6b9a921767 100644
--- aiohttp/cookiejar.py
+++ aiohttp/cookiejar.py
@@ -117,6 +117,10 @@ def __init__(
         self._expire_heap: List[Tuple[float, Tuple[str, str, str]]] = []
         self._expirations: Dict[Tuple[str, str, str], float] = {}
 
+    @property
+    def quote_cookie(self) -> bool:
+        return self._quote_cookie
+
     def save(self, file_path: PathLike) -> None:
         file_path = pathlib.Path(file_path)
         with file_path.open(mode="wb") as f:
@@ -474,6 +478,10 @@ def __iter__(self) -> "Iterator[Morsel[str]]":
     def __len__(self) -> int:
         return 0
 
+    @property
+    def quote_cookie(self) -> bool:
+        return True
+
     def clear(self, predicate: Optional[ClearCookiePredicate] = None) -> None:
         pass
 
diff --git aiohttp/http_writer.py aiohttp/http_writer.py
index c66fda3d8d0..28b14f7a791 100644
--- aiohttp/http_writer.py
+++ aiohttp/http_writer.py
@@ -72,7 +72,7 @@ def enable_compression(
     ) -> None:
         self._compress = ZLibCompressor(encoding=encoding, strategy=strategy)
 
-    def _write(self, chunk: bytes) -> None:
+    def _write(self, chunk: Union[bytes, bytearray, memoryview]) -> None:
         size = len(chunk)
         self.buffer_size += size
         self.output_size += size
@@ -90,10 +90,14 @@ def _writelines(self, chunks: Iterable[bytes]) -> None:
         transport = self._protocol.transport
         if transport is None or transport.is_closing():
             raise ClientConnectionResetError("Cannot write to closing transport")
-        transport.writelines(chunks)
+        transport.write(b"".join(chunks))
 
     async def write(
-        self, chunk: bytes, *, drain: bool = True, LIMIT: int = 0x10000
+        self,
+        chunk: Union[bytes, bytearray, memoryview],
+        *,
+        drain: bool = True,
+        LIMIT: int = 0x10000,
     ) -> None:
         """Writes chunk of data to a stream.
 
diff --git aiohttp/payload.py aiohttp/payload.py
index c8c01814698..3f6d3672db2 100644
--- aiohttp/payload.py
+++ aiohttp/payload.py
@@ -4,6 +4,7 @@
 import json
 import mimetypes
 import os
+import sys
 import warnings
 from abc import ABC, abstractmethod
 from itertools import chain
@@ -169,7 +170,11 @@ def __init__(
         if content_type is not sentinel and content_type is not None:
             self._headers[hdrs.CONTENT_TYPE] = content_type
         elif self._filename is not None:
-            content_type = mimetypes.guess_type(self._filename)[0]
+            if sys.version_info >= (3, 13):
+                guesser = mimetypes.guess_file_type
+            else:
+                guesser = mimetypes.guess_type
+            content_type = guesser(self._filename)[0]
             if content_type is None:
                 content_type = self._default_content_type
             self._headers[hdrs.CONTENT_TYPE] = content_type
diff --git aiohttp/streams.py aiohttp/streams.py
index b97846171b1..6126fb5695d 100644
--- aiohttp/streams.py
+++ aiohttp/streams.py
@@ -220,6 +220,9 @@ def feed_eof(self) -> None:
             self._eof_waiter = None
             set_result(waiter, None)
 
+        if self._protocol._reading_paused:
+            self._protocol.resume_reading()
+
         for cb in self._eof_callbacks:
             try:
                 cb()
@@ -517,8 +520,9 @@ def _read_nowait_chunk(self, n: int) -> bytes:
         else:
             data = self._buffer.popleft()
 
-        self._size -= len(data)
-        self._cursor += len(data)
+        data_len = len(data)
+        self._size -= data_len
+        self._cursor += data_len
 
         chunk_splits = self._http_chunk_splits
         # Prevent memory leak: drop useless chunk splits
diff --git aiohttp/web.py aiohttp/web.py
index f975b665331..d6ab6f6fad4 100644
--- aiohttp/web.py
+++ aiohttp/web.py
@@ -9,6 +9,7 @@
 from contextlib import suppress
 from importlib import import_module
 from typing import (
+    TYPE_CHECKING,
     Any,
     Awaitable,
     Callable,
@@ -287,10 +288,13 @@
 )
 
 
-try:
+if TYPE_CHECKING:
     from ssl import SSLContext
-except ImportError:  # pragma: no cover
-    SSLContext = Any  # type: ignore[misc,assignment]
+else:
+    try:
+        from ssl import SSLContext
+    except ImportError:  # pragma: no cover
+        SSLContext = object  # type: ignore[misc,assignment]
 
 # Only display warning when using -Wdefault, -We, -X dev or similar.
 warnings.filterwarnings("ignore", category=NotAppKeyWarning, append=True)
diff --git aiohttp/web_fileresponse.py aiohttp/web_fileresponse.py
index 3b2bc2caf12..be9cf87e069 100644
--- aiohttp/web_fileresponse.py
+++ aiohttp/web_fileresponse.py
@@ -1,7 +1,10 @@
 import asyncio
+import io
 import os
 import pathlib
+import sys
 from contextlib import suppress
+from enum import Enum, auto
 from mimetypes import MimeTypes
 from stat import S_ISREG
 from types import MappingProxyType
@@ -15,6 +18,7 @@
     Iterator,
     List,
     Optional,
+    Set,
     Tuple,
     Union,
     cast,
@@ -66,12 +70,25 @@
     }
 )
 
+
+class _FileResponseResult(Enum):
+    """The result of the file response."""
+
+    SEND_FILE = auto()  # Ie a regular file to send
+    NOT_ACCEPTABLE = auto()  # Ie a socket, or non-regular file
+    PRE_CONDITION_FAILED = auto()  # Ie If-Match or If-None-Match failed
+    NOT_MODIFIED = auto()  # 304 Not Modified
+
+
 # Add custom pairs and clear the encodings map so guess_type ignores them.
 CONTENT_TYPES.encodings_map.clear()
 for content_type, extension in ADDITIONAL_CONTENT_TYPES.items():
     CONTENT_TYPES.add_type(content_type, extension)  # type: ignore[attr-defined]
 
 
+_CLOSE_FUTURES: Set[asyncio.Future[None]] = set()
+
+
 class FileResponse(StreamResponse):
     """A response object can be used to send files."""
 
@@ -160,10 +177,12 @@ async def _precondition_failed(
         self.content_length = 0
         return await super().prepare(request)
 
-    def _get_file_path_stat_encoding(
-        self, accept_encoding: str
-    ) -> Tuple[pathlib.Path, os.stat_result, Optional[str]]:
-        """Return the file path, stat result, and encoding.
+    def _make_response(
+        self, request: "BaseRequest", accept_encoding: str
+    ) -> Tuple[
+        _FileResponseResult, Optional[io.BufferedReader], os.stat_result, Optional[str]
+    ]:
+        """Return the response result, io object, stat result, and encoding.
 
         If an uncompressed file is returned, the encoding is set to
         :py:data:`None`.
@@ -171,6 +190,52 @@ def _get_file_path_stat_encoding(
         This method should be called from a thread executor
         since it calls os.stat which may block.
         """
+        file_path, st, file_encoding = self._get_file_path_stat_encoding(
+            accept_encoding
+        )
+        if not file_path:
+            return _FileResponseResult.NOT_ACCEPTABLE, None, st, None
+
+        etag_value = f"{st.st_mtime_ns:x}-{st.st_size:x}"
+
+        # https://www.rfc-editor.org/rfc/rfc9110#section-13.1.1-2
+        if (ifmatch := request.if_match) is not None and not self._etag_match(
+            etag_value, ifmatch, weak=False
+        ):
+            return _FileResponseResult.PRE_CONDITION_FAILED, None, st, file_encoding
+
+        if (
+            (unmodsince := request.if_unmodified_since) is not None
+            and ifmatch is None
+            and st.st_mtime > unmodsince.timestamp()
+        ):
+            return _FileResponseResult.PRE_CONDITION_FAILED, None, st, file_encoding
+
+        # https://www.rfc-editor.org/rfc/rfc9110#section-13.1.2-2
+        if (ifnonematch := request.if_none_match) is not None and self._etag_match(
+            etag_value, ifnonematch, weak=True
+        ):
+            return _FileResponseResult.NOT_MODIFIED, None, st, file_encoding
+
+        if (
+            (modsince := request.if_modified_since) is not None
+            and ifnonematch is None
+            and st.st_mtime <= modsince.timestamp()
+        ):
+            return _FileResponseResult.NOT_MODIFIED, None, st, file_encoding
+
+        fobj = file_path.open("rb")
+        with suppress(OSError):
+            # fstat() may not be available on all platforms
+            # Once we open the file, we want the fstat() to ensure
+            # the file has not changed between the first stat()
+            # and the open().
+            st = os.stat(fobj.fileno())
+        return _FileResponseResult.SEND_FILE, fobj, st, file_encoding
+
+    def _get_file_path_stat_encoding(
+        self, accept_encoding: str
+    ) -> Tuple[Optional[pathlib.Path], os.stat_result, Optional[str]]:
         file_path = self._path
         for file_extension, file_encoding in ENCODING_EXTENSIONS.items():
             if file_encoding not in accept_encoding:
@@ -184,7 +249,8 @@ def _get_file_path_stat_encoding(
                     return compressed_path, st, file_encoding
 
         # Fallback to the uncompressed file
-        return file_path, file_path.stat(), None
+        st = file_path.stat()
+        return file_path if S_ISREG(st.st_mode) else None, st, None
 
     async def prepare(self, request: "BaseRequest") -> Optional[AbstractStreamWriter]:
         loop = asyncio.get_running_loop()
@@ -192,9 +258,12 @@ async def prepare(self, request: "BaseRequest") -> Optional[AbstractStreamWriter
         # https://www.rfc-editor.org/rfc/rfc9110#section-8.4.1
         accept_encoding = request.headers.get(hdrs.ACCEPT_ENCODING, "").lower()
         try:
-            file_path, st, file_encoding = await loop.run_in_executor(
-                None, self._get_file_path_stat_encoding, accept_encoding
+            response_result, fobj, st, file_encoding = await loop.run_in_executor(
+                None, self._make_response, request, accept_encoding
             )
+        except PermissionError:
+            self.set_status(HTTPForbidden.status_code)
+            return await super().prepare(request)
         except OSError:
             # Most likely to be FileNotFoundError or OSError for circular
             # symlinks in python >= 3.13, so respond with 404.
@@ -202,51 +271,46 @@ async def prepare(self, request: "BaseRequest") -> Optional[AbstractStreamWriter
             return await super().prepare(request)
 
         # Forbid special files like sockets, pipes, devices, etc.
-        if not S_ISREG(st.st_mode):
+        if response_result is _FileResponseResult.NOT_ACCEPTABLE:
             self.set_status(HTTPForbidden.status_code)
             return await super().prepare(request)
 
-        etag_value = f"{st.st_mtime_ns:x}-{st.st_size:x}"
-        last_modified = st.st_mtime
-
-        # https://www.rfc-editor.org/rfc/rfc9110#section-13.1.1-2
-        ifmatch = request.if_match
-        if ifmatch is not None and not self._etag_match(
-            etag_value, ifmatch, weak=False
-        ):
-            return await self._precondition_failed(request)
-
-        unmodsince = request.if_unmodified_since
-        if (
-            unmodsince is not None
-            and ifmatch is None
-            and st.st_mtime > unmodsince.timestamp()
-        ):
+        if response_result is _FileResponseResult.PRE_CONDITION_FAILED:
             return await self._precondition_failed(request)
 
-        # https://www.rfc-editor.org/rfc/rfc9110#section-13.1.2-2
-        ifnonematch = request.if_none_match
-        if ifnonematch is not None and self._etag_match(
-            etag_value, ifnonematch, weak=True
-        ):
-            return await self._not_modified(request, etag_value, last_modified)
-
-        modsince = request.if_modified_since
-        if (
-            modsince is not None
-            and ifnonematch is None
-            and st.st_mtime <= modsince.timestamp()
-        ):
+        if response_result is _FileResponseResult.NOT_MODIFIED:
+            etag_value = f"{st.st_mtime_ns:x}-{st.st_size:x}"
+            last_modified = st.st_mtime
             return await self._not_modified(request, etag_value, last_modified)
 
+        assert fobj is not None
+        try:
+            return await self._prepare_open_file(request, fobj, st, file_encoding)
+        finally:
+            # We do not await here because we do not want to wait
+            # for the executor to finish before returning the response
+            # so the connection can begin servicing another request
+            # as soon as possible.
+            close_future = loop.run_in_executor(None, fobj.close)
+            # Hold a strong reference to the future to prevent it from being
+            # garbage collected before it completes.
+            _CLOSE_FUTURES.add(close_future)
+            close_future.add_done_callback(_CLOSE_FUTURES.remove)
+
+    async def _prepare_open_file(
+        self,
+        request: "BaseRequest",
+        fobj: io.BufferedReader,
+        st: os.stat_result,
+        file_encoding: Optional[str],
+    ) -> Optional[AbstractStreamWriter]:
         status = self._status
-        file_size = st.st_size
-        count = file_size
-
-        start = None
+        file_size: int = st.st_size
+        file_mtime: float = st.st_mtime
+        count: int = file_size
+        start: Optional[int] = None
 
-        ifrange = request.if_range
-        if ifrange is None or st.st_mtime <= ifrange.timestamp():
+        if (ifrange := request.if_range) is None or file_mtime <= ifrange.timestamp():
             # If-Range header check:
             # condition = cached date >= last modification date
             # return 206 if True else 200.
@@ -257,7 +321,7 @@ async def prepare(self, request: "BaseRequest") -> Optional[AbstractStreamWriter
             try:
                 rng = request.http_range
                 start = rng.start
-                end = rng.stop
+                end: Optional[int] = rng.stop
             except ValueError:
                 # https://tools.ietf.org/html/rfc7233:
                 # A server generating a 416 (Range Not Satisfiable) response to
@@ -268,13 +332,13 @@ async def prepare(self, request: "BaseRequest") -> Optional[AbstractStreamWriter
                 #
                 # Will do the same below. Many servers ignore this and do not
                 # send a Content-Range header with HTTP 416
-                self.headers[hdrs.CONTENT_RANGE] = f"bytes */{file_size}"
+                self._headers[hdrs.CONTENT_RANGE] = f"bytes */{file_size}"
                 self.set_status(HTTPRequestRangeNotSatisfiable.status_code)
                 return await super().prepare(request)
 
             # If a range request has been made, convert start, end slice
             # notation into file pointer offset and count
-            if start is not None or end is not None:
+            if start is not None:
                 if start < 0 and end is None:  # return tail of file
                     start += file_size
                     if start < 0:
@@ -304,7 +368,7 @@ async def prepare(self, request: "BaseRequest") -> Optional[AbstractStreamWriter
                     # suffix-byte-range-spec with a non-zero suffix-length,
                     # then the byte-range-set is satisfiable. Otherwise, the
                     # byte-range-set is unsatisfiable.
-                    self.headers[hdrs.CONTENT_RANGE] = f"bytes */{file_size}"
+                    self._headers[hdrs.CONTENT_RANGE] = f"bytes */{file_size}"
                     self.set_status(HTTPRequestRangeNotSatisfiable.status_code)
                     return await super().prepare(request)
 
@@ -316,48 +380,39 @@ async def prepare(self, request: "BaseRequest") -> Optional[AbstractStreamWriter
         # If the Content-Type header is not already set, guess it based on the
         # extension of the request path. The encoding returned by guess_type
         #  can be ignored since the map was cleared above.
-        if hdrs.CONTENT_TYPE not in self.headers:
-            self.content_type = (
-                CONTENT_TYPES.guess_type(self._path)[0] or FALLBACK_CONTENT_TYPE
-            )
+        if hdrs.CONTENT_TYPE not in self._headers:
+            if sys.version_info >= (3, 13):
+                guesser = CONTENT_TYPES.guess_file_type
+            else:
+                guesser = CONTENT_TYPES.guess_type
+            self.content_type = guesser(self._path)[0] or FALLBACK_CONTENT_TYPE
 
         if file_encoding:
-            self.headers[hdrs.CONTENT_ENCODING] = file_encoding
-            self.headers[hdrs.VARY] = hdrs.ACCEPT_ENCODING
+            self._headers[hdrs.CONTENT_ENCODING] = file_encoding
+            self._headers[hdrs.VARY] = hdrs.ACCEPT_ENCODING
             # Disable compression if we are already sending
             # a compressed file since we don't want to double
             # compress.
             self._compression = False
 
-        self.etag = etag_value  # type: ignore[assignment]
-        self.last_modified = st.st_mtime  # type: ignore[assignment]
+        self.etag = f"{st.st_mtime_ns:x}-{st.st_size:x}"  # type: ignore[assignment]
+        self.last_modified = file_mtime  # type: ignore[assignment]
         self.content_length = count
 
-        self.headers[hdrs.ACCEPT_RANGES] = "bytes"
-
-        real_start = cast(int, start)
+        self._headers[hdrs.ACCEPT_RANGES] = "bytes"
 
         if status == HTTPPartialContent.status_code:
-            self.headers[hdrs.CONTENT_RANGE] = "bytes {}-{}/{}".format(
+            real_start = start
+            assert real_start is not None
+            self._headers[hdrs.CONTENT_RANGE] = "bytes {}-{}/{}".format(
                 real_start, real_start + count - 1, file_size
             )
 
         # If we are sending 0 bytes calling sendfile() will throw a ValueError
-        if count == 0 or must_be_empty_body(request.method, self.status):
-            return await super().prepare(request)
-
-        try:
-            fobj = await loop.run_in_executor(None, file_path.open, "rb")
-        except PermissionError:
-            self.set_status(HTTPForbidden.status_code)
+        if count == 0 or must_be_empty_body(request.method, status):
             return await super().prepare(request)
 
-        if start:  # be aware that start could be None or int=0 here.
-            offset = start
-        else:
-            offset = 0
+        # be aware that start could be None or int=0 here.
+        offset = start or 0
 
-        try:
-            return await self._sendfile(request, fobj, offset, count)
-        finally:
-            await asyncio.shield(loop.run_in_executor(None, fobj.close))
+        return await self._sendfile(request, fobj, offset, count)
diff --git aiohttp/web_protocol.py aiohttp/web_protocol.py
index e8bb41abf97..3306b86bded 100644
--- aiohttp/web_protocol.py
+++ aiohttp/web_protocol.py
@@ -458,7 +458,7 @@ def _process_keepalive(self) -> None:
         loop = self._loop
         now = loop.time()
         close_time = self._next_keepalive_close_time
-        if now <= close_time:
+        if now < close_time:
             # Keep alive close check fired too early, reschedule
             self._keepalive_handle = loop.call_at(close_time, self._process_keepalive)
             return
diff --git aiohttp/web_response.py aiohttp/web_response.py
index cd2be24f1a3..e498a905caf 100644
--- aiohttp/web_response.py
+++ aiohttp/web_response.py
@@ -537,7 +537,7 @@ async def _write_headers(self) -> None:
         status_line = f"HTTP/{version[0]}.{version[1]} {self._status} {self._reason}"
         await writer.write_headers(status_line, self._headers)
 
-    async def write(self, data: bytes) -> None:
+    async def write(self, data: Union[bytes, bytearray, memoryview]) -> None:
         assert isinstance(
             data, (bytes, bytearray, memoryview)
         ), "data argument must be byte-ish (%r)" % type(data)
diff --git aiohttp/web_runner.py aiohttp/web_runner.py
index f8933383435..bcfec727c84 100644
--- aiohttp/web_runner.py
+++ aiohttp/web_runner.py
@@ -3,7 +3,7 @@
 import socket
 import warnings
 from abc import ABC, abstractmethod
-from typing import Any, List, Optional, Set
+from typing import TYPE_CHECKING, Any, List, Optional, Set
 
 from yarl import URL
 
@@ -11,11 +11,13 @@
 from .web_app import Application
 from .web_server import Server
 
-try:
+if TYPE_CHECKING:
     from ssl import SSLContext
-except ImportError:
-    SSLContext = object  # type: ignore[misc,assignment]
-
+else:
+    try:
+        from ssl import SSLContext
+    except ImportError:  # pragma: no cover
+        SSLContext = object  # type: ignore[misc,assignment]
 
 __all__ = (
     "BaseSite",
diff --git aiohttp/worker.py aiohttp/worker.py
index 9b307697336..8ed121ac955 100644
--- aiohttp/worker.py
+++ aiohttp/worker.py
@@ -6,7 +6,7 @@
 import signal
 import sys
 from types import FrameType
-from typing import Any, Awaitable, Callable, Optional, Union  # noqa
+from typing import TYPE_CHECKING, Any, Optional
 
 from gunicorn.config import AccessLogFormat as GunicornAccessLogFormat
 from gunicorn.workers import base
@@ -17,13 +17,18 @@
 from .web_app import Application
 from .web_log import AccessLogger
 
-try:
+if TYPE_CHECKING:
     import ssl
 
     SSLContext = ssl.SSLContext
-except ImportError:  # pragma: no cover
-    ssl = None  # type: ignore[assignment]
-    SSLContext = object  # type: ignore[misc,assignment]
+else:
+    try:
+        import ssl
+
+        SSLContext = ssl.SSLContext
+    except ImportError:  # pragma: no cover
+        ssl = None  # type: ignore[assignment]
+        SSLContext = object  # type: ignore[misc,assignment]
 
 
 __all__ = ("GunicornWebWorker", "GunicornUVLoopWebWorker")
diff --git docs/spelling_wordlist.txt docs/spelling_wordlist.txt
index a1f3d944584..c4e10b44987 100644
--- docs/spelling_wordlist.txt
+++ docs/spelling_wordlist.txt
@@ -245,6 +245,7 @@ py
 pydantic
 pyenv
 pyflakes
+pyright
 pytest
 Pytest
 Quickstart
diff --git requirements/constraints.txt requirements/constraints.txt
index d32acc7b773..740e3e2d559 100644
--- requirements/constraints.txt
+++ requirements/constraints.txt
@@ -14,7 +14,7 @@ aiohttp-theme==0.1.7
     # via -r requirements/doc.in
 aiosignal==1.3.1
     # via -r requirements/runtime-deps.in
-alabaster==0.7.13
+alabaster==1.0.0
     # via sphinx
 annotated-types==0.7.0
     # via pydantic
@@ -236,22 +236,22 @@ slotscheck==0.19.1
     # via -r requirements/lint.in
 snowballstemmer==2.2.0
     # via sphinx
-sphinx==7.1.2
+sphinx==8.1.3
     # via
     #   -r requirements/doc.in
     #   sphinxcontrib-spelling
     #   sphinxcontrib-towncrier
-sphinxcontrib-applehelp==1.0.4
+sphinxcontrib-applehelp==2.0.0
     # via sphinx
-sphinxcontrib-devhelp==1.0.2
+sphinxcontrib-devhelp==2.0.0
     # via sphinx
-sphinxcontrib-htmlhelp==2.0.1
+sphinxcontrib-htmlhelp==2.1.0
     # via sphinx
 sphinxcontrib-jsmath==1.0.1
     # via sphinx
-sphinxcontrib-qthelp==1.0.3
+sphinxcontrib-qthelp==2.0.0
     # via sphinx
-sphinxcontrib-serializinghtml==1.1.5
+sphinxcontrib-serializinghtml==2.0.0
     # via sphinx
 sphinxcontrib-spelling==8.0.0 ; platform_system != "Windows"
     # via -r requirements/doc-spelling.in
diff --git requirements/dev.txt requirements/dev.txt
index 168ce639d19..72e49ed9edf 100644
--- requirements/dev.txt
+++ requirements/dev.txt
@@ -14,7 +14,7 @@ aiohttp-theme==0.1.7
     # via -r requirements/doc.in
 aiosignal==1.3.1
     # via -r requirements/runtime-deps.in
-alabaster==0.7.13
+alabaster==1.0.0
     # via sphinx
 annotated-types==0.7.0
     # via pydantic
@@ -210,21 +210,21 @@ slotscheck==0.19.1
     # via -r requirements/lint.in
 snowballstemmer==2.2.0
     # via sphinx
-sphinx==7.1.2
+sphinx==8.1.3
     # via
     #   -r requirements/doc.in
     #   sphinxcontrib-towncrier
-sphinxcontrib-applehelp==1.0.4
+sphinxcontrib-applehelp==2.0.0
     # via sphinx
-sphinxcontrib-devhelp==1.0.2
+sphinxcontrib-devhelp==2.0.0
     # via sphinx
-sphinxcontrib-htmlhelp==2.0.1
+sphinxcontrib-htmlhelp==2.1.0
     # via sphinx
 sphinxcontrib-jsmath==1.0.1
     # via sphinx
-sphinxcontrib-qthelp==1.0.3
+sphinxcontrib-qthelp==2.0.0
     # via sphinx
-sphinxcontrib-serializinghtml==1.1.5
+sphinxcontrib-serializinghtml==2.0.0
     # via sphinx
 sphinxcontrib-towncrier==0.4.0a0
     # via -r requirements/doc.in
diff --git requirements/doc-spelling.txt requirements/doc-spelling.txt
index df393012548..892ae6b164c 100644
--- requirements/doc-spelling.txt
+++ requirements/doc-spelling.txt
@@ -6,7 +6,7 @@
 #
 aiohttp-theme==0.1.7
     # via -r requirements/doc.in
-alabaster==0.7.13
+alabaster==1.0.0
     # via sphinx
 babel==2.16.0
     # via sphinx
@@ -46,22 +46,22 @@ requests==2.32.3
     # via sphinx
 snowballstemmer==2.2.0
     # via sphinx
-sphinx==7.1.2
+sphinx==8.1.3
     # via
     #   -r requirements/doc.in
     #   sphinxcontrib-spelling
     #   sphinxcontrib-towncrier
-sphinxcontrib-applehelp==1.0.4
+sphinxcontrib-applehelp==2.0.0
     # via sphinx
-sphinxcontrib-devhelp==1.0.2
+sphinxcontrib-devhelp==2.0.0
     # via sphinx
-sphinxcontrib-htmlhelp==2.0.1
+sphinxcontrib-htmlhelp==2.1.0
     # via sphinx
 sphinxcontrib-jsmath==1.0.1
     # via sphinx
-sphinxcontrib-qthelp==1.0.3
+sphinxcontrib-qthelp==2.0.0
     # via sphinx
-sphinxcontrib-serializinghtml==1.1.5
+sphinxcontrib-serializinghtml==2.0.0
     # via sphinx
 sphinxcontrib-spelling==8.0.0 ; platform_system != "Windows"
     # via -r requirements/doc-spelling.in
diff --git requirements/doc.txt requirements/doc.txt
index 43b7c6b7e8b..f7f98330e1f 100644
--- requirements/doc.txt
+++ requirements/doc.txt
@@ -6,7 +6,7 @@
 #
 aiohttp-theme==0.1.7
     # via -r requirements/doc.in
-alabaster==0.7.13
+alabaster==1.0.0
     # via sphinx
 babel==2.16.0
     # via sphinx
@@ -44,21 +44,21 @@ requests==2.32.3
     # via sphinx
 snowballstemmer==2.2.0
     # via sphinx
-sphinx==7.1.2
+sphinx==8.1.3
     # via
     #   -r requirements/doc.in
     #   sphinxcontrib-towncrier
-sphinxcontrib-applehelp==1.0.4
+sphinxcontrib-applehelp==2.0.0
     # via sphinx
-sphinxcontrib-devhelp==1.0.2
+sphinxcontrib-devhelp==2.0.0
     # via sphinx
-sphinxcontrib-htmlhelp==2.0.1
+sphinxcontrib-htmlhelp==2.1.0
     # via sphinx
 sphinxcontrib-jsmath==1.0.1
     # via sphinx
-sphinxcontrib-qthelp==1.0.3
+sphinxcontrib-qthelp==2.0.0
     # via sphinx
-sphinxcontrib-serializinghtml==1.1.5
+sphinxcontrib-serializinghtml==2.0.0
     # via sphinx
 sphinxcontrib-towncrier==0.4.0a0
     # via -r requirements/doc.in
diff --git a/tests/test_benchmarks_web_fileresponse.py b/tests/test_benchmarks_web_fileresponse.py
new file mode 100644
index 00000000000..01aa7448c86
--- /dev/null
+++ tests/test_benchmarks_web_fileresponse.py
@@ -0,0 +1,105 @@
+"""codspeed benchmarks for the web file responses."""
+
+import asyncio
+import pathlib
+
+from multidict import CIMultiDict
+from pytest_codspeed import BenchmarkFixture
+
+from aiohttp import ClientResponse, web
+from aiohttp.pytest_plugin import AiohttpClient
+
+
+def test_simple_web_file_response(
+    loop: asyncio.AbstractEventLoop,
+    aiohttp_client: AiohttpClient,
+    benchmark: BenchmarkFixture,
+) -> None:
+    """Benchmark creating 100 simple web.FileResponse."""
+    response_count = 100
+    filepath = pathlib.Path(__file__).parent / "sample.txt"
+
+    async def handler(request: web.Request) -> web.FileResponse:
+        return web.FileResponse(path=filepath)
+
+    app = web.Application()
+    app.router.add_route("GET", "/", handler)
+
+    async def run_file_response_benchmark() -> None:
+        client = await aiohttp_client(app)
+        for _ in range(response_count):
+            await client.get("/")
+        await client.close()
+
+    @benchmark
+    def _run() -> None:
+        loop.run_until_complete(run_file_response_benchmark())
+
+
+def test_simple_web_file_sendfile_fallback_response(
+    loop: asyncio.AbstractEventLoop,
+    aiohttp_client: AiohttpClient,
+    benchmark: BenchmarkFixture,
+) -> None:
+    """Benchmark creating 100 simple web.FileResponse without sendfile."""
+    response_count = 100
+    filepath = pathlib.Path(__file__).parent / "sample.txt"
+
+    async def handler(request: web.Request) -> web.FileResponse:
+        transport = request.transport
+        assert transport is not None
+        transport._sendfile_compatible = False  # type: ignore[attr-defined]
+        return web.FileResponse(path=filepath)
+
+    app = web.Application()
+    app.router.add_route("GET", "/", handler)
+
+    async def run_file_response_benchmark() -> None:
+        client = await aiohttp_client(app)
+        for _ in range(response_count):
+            await client.get("/")
+        await client.close()
+
+    @benchmark
+    def _run() -> None:
+        loop.run_until_complete(run_file_response_benchmark())
+
+
+def test_simple_web_file_response_not_modified(
+    loop: asyncio.AbstractEventLoop,
+    aiohttp_client: AiohttpClient,
+    benchmark: BenchmarkFixture,
+) -> None:
+    """Benchmark web.FileResponse that return a 304."""
+    response_count = 100
+    filepath = pathlib.Path(__file__).parent / "sample.txt"
+
+    async def handler(request: web.Request) -> web.FileResponse:
+        return web.FileResponse(path=filepath)
+
+    app = web.Application()
+    app.router.add_route("GET", "/", handler)
+
+    async def make_last_modified_header() -> CIMultiDict[str]:
+        client = await aiohttp_client(app)
+        resp = await client.get("/")
+        last_modified = resp.headers["Last-Modified"]
+        headers = CIMultiDict({"If-Modified-Since": last_modified})
+        return headers
+
+    async def run_file_response_benchmark(
+        headers: CIMultiDict[str],
+    ) -> ClientResponse:
+        client = await aiohttp_client(app)
+        for _ in range(response_count):
+            resp = await client.get("/", headers=headers)
+
+        await client.close()
+        return resp  # type: ignore[possibly-undefined]
+
+    headers = loop.run_until_complete(make_last_modified_header())
+
+    @benchmark
+    def _run() -> None:
+        resp = loop.run_until_complete(run_file_response_benchmark(headers))
+        assert resp.status == 304
diff --git tests/test_client_functional.py tests/test_client_functional.py
index b34ccdb600d..05af9ae25ad 100644
--- tests/test_client_functional.py
+++ tests/test_client_functional.py
@@ -603,6 +603,30 @@ async def handler(request):
     assert txt == "Test message"
 
 
+async def test_ssl_client_alpn(
+    aiohttp_server: AiohttpServer,
+    aiohttp_client: AiohttpClient,
+    ssl_ctx: ssl.SSLContext,
+) -> None:
+
+    async def handler(request: web.Request) -> web.Response:
+        assert request.transport is not None
+        sslobj = request.transport.get_extra_info("ssl_object")
+        return web.Response(text=sslobj.selected_alpn_protocol())
+
+    app = web.Application()
+    app.router.add_route("GET", "/", handler)
+    ssl_ctx.set_alpn_protocols(("http/1.1",))
+    server = await aiohttp_server(app, ssl=ssl_ctx)
+
+    connector = aiohttp.TCPConnector(ssl=False)
+    client = await aiohttp_client(server, connector=connector)
+    resp = await client.get("/")
+    assert resp.status == 200
+    txt = await resp.text()
+    assert txt == "http/1.1"
+
+
 async def test_tcp_connector_fingerprint_ok(
     aiohttp_server,
     aiohttp_client,
diff --git tests/test_client_session.py tests/test_client_session.py
index 65f80b6abe9..6309c5daf2e 100644
--- tests/test_client_session.py
+++ tests/test_client_session.py
@@ -15,13 +15,14 @@
 from yarl import URL
 
 import aiohttp
-from aiohttp import client, hdrs, web
+from aiohttp import CookieJar, client, hdrs, web
 from aiohttp.client import ClientSession
 from aiohttp.client_proto import ResponseHandler
 from aiohttp.client_reqrep import ClientRequest
 from aiohttp.connector import BaseConnector, Connection, TCPConnector, UnixConnector
 from aiohttp.helpers import DEBUG
 from aiohttp.http import RawResponseMessage
+from aiohttp.pytest_plugin import AiohttpServer
 from aiohttp.test_utils import make_mocked_coro
 from aiohttp.tracing import Trace
 
@@ -634,8 +635,24 @@ async def handler(request):
     assert resp_cookies["response"].value == "resp_value"
 
 
-async def test_session_default_version(loop) -> None:
-    session = aiohttp.ClientSession(loop=loop)
+async def test_cookies_with_not_quoted_cookie_jar(
+    aiohttp_server: AiohttpServer,
+) -> None:
+    async def handler(_: web.Request) -> web.Response:
+        return web.Response()
+
+    app = web.Application()
+    app.router.add_route("GET", "/", handler)
+    server = await aiohttp_server(app)
+    jar = CookieJar(quote_cookie=False)
+    cookies = {"name": "val=foobar"}
+    async with aiohttp.ClientSession(cookie_jar=jar) as sess:
+        resp = await sess.request("GET", server.make_url("/"), cookies=cookies)
+    assert resp.request_info.headers.get("Cookie", "") == "name=val=foobar"
+
+
+async def test_session_default_version(loop: asyncio.AbstractEventLoop) -> None:
+    session = aiohttp.ClientSession()
     assert session.version == aiohttp.HttpVersion11
     await session.close()
 
diff --git tests/test_cookiejar.py tests/test_cookiejar.py
index bdcf54fa796..0b440bc2ca6 100644
--- tests/test_cookiejar.py
+++ tests/test_cookiejar.py
@@ -807,6 +807,7 @@ async def make_jar():
 async def test_dummy_cookie_jar() -> None:
     cookie = SimpleCookie("foo=bar; Domain=example.com;")
     dummy_jar = DummyCookieJar()
+    assert dummy_jar.quote_cookie is True
     assert len(dummy_jar) == 0
     dummy_jar.update_cookies(cookie)
     assert len(dummy_jar) == 0
diff --git tests/test_flowcontrol_streams.py tests/test_flowcontrol_streams.py
index 68e623b6dd7..9874cc2511e 100644
--- tests/test_flowcontrol_streams.py
+++ tests/test_flowcontrol_streams.py
@@ -4,6 +4,7 @@
 import pytest
 
 from aiohttp import streams
+from aiohttp.base_protocol import BaseProtocol
 
 
 @pytest.fixture
@@ -112,6 +113,15 @@ async def test_read_nowait(self, stream) -> None:
         assert res == b""
         assert stream._protocol.resume_reading.call_count == 1  # type: ignore[attr-defined]
 
+    async def test_resumed_on_eof(self, stream: streams.StreamReader) -> None:
+        stream.feed_data(b"data")
+        assert stream._protocol.pause_reading.call_count == 1  # type: ignore[attr-defined]
+        assert stream._protocol.resume_reading.call_count == 0  # type: ignore[attr-defined]
+        stream._protocol._reading_paused = True
+
+        stream.feed_eof()
+        assert stream._protocol.resume_reading.call_count == 1  # type: ignore[attr-defined]
+
 
 async def test_flow_control_data_queue_waiter_cancelled(
     buffer: streams.FlowControlDataQueue,
@@ -180,3 +190,16 @@ async def test_flow_control_data_queue_read_eof(
     buffer.feed_eof()
     with pytest.raises(streams.EofStream):
         await buffer.read()
+
+
+async def test_stream_reader_eof_when_full() -> None:
+    loop = asyncio.get_event_loop()
+    protocol = BaseProtocol(loop=loop)
+    protocol.transport = asyncio.Transport()
+    stream = streams.StreamReader(protocol, 1024, loop=loop)
+
+    data_len = stream._high_water + 1
+    stream.feed_data(b"0" * data_len)
+    assert protocol._reading_paused
+    stream.feed_eof()
+    assert not protocol._reading_paused
diff --git tests/test_http_writer.py tests/test_http_writer.py
index 0ed0e615700..5f316fad2f7 100644
--- tests/test_http_writer.py
+++ tests/test_http_writer.py
@@ -104,16 +104,15 @@ async def test_write_large_payload_deflate_compression_data_in_eof(
     assert transport.write.called  # type: ignore[attr-defined]
     chunks = [c[1][0] for c in list(transport.write.mock_calls)]  # type: ignore[attr-defined]
     transport.write.reset_mock()  # type: ignore[attr-defined]
-    assert not transport.writelines.called  # type: ignore[attr-defined]
 
     # This payload compresses to 20447 bytes
     payload = b"".join(
         [bytes((*range(0, i), *range(i, 0, -1))) for i in range(255) for _ in range(64)]
     )
     await msg.write_eof(payload)
-    assert not transport.write.called  # type: ignore[attr-defined]
-    assert transport.writelines.called  # type: ignore[attr-defined]
-    chunks.extend(transport.writelines.mock_calls[0][1][0])  # type: ignore[attr-defined]
+    chunks.extend([c[1][0] for c in list(transport.write.mock_calls)])  # type: ignore[attr-defined]
+
+    assert all(chunks)
     content = b"".join(chunks)
     assert zlib.decompress(content) == (b"data" * 4096) + payload
 
@@ -180,7 +179,7 @@ async def test_write_payload_deflate_compression_chunked(
     await msg.write(b"data")
     await msg.write_eof()
 
-    chunks = [b"".join(c[1][0]) for c in list(transport.writelines.mock_calls)]  # type: ignore[attr-defined]
+    chunks = [c[1][0] for c in list(transport.write.mock_calls)]  # type: ignore[attr-defined]
     assert all(chunks)
     content = b"".join(chunks)
     assert content == expected
@@ -216,7 +215,7 @@ async def test_write_payload_deflate_compression_chunked_data_in_eof(
     await msg.write(b"data")
     await msg.write_eof(b"end")
 
-    chunks = [b"".join(c[1][0]) for c in list(transport.writelines.mock_calls)]  # type: ignore[attr-defined]
+    chunks = [c[1][0] for c in list(transport.write.mock_calls)]  # type: ignore[attr-defined]
     assert all(chunks)
     content = b"".join(chunks)
     assert content == expected
@@ -235,16 +234,16 @@ async def test_write_large_payload_deflate_compression_chunked_data_in_eof(
     # This payload compresses to 1111 bytes
     payload = b"".join([bytes((*range(0, i), *range(i, 0, -1))) for i in range(255)])
     await msg.write_eof(payload)
-    assert not transport.write.called  # type: ignore[attr-defined]
 
-    chunks = []
-    for write_lines_call in transport.writelines.mock_calls:  # type: ignore[attr-defined]
-        chunked_payload = list(write_lines_call[1][0])[1:]
-        chunked_payload.pop()
-        chunks.extend(chunked_payload)
+    compressed = []
+    chunks = [c[1][0] for c in list(transport.write.mock_calls)]  # type: ignore[attr-defined]
+    chunked_body = b"".join(chunks)
+    split_body = chunked_body.split(b"\r\n")
+    while split_body:
+        if split_body.pop(0):
+            compressed.append(split_body.pop(0))
 
-    assert all(chunks)
-    content = b"".join(chunks)
+    content = b"".join(compressed)
     assert zlib.decompress(content) == (b"data" * 4096) + payload
 
 
diff --git tests/test_web_functional.py tests/test_web_functional.py
index a3a990141a1..e4979851300 100644
--- tests/test_web_functional.py
+++ tests/test_web_functional.py
@@ -2324,3 +2324,41 @@ async def handler(request: web.Request) -> web.Response:
         # Make 2nd request which will hit the race condition.
         async with client.get("/") as resp:
             assert resp.status == 200
+
+
+async def test_keepalive_expires_on_time(aiohttp_client: AiohttpClient) -> None:
+    """Test that the keepalive handle expires on time."""
+
+    async def handler(request: web.Request) -> web.Response:
+        body = await request.read()
+        assert b"" == body
+        return web.Response(body=b"OK")
+
+    app = web.Application()
+    app.router.add_route("GET", "/", handler)
+
+    connector = aiohttp.TCPConnector(limit=1)
+    client = await aiohttp_client(app, connector=connector)
+
+    loop = asyncio.get_running_loop()
+    now = loop.time()
+
+    # Patch loop time so we can control when the keepalive timeout is processed
+    with mock.patch.object(loop, "time") as loop_time_mock:
+        loop_time_mock.return_value = now
+        resp1 = await client.get("/")
+        await resp1.read()
+        request_handler = client.server.handler.connections[0]
+
+        # Ensure the keep alive handle is set
+        assert request_handler._keepalive_handle is not None
+
+        # Set the loop time to exactly the keepalive timeout
+        loop_time_mock.return_value = request_handler._next_keepalive_close_time
+
+        # sleep twice to ensure the keep alive timeout is processed
+        await asyncio.sleep(0)
+        await asyncio.sleep(0)
+
+        # Ensure the keep alive handle expires
+        assert request_handler._keepalive_handle is None
diff --git tests/test_web_urldispatcher.py tests/test_web_urldispatcher.py
index 92066f09b7d..ee60b6917c5 100644
--- tests/test_web_urldispatcher.py
+++ tests/test_web_urldispatcher.py
@@ -585,16 +585,17 @@ async def test_access_mock_special_resource(
     my_special.touch()
 
     real_result = my_special.stat()
-    real_stat = pathlib.Path.stat
+    real_stat = os.stat
 
-    def mock_stat(self: pathlib.Path, **kwargs: Any) -> os.stat_result:
-        s = real_stat(self, **kwargs)
+    def mock_stat(path: Any, **kwargs: Any) -> os.stat_result:
+        s = real_stat(path, **kwargs)
         if os.path.samestat(s, real_result):
             mock_mode = S_IFIFO | S_IMODE(s.st_mode)
             s = os.stat_result([mock_mode] + list(s)[1:])
         return s
 
     monkeypatch.setattr("pathlib.Path.stat", mock_stat)
+    monkeypatch.setattr("os.stat", mock_stat)
 
     app = web.Application()
     app.router.add_static("/", str(tmp_path))

Description

This pull request updates the aiohttp library with various improvements and bug fixes. It includes changes to the SSL context handling, file response optimizations, and several other enhancements across different modules.

Changes

Changes

  1. .github/workflows/ci-cd.yml:

    • Updated the actions/cache version to 4.2.0
    • Changed Python version for setup from 3.12 to 3.13
  2. CHANGES.rst:

    • Added changelog entries for versions 3.11.11 and 3.11.10, including bug fixes and features
  3. aiohttp/__init__.py:

    • Updated version to 3.11.11
  4. aiohttp/abc.py:

    • Added quote_cookie property to AbstractCookieJar
    • Updated write method signature in AbstractStreamWriter to accept Union[bytes, bytearray, memoryview]
  5. aiohttp/client.py:

    • Updated _request method to reuse quote_cookie setting from ClientSession._cookie_jar
  6. aiohttp/client_exceptions.py, aiohttp/client_reqrep.py, aiohttp/connector.py, aiohttp/web_runner.py, aiohttp/worker.py:

    • Refactored SSL context imports for better type checking
  7. aiohttp/connector.py:

    • Added ALPN protocol setting for SSL context
  8. aiohttp/cookiejar.py:

    • Implemented quote_cookie property
  9. aiohttp/http_writer.py:

    • Updated _write and write methods to accept Union[bytes, bytearray, memoryview]
  10. aiohttp/payload.py:

    • Updated mimetypes.guess_type usage for Python 3.13+ compatibility
  11. aiohttp/streams.py:

    • Added logic to resume reading when EOF is received
  12. aiohttp/web_fileresponse.py:

    • Refactored file response handling for better performance and security
  13. aiohttp/web_protocol.py:

    • Updated keepalive handling logic
  14. Various test files:

    • Added new tests and updated existing ones to cover the changes
sequenceDiagram
    participant Client
    participant ClientSession
    participant Connector
    participant FileResponse
    participant StreamWriter

    Client->>ClientSession: request()
    ClientSession->>Connector: connect()
    Connector->>Connector: _make_ssl_context()
    Note over Connector: Set ALPN protocols
    Connector-->>ClientSession: Connection
    ClientSession->>FileResponse: prepare()
    FileResponse->>FileResponse: _make_response()
    FileResponse->>StreamWriter: write()
    StreamWriter-->>ClientSession: Response
    ClientSession-->>Client: Response
Loading

Possible Issues

  • The change from Python 3.12 to 3.13 in the CI/CD workflow might cause issues if Python 3.13 is not yet stable or widely available.

Security Hotspots

No significant security hotspots were identified in this change.

This sequence diagram illustrates the main flow of a client request, highlighting the areas where significant changes have been made, such as SSL context creation with ALPN support and the optimized file response handling.

@renovate renovate bot force-pushed the renovate/aiohttp-3.x branch from cdf822c to feaab5d Compare February 13, 2025 02:55
@renovate renovate bot changed the title chore(deps): update dependency aiohttp to v3.11.11 chore(deps): update dependency aiohttp to v3.11.12 Feb 13, 2025
@renovate renovate bot force-pushed the renovate/aiohttp-3.x branch from feaab5d to 8a3a7ac Compare March 3, 2025 18:12
@renovate renovate bot changed the title chore(deps): update dependency aiohttp to v3.11.12 chore(deps): update dependency aiohttp to v3.11.13 Mar 3, 2025
Copy link

github-actions bot commented Mar 3, 2025

[puLL-Merge] - aio-libs/[email protected]

Diff
diff --git .github/workflows/ci-cd.yml .github/workflows/ci-cd.yml
index 765047b933f..a794dc65d77 100644
--- .github/workflows/ci-cd.yml
+++ .github/workflows/ci-cd.yml
@@ -47,7 +47,7 @@ jobs:
       with:
         python-version: 3.11
     - name: Cache PyPI
-      uses: actions/[email protected]
+      uses: actions/[email protected]
       with:
         key: pip-lint-${{ hashFiles('requirements/*.txt') }}
         path: ~/.cache/pip
@@ -99,7 +99,7 @@ jobs:
       with:
         submodules: true
     - name: Cache llhttp generated files
-      uses: actions/[email protected]
+      uses: actions/[email protected]
       id: cache
       with:
         key: llhttp-${{ hashFiles('vendor/llhttp/package*.json', 'vendor/llhttp/src/**/*') }}
@@ -114,7 +114,7 @@ jobs:
       run: |
         make generate-llhttp
     - name: Upload llhttp generated files
-      uses: actions/upload-artifact@v3
+      uses: actions/upload-artifact@v4
       with:
         name: llhttp
         path: vendor/llhttp/build
@@ -163,7 +163,7 @@ jobs:
         echo "dir=$(pip cache dir)" >> "${GITHUB_OUTPUT}"
       shell: bash
     - name: Cache PyPI
-      uses: actions/[email protected]
+      uses: actions/[email protected]
       with:
         key: pip-ci-${{ runner.os }}-${{ matrix.pyver }}-${{ matrix.no-extensions }}-${{ hashFiles('requirements/*.txt') }}
         path: ${{ steps.pip-cache.outputs.dir }}
@@ -177,7 +177,7 @@ jobs:
         python -m pip install -r requirements/test.in -c requirements/test.txt
     - name: Restore llhttp generated files
       if: ${{ matrix.no-extensions == '' }}
-      uses: actions/download-artifact@v3
+      uses: actions/download-artifact@v4
       with:
         name: llhttp
         path: vendor/llhttp/build/
@@ -250,11 +250,11 @@ jobs:
       uses: actions/checkout@v4
       with:
         submodules: true
-    - name: Setup Python 3.12
+    - name: Setup Python 3.13.2
       id: python-install
       uses: actions/setup-python@v5
       with:
-        python-version: 3.12
+        python-version: 3.13.2
         cache: pip
         cache-dependency-path: requirements/*.txt
     - name: Update pip, wheel, setuptools, build, twine
@@ -264,7 +264,7 @@ jobs:
       run: |
         python -m pip install -r requirements/test.in -c requirements/test.txt
     - name: Restore llhttp generated files
-      uses: actions/download-artifact@v3
+      uses: actions/download-artifact@v4
       with:
         name: llhttp
         path: vendor/llhttp/build/
@@ -325,7 +325,7 @@ jobs:
         python -m
         pip install -r requirements/cython.in -c requirements/cython.txt
     - name: Restore llhttp generated files
-      uses: actions/download-artifact@v3
+      uses: actions/download-artifact@v4
       with:
         name: llhttp
         path: vendor/llhttp/build/
@@ -336,27 +336,41 @@ jobs:
       run: |
         python -m build --sdist
     - name: Upload artifacts
-      uses: actions/upload-artifact@v3
+      uses: actions/upload-artifact@v4
       with:
-        name: dist
+        name: dist-sdist
         path: dist
 
   build-wheels:
-    name: Build wheels on ${{ matrix.os }} ${{ matrix.qemu }}
-    runs-on: ${{ matrix.os }}-latest
+    name: Build wheels on ${{ matrix.os }} ${{ matrix.qemu }} ${{ matrix.musl }}
+    runs-on: ${{ matrix.os }}
     needs: pre-deploy
     strategy:
       matrix:
-        os: [ubuntu, windows, macos]
+        os: ["ubuntu-latest", "windows-latest", "macos-latest", "ubuntu-24.04-arm"]
         qemu: ['']
+        musl: [""]
         include:
-          # Split ubuntu job for the sake of speed-up
-        - os: ubuntu
-          qemu: aarch64
-        - os: ubuntu
+          # Split ubuntu/musl jobs for the sake of speed-up
+        - os: ubuntu-latest
+          qemu: ppc64le
+          musl: ""
+        - os: ubuntu-latest
           qemu: ppc64le
-        - os: ubuntu
+          musl: musllinux
+        - os: ubuntu-latest
           qemu: s390x
+          musl: ""
+        - os: ubuntu-latest
+          qemu: s390x
+          musl: musllinux
+        - os: ubuntu-latest
+          qemu: armv7l
+          musl: musllinux
+        - os: ubuntu-latest
+          musl: musllinux
+        - os: ubuntu-24.04-arm
+          musl: musllinux
     steps:
     - name: Checkout
       uses: actions/checkout@v4
@@ -367,6 +381,10 @@ jobs:
       uses: docker/setup-qemu-action@v3
       with:
         platforms: all
+        # This should be temporary
+        # xref https://github.com/docker/setup-qemu-action/issues/188
+        # xref https://github.com/tonistiigi/binfmt/issues/215
+        image: tonistiigi/binfmt:qemu-v8.1.5
       id: qemu
     - name: Prepare emulation
       run: |
@@ -388,7 +406,7 @@ jobs:
         python -m
         pip install -r requirements/cython.in -c requirements/cython.txt
     - name: Restore llhttp generated files
-      uses: actions/download-artifact@v3
+      uses: actions/download-artifact@v4
       with:
         name: llhttp
         path: vendor/llhttp/build/
@@ -398,10 +416,17 @@ jobs:
     - name: Build wheels
       uses: pypa/[email protected]
       env:
+        CIBW_SKIP: pp* ${{ matrix.musl == 'musllinux' && '*manylinux*' || '*musllinux*' }}
         CIBW_ARCHS_MACOS: x86_64 arm64 universal2
-    - uses: actions/upload-artifact@v3
+    - name: Upload wheels
+      uses: actions/upload-artifact@v4
       with:
-        name: dist
+        name: >-
+          dist-${{ matrix.os }}-${{ matrix.musl }}-${{
+            matrix.qemu
+            && matrix.qemu
+            || 'native'
+          }}
         path: ./wheelhouse/*.whl
 
   deploy:
@@ -426,10 +451,11 @@ jobs:
       run: |
         echo "${{ secrets.GITHUB_TOKEN }}" | gh auth login --with-token
     - name: Download distributions
-      uses: actions/download-artifact@v3
+      uses: actions/download-artifact@v4
       with:
-        name: dist
         path: dist
+        pattern: dist-*
+        merge-multiple: true
     - name: Collected dists
       run: |
         tree dist
diff --git .readthedocs.yml .readthedocs.yml
index b3edaf4b8ea..b7d8a9236f6 100644
--- .readthedocs.yml
+++ .readthedocs.yml
@@ -5,6 +5,10 @@
 ---
 version: 2
 
+sphinx:
+  # Path to your Sphinx configuration file.
+  configuration: docs/conf.py
+
 submodules:
   include: all
   exclude: []
diff --git CHANGES.rst CHANGES.rst
index 8352236c320..39c45196c26 100644
--- CHANGES.rst
+++ CHANGES.rst
@@ -10,6 +10,274 @@
 
 .. towncrier release notes start
 
+3.11.13 (2025-02-24)
+====================
+
+Bug fixes
+---------
+
+- Removed a break statement inside the finally block in :py:class:`~aiohttp.web.RequestHandler`
+  -- by :user:`Cycloctane`.
+
+
+  *Related issues and pull requests on GitHub:*
+  :issue:`10434`.
+
+
+
+- Changed connection creation to explicitly close sockets if an exception is raised in the event loop's ``create_connection`` method -- by :user:`top-oai`.
+
+
+  *Related issues and pull requests on GitHub:*
+  :issue:`10464`.
+
+
+
+
+Packaging updates and notes for downstreams
+-------------------------------------------
+
+- Fixed test ``test_write_large_payload_deflate_compression_data_in_eof_writelines`` failing with Python 3.12.9+ or 3.13.2+ -- by :user:`bdraco`.
+
+
+  *Related issues and pull requests on GitHub:*
+  :issue:`10423`.
+
+
+
+
+Miscellaneous internal changes
+------------------------------
+
+- Added human-readable error messages to the exceptions for WebSocket disconnects due to PONG not being received -- by :user:`bdraco`.
+
+  Previously, the error messages were empty strings, which made it hard to determine what went wrong.
+
+
+  *Related issues and pull requests on GitHub:*
+  :issue:`10422`.
+
+
+
+
+----
+
+
+3.11.12 (2025-02-05)
+====================
+
+Bug fixes
+---------
+
+- ``MultipartForm.decode()`` now follows RFC1341 7.2.1 with a ``CRLF`` after the boundary
+  -- by :user:`imnotjames`.
+
+
+  *Related issues and pull requests on GitHub:*
+  :issue:`10270`.
+
+
+
+- Restored the missing ``total_bytes`` attribute to ``EmptyStreamReader`` -- by :user:`bdraco`.
+
+
+  *Related issues and pull requests on GitHub:*
+  :issue:`10387`.
+
+
+
+
+Features
+--------
+
+- Updated :py:func:`~aiohttp.request` to make it accept ``_RequestOptions`` kwargs.
+  -- by :user:`Cycloctane`.
+
+
+  *Related issues and pull requests on GitHub:*
+  :issue:`10300`.
+
+
+
+- Improved logging of HTTP protocol errors to include the remote address -- by :user:`bdraco`.
+
+
+  *Related issues and pull requests on GitHub:*
+  :issue:`10332`.
+
+
+
+
+Improved documentation
+----------------------
+
+- Added ``aiohttp-openmetrics`` to list of third-party libraries -- by :user:`jelmer`.
+
+
+  *Related issues and pull requests on GitHub:*
+  :issue:`10304`.
+
+
+
+
+Packaging updates and notes for downstreams
+-------------------------------------------
+
+- Added missing files to the source distribution to fix ``Makefile`` targets.
+  Added a ``cythonize-nodeps`` target to run Cython without invoking pip to install dependencies.
+
+
+  *Related issues and pull requests on GitHub:*
+  :issue:`10366`.
+
+
+
+- Started building armv7l musllinux wheels -- by :user:`bdraco`.
+
+
+  *Related issues and pull requests on GitHub:*
+  :issue:`10404`.
+
+
+
+
+Contributor-facing changes
+--------------------------
+
+- The CI/CD workflow has been updated to use `upload-artifact` v4 and `download-artifact` v4 GitHub Actions -- by :user:`silamon`.
+
+
+  *Related issues and pull requests on GitHub:*
+  :issue:`10281`.
+
+
+
+
+Miscellaneous internal changes
+------------------------------
+
+- Restored support for zero copy writes when using Python 3.12 versions 3.12.9 and later or Python 3.13.2+ -- by :user:`bdraco`.
+
+  Zero copy writes were previously disabled due to :cve:`2024-12254` which is resolved in these Python versions.
+
+
+  *Related issues and pull requests on GitHub:*
+  :issue:`10137`.
+
+
+
+
+----
+
+
+3.11.11 (2024-12-18)
+====================
+
+Bug fixes
+---------
+
+- Updated :py:meth:`~aiohttp.ClientSession.request` to reuse the ``quote_cookie`` setting from ``ClientSession._cookie_jar`` when processing cookies parameter.
+  -- by :user:`Cycloctane`.
+
+
+  *Related issues and pull requests on GitHub:*
+  :issue:`10093`.
+
+
+
+- Fixed type of ``SSLContext`` for some static type checkers (e.g. pyright).
+
+
+  *Related issues and pull requests on GitHub:*
+  :issue:`10099`.
+
+
+
+- Updated :meth:`aiohttp.web.StreamResponse.write` annotation to also allow :class:`bytearray` and :class:`memoryview` as inputs -- by :user:`cdce8p`.
+
+
+  *Related issues and pull requests on GitHub:*
+  :issue:`10154`.
+
+
+
+- Fixed a hang where a connection previously used for a streaming
+  download could be returned to the pool in a paused state.
+  -- by :user:`javitonino`.
+
+
+  *Related issues and pull requests on GitHub:*
+  :issue:`10169`.
+
+
+
+
+Features
+--------
+
+- Enabled ALPN on default SSL contexts. This improves compatibility with some
+  proxies which don't work without this extension.
+  -- by :user:`Cycloctane`.
+
+
+  *Related issues and pull requests on GitHub:*
+  :issue:`10156`.
+
+
+
+
+Miscellaneous internal changes
+------------------------------
+
+- Fixed an infinite loop that can occur when using aiohttp in combination
+  with `async-solipsism`_ -- by :user:`bmerry`.
+
+  .. _async-solipsism: https://github.com/bmerry/async-solipsism
+
+
+  *Related issues and pull requests on GitHub:*
+  :issue:`10149`.
+
+
+
+
+----
+
+
+3.11.10 (2024-12-05)
+====================
+
+Bug fixes
+---------
+
+- Fixed race condition in :class:`aiohttp.web.FileResponse` that could have resulted in an incorrect response if the file was replaced on the file system during ``prepare`` -- by :user:`bdraco`.
+
+
+  *Related issues and pull requests on GitHub:*
+  :issue:`10101`, :issue:`10113`.
+
+
+
+- Replaced deprecated call to :func:`mimetypes.guess_type` with :func:`mimetypes.guess_file_type` when using Python 3.13+ -- by :user:`bdraco`.
+
+
+  *Related issues and pull requests on GitHub:*
+  :issue:`10102`.
+
+
+
+- Disabled zero copy writes in the ``StreamWriter`` -- by :user:`bdraco`.
+
+
+  *Related issues and pull requests on GitHub:*
+  :issue:`10125`.
+
+
+
+
+----
+
+
 3.11.9 (2024-12-01)
 ===================
 
diff --git CONTRIBUTORS.txt CONTRIBUTORS.txt
index 6adb3b97fb1..1f0d1e7d2d7 100644
--- CONTRIBUTORS.txt
+++ CONTRIBUTORS.txt
@@ -9,6 +9,7 @@ Adam Mills
 Adrian Krupa
 Adrián Chaves
 Ahmed Tahri
+Alan Bogarin
 Alan Tse
 Alec Hanefeld
 Alejandro Gómez
@@ -41,6 +42,7 @@ Andrej Antonov
 Andrew Leech
 Andrew Lytvyn
 Andrew Svetlov
+Andrew Top
 Andrew Zhou
 Andrii Soldatenko
 Anes Abismail
@@ -166,10 +168,12 @@ Jaesung Lee
 Jake Davis
 Jakob Ackermann
 Jakub Wilk
+James Ward
 Jan Buchar
 Jan Gosmann
 Jarno Elonen
 Jashandeep Sohi
+Javier Torres
 Jean-Baptiste Estival
 Jens Steinhauser
 Jeonghun Lee
@@ -364,6 +368,7 @@ William S.
 Wilson Ong
 wouter bolsterlee
 Xavier Halloran
+Xi Rui
 Xiang Li
 Yang Zhou
 Yannick Koechlin
diff --git MANIFEST.in MANIFEST.in
index d7c5cef6aad..64cee139a1f 100644
--- MANIFEST.in
+++ MANIFEST.in
@@ -7,6 +7,7 @@ graft aiohttp
 graft docs
 graft examples
 graft tests
+graft tools
 graft requirements
 recursive-include vendor *
 global-include aiohttp *.pyi
diff --git Makefile Makefile
index b0a3ef3226b..c6193fea9e4 100644
--- Makefile
+++ Makefile
@@ -81,6 +81,9 @@ generate-llhttp: .llhttp-gen
 .PHONY: cythonize
 cythonize: .install-cython $(PYXS:.pyx=.c) aiohttp/_websocket/reader_c.c
 
+.PHONY: cythonize-nodeps
+cythonize-nodeps: $(PYXS:.pyx=.c) aiohttp/_websocket/reader_c.c
+
 .install-deps: .install-cython $(PYXS:.pyx=.c) aiohttp/_websocket/reader_c.c $(call to-hash,$(CYS) $(REQS))
 	@python -m pip install -r requirements/dev.in -c requirements/dev.txt
 	@touch .install-deps
diff --git aiohttp/__init__.py aiohttp/__init__.py
index 5615e5349ae..786eed63650 100644
--- aiohttp/__init__.py
+++ aiohttp/__init__.py
@@ -1,4 +1,4 @@
-__version__ = "3.11.9"
+__version__ = "3.11.13"
 
 from typing import TYPE_CHECKING, Tuple
 
diff --git aiohttp/abc.py aiohttp/abc.py
index d6f9f782b0f..5794a9108b0 100644
--- aiohttp/abc.py
+++ aiohttp/abc.py
@@ -17,6 +17,7 @@
     Optional,
     Tuple,
     TypedDict,
+    Union,
 )
 
 from multidict import CIMultiDict
@@ -175,6 +176,11 @@ class AbstractCookieJar(Sized, IterableBase):
     def __init__(self, *, loop: Optional[asyncio.AbstractEventLoop] = None) -> None:
         self._loop = loop or asyncio.get_running_loop()
 
+    @property
+    @abstractmethod
+    def quote_cookie(self) -> bool:
+        """Return True if cookies should be quoted."""
+
     @abstractmethod
     def clear(self, predicate: Optional[ClearCookiePredicate] = None) -> None:
         """Clear all cookies if no predicate is passed."""
@@ -200,7 +206,7 @@ class AbstractStreamWriter(ABC):
     length: Optional[int] = 0
 
     @abstractmethod
-    async def write(self, chunk: bytes) -> None:
+    async def write(self, chunk: Union[bytes, bytearray, memoryview]) -> None:
         """Write chunk into stream."""
 
     @abstractmethod
diff --git aiohttp/client.py aiohttp/client.py
index e04a6ff989a..7c788e825eb 100644
--- aiohttp/client.py
+++ aiohttp/client.py
@@ -658,7 +658,9 @@ async def _request(
                     all_cookies = self._cookie_jar.filter_cookies(url)
 
                     if cookies is not None:
-                        tmp_cookie_jar = CookieJar()
+                        tmp_cookie_jar = CookieJar(
+                            quote_cookie=self._cookie_jar.quote_cookie
+                        )
                         tmp_cookie_jar.update_cookies(cookies)
                         req_cookies = tmp_cookie_jar.filter_cookies(url)
                         if req_cookies:
@@ -1469,106 +1471,80 @@ async def __aexit__(
         await self._session.close()
 
 
-def request(
-    method: str,
-    url: StrOrURL,
-    *,
-    params: Query = None,
-    data: Any = None,
-    json: Any = None,
-    headers: Optional[LooseHeaders] = None,
-    skip_auto_headers: Optional[Iterable[str]] = None,
-    auth: Optional[BasicAuth] = None,
-    allow_redirects: bool = True,
-    max_redirects: int = 10,
-    compress: Optional[str] = None,
-    chunked: Optional[bool] = None,
-    expect100: bool = False,
-    raise_for_status: Optional[bool] = None,
-    read_until_eof: bool = True,
-    proxy: Optional[StrOrURL] = None,
-    proxy_auth: Optional[BasicAuth] = None,
-    timeout: Union[ClientTimeout, object] = sentinel,
-    cookies: Optional[LooseCookies] = None,
-    version: HttpVersion = http.HttpVersion11,
-    connector: Optional[BaseConnector] = None,
-    read_bufsize: Optional[int] = None,
-    loop: Optional[asyncio.AbstractEventLoop] = None,
-    max_line_size: int = 8190,
-    max_field_size: int = 8190,
-) -> _SessionRequestContextManager:
-    """Constructs and sends a request.
-
-    Returns response object.
-    method - HTTP method
-    url - request url
-    params - (optional) Dictionary or bytes to be sent in the query
-      string of the new request
-    data - (optional) Dictionary, bytes, or file-like object to
-      send in the body of the request
-    json - (optional) Any json compatible python object
-    headers - (optional) Dictionary of HTTP Headers to send with
-      the request
-    cookies - (optional) Dict object to send with the request
-    auth - (optional) BasicAuth named tuple represent HTTP Basic Auth
-    auth - aiohttp.helpers.BasicAuth
-    allow_redirects - (optional) If set to False, do not follow
-      redirects
-    version - Request HTTP version.
-    compress - Set to True if request has to be compressed
-       with deflate encoding.
-    chunked - Set to chunk size for chunked transfer encoding.
-    expect100 - Expect 100-continue response from server.
-    connector - BaseConnector sub-class instance to support
-       connection pooling.
-    read_until_eof - Read response until eof if response
-       does not have Content-Length header.
-    loop - Optional event loop.
-    timeout - Optional ClientTimeout settings structure, 5min
-       total timeout by default.
-    Usage::
-      >>> import aiohttp
-      >>> resp = await aiohttp.request('GET', 'http://python.org/')
-      >>> resp
-      <ClientResponse(python.org/) [200]>
-      >>> data = await resp.read()
-    """
-    connector_owner = False
-    if connector is None:
-        connector_owner = True
-        connector = TCPConnector(loop=loop, force_close=True)
-
-    session = ClientSession(
-        loop=loop,
-        cookies=cookies,
-        version=version,
-        timeout=timeout,
-        connector=connector,
-        connector_owner=connector_owner,
-    )
+if sys.version_info >= (3, 11) and TYPE_CHECKING:
 
-    return _SessionRequestContextManager(
-        session._request(
-            method,
-            url,
-            params=params,
-            data=data,
-            json=json,
-            headers=headers,
-            skip_auto_headers=skip_auto_headers,
-            auth=auth,
-            allow_redirects=allow_redirects,
-            max_redirects=max_redirects,
-            compress=compress,
-            chunked=chunked,
-            expect100=expect100,
-            raise_for_status=raise_for_status,
-            read_until_eof=read_until_eof,
-            proxy=proxy,
-            proxy_auth=proxy_auth,
-            read_bufsize=read_bufsize,
-            max_line_size=max_line_size,
-            max_field_size=max_field_size,
-        ),
-        session,
-    )
+    def request(
+        method: str,
+        url: StrOrURL,
+        *,
+        version: HttpVersion = http.HttpVersion11,
+        connector: Optional[BaseConnector] = None,
+        loop: Optional[asyncio.AbstractEventLoop] = None,
+        **kwargs: Unpack[_RequestOptions],
+    ) -> _SessionRequestContextManager: ...
+
+else:
+
+    def request(
+        method: str,
+        url: StrOrURL,
+        *,
+        version: HttpVersion = http.HttpVersion11,
+        connector: Optional[BaseConnector] = None,
+        loop: Optional[asyncio.AbstractEventLoop] = None,
+        **kwargs: Any,
+    ) -> _SessionRequestContextManager:
+        """Constructs and sends a request.
+
+        Returns response object.
+        method - HTTP method
+        url - request url
+        params - (optional) Dictionary or bytes to be sent in the query
+        string of the new request
+        data - (optional) Dictionary, bytes, or file-like object to
+        send in the body of the request
+        json - (optional) Any json compatible python object
+        headers - (optional) Dictionary of HTTP Headers to send with
+        the request
+        cookies - (optional) Dict object to send with the request
+        auth - (optional) BasicAuth named tuple represent HTTP Basic Auth
+        auth - aiohttp.helpers.BasicAuth
+        allow_redirects - (optional) If set to False, do not follow
+        redirects
+        version - Request HTTP version.
+        compress - Set to True if request has to be compressed
+        with deflate encoding.
+        chunked - Set to chunk size for chunked transfer encoding.
+        expect100 - Expect 100-continue response from server.
+        connector - BaseConnector sub-class instance to support
+        connection pooling.
+        read_until_eof - Read response until eof if response
+        does not have Content-Length header.
+        loop - Optional event loop.
+        timeout - Optional ClientTimeout settings structure, 5min
+        total timeout by default.
+        Usage::
+        >>> import aiohttp
+        >>> async with aiohttp.request('GET', 'http://python.org/') as resp:
+        ...    print(resp)
+        ...    data = await resp.read()
+        <ClientResponse(https://www.python.org/) [200 OK]>
+        """
+        connector_owner = False
+        if connector is None:
+            connector_owner = True
+            connector = TCPConnector(loop=loop, force_close=True)
+
+        session = ClientSession(
+            loop=loop,
+            cookies=kwargs.pop("cookies", None),
+            version=version,
+            timeout=kwargs.pop("timeout", sentinel),
+            connector=connector,
+            connector_owner=connector_owner,
+        )
+
+        return _SessionRequestContextManager(
+            session._request(method, url, **kwargs),
+            session,
+        )
diff --git aiohttp/client_exceptions.py aiohttp/client_exceptions.py
index 667da8d5084..1d298e9a8cf 100644
--- aiohttp/client_exceptions.py
+++ aiohttp/client_exceptions.py
@@ -8,13 +8,17 @@
 
 from .typedefs import StrOrURL
 
-try:
+if TYPE_CHECKING:
     import ssl
 
     SSLContext = ssl.SSLContext
-except ImportError:  # pragma: no cover
-    ssl = SSLContext = None  # type: ignore[assignment]
+else:
+    try:
+        import ssl
 
+        SSLContext = ssl.SSLContext
+    except ImportError:  # pragma: no cover
+        ssl = SSLContext = None  # type: ignore[assignment]
 
 if TYPE_CHECKING:
     from .client_reqrep import ClientResponse, ConnectionKey, Fingerprint, RequestInfo
diff --git aiohttp/client_reqrep.py aiohttp/client_reqrep.py
index e97c40ce0e5..43b48063c6e 100644
--- aiohttp/client_reqrep.py
+++ aiohttp/client_reqrep.py
@@ -72,12 +72,16 @@
     RawHeaders,
 )
 
-try:
+if TYPE_CHECKING:
     import ssl
     from ssl import SSLContext
-except ImportError:  # pragma: no cover
-    ssl = None  # type: ignore[assignment]
-    SSLContext = object  # type: ignore[misc,assignment]
+else:
+    try:
+        import ssl
+        from ssl import SSLContext
+    except ImportError:  # pragma: no cover
+        ssl = None  # type: ignore[assignment]
+        SSLContext = object  # type: ignore[misc,assignment]
 
 
 __all__ = ("ClientRequest", "ClientResponse", "RequestInfo", "Fingerprint")
diff --git aiohttp/client_ws.py aiohttp/client_ws.py
index f4cfa1bffe8..daa57d1930b 100644
--- aiohttp/client_ws.py
+++ aiohttp/client_ws.py
@@ -163,7 +163,9 @@ def _ping_task_done(self, task: "asyncio.Task[None]") -> None:
         self._ping_task = None
 
     def _pong_not_received(self) -> None:
-        self._handle_ping_pong_exception(ServerTimeoutError())
+        self._handle_ping_pong_exception(
+            ServerTimeoutError(f"No PONG received after {self._pong_heartbeat} seconds")
+        )
 
     def _handle_ping_pong_exception(self, exc: BaseException) -> None:
         """Handle exceptions raised during ping/pong processing."""
diff --git aiohttp/connector.py aiohttp/connector.py
index 93bc2513b20..14433ba37e1 100644
--- aiohttp/connector.py
+++ aiohttp/connector.py
@@ -60,14 +60,18 @@
 )
 from .resolver import DefaultResolver
 
-try:
+if TYPE_CHECKING:
     import ssl
 
     SSLContext = ssl.SSLContext
-except ImportError:  # pragma: no cover
-    ssl = None  # type: ignore[assignment]
-    SSLContext = object  # type: ignore[misc,assignment]
+else:
+    try:
+        import ssl
 
+        SSLContext = ssl.SSLContext
+    except ImportError:  # pragma: no cover
+        ssl = None  # type: ignore[assignment]
+        SSLContext = object  # type: ignore[misc,assignment]
 
 EMPTY_SCHEMA_SET = frozenset({""})
 HTTP_SCHEMA_SET = frozenset({"http", "https"})
@@ -776,14 +780,16 @@ def _make_ssl_context(verified: bool) -> SSLContext:
         # No ssl support
         return None
     if verified:
-        return ssl.create_default_context()
-    sslcontext = ssl.SSLContext(ssl.PROTOCOL_TLS_CLIENT)
-    sslcontext.options |= ssl.OP_NO_SSLv2
-    sslcontext.options |= ssl.OP_NO_SSLv3
-    sslcontext.check_hostname = False
-    sslcontext.verify_mode = ssl.CERT_NONE
-    sslcontext.options |= ssl.OP_NO_COMPRESSION
-    sslcontext.set_default_verify_paths()
+        sslcontext = ssl.create_default_context()
+    else:
+        sslcontext = ssl.SSLContext(ssl.PROTOCOL_TLS_CLIENT)
+        sslcontext.options |= ssl.OP_NO_SSLv2
+        sslcontext.options |= ssl.OP_NO_SSLv3
+        sslcontext.check_hostname = False
+        sslcontext.verify_mode = ssl.CERT_NONE
+        sslcontext.options |= ssl.OP_NO_COMPRESSION
+        sslcontext.set_default_verify_paths()
+    sslcontext.set_alpn_protocols(("http/1.1",))
     return sslcontext
 
 
@@ -1102,6 +1108,7 @@ async def _wrap_create_connection(
         client_error: Type[Exception] = ClientConnectorError,
         **kwargs: Any,
     ) -> Tuple[asyncio.Transport, ResponseHandler]:
+        sock: Union[socket.socket, None] = None
         try:
             async with ceil_timeout(
                 timeout.sock_connect, ceil_threshold=timeout.ceil_threshold
@@ -1113,7 +1120,11 @@ async def _wrap_create_connection(
                     interleave=self._interleave,
                     loop=self._loop,
                 )
-                return await self._loop.create_connection(*args, **kwargs, sock=sock)
+                connection = await self._loop.create_connection(
+                    *args, **kwargs, sock=sock
+                )
+                sock = None
+                return connection
         except cert_errors as exc:
             raise ClientConnectorCertificateError(req.connection_key, exc) from exc
         except ssl_errors as exc:
@@ -1122,6 +1133,12 @@ async def _wrap_create_connection(
             if exc.errno is None and isinstance(exc, asyncio.TimeoutError):
                 raise
             raise client_error(req.connection_key, exc) from exc
+        finally:
+            if sock is not None:
+                # Will be hit if an exception is thrown before the event loop takes the socket.
+                # In that case, proactively close the socket to guard against event loop leaks.
+                # For example, see https://github.com/MagicStack/uvloop/issues/653.
+                sock.close()
 
     async def _wrap_existing_connection(
         self,
diff --git aiohttp/cookiejar.py aiohttp/cookiejar.py
index ef04bda5ad6..f6b9a921767 100644
--- aiohttp/cookiejar.py
+++ aiohttp/cookiejar.py
@@ -117,6 +117,10 @@ def __init__(
         self._expire_heap: List[Tuple[float, Tuple[str, str, str]]] = []
         self._expirations: Dict[Tuple[str, str, str], float] = {}
 
+    @property
+    def quote_cookie(self) -> bool:
+        return self._quote_cookie
+
     def save(self, file_path: PathLike) -> None:
         file_path = pathlib.Path(file_path)
         with file_path.open(mode="wb") as f:
@@ -474,6 +478,10 @@ def __iter__(self) -> "Iterator[Morsel[str]]":
     def __len__(self) -> int:
         return 0
 
+    @property
+    def quote_cookie(self) -> bool:
+        return True
+
     def clear(self, predicate: Optional[ClearCookiePredicate] = None) -> None:
         pass
 
diff --git aiohttp/http_writer.py aiohttp/http_writer.py
index c66fda3d8d0..e031a97708d 100644
--- aiohttp/http_writer.py
+++ aiohttp/http_writer.py
@@ -1,6 +1,7 @@
 """Http related parsers and protocol."""
 
 import asyncio
+import sys
 import zlib
 from typing import (  # noqa
     Any,
@@ -24,6 +25,17 @@
 __all__ = ("StreamWriter", "HttpVersion", "HttpVersion10", "HttpVersion11")
 
 
+MIN_PAYLOAD_FOR_WRITELINES = 2048
+IS_PY313_BEFORE_313_2 = (3, 13, 0) <= sys.version_info < (3, 13, 2)
+IS_PY_BEFORE_312_9 = sys.version_info < (3, 12, 9)
+SKIP_WRITELINES = IS_PY313_BEFORE_313_2 or IS_PY_BEFORE_312_9
+# writelines is not safe for use
+# on Python 3.12+ until 3.12.9
+# on Python 3.13+ until 3.13.2
+# and on older versions it not any faster than write
+# CVE-2024-12254: https://github.com/python/cpython/pull/127656
+
+
 class HttpVersion(NamedTuple):
     major: int
     minor: int
@@ -72,7 +84,7 @@ def enable_compression(
     ) -> None:
         self._compress = ZLibCompressor(encoding=encoding, strategy=strategy)
 
-    def _write(self, chunk: bytes) -> None:
+    def _write(self, chunk: Union[bytes, bytearray, memoryview]) -> None:
         size = len(chunk)
         self.buffer_size += size
         self.output_size += size
@@ -90,10 +102,17 @@ def _writelines(self, chunks: Iterable[bytes]) -> None:
         transport = self._protocol.transport
         if transport is None or transport.is_closing():
             raise ClientConnectionResetError("Cannot write to closing transport")
-        transport.writelines(chunks)
+        if SKIP_WRITELINES or size < MIN_PAYLOAD_FOR_WRITELINES:
+            transport.write(b"".join(chunks))
+        else:
+            transport.writelines(chunks)
 
     async def write(
-        self, chunk: bytes, *, drain: bool = True, LIMIT: int = 0x10000
+        self,
+        chunk: Union[bytes, bytearray, memoryview],
+        *,
+        drain: bool = True,
+        LIMIT: int = 0x10000,
     ) -> None:
         """Writes chunk of data to a stream.
 
diff --git aiohttp/multipart.py aiohttp/multipart.py
index e0bcce07449..bd4d8ae1ddf 100644
--- aiohttp/multipart.py
+++ aiohttp/multipart.py
@@ -979,7 +979,7 @@ def decode(self, encoding: str = "utf-8", errors: str = "strict") -> str:
         return "".join(
             "--"
             + self.boundary
-            + "\n"
+            + "\r\n"
             + part._binary_headers.decode(encoding, errors)
             + part.decode()
             for part, _e, _te in self._parts
diff --git aiohttp/payload.py aiohttp/payload.py
index c8c01814698..3f6d3672db2 100644
--- aiohttp/payload.py
+++ aiohttp/payload.py
@@ -4,6 +4,7 @@
 import json
 import mimetypes
 import os
+import sys
 import warnings
 from abc import ABC, abstractmethod
 from itertools import chain
@@ -169,7 +170,11 @@ def __init__(
         if content_type is not sentinel and content_type is not None:
             self._headers[hdrs.CONTENT_TYPE] = content_type
         elif self._filename is not None:
-            content_type = mimetypes.guess_type(self._filename)[0]
+            if sys.version_info >= (3, 13):
+                guesser = mimetypes.guess_file_type
+            else:
+                guesser = mimetypes.guess_type
+            content_type = guesser(self._filename)[0]
             if content_type is None:
                 content_type = self._default_content_type
             self._headers[hdrs.CONTENT_TYPE] = content_type
diff --git aiohttp/streams.py aiohttp/streams.py
index b97846171b1..7a3f64d1289 100644
--- aiohttp/streams.py
+++ aiohttp/streams.py
@@ -220,6 +220,9 @@ def feed_eof(self) -> None:
             self._eof_waiter = None
             set_result(waiter, None)
 
+        if self._protocol._reading_paused:
+            self._protocol.resume_reading()
+
         for cb in self._eof_callbacks:
             try:
                 cb()
@@ -517,8 +520,9 @@ def _read_nowait_chunk(self, n: int) -> bytes:
         else:
             data = self._buffer.popleft()
 
-        self._size -= len(data)
-        self._cursor += len(data)
+        data_len = len(data)
+        self._size -= data_len
+        self._cursor += data_len
 
         chunk_splits = self._http_chunk_splits
         # Prevent memory leak: drop useless chunk splits
@@ -551,6 +555,7 @@ class EmptyStreamReader(StreamReader):  # lgtm [py/missing-call-to-init]
 
     def __init__(self) -> None:
         self._read_eof_chunk = False
+        self.total_bytes = 0
 
     def __repr__(self) -> str:
         return "<%s>" % self.__class__.__name__
diff --git aiohttp/web.py aiohttp/web.py
index f975b665331..d6ab6f6fad4 100644
--- aiohttp/web.py
+++ aiohttp/web.py
@@ -9,6 +9,7 @@
 from contextlib import suppress
 from importlib import import_module
 from typing import (
+    TYPE_CHECKING,
     Any,
     Awaitable,
     Callable,
@@ -287,10 +288,13 @@
 )
 
 
-try:
+if TYPE_CHECKING:
     from ssl import SSLContext
-except ImportError:  # pragma: no cover
-    SSLContext = Any  # type: ignore[misc,assignment]
+else:
+    try:
+        from ssl import SSLContext
+    except ImportError:  # pragma: no cover
+        SSLContext = object  # type: ignore[misc,assignment]
 
 # Only display warning when using -Wdefault, -We, -X dev or similar.
 warnings.filterwarnings("ignore", category=NotAppKeyWarning, append=True)
diff --git aiohttp/web_fileresponse.py aiohttp/web_fileresponse.py
index 3b2bc2caf12..be9cf87e069 100644
--- aiohttp/web_fileresponse.py
+++ aiohttp/web_fileresponse.py
@@ -1,7 +1,10 @@
 import asyncio
+import io
 import os
 import pathlib
+import sys
 from contextlib import suppress
+from enum import Enum, auto
 from mimetypes import MimeTypes
 from stat import S_ISREG
 from types import MappingProxyType
@@ -15,6 +18,7 @@
     Iterator,
     List,
     Optional,
+    Set,
     Tuple,
     Union,
     cast,
@@ -66,12 +70,25 @@
     }
 )
 
+
+class _FileResponseResult(Enum):
+    """The result of the file response."""
+
+    SEND_FILE = auto()  # Ie a regular file to send
+    NOT_ACCEPTABLE = auto()  # Ie a socket, or non-regular file
+    PRE_CONDITION_FAILED = auto()  # Ie If-Match or If-None-Match failed
+    NOT_MODIFIED = auto()  # 304 Not Modified
+
+
 # Add custom pairs and clear the encodings map so guess_type ignores them.
 CONTENT_TYPES.encodings_map.clear()
 for content_type, extension in ADDITIONAL_CONTENT_TYPES.items():
     CONTENT_TYPES.add_type(content_type, extension)  # type: ignore[attr-defined]
 
 
+_CLOSE_FUTURES: Set[asyncio.Future[None]] = set()
+
+
 class FileResponse(StreamResponse):
     """A response object can be used to send files."""
 
@@ -160,10 +177,12 @@ async def _precondition_failed(
         self.content_length = 0
         return await super().prepare(request)
 
-    def _get_file_path_stat_encoding(
-        self, accept_encoding: str
-    ) -> Tuple[pathlib.Path, os.stat_result, Optional[str]]:
-        """Return the file path, stat result, and encoding.
+    def _make_response(
+        self, request: "BaseRequest", accept_encoding: str
+    ) -> Tuple[
+        _FileResponseResult, Optional[io.BufferedReader], os.stat_result, Optional[str]
+    ]:
+        """Return the response result, io object, stat result, and encoding.
 
         If an uncompressed file is returned, the encoding is set to
         :py:data:`None`.
@@ -171,6 +190,52 @@ def _get_file_path_stat_encoding(
         This method should be called from a thread executor
         since it calls os.stat which may block.
         """
+        file_path, st, file_encoding = self._get_file_path_stat_encoding(
+            accept_encoding
+        )
+        if not file_path:
+            return _FileResponseResult.NOT_ACCEPTABLE, None, st, None
+
+        etag_value = f"{st.st_mtime_ns:x}-{st.st_size:x}"
+
+        # https://www.rfc-editor.org/rfc/rfc9110#section-13.1.1-2
+        if (ifmatch := request.if_match) is not None and not self._etag_match(
+            etag_value, ifmatch, weak=False
+        ):
+            return _FileResponseResult.PRE_CONDITION_FAILED, None, st, file_encoding
+
+        if (
+            (unmodsince := request.if_unmodified_since) is not None
+            and ifmatch is None
+            and st.st_mtime > unmodsince.timestamp()
+        ):
+            return _FileResponseResult.PRE_CONDITION_FAILED, None, st, file_encoding
+
+        # https://www.rfc-editor.org/rfc/rfc9110#section-13.1.2-2
+        if (ifnonematch := request.if_none_match) is not None and self._etag_match(
+            etag_value, ifnonematch, weak=True
+        ):
+            return _FileResponseResult.NOT_MODIFIED, None, st, file_encoding
+
+        if (
+            (modsince := request.if_modified_since) is not None
+            and ifnonematch is None
+            and st.st_mtime <= modsince.timestamp()
+        ):
+            return _FileResponseResult.NOT_MODIFIED, None, st, file_encoding
+
+        fobj = file_path.open("rb")
+        with suppress(OSError):
+            # fstat() may not be available on all platforms
+            # Once we open the file, we want the fstat() to ensure
+            # the file has not changed between the first stat()
+            # and the open().
+            st = os.stat(fobj.fileno())
+        return _FileResponseResult.SEND_FILE, fobj, st, file_encoding
+
+    def _get_file_path_stat_encoding(
+        self, accept_encoding: str
+    ) -> Tuple[Optional[pathlib.Path], os.stat_result, Optional[str]]:
         file_path = self._path
         for file_extension, file_encoding in ENCODING_EXTENSIONS.items():
             if file_encoding not in accept_encoding:
@@ -184,7 +249,8 @@ def _get_file_path_stat_encoding(
                     return compressed_path, st, file_encoding
 
         # Fallback to the uncompressed file
-        return file_path, file_path.stat(), None
+        st = file_path.stat()
+        return file_path if S_ISREG(st.st_mode) else None, st, None
 
     async def prepare(self, request: "BaseRequest") -> Optional[AbstractStreamWriter]:
         loop = asyncio.get_running_loop()
@@ -192,9 +258,12 @@ async def prepare(self, request: "BaseRequest") -> Optional[AbstractStreamWriter
         # https://www.rfc-editor.org/rfc/rfc9110#section-8.4.1
         accept_encoding = request.headers.get(hdrs.ACCEPT_ENCODING, "").lower()
         try:
-            file_path, st, file_encoding = await loop.run_in_executor(
-                None, self._get_file_path_stat_encoding, accept_encoding
+            response_result, fobj, st, file_encoding = await loop.run_in_executor(
+                None, self._make_response, request, accept_encoding
             )
+        except PermissionError:
+            self.set_status(HTTPForbidden.status_code)
+            return await super().prepare(request)
         except OSError:
             # Most likely to be FileNotFoundError or OSError for circular
             # symlinks in python >= 3.13, so respond with 404.
@@ -202,51 +271,46 @@ async def prepare(self, request: "BaseRequest") -> Optional[AbstractStreamWriter
             return await super().prepare(request)
 
         # Forbid special files like sockets, pipes, devices, etc.
-        if not S_ISREG(st.st_mode):
+        if response_result is _FileResponseResult.NOT_ACCEPTABLE:
             self.set_status(HTTPForbidden.status_code)
             return await super().prepare(request)
 
-        etag_value = f"{st.st_mtime_ns:x}-{st.st_size:x}"
-        last_modified = st.st_mtime
-
-        # https://www.rfc-editor.org/rfc/rfc9110#section-13.1.1-2
-        ifmatch = request.if_match
-        if ifmatch is not None and not self._etag_match(
-            etag_value, ifmatch, weak=False
-        ):
-            return await self._precondition_failed(request)
-
-        unmodsince = request.if_unmodified_since
-        if (
-            unmodsince is not None
-            and ifmatch is None
-            and st.st_mtime > unmodsince.timestamp()
-        ):
+        if response_result is _FileResponseResult.PRE_CONDITION_FAILED:
             return await self._precondition_failed(request)
 
-        # https://www.rfc-editor.org/rfc/rfc9110#section-13.1.2-2
-        ifnonematch = request.if_none_match
-        if ifnonematch is not None and self._etag_match(
-            etag_value, ifnonematch, weak=True
-        ):
-            return await self._not_modified(request, etag_value, last_modified)
-
-        modsince = request.if_modified_since
-        if (
-            modsince is not None
-            and ifnonematch is None
-            and st.st_mtime <= modsince.timestamp()
-        ):
+        if response_result is _FileResponseResult.NOT_MODIFIED:
+            etag_value = f"{st.st_mtime_ns:x}-{st.st_size:x}"
+            last_modified = st.st_mtime
             return await self._not_modified(request, etag_value, last_modified)
 
+        assert fobj is not None
+        try:
+            return await self._prepare_open_file(request, fobj, st, file_encoding)
+        finally:
+            # We do not await here because we do not want to wait
+            # for the executor to finish before returning the response
+            # so the connection can begin servicing another request
+            # as soon as possible.
+            close_future = loop.run_in_executor(None, fobj.close)
+            # Hold a strong reference to the future to prevent it from being
+            # garbage collected before it completes.
+            _CLOSE_FUTURES.add(close_future)
+            close_future.add_done_callback(_CLOSE_FUTURES.remove)
+
+    async def _prepare_open_file(
+        self,
+        request: "BaseRequest",
+        fobj: io.BufferedReader,
+        st: os.stat_result,
+        file_encoding: Optional[str],
+    ) -> Optional[AbstractStreamWriter]:
         status = self._status
-        file_size = st.st_size
-        count = file_size
-
-        start = None
+        file_size: int = st.st_size
+        file_mtime: float = st.st_mtime
+        count: int = file_size
+        start: Optional[int] = None
 
-        ifrange = request.if_range
-        if ifrange is None or st.st_mtime <= ifrange.timestamp():
+        if (ifrange := request.if_range) is None or file_mtime <= ifrange.timestamp():
             # If-Range header check:
             # condition = cached date >= last modification date
             # return 206 if True else 200.
@@ -257,7 +321,7 @@ async def prepare(self, request: "BaseRequest") -> Optional[AbstractStreamWriter
             try:
                 rng = request.http_range
                 start = rng.start
-                end = rng.stop
+                end: Optional[int] = rng.stop
             except ValueError:
                 # https://tools.ietf.org/html/rfc7233:
                 # A server generating a 416 (Range Not Satisfiable) response to
@@ -268,13 +332,13 @@ async def prepare(self, request: "BaseRequest") -> Optional[AbstractStreamWriter
                 #
                 # Will do the same below. Many servers ignore this and do not
                 # send a Content-Range header with HTTP 416
-                self.headers[hdrs.CONTENT_RANGE] = f"bytes */{file_size}"
+                self._headers[hdrs.CONTENT_RANGE] = f"bytes */{file_size}"
                 self.set_status(HTTPRequestRangeNotSatisfiable.status_code)
                 return await super().prepare(request)
 
             # If a range request has been made, convert start, end slice
             # notation into file pointer offset and count
-            if start is not None or end is not None:
+            if start is not None:
                 if start < 0 and end is None:  # return tail of file
                     start += file_size
                     if start < 0:
@@ -304,7 +368,7 @@ async def prepare(self, request: "BaseRequest") -> Optional[AbstractStreamWriter
                     # suffix-byte-range-spec with a non-zero suffix-length,
                     # then the byte-range-set is satisfiable. Otherwise, the
                     # byte-range-set is unsatisfiable.
-                    self.headers[hdrs.CONTENT_RANGE] = f"bytes */{file_size}"
+                    self._headers[hdrs.CONTENT_RANGE] = f"bytes */{file_size}"
                     self.set_status(HTTPRequestRangeNotSatisfiable.status_code)
                     return await super().prepare(request)
 
@@ -316,48 +380,39 @@ async def prepare(self, request: "BaseRequest") -> Optional[AbstractStreamWriter
         # If the Content-Type header is not already set, guess it based on the
         # extension of the request path. The encoding returned by guess_type
         #  can be ignored since the map was cleared above.
-        if hdrs.CONTENT_TYPE not in self.headers:
-            self.content_type = (
-                CONTENT_TYPES.guess_type(self._path)[0] or FALLBACK_CONTENT_TYPE
-            )
+        if hdrs.CONTENT_TYPE not in self._headers:
+            if sys.version_info >= (3, 13):
+                guesser = CONTENT_TYPES.guess_file_type
+            else:
+                guesser = CONTENT_TYPES.guess_type
+            self.content_type = guesser(self._path)[0] or FALLBACK_CONTENT_TYPE
 
         if file_encoding:
-            self.headers[hdrs.CONTENT_ENCODING] = file_encoding
-            self.headers[hdrs.VARY] = hdrs.ACCEPT_ENCODING
+            self._headers[hdrs.CONTENT_ENCODING] = file_encoding
+            self._headers[hdrs.VARY] = hdrs.ACCEPT_ENCODING
             # Disable compression if we are already sending
             # a compressed file since we don't want to double
             # compress.
             self._compression = False
 
-        self.etag = etag_value  # type: ignore[assignment]
-        self.last_modified = st.st_mtime  # type: ignore[assignment]
+        self.etag = f"{st.st_mtime_ns:x}-{st.st_size:x}"  # type: ignore[assignment]
+        self.last_modified = file_mtime  # type: ignore[assignment]
         self.content_length = count
 
-        self.headers[hdrs.ACCEPT_RANGES] = "bytes"
-
-        real_start = cast(int, start)
+        self._headers[hdrs.ACCEPT_RANGES] = "bytes"
 
         if status == HTTPPartialContent.status_code:
-            self.headers[hdrs.CONTENT_RANGE] = "bytes {}-{}/{}".format(
+            real_start = start
+            assert real_start is not None
+            self._headers[hdrs.CONTENT_RANGE] = "bytes {}-{}/{}".format(
                 real_start, real_start + count - 1, file_size
             )
 
         # If we are sending 0 bytes calling sendfile() will throw a ValueError
-        if count == 0 or must_be_empty_body(request.method, self.status):
-            return await super().prepare(request)
-
-        try:
-            fobj = await loop.run_in_executor(None, file_path.open, "rb")
-        except PermissionError:
-            self.set_status(HTTPForbidden.status_code)
+        if count == 0 or must_be_empty_body(request.method, status):
             return await super().prepare(request)
 
-        if start:  # be aware that start could be None or int=0 here.
-            offset = start
-        else:
-            offset = 0
+        # be aware that start could be None or int=0 here.
+        offset = start or 0
 
-        try:
-            return await self._sendfile(request, fobj, offset, count)
-        finally:
-            await asyncio.shield(loop.run_in_executor(None, fobj.close))
+        return await self._sendfile(request, fobj, offset, count)
diff --git aiohttp/web_protocol.py aiohttp/web_protocol.py
index e8bb41abf97..e4c347e5a9e 100644
--- aiohttp/web_protocol.py
+++ aiohttp/web_protocol.py
@@ -458,7 +458,7 @@ def _process_keepalive(self) -> None:
         loop = self._loop
         now = loop.time()
         close_time = self._next_keepalive_close_time
-        if now <= close_time:
+        if now < close_time:
             # Keep alive close check fired too early, reschedule
             self._keepalive_handle = loop.call_at(close_time, self._process_keepalive)
             return
@@ -608,26 +608,28 @@ async def start(self) -> None:
 
             except asyncio.CancelledError:
                 self.log_debug("Ignored premature client disconnection")
+                self.force_close()
                 raise
             except Exception as exc:
                 self.log_exception("Unhandled exception", exc_info=exc)
                 self.force_close()
+            except BaseException:
+                self.force_close()
+                raise
             finally:
                 if self.transport is None and resp is not None:
                     self.log_debug("Ignored premature client disconnection.")
-                elif not self._force_close:
-                    if self._keepalive and not self._close:
-                        # start keep-alive timer
-                        if keepalive_timeout is not None:
-                            now = loop.time()
-                            close_time = now + keepalive_timeout
-                            self._next_keepalive_close_time = close_time
-                            if self._keepalive_handle is None:
-                                self._keepalive_handle = loop.call_at(
-                                    close_time, self._process_keepalive
-                                )
-                    else:
-                        break
+
+            if self._keepalive and not self._close and not self._force_close:
+                # start keep-alive timer
+                close_time = loop.time() + keepalive_timeout
+                self._next_keepalive_close_time = close_time
+                if self._keepalive_handle is None:
+                    self._keepalive_handle = loop.call_at(
+                        close_time, self._process_keepalive
+                    )
+            else:
+                break
 
         # remove handler, close transport if no handlers left
         if not self._force_close:
@@ -694,9 +696,13 @@ def handle_error(
             # or encrypted traffic to an HTTP port. This is expected
             # to happen when connected to the public internet so we log
             # it at the debug level as to not fill logs with noise.
-            self.logger.debug("Error handling request", exc_info=exc)
+            self.logger.debug(
+                "Error handling request from %s", request.remote, exc_info=exc
+            )
         else:
-            self.log_exception("Error handling request", exc_info=exc)
+            self.log_exception(
+                "Error handling request from %s", request.remote, exc_info=exc
+            )
 
         # some data already got sent, connection is broken
         if request.writer.output_size > 0:
diff --git aiohttp/web_response.py aiohttp/web_response.py
index cd2be24f1a3..e498a905caf 100644
--- aiohttp/web_response.py
+++ aiohttp/web_response.py
@@ -537,7 +537,7 @@ async def _write_headers(self) -> None:
         status_line = f"HTTP/{version[0]}.{version[1]} {self._status} {self._reason}"
         await writer.write_headers(status_line, self._headers)
 
-    async def write(self, data: bytes) -> None:
+    async def write(self, data: Union[bytes, bytearray, memoryview]) -> None:
         assert isinstance(
             data, (bytes, bytearray, memoryview)
         ), "data argument must be byte-ish (%r)" % type(data)
diff --git aiohttp/web_runner.py aiohttp/web_runner.py
index f8933383435..bcfec727c84 100644
--- aiohttp/web_runner.py
+++ aiohttp/web_runner.py
@@ -3,7 +3,7 @@
 import socket
 import warnings
 from abc import ABC, abstractmethod
-from typing import Any, List, Optional, Set
+from typing import TYPE_CHECKING, Any, List, Optional, Set
 
 from yarl import URL
 
@@ -11,11 +11,13 @@
 from .web_app import Application
 from .web_server import Server
 
-try:
+if TYPE_CHECKING:
     from ssl import SSLContext
-except ImportError:
-    SSLContext = object  # type: ignore[misc,assignment]
-
+else:
+    try:
+        from ssl import SSLContext
+    except ImportError:  # pragma: no cover
+        SSLContext = object  # type: ignore[misc,assignment]
 
 __all__ = (
     "BaseSite",
diff --git aiohttp/web_ws.py aiohttp/web_ws.py
index 0fb1549a3aa..a448bca101e 100644
--- aiohttp/web_ws.py
+++ aiohttp/web_ws.py
@@ -182,7 +182,11 @@ def _ping_task_done(self, task: "asyncio.Task[None]") -> None:
 
     def _pong_not_received(self) -> None:
         if self._req is not None and self._req.transport is not None:
-            self._handle_ping_pong_exception(asyncio.TimeoutError())
+            self._handle_ping_pong_exception(
+                asyncio.TimeoutError(
+                    f"No PONG received after {self._pong_heartbeat} seconds"
+                )
+            )
 
     def _handle_ping_pong_exception(self, exc: BaseException) -> None:
         """Handle exceptions raised during ping/pong processing."""
diff --git aiohttp/worker.py aiohttp/worker.py
index 9b307697336..8ed121ac955 100644
--- aiohttp/worker.py
+++ aiohttp/worker.py
@@ -6,7 +6,7 @@
 import signal
 import sys
 from types import FrameType
-from typing import Any, Awaitable, Callable, Optional, Union  # noqa
+from typing import TYPE_CHECKING, Any, Optional
 
 from gunicorn.config import AccessLogFormat as GunicornAccessLogFormat
 from gunicorn.workers import base
@@ -17,13 +17,18 @@
 from .web_app import Application
 from .web_log import AccessLogger
 
-try:
+if TYPE_CHECKING:
     import ssl
 
     SSLContext = ssl.SSLContext
-except ImportError:  # pragma: no cover
-    ssl = None  # type: ignore[assignment]
-    SSLContext = object  # type: ignore[misc,assignment]
+else:
+    try:
+        import ssl
+
+        SSLContext = ssl.SSLContext
+    except ImportError:  # pragma: no cover
+        ssl = None  # type: ignore[assignment]
+        SSLContext = object  # type: ignore[misc,assignment]
 
 
 __all__ = ("GunicornWebWorker", "GunicornUVLoopWebWorker")
diff --git docs/client_reference.rst docs/client_reference.rst
index c9031de5383..26537161971 100644
--- docs/client_reference.rst
+++ docs/client_reference.rst
@@ -448,11 +448,16 @@ The client session supports the context manager protocol for self closing.
       :param aiohttp.BasicAuth auth: an object that represents HTTP
                                      Basic Authorization (optional)
 
-      :param bool allow_redirects: If set to ``False``, do not follow redirects.
-                                   ``True`` by default (optional).
+      :param bool allow_redirects: Whether to process redirects or not.
+         When ``True``, redirects are followed (up to ``max_redirects`` times)
+         and logged into :attr:`ClientResponse.history` and ``trace_configs``.
+         When ``False``, the original response is returned.
+         ``True`` by default (optional).
 
       :param int max_redirects: Maximum number of redirects to follow.
-                                ``10`` by default.
+         :exc:`TooManyRedirects` is raised if the number is exceeded.
+         Ignored when ``allow_redirects=False``.
+         ``10`` by default.
 
       :param bool compress: Set to ``True`` if request has to be compressed
          with deflate encoding. If `compress` can not be combined
@@ -508,7 +513,7 @@ The client session supports the context manager protocol for self closing.
          .. versionadded:: 3.0
 
       :param str server_hostname: Sets or overrides the host name that the
-         target server’s certificate will be matched against.
+         target server's certificate will be matched against.
 
          See :py:meth:`asyncio.loop.create_connection` for more information.
 
@@ -554,8 +559,11 @@ The client session supports the context manager protocol for self closing.
 
       :param url: Request URL, :class:`str` or :class:`~yarl.URL`
 
-      :param bool allow_redirects: If set to ``False``, do not follow redirects.
-                                   ``True`` by default (optional).
+      :param bool allow_redirects: Whether to process redirects or not.
+         When ``True``, redirects are followed and logged into
+         :attr:`ClientResponse.history`.
+         When ``False``, the original response is returned.
+         ``True`` by default (optional).
 
       :return ClientResponse: a :class:`client response
                               <ClientResponse>` object.
@@ -623,8 +631,11 @@ The client session supports the context manager protocol for self closing.
 
       :param url: Request URL, :class:`str` or :class:`~yarl.URL`
 
-      :param bool allow_redirects: If set to ``False``, do not follow redirects.
-                                   ``False`` by default (optional).
+      :param bool allow_redirects: Whether to process redirects or not.
+         When ``True``, redirects are followed and logged into
+         :attr:`ClientResponse.history`.
+         When ``False``, the original response is returned.
+         ``False`` by default (optional).
 
       :return ClientResponse: a :class:`client response
                               <ClientResponse>` object.
@@ -641,8 +652,11 @@ The client session supports the context manager protocol for self closing.
 
       :param url: Request URL, :class:`str` or :class:`~yarl.URL`
 
-      :param bool allow_redirects: If set to ``False``, do not follow redirects.
-                                   ``True`` by default (optional).
+      :param bool allow_redirects: Whether to process redirects or not.
+         When ``True``, redirects are followed and logged into
+         :attr:`ClientResponse.history`.
+         When ``False``, the original response is returned.
+         ``True`` by default (optional).
 
       :return ClientResponse: a :class:`client response
                               <ClientResponse>` object.
@@ -836,14 +850,21 @@ certification chaining.
 
 .. function:: request(method, url, *, params=None, data=None, \
                         json=None,\
-                        headers=None, cookies=None, auth=None, \
+                        cookies=None, headers=None, skip_auto_headers=None, auth=None, \
                         allow_redirects=True, max_redirects=10, \
-                        encoding='utf-8', \
-                        version=HttpVersion(major=1, minor=1), \
-                        compress=None, chunked=None, expect100=False, raise_for_status=False, \
+                        compress=False, chunked=None, expect100=False, raise_for_status=None, \
+                        read_until_eof=True, \
+                        proxy=None, proxy_auth=None, \
+                        timeout=sentinel, ssl=True, \
+                        server_hostname=None, \
+                        proxy_headers=None, \
+                        trace_request_ctx=None, \
                         read_bufsize=None, \
-                        connector=None, loop=None,\
-                        read_until_eof=True, timeout=sentinel)
+                        auto_decompress=None, \
+                        max_line_size=None, \
+                        max_field_size=None, \
+                        version=aiohttp.HttpVersion11, \
+                        connector=None)
    :async:
 
    Asynchronous context manager for performing an asynchronous HTTP
@@ -856,8 +877,20 @@ certification chaining.
                be encoded with :class:`~yarl.URL` (see :class:`~yarl.URL`
                to skip encoding).
 
-   :param dict params: Parameters to be sent in the query
-                       string of the new request (optional)
+   :param params: Mapping, iterable of tuple of *key*/*value* pairs or
+                  string to be sent as parameters in the query
+                  string of the new request. Ignored for subsequent
+                  redirected requests (optional)
+
+                  Allowed values are:
+
+                  - :class:`collections.abc.Mapping` e.g. :class:`dict`,
+                     :class:`multidict.MultiDict` or
+                     :class:`multidict.MultiDictProxy`
+                  - :class:`collections.abc.Iterable` e.g. :class:`tuple` or
+                     :class:`list`
+                  - :class:`str` with preferably url-encoded content
+                     (**Warning:** content will not be encoded by *aiohttp*)
 
    :param data: The data to send in the body of the request. This can be a
                 :class:`FormData` object or anything that can be passed into
@@ -867,25 +900,46 @@ certification chaining.
    :param json: Any json compatible python object (optional). *json* and *data*
                 parameters could not be used at the same time.
 
+   :param dict cookies: HTTP Cookies to send with the request (optional)
+
    :param dict headers: HTTP Headers to send with the request (optional)
 
-   :param dict cookies: Cookies to send with the request (optional)
+   :param skip_auto_headers: set of headers for which autogeneration
+      should be skipped.
+
+      *aiohttp* autogenerates headers like ``User-Agent`` or
+      ``Content-Type`` if these headers are not explicitly
+      passed. Using ``skip_auto_headers`` parameter allows to skip
+      that generation.
+
+      Iterable of :class:`str` or :class:`~multidict.istr`
+      (optional)
 
    :param aiohttp.BasicAuth auth: an object that represents HTTP Basic
                                   Authorization (optional)
 
-   :param bool allow_redirects: If set to ``False``, do not follow redirects.
-                                ``True`` by default (optional).
+   :param bool allow_redirects: Whether to process redirects or not.
+      When ``True``, redirects are followed (up to ``max_redirects`` times)
+      and logged into :attr:`ClientResponse.history` and ``trace_configs``.
+      When ``False``, the original response is returned.
+      ``True`` by default (optional).
 
-   :param aiohttp.protocol.HttpVersion version: Request HTTP version (optional)
+   :param int max_redirects: Maximum number of redirects to follow.
+      :exc:`TooManyRedirects` is raised if the number is exceeded.
+      Ignored when ``allow_redirects=False``.
+      ``10`` by default.
 
    :param bool compress: Set to ``True`` if request has to be compressed
-                         with deflate encoding.
-                         ``False`` instructs aiohttp to not compress data.
+                         with deflate encoding. If `compress` can not be combined
+                         with a *Content-Encoding* and *Content-Length* headers.
                          ``None`` by default (optional).
 
    :param int chunked: Enables chunked transfer encoding.
-                       ``None`` by default (optional).
+      It is up to the developer
+      to decide how to chunk data streams. If chunking is enabled, aiohttp
+      encodes the provided chunks in the "Transfer-encoding: chunked" format.
+      If *chunked* is set, then the *Transfer-encoding* and *content-length*
+      headers are disallowed. ``None`` by default (optional).
 
    :param bool expect100: Expect 100-continue response from server.
                           ``False`` by default (optional).
@@ -899,28 +953,60 @@ certification chaining.
 
       .. versionadded:: 3.4
 
-   :param aiohttp.BaseConnector connector: BaseConnector sub-class
-      instance to support connection pooling.
-
    :param bool read_until_eof: Read response until EOF if response
                                does not have Content-Length header.
                                ``True`` by default (optional).
 
+   :param proxy: Proxy URL, :class:`str` or :class:`~yarl.URL` (optional)
+
+   :param aiohttp.BasicAuth proxy_auth: an object that represents proxy HTTP
+                                        Basic Authorization (optional)
+
+   :param timeout: a :class:`ClientTimeout` settings structure, 300 seconds (5min)
+        total timeout, 30 seconds socket connect timeout by default.
+
+   :param ssl: SSL validation mode. ``True`` for default SSL check
+               (:func:`ssl.create_default_context` is used),
+               ``False`` for skip SSL certificate validation,
+               :class:`aiohttp.Fingerprint` for fingerprint
+               validation, :class:`ssl.SSLContext` for custom SSL
+               certificate validation.
+
+               Supersedes *verify_ssl*, *ssl_context* and
+               *fingerprint* parameters.
+
+   :param str server_hostname: Sets or overrides the host name that the
+      target server's certificate will be matched against.
+
+      See :py:meth:`asyncio.loop.create_connection`
+      for more information.
+
+   :param collections.abc.Mapping proxy_headers: HTTP headers to send to the proxy
+      if the parameter proxy has been provided.
+
+   :param trace_request_ctx: Object used to give as a kw param for each new
+      :class:`TraceConfig` object instantiated,
+      use,d to give information to the
+      tracers that is only available at request time.
+
    :param int read_bufsize: Size of the read buffer (:attr:`ClientResponse.content`).
                             ``None`` by default,
                             it means that the session global value is used.
 
       .. versionadded:: 3.7
 
-   :param timeout: a :class:`ClientTimeout` settings structure, 300 seconds (5min)
-        total timeout, 30 seconds socket connect timeout by default.
+   :param bool auto_decompress: Automatically decompress response body.
+      May be used to enable/disable auto decompression on a per-request basis.
 
-   :param loop: :ref:`event loop<asyncio-event-loop>`
-                used for processing HTTP requests.
-                If param is ``None``, :func:`asyncio.get_event_loop`
-                is used for getting default event loop.
+   :param int max_line_size: Maximum allowed size of lines in responses.
 
-      .. deprecated:: 2.0
+   :param int max_field_size: Maximum allowed size of header fields in responses.
+
+   :param aiohttp.protocol.HttpVersion version: Request HTTP version,
+      ``HTTP 1.1`` by default. (optional)
+
+   :param aiohttp.BaseConnector connector: BaseConnector sub-class
+      instance to support connection pooling. (optional)
 
    :return ClientResponse: a :class:`client response <ClientResponse>` object.
 
diff --git docs/spelling_wordlist.txt docs/spelling_wordlist.txt
index a1f3d944584..59ea99c40bb 100644
--- docs/spelling_wordlist.txt
+++ docs/spelling_wordlist.txt
@@ -13,6 +13,8 @@ app
 app’s
 apps
 arg
+args
+armv
 Arsenic
 async
 asyncio
@@ -169,6 +171,7 @@ keepaliving
 kib
 KiB
 kwarg
+kwargs
 latin
 lifecycle
 linux
@@ -199,6 +202,7 @@ multidicts
 Multidicts
 multipart
 Multipart
+musllinux
 mypy
 Nagle
 Nagle’s
@@ -245,6 +249,7 @@ py
 pydantic
 pyenv
 pyflakes
+pyright
 pytest
 Pytest
 Quickstart
diff --git docs/third_party.rst docs/third_party.rst
index e8095c7f09d..145a505a5de 100644
--- docs/third_party.rst
+++ docs/third_party.rst
@@ -305,3 +305,6 @@ ask to raise the status.
 
 - `aiohttp-asgi-connector <https://github.com/thearchitector/aiohttp-asgi-connector>`_
   An aiohttp connector for using a ``ClientSession`` to interface directly with separate ASGI applications.
+
+- `aiohttp-openmetrics <https://github.com/jelmer/aiohttp-openmetrics>`_
+  An aiohttp middleware for exposing Prometheus metrics.
diff --git requirements/base.txt requirements/base.txt
index 1e7c0bbe6c1..d79bdab3893 100644
--- requirements/base.txt
+++ requirements/base.txt
@@ -30,7 +30,7 @@ multidict==6.1.0
     # via
     #   -r requirements/runtime-deps.in
     #   yarl
-packaging==24.1
+packaging==24.2
     # via gunicorn
 propcache==0.2.0
     # via
diff --git requirements/constraints.txt requirements/constraints.txt
index d32acc7b773..041a3737ab0 100644
--- requirements/constraints.txt
+++ requirements/constraints.txt
@@ -14,7 +14,7 @@ aiohttp-theme==0.1.7
     # via -r requirements/doc.in
 aiosignal==1.3.1
     # via -r requirements/runtime-deps.in
-alabaster==0.7.13
+alabaster==1.0.0
     # via sphinx
 annotated-types==0.7.0
     # via pydantic
@@ -129,7 +129,7 @@ mypy-extensions==1.0.0
     # via mypy
 nodeenv==1.9.1
     # via pre-commit
-packaging==24.1
+packaging==24.2
     # via
     #   build
     #   gunicorn
@@ -236,22 +236,22 @@ slotscheck==0.19.1
     # via -r requirements/lint.in
 snowballstemmer==2.2.0
     # via sphinx
-sphinx==7.1.2
+sphinx==8.1.3
     # via
     #   -r requirements/doc.in
     #   sphinxcontrib-spelling
     #   sphinxcontrib-towncrier
-sphinxcontrib-applehelp==1.0.4
+sphinxcontrib-applehelp==2.0.0
     # via sphinx
-sphinxcontrib-devhelp==1.0.2
+sphinxcontrib-devhelp==2.0.0
     # via sphinx
-sphinxcontrib-htmlhelp==2.0.1
+sphinxcontrib-htmlhelp==2.1.0
     # via sphinx
 sphinxcontrib-jsmath==1.0.1
     # via sphinx
-sphinxcontrib-qthelp==1.0.3
+sphinxcontrib-qthelp==2.0.0
     # via sphinx
-sphinxcontrib-serializinghtml==1.1.5
+sphinxcontrib-serializinghtml==2.0.0
     # via sphinx
 sphinxcontrib-spelling==8.0.0 ; platform_system != "Windows"
     # via -r requirements/doc-spelling.in
diff --git requirements/dev.txt requirements/dev.txt
index 168ce639d19..a99644dff81 100644
--- requirements/dev.txt
+++ requirements/dev.txt
@@ -14,7 +14,7 @@ aiohttp-theme==0.1.7
     # via -r requirements/doc.in
 aiosignal==1.3.1
     # via -r requirements/runtime-deps.in
-alabaster==0.7.13
+alabaster==1.0.0
     # via sphinx
 annotated-types==0.7.0
     # via pydantic
@@ -122,7 +122,7 @@ mypy-extensions==1.0.0
     # via mypy
 nodeenv==1.9.1
     # via pre-commit
-packaging==24.1
+packaging==24.2
     # via
     #   build
     #   gunicorn
@@ -210,21 +210,21 @@ slotscheck==0.19.1
     # via -r requirements/lint.in
 snowballstemmer==2.2.0
     # via sphinx
-sphinx==7.1.2
+sphinx==8.1.3
     # via
     #   -r requirements/doc.in
     #   sphinxcontrib-towncrier
-sphinxcontrib-applehelp==1.0.4
+sphinxcontrib-applehelp==2.0.0
     # via sphinx
-sphinxcontrib-devhelp==1.0.2
+sphinxcontrib-devhelp==2.0.0
     # via sphinx
-sphinxcontrib-htmlhelp==2.0.1
+sphinxcontrib-htmlhelp==2.1.0
     # via sphinx
 sphinxcontrib-jsmath==1.0.1
     # via sphinx
-sphinxcontrib-qthelp==1.0.3
+sphinxcontrib-qthelp==2.0.0
     # via sphinx
-sphinxcontrib-serializinghtml==1.1.5
+sphinxcontrib-serializinghtml==2.0.0
     # via sphinx
 sphinxcontrib-towncrier==0.4.0a0
     # via -r requirements/doc.in
diff --git requirements/doc-spelling.txt requirements/doc-spelling.txt
index df393012548..43b3822706e 100644
--- requirements/doc-spelling.txt
+++ requirements/doc-spelling.txt
@@ -6,7 +6,7 @@
 #
 aiohttp-theme==0.1.7
     # via -r requirements/doc.in
-alabaster==0.7.13
+alabaster==1.0.0
     # via sphinx
 babel==2.16.0
     # via sphinx
@@ -34,7 +34,7 @@ jinja2==3.1.4
     #   towncrier
 markupsafe==2.1.5
     # via jinja2
-packaging==24.1
+packaging==24.2
     # via sphinx
 pyenchant==3.2.2
     # via sphinxcontrib-spelling
@@ -46,22 +46,22 @@ requests==2.32.3
     # via sphinx
 snowballstemmer==2.2.0
     # via sphinx
-sphinx==7.1.2
+sphinx==8.1.3
     # via
     #   -r requirements/doc.in
     #   sphinxcontrib-spelling
     #   sphinxcontrib-towncrier
-sphinxcontrib-applehelp==1.0.4
+sphinxcontrib-applehelp==2.0.0
     # via sphinx
-sphinxcontrib-devhelp==1.0.2
+sphinxcontrib-devhelp==2.0.0
     # via sphinx
-sphinxcontrib-htmlhelp==2.0.1
+sphinxcontrib-htmlhelp==2.1.0
     # via sphinx
 sphinxcontrib-jsmath==1.0.1
     # via sphinx
-sphinxcontrib-qthelp==1.0.3
+sphinxcontrib-qthelp==2.0.0
     # via sphinx
-sphinxcontrib-serializinghtml==1.1.5
+sphinxcontrib-serializinghtml==2.0.0
     # via sphinx
 sphinxcontrib-spelling==8.0.0 ; platform_system != "Windows"
     # via -r requirements/doc-spelling.in
diff --git requirements/doc.txt requirements/doc.txt
index 43b7c6b7e8b..6ddfc47455b 100644
--- requirements/doc.txt
+++ requirements/doc.txt
@@ -6,7 +6,7 @@
 #
 aiohttp-theme==0.1.7
     # via -r requirements/doc.in
-alabaster==0.7.13
+alabaster==1.0.0
     # via sphinx
 babel==2.16.0
     # via sphinx
@@ -34,7 +34,7 @@ jinja2==3.1.4
     #   towncrier
 markupsafe==2.1.5
     # via jinja2
-packaging==24.1
+packaging==24.2
     # via sphinx
 pygments==2.18.0
     # via sphinx
@@ -44,21 +44,21 @@ requests==2.32.3
     # via sphinx
 snowballstemmer==2.2.0
     # via sphinx
-sphinx==7.1.2
+sphinx==8.1.3
     # via
     #   -r requirements/doc.in
     #   sphinxcontrib-towncrier
-sphinxcontrib-applehelp==1.0.4
+sphinxcontrib-applehelp==2.0.0
     # via sphinx
-sphinxcontrib-devhelp==1.0.2
+sphinxcontrib-devhelp==2.0.0
     # via sphinx
-sphinxcontrib-htmlhelp==2.0.1
+sphinxcontrib-htmlhelp==2.1.0
     # via sphinx
 sphinxcontrib-jsmath==1.0.1
     # via sphinx
-sphinxcontrib-qthelp==1.0.3
+sphinxcontrib-qthelp==2.0.0
     # via sphinx
-sphinxcontrib-serializinghtml==1.1.5
+sphinxcontrib-serializinghtml==2.0.0
     # via sphinx
 sphinxcontrib-towncrier==0.4.0a0
     # via -r requirements/doc.in
diff --git requirements/lint.txt requirements/lint.txt
index d7d97277bce..e2547d13da5 100644
--- requirements/lint.txt
+++ requirements/lint.txt
@@ -55,7 +55,7 @@ mypy-extensions==1.0.0
     # via mypy
 nodeenv==1.9.1
     # via pre-commit
-packaging==24.1
+packaging==24.2
     # via pytest
 platformdirs==4.3.6
     # via virtualenv
diff --git requirements/test.txt requirements/test.txt
index 33510f18682..cf81a7bf257 100644
--- requirements/test.txt
+++ requirements/test.txt
@@ -70,7 +70,7 @@ mypy==1.11.2 ; implementation_name == "cpython"
     # via -r requirements/test.in
 mypy-extensions==1.0.0
     # via mypy
-packaging==24.1
+packaging==24.2
     # via
     #   gunicorn
     #   pytest
diff --git tests/conftest.py tests/conftest.py
index 44ae384b633..95a98cd4fc0 100644
--- tests/conftest.py
+++ tests/conftest.py
@@ -221,6 +221,7 @@ def start_connection():
         "aiohttp.connector.aiohappyeyeballs.start_connection",
         autospec=True,
         spec_set=True,
+        return_value=mock.create_autospec(socket.socket, spec_set=True, instance=True),
     ) as start_connection_mock:
         yield start_connection_mock
 
diff --git tests/test_benchmarks_client.py tests/test_benchmarks_client.py
index 61439183334..ac3131e9750 100644
--- tests/test_benchmarks_client.py
+++ tests/test_benchmarks_client.py
@@ -124,7 +124,7 @@ def test_one_hundred_get_requests_with_512kib_chunked_payload(
     aiohttp_client: AiohttpClient,
     benchmark: BenchmarkFixture,
 ) -> None:
-    """Benchmark 100 GET requests with a payload of 512KiB."""
+    """Benchmark 100 GET requests with a payload of 512KiB using read."""
     message_count = 100
     payload = b"a" * (2**19)
 
@@ -148,6 +148,36 @@ def _run() -> None:
         loop.run_until_complete(run_client_benchmark())
 
 
+def test_one_hundred_get_requests_iter_chunks_on_512kib_chunked_payload(
+    loop: asyncio.AbstractEventLoop,
+    aiohttp_client: AiohttpClient,
+    benchmark: BenchmarkFixture,
+) -> None:
+    """Benchmark 100 GET requests with a payload of 512KiB using iter_chunks."""
+    message_count = 100
+    payload = b"a" * (2**19)
+
+    async def handler(request: web.Request) -> web.Response:
+        resp = web.Response(body=payload)
+        resp.enable_chunked_encoding()
+        return resp
+
+    app = web.Application()
+    app.router.add_route("GET", "/", handler)
+
+    async def run_client_benchmark() -> None:
+        client = await aiohttp_client(app)
+        for _ in range(message_count):
+            resp = await client.get("/")
+            async for _ in resp.content.iter_chunks():
+                pass
+        await client.close()
+
+    @benchmark
+    def _run() -> None:
+        loop.run_until_complete(run_client_benchmark())
+
+
 def test_get_request_with_251308_compressed_chunked_payload(
     loop: asyncio.AbstractEventLoop,
     aiohttp_client: AiohttpClient,
diff --git a/tests/test_benchmarks_web_fileresponse.py b/tests/test_benchmarks_web_fileresponse.py
new file mode 100644
index 00000000000..01aa7448c86
--- /dev/null
+++ tests/test_benchmarks_web_fileresponse.py
@@ -0,0 +1,105 @@
+"""codspeed benchmarks for the web file responses."""
+
+import asyncio
+import pathlib
+
+from multidict import CIMultiDict
+from pytest_codspeed import BenchmarkFixture
+
+from aiohttp import ClientResponse, web
+from aiohttp.pytest_plugin import AiohttpClient
+
+
+def test_simple_web_file_response(
+    loop: asyncio.AbstractEventLoop,
+    aiohttp_client: AiohttpClient,
+    benchmark: BenchmarkFixture,
+) -> None:
+    """Benchmark creating 100 simple web.FileResponse."""
+    response_count = 100
+    filepath = pathlib.Path(__file__).parent / "sample.txt"
+
+    async def handler(request: web.Request) -> web.FileResponse:
+        return web.FileResponse(path=filepath)
+
+    app = web.Application()
+    app.router.add_route("GET", "/", handler)
+
+    async def run_file_response_benchmark() -> None:
+        client = await aiohttp_client(app)
+        for _ in range(response_count):
+            await client.get("/")
+        await client.close()
+
+    @benchmark
+    def _run() -> None:
+        loop.run_until_complete(run_file_response_benchmark())
+
+
+def test_simple_web_file_sendfile_fallback_response(
+    loop: asyncio.AbstractEventLoop,
+    aiohttp_client: AiohttpClient,
+    benchmark: BenchmarkFixture,
+) -> None:
+    """Benchmark creating 100 simple web.FileResponse without sendfile."""
+    response_count = 100
+    filepath = pathlib.Path(__file__).parent / "sample.txt"
+
+    async def handler(request: web.Request) -> web.FileResponse:
+        transport = request.transport
+        assert transport is not None
+        transport._sendfile_compatible = False  # type: ignore[attr-defined]
+        return web.FileResponse(path=filepath)
+
+    app = web.Application()
+    app.router.add_route("GET", "/", handler)
+
+    async def run_file_response_benchmark() -> None:
+        client = await aiohttp_client(app)
+        for _ in range(response_count):
+            await client.get("/")
+        await client.close()
+
+    @benchmark
+    def _run() -> None:
+        loop.run_until_complete(run_file_response_benchmark())
+
+
+def test_simple_web_file_response_not_modified(
+    loop: asyncio.AbstractEventLoop,
+    aiohttp_client: AiohttpClient,
+    benchmark: BenchmarkFixture,
+) -> None:
+    """Benchmark web.FileResponse that return a 304."""
+    response_count = 100
+    filepath = pathlib.Path(__file__).parent / "sample.txt"
+
+    async def handler(request: web.Request) -> web.FileResponse:
+        return web.FileResponse(path=filepath)
+
+    app = web.Application()
+    app.router.add_route("GET", "/", handler)
+
+    async def make_last_modified_header() -> CIMultiDict[str]:
+        client = await aiohttp_client(app)
+        resp = await client.get("/")
+        last_modified = resp.headers["Last-Modified"]
+        headers = CIMultiDict({"If-Modified-Since": last_modified})
+        return headers
+
+    async def run_file_response_benchmark(
+        headers: CIMultiDict[str],
+    ) -> ClientResponse:
+        client = await aiohttp_client(app)
+        for _ in range(response_count):
+            resp = await client.get("/", headers=headers)
+
+        await client.close()
+        return resp  # type: ignore[possibly-undefined]
+
+    headers = loop.run_until_complete(make_last_modified_header())
+
+    @benchmark
+    def _run() -> None:
+        resp = loop.run_until_complete(run_file_response_benchmark(headers))
+        assert resp.status == 304
diff --git tests/test_client_functional.py tests/test_client_functional.py
index b34ccdb600d..ba75e8e93c6 100644
--- tests/test_client_functional.py
+++ tests/test_client_functional.py
@@ -603,6 +603,30 @@ async def handler(request):
     assert txt == "Test message"
 
 
+async def test_ssl_client_alpn(
+    aiohttp_server: AiohttpServer,
+    aiohttp_client: AiohttpClient,
+    ssl_ctx: ssl.SSLContext,
+) -> None:
+
+    async def handler(request: web.Request) -> web.Response:
+        assert request.transport is not None
+        sslobj = request.transport.get_extra_info("ssl_object")
+        return web.Response(text=sslobj.selected_alpn_protocol())
+
+    app = web.Application()
+    app.router.add_route("GET", "/", handler)
+    ssl_ctx.set_alpn_protocols(("http/1.1",))
+    server = await aiohttp_server(app, ssl=ssl_ctx)
+
+    connector = aiohttp.TCPConnector(ssl=False)
+    client = await aiohttp_client(server, connector=connector)
+    resp = await client.get("/")
+    assert resp.status == 200
+    txt = await resp.text()
+    assert txt == "http/1.1"
+
+
 async def test_tcp_connector_fingerprint_ok(
     aiohttp_server,
     aiohttp_client,
@@ -3358,6 +3382,22 @@ async def handler(request: web.Request) -> web.Response:
     await server.close()
 
 
+async def test_aiohttp_request_ssl(
+    aiohttp_server: AiohttpServer,
+    ssl_ctx: ssl.SSLContext,
+    client_ssl_ctx: ssl.SSLContext,
+) -> None:
+    async def handler(request: web.Request) -> web.Response:
+        return web.Response()
+
+    app = web.Application()
+    app.router.add_get("/", handler)
+    server = await aiohttp_server(app, ssl=ssl_ctx)
+
+    async with aiohttp.request("GET", server.make_url("/"), ssl=client_ssl_ctx) as resp:
+        assert resp.status == 200
+
+
 async def test_yield_from_in_session_request(aiohttp_client: AiohttpClient) -> None:
     # a test for backward compatibility with yield from syntax
     async def handler(request):
diff --git tests/test_client_session.py tests/test_client_session.py
index 65f80b6abe9..6309c5daf2e 100644
--- tests/test_client_session.py
+++ tests/test_client_session.py
@@ -15,13 +15,14 @@
 from yarl import URL
 
 import aiohttp
-from aiohttp import client, hdrs, web
+from aiohttp import CookieJar, client, hdrs, web
 from aiohttp.client import ClientSession
 from aiohttp.client_proto import ResponseHandler
 from aiohttp.client_reqrep import ClientRequest
 from aiohttp.connector import BaseConnector, Connection, TCPConnector, UnixConnector
 from aiohttp.helpers import DEBUG
 from aiohttp.http import RawResponseMessage
+from aiohttp.pytest_plugin import AiohttpServer
 from aiohttp.test_utils import make_mocked_coro
 from aiohttp.tracing import Trace
 
@@ -634,8 +635,24 @@ async def handler(request):
     assert resp_cookies["response"].value == "resp_value"
 
 
-async def test_session_default_version(loop) -> None:
-    session = aiohttp.ClientSession(loop=loop)
+async def test_cookies_with_not_quoted_cookie_jar(
+    aiohttp_server: AiohttpServer,
+) -> None:
+    async def handler(_: web.Request) -> web.Response:
+        return web.Response()
+
+    app = web.Application()
+    app.router.add_route("GET", "/", handler)
+    server = await aiohttp_server(app)
+    jar = CookieJar(quote_cookie=False)
+    cookies = {"name": "val=foobar"}
+    async with aiohttp.ClientSession(cookie_jar=jar) as sess:
+        resp = await sess.request("GET", server.make_url("/"), cookies=cookies)
+    assert resp.request_info.headers.get("Cookie", "") == "name=val=foobar"
+
+
+async def test_session_default_version(loop: asyncio.AbstractEventLoop) -> None:
+    session = aiohttp.ClientSession()
     assert session.version == aiohttp.HttpVersion11
     await session.close()
 
diff --git tests/test_client_ws_functional.py tests/test_client_ws_functional.py
index 7ede7432adf..54cd5e92f80 100644
--- tests/test_client_ws_functional.py
+++ tests/test_client_ws_functional.py
@@ -902,6 +902,7 @@ async def handler(request):
         assert resp.close_code is WSCloseCode.ABNORMAL_CLOSURE
         assert msg.type is WSMsgType.ERROR
         assert isinstance(msg.data, ServerTimeoutError)
+        assert str(msg.data) == "No PONG received after 0.05 seconds"
 
 
 async def test_close_websocket_while_ping_inflight(
diff --git tests/test_connector.py tests/test_connector.py
index 483759a4180..e79b36a673d 100644
--- tests/test_connector.py
+++ tests/test_connector.py
@@ -617,6 +617,29 @@ async def certificate_error(*args, **kwargs):
     await conn.close()
 
 
+async def test_tcp_connector_closes_socket_on_error(
+    loop: asyncio.AbstractEventLoop, start_connection: mock.AsyncMock
+) -> None:
+    req = ClientRequest("GET", URL("https://127.0.0.1:443"), loop=loop)
+
+    conn = aiohttp.TCPConnector()
+    with (
+        mock.patch.object(
+            conn._loop,
+            "create_connection",
+            autospec=True,
+            spec_set=True,
+            side_effect=ValueError,
+        ),
+        pytest.raises(ValueError),
+    ):
+        await conn.connect(req, [], ClientTimeout())
+
+    assert start_connection.return_value.close.called
+
+    await conn.close()
+
+
 async def test_tcp_connector_server_hostname_default(
     loop: Any, start_connection: mock.AsyncMock
 ) -> None:
diff --git tests/test_cookiejar.py tests/test_cookiejar.py
index bdcf54fa796..0b440bc2ca6 100644
--- tests/test_cookiejar.py
+++ tests/test_cookiejar.py
@@ -807,6 +807,7 @@ async def make_jar():
 async def test_dummy_cookie_jar() -> None:
     cookie = SimpleCookie("foo=bar; Domain=example.com;")
     dummy_jar = DummyCookieJar()
+    assert dummy_jar.quote_cookie is True
     assert len(dummy_jar) == 0
     dummy_jar.update_cookies(cookie)
     assert len(dummy_jar) == 0
diff --git tests/test_flowcontrol_streams.py tests/test_flowcontrol_streams.py
index 68e623b6dd7..9874cc2511e 100644
--- tests/test_flowcontrol_streams.py
+++ tests/test_flowcontrol_streams.py
@@ -4,6 +4,7 @@
 import pytest
 
 from aiohttp import streams
+from aiohttp.base_protocol import BaseProtocol
 
 
 @pytest.fixture
@@ -112,6 +113,15 @@ async def test_read_nowait(self, stream) -> None:
         assert res == b""
         assert stream._protocol.resume_reading.call_count == 1  # type: ignore[attr-defined]
 
+    async def test_resumed_on_eof(self, stream: streams.StreamReader) -> None:
+        stream.feed_data(b"data")
+        assert stream._protocol.pause_reading.call_count == 1  # type: ignore[attr-defined]
+        assert stream._protocol.resume_reading.call_count == 0  # type: ignore[attr-defined]
+        stream._protocol._reading_paused = True
+
+        stream.feed_eof()
+        assert stream._protocol.resume_reading.call_count == 1  # type: ignore[attr-defined]
+
 
 async def test_flow_control_data_queue_waiter_cancelled(
     buffer: streams.FlowControlDataQueue,
@@ -180,3 +190,16 @@ async def test_flow_control_data_queue_read_eof(
     buffer.feed_eof()
     with pytest.raises(streams.EofStream):
         await buffer.read()
+
+
+async def test_stream_reader_eof_when_full() -> None:
+    loop = asyncio.get_event_loop()
+    protocol = BaseProtocol(loop=loop)
+    protocol.transport = asyncio.Transport()
+    stream = streams.StreamReader(protocol, 1024, loop=loop)
+
+    data_len = stream._high_water + 1
+    stream.feed_data(b"0" * data_len)
+    assert protocol._reading_paused
+    stream.feed_eof()
+    assert not protocol._reading_paused
diff --git tests/test_http_writer.py tests/test_http_writer.py
index 0ed0e615700..c39fe3c7251 100644
--- tests/test_http_writer.py
+++ tests/test_http_writer.py
@@ -2,7 +2,7 @@
 import array
 import asyncio
 import zlib
-from typing import Iterable
+from typing import Generator, Iterable
 from unittest import mock
 
 import pytest
@@ -14,7 +14,25 @@
 
 
 @pytest.fixture
-def buf():
+def enable_writelines() -> Generator[None, None, None]:
+    with mock.patch("aiohttp.http_writer.SKIP_WRITELINES", False):
+        yield
+
+
+@pytest.fixture
+def disable_writelines() -> Generator[None, None, None]:
+    with mock.patch("aiohttp.http_writer.SKIP_WRITELINES", True):
+        yield
+
+
+@pytest.fixture
+def force_writelines_small_payloads() -> Generator[None, None, None]:
+    with mock.patch("aiohttp.http_writer.MIN_PAYLOAD_FOR_WRITELINES", 1):
+        yield
+
+
+@pytest.fixture
+def buf() -> bytearray:
     return bytearray()
 
 
@@ -92,6 +110,7 @@ async def test_write_payload_length(protocol, transport, loop) -> None:
     assert b"da" == content.split(b"\r\n\r\n", 1)[-1]
 
 
+@pytest.mark.usefixtures("disable_writelines")
 async def test_write_large_payload_deflate_compression_data_in_eof(
     protocol: BaseProtocol,
     transport: asyncio.Transport,
@@ -100,6 +119,32 @@ async def test_write_large_payload_deflate_compression_data_in_eof(
     msg = http.StreamWriter(protocol, loop)
     msg.enable_compression("deflate")
 
+    await msg.write(b"data" * 4096)
+    assert transport.write.called  # type: ignore[attr-defined]
+    chunks = [c[1][0] for c in list(transport.write.mock_calls)]  # type: ignore[attr-defined]
+    transport.write.reset_mock()  # type: ignore[attr-defined]
+
+    # This payload compresses to 20447 bytes
+    payload = b"".join(
+        [bytes((*range(0, i), *range(i, 0, -1))) for i in range(255) for _ in range(64)]
+    )
+    await msg.write_eof(payload)
+    chunks.extend([c[1][0] for c in list(transport.write.mock_calls)])  # type: ignore[attr-defined]
+
+    assert all(chunks)
+    content = b"".join(chunks)
+    assert zlib.decompress(content) == (b"data" * 4096) + payload
+
+
+@pytest.mark.usefixtures("enable_writelines")
+async def test_write_large_payload_deflate_compression_data_in_eof_writelines(
+    protocol: BaseProtocol,
+    transport: asyncio.Transport,
+    loop: asyncio.AbstractEventLoop,
+) -> None:
+    msg = http.StreamWriter(protocol, loop)
+    msg.enable_compression("deflate")
+
     await msg.write(b"data" * 4096)
     assert transport.write.called  # type: ignore[attr-defined]
     chunks = [c[1][0] for c in list(transport.write.mock_calls)]  # type: ignore[attr-defined]
@@ -180,6 +225,26 @@ async def test_write_payload_deflate_compression_chunked(
     await msg.write(b"data")
     await msg.write_eof()
 
+    chunks = [c[1][0] for c in list(transport.write.mock_calls)]  # type: ignore[attr-defined]
+    assert all(chunks)
+    content = b"".join(chunks)
+    assert content == expected
+
+
+@pytest.mark.usefixtures("enable_writelines")
+@pytest.mark.usefixtures("force_writelines_small_payloads")
+async def test_write_payload_deflate_compression_chunked_writelines(
+    protocol: BaseProtocol,
+    transport: asyncio.Transport,
+    loop: asyncio.AbstractEventLoop,
+) -> None:
+    expected = b"2\r\nx\x9c\r\na\r\nKI,I\x04\x00\x04\x00\x01\x9b\r\n0\r\n\r\n"
+    msg = http.StreamWriter(protocol, loop)
+    msg.enable_compression("deflate")
+    msg.enable_chunking()
+    await msg.write(b"data")
+    await msg.write_eof()
+
     chunks = [b"".join(c[1][0]) for c in list(transport.writelines.mock_calls)]  # type: ignore[attr-defined]
     assert all(chunks)
     content = b"".join(chunks)
@@ -216,6 +281,26 @@ async def test_write_payload_deflate_compression_chunked_data_in_eof(
     await msg.write(b"data")
     await msg.write_eof(b"end")
 
+    chunks = [c[1][0] for c in list(transport.write.mock_calls)]  # type: ignore[attr-defined]
+    assert all(chunks)
+    content = b"".join(chunks)
+    assert content == expected
+
+
+@pytest.mark.usefixtures("enable_writelines")
+@pytest.mark.usefixtures("force_writelines_small_payloads")
+async def test_write_payload_deflate_compression_chunked_data_in_eof_writelines(
+    protocol: BaseProtocol,
+    transport: asyncio.Transport,
+    loop: asyncio.AbstractEventLoop,
+) -> None:
+    expected = b"2\r\nx\x9c\r\nd\r\nKI,IL\xcdK\x01\x00\x0b@\x02\xd2\r\n0\r\n\r\n"
+    msg = http.StreamWriter(protocol, loop)
+    msg.enable_compression("deflate")
+    msg.enable_chunking()
+    await msg.write(b"data")
+    await msg.write_eof(b"end")
+
     chunks = [b"".join(c[1][0]) for c in list(transport.writelines.mock_calls)]  # type: ignore[attr-defined]
     assert all(chunks)
     content = b"".join(chunks)
@@ -231,6 +316,34 @@ async def test_write_large_payload_deflate_compression_chunked_data_in_eof(
     msg.enable_compression("deflate")
     msg.enable_chunking()
 
+    await msg.write(b"data" * 4096)
+    # This payload compresses to 1111 bytes
+    payload = b"".join([bytes((*range(0, i), *range(i, 0, -1))) for i in range(255)])
+    await msg.write_eof(payload)
+
+    compressed = []
+    chunks = [c[1][0] for c in list(transport.write.mock_calls)]  # type: ignore[attr-defined]
+    chunked_body = b"".join(chunks)
+    split_body = chunked_body.split(b"\r\n")
+    while split_body:
+        if split_body.pop(0):
+            compressed.append(split_body.pop(0))
+
+    content = b"".join(compressed)
+    assert zlib.decompress(content) == (b"data" * 4096) + payload
+
+
+@pytest.mark.usefixtures("enable_writelines")
+@pytest.mark.usefixtures("force_writelines_small_payloads")
+async def test_write_large_payload_deflate_compression_chunked_data_in_eof_writelines(
+    protocol: BaseProtocol,
+    transport: asyncio.Transport,
+    loop: asyncio.AbstractEventLoop,
+) -> None:
+    msg = http.StreamWriter(protocol, loop)
+    msg.enable_compression("deflate")
+    msg.enable_chunking()
+
     await msg.write(b"data" * 4096)
     # This payload compresses to 1111 bytes
     payload = b"".join([bytes((*range(0, i), *range(i, 0, -1))) for i in range(255)])
diff --git tests/test_imports.py tests/test_imports.py
index 5a2bb76b03c..b3f545ad900 100644
--- tests/test_imports.py
+++ tests/test_imports.py
@@ -38,7 +38,7 @@ def test_web___all__(pytester: pytest.Pytester) -> None:
         # and even slower under pytest-xdist, especially in CI
         _XDIST_WORKER_COUNT * 100 * (1 if _IS_CI_ENV else 1.53)
         if _IS_XDIST_RUN
-        else 265
+        else 295
     ),
 }
 _TARGET_TIMINGS_BY_PYTHON_VERSION["3.13"] = _TARGET_TIMINGS_BY_PYTHON_VERSION["3.12"]
diff --git tests/test_proxy.py tests/test_proxy.py
index 1679b68909f..83457de891f 100644
--- tests/test_proxy.py
+++ tests/test_proxy.py
@@ -207,6 +207,7 @@ async def make_conn():
         "aiohttp.connector.aiohappyeyeballs.start_connection",
         autospec=True,
         spec_set=True,
+        return_value=mock.create_autospec(socket.socket, spec_set=True, instance=True),
     )
     def test_proxy_connection_error(self, start_connection: Any) -> None:
         async def make_conn():
diff --git tests/test_streams.py tests/test_streams.py
index fcf13a91eb3..1b65f771c77 100644
--- tests/test_streams.py
+++ tests/test_streams.py
@@ -1141,6 +1141,7 @@ async def test_empty_stream_reader() -> None:
     with pytest.raises(asyncio.IncompleteReadError):
         await s.readexactly(10)
     assert s.read_nowait() == b""
+    assert s.total_bytes == 0
 
 
 async def test_empty_stream_reader_iter_chunks() -> None:
diff --git tests/test_urldispatch.py tests/test_urldispatch.py
index 8ee3df33202..ba6bdff23a0 100644
--- tests/test_urldispatch.py
+++ tests/test_urldispatch.py
@@ -358,7 +358,7 @@ def test_add_static_path_resolution(router: any) -> None:
     """Test that static paths are expanded and absolute."""
     res = router.add_static("/", "~/..")
     directory = str(res.get_info()["directory"])
-    assert directory == str(pathlib.Path.home().parent)
+    assert directory == str(pathlib.Path.home().resolve(strict=True).parent)
 
 
 def test_add_static(router) -> None:
diff --git tests/test_web_functional.py tests/test_web_functional.py
index a3a990141a1..e4979851300 100644
--- tests/test_web_functional.py
+++ tests/test_web_functional.py
@@ -2324,3 +2324,41 @@ async def handler(request: web.Request) -> web.Response:
         # Make 2nd request which will hit the race condition.
         async with client.get("/") as resp:
             assert resp.status == 200
+
+
+async def test_keepalive_expires_on_time(aiohttp_client: AiohttpClient) -> None:
+    """Test that the keepalive handle expires on time."""
+
+    async def handler(request: web.Request) -> web.Response:
+        body = await request.read()
+        assert b"" == body
+        return web.Response(body=b"OK")
+
+    app = web.Application()
+    app.router.add_route("GET", "/", handler)
+
+    connector = aiohttp.TCPConnector(limit=1)
+    client = await aiohttp_client(app, connector=connector)
+
+    loop = asyncio.get_running_loop()
+    now = loop.time()
+
+    # Patch loop time so we can control when the keepalive timeout is processed
+    with mock.patch.object(loop, "time") as loop_time_mock:
+        loop_time_mock.return_value = now
+        resp1 = await client.get("/")
+        await resp1.read()
+        request_handler = client.server.handler.connections[0]
+
+        # Ensure the keep alive handle is set
+        assert request_handler._keepalive_handle is not None
+
+        # Set the loop time to exactly the keepalive timeout
+        loop_time_mock.return_value = request_handler._next_keepalive_close_time
+
+        # sleep twice to ensure the keep alive timeout is processed
+        await asyncio.sleep(0)
+        await asyncio.sleep(0)
+
+        # Ensure the keep alive handle expires
+        assert request_handler._keepalive_handle is None
diff --git tests/test_web_response.py tests/test_web_response.py
index f4acf23f61b..0591426c57b 100644
--- tests/test_web_response.py
+++ tests/test_web_response.py
@@ -1201,7 +1201,7 @@ def read(self, size: int = -1) -> bytes:
         (BodyPartReader("x", CIMultiDictProxy(CIMultiDict()), mock.Mock()), None),
         (
             mpwriter,
-            "--x\nContent-Type: text/plain; charset=utf-8\r\nContent-Length: 4\r\n\r\ntest",
+            "--x\r\nContent-Type: text/plain; charset=utf-8\r\nContent-Length: 4\r\n\r\ntest",
         ),
     ),
 )
diff --git tests/test_web_server.py tests/test_web_server.py
index 7b9b87a374a..9098ef9e7bf 100644
--- tests/test_web_server.py
+++ tests/test_web_server.py
@@ -56,7 +56,9 @@ async def handler(request):
     assert txt.startswith("500 Internal Server Error")
     assert "Traceback" not in txt
 
-    logger.exception.assert_called_with("Error handling request", exc_info=exc)
+    logger.exception.assert_called_with(
+        "Error handling request from %s", cli.host, exc_info=exc
+    )
 
 
 async def test_raw_server_logs_invalid_method_with_loop_debug(
@@ -85,7 +87,9 @@ async def handler(request: web.BaseRequest) -> NoReturn:
     # on the first request since the client may
     # be probing for TLS/SSL support which is
     # expected to fail
-    logger.debug.assert_called_with("Error handling request", exc_info=exc)
+    logger.debug.assert_called_with(
+        "Error handling request from %s", cli.host, exc_info=exc
+    )
     logger.debug.reset_mock()
 
     # Now make another connection to the server
@@ -99,7 +103,9 @@ async def handler(request: web.BaseRequest) -> NoReturn:
     # on the first request since the client may
     # be probing for TLS/SSL support which is
     # expected to fail
-    logger.debug.assert_called_with("Error handling request", exc_info=exc)
+    logger.debug.assert_called_with(
+        "Error handling request from %s", cli.host, exc_info=exc
+    )
 
 
 async def test_raw_server_logs_invalid_method_without_loop_debug(
@@ -128,7 +134,9 @@ async def handler(request: web.BaseRequest) -> NoReturn:
     # on the first request since the client may
     # be probing for TLS/SSL support which is
     # expected to fail
-    logger.debug.assert_called_with("Error handling request", exc_info=exc)
+    logger.debug.assert_called_with(
+        "Error handling request from %s", cli.host, exc_info=exc
+    )
 
 
 async def test_raw_server_logs_invalid_method_second_request(
@@ -159,7 +167,9 @@ async def handler(request: web.BaseRequest) -> web.Response:
     # BadHttpMethod should be logged as an exception
     # if its not the first request since we know
     # that the client already was speaking HTTP
-    logger.exception.assert_called_with("Error handling request", exc_info=exc)
+    logger.exception.assert_called_with(
+        "Error handling request from %s", cli.host, exc_info=exc
+    )
 
 
 async def test_raw_server_logs_bad_status_line_as_exception(
@@ -184,7 +194,9 @@ async def handler(request: web.BaseRequest) -> NoReturn:
     txt = await resp.text()
     assert "Traceback (most recent call last):\n" not in txt
 
-    logger.exception.assert_called_with("Error handling request", exc_info=exc)
+    logger.exception.assert_called_with(
+        "Error handling request from %s", cli.host, exc_info=exc
+    )
 
 
 async def test_raw_server_handler_timeout(
@@ -221,6 +233,24 @@ async def handler(request):
     logger.debug.assert_called_with("Ignored premature client disconnection")
 
 
+async def test_raw_server_does_not_swallow_base_exceptions(
+    aiohttp_raw_server: AiohttpRawServer, aiohttp_client: AiohttpClient
+) -> None:
+    class UnexpectedException(BaseException):
+        """Dummy base exception."""
+
+    async def handler(request: web.BaseRequest) -> NoReturn:
+        raise UnexpectedException()
+
+    loop = asyncio.get_event_loop()
+    loop.set_debug(True)
+    server = await aiohttp_raw_server(handler)
+    cli = await aiohttp_client(server)
+
+    with pytest.raises(client.ServerDisconnectedError):
+        await cli.get("/path/to", timeout=client.ClientTimeout(10))
+
+
 async def test_raw_server_cancelled_in_write_eof(aiohttp_raw_server, aiohttp_client):
     async def handler(request):
         resp = web.Response(text=str(request.rel_url))
@@ -254,7 +284,9 @@ async def handler(request):
     txt = await resp.text()
     assert "Traceback (most recent call last):\n" in txt
 
-    logger.exception.assert_called_with("Error handling request", exc_info=exc)
+    logger.exception.assert_called_with(
+        "Error handling request from %s", cli.host, exc_info=exc
+    )
 
 
 async def test_raw_server_html_exception(aiohttp_raw_server, aiohttp_client):
@@ -278,7 +310,9 @@ async def handler(request):
         "</body></html>\n"
     )
 
-    logger.exception.assert_called_with("Error handling request", exc_info=exc)
+    logger.exception.assert_called_with(
+        "Error handling request from %s", cli.host, exc_info=exc
+    )
 
 
 async def test_raw_server_html_exception_debug(aiohttp_raw_server, aiohttp_client):
@@ -302,7 +336,9 @@ async def handler(request):
         "<pre>Traceback (most recent call last):\n"
     )
 
-    logger.exception.assert_called_with("Error handling request", exc_info=exc)
+    logger.exception.assert_called_with(
+        "Error handling request from %s", cli.host, exc_info=exc
+    )
 
 
 async def test_handler_cancellation(unused_port_socket: socket.socket) -> None:
diff --git tests/test_web_urldispatcher.py tests/test_web_urldispatcher.py
index 92066f09b7d..ee60b6917c5 100644
--- tests/test_web_urldispatcher.py
+++ tests/test_web_urldispatcher.py
@@ -585,16 +585,17 @@ async def test_access_mock_special_resource(
     my_special.touch()
 
     real_result = my_special.stat()
-    real_stat = pathlib.Path.stat
+    real_stat = os.stat
 
-    def mock_stat(self: pathlib.Path, **kwargs: Any) -> os.stat_result:
-        s = real_stat(self, **kwargs)
+    def mock_stat(path: Any, **kwargs: Any) -> os.stat_result:
+        s = real_stat(path, **kwargs)
         if os.path.samestat(s, real_result):
             mock_mode = S_IFIFO | S_IMODE(s.st_mode)
             s = os.stat_result([mock_mode] + list(s)[1:])
         return s
 
     monkeypatch.setattr("pathlib.Path.stat", mock_stat)
+    monkeypatch.setattr("os.stat", mock_stat)
 
     app = web.Application()
     app.router.add_static("/", str(tmp_path))
diff --git tests/test_web_websocket_functional.py tests/test_web_websocket_functional.py
index b7494d9265f..945096a2af3 100644
--- tests/test_web_websocket_functional.py
+++ tests/test_web_websocket_functional.py
@@ -797,6 +797,7 @@ async def handler(request: web.Request) -> NoReturn:
     assert ws.close_code == WSCloseCode.ABNORMAL_CLOSURE
     assert ws_server_close_code == WSCloseCode.ABNORMAL_CLOSURE
     assert isinstance(ws_server_exception, asyncio.TimeoutError)
+    assert str(ws_server_exception) == "No PONG received after 0.025 seconds"
     await ws.close()
 
 
diff --git tools/gen.py tools/gen.py
index ab2b39a2df0..24fb71bdd9d 100755
--- tools/gen.py
+++ tools/gen.py
@@ -7,7 +7,7 @@
 import multidict
 
 ROOT = pathlib.Path.cwd()
-while ROOT.parent != ROOT and not (ROOT / ".git").exists():
+while ROOT.parent != ROOT and not (ROOT / "pyproject.toml").exists():
     ROOT = ROOT.parent
 
 

Description

This is a significant update to the aiohttp library that includes multiple bug fixes, security improvements, and performance optimizations. The changes span across version 3.11.13 with several key improvements to HTTP protocol handling, WebSocket messaging, and SSL/TLS configurations.

Possible Issues

  1. The change to disable zero-copy writes in certain Python versions (3.12.x < 3.12.9, 3.13.x < 3.13.2) could impact performance for those specific versions.
  2. The modification of transport handling in the connection creation could affect edge cases in connection error scenarios.
  3. Changes to file response handling could impact applications that rely on specific file serving behaviors.

Security Hotspots

  1. SSL/ALPN handling modifications in the TCP connector - changes in SSL context configuration need careful review
  2. File handling changes in FileResponse - potential path traversal vulnerabilities
  3. Changes in socket handling during connection errors - potential resource leaks
Changes

Changes

  1. CI/CD Updates
  • Updated GitHub Actions to use newer versions
  • Added support for new architectures (armv7l musllinux)
  • Updated Python versions and testing configurations
  1. Core Changes
  • Improved error messages for WebSocket disconnects
  • Fixed race condition in FileResponse
  • Enhanced SSL/ALPN support
  • Updated memory management for file operations
  • Improved connection handling and error reporting
  1. Protocol Improvements
  • Fixed multipart form handling to follow RFC1341
  • Enhanced keepalive connection management
  • Improved error logging with remote address information
  1. Documentation
  • Enhanced API documentation
  • Added new third-party libraries to documentation
  • Updated configuration for ReadTheDocs
sequenceDiagram
    participant Client
    participant Server
    participant FileSystem
    participant SSLContext

    Client->>Server: Request Connection
    Server->>SSLContext: Configure ALPN
    SSLContext-->>Server: ALPN Configuration
    Server-->>Client: Establish Connection

    Client->>Server: HTTP Request
    alt File Request
        Server->>FileSystem: Check File Status
        FileSystem-->>Server: File Information
        Server->>Server: Validate Path & Permissions
        Server-->>Client: File Response
    else Normal Request
        Server->>Server: Process Request
        Server-->>Client: HTTP Response
    end

    alt Keep-Alive
        Server->>Server: Start Keep-Alive Timer
        Server-->>Client: Keep Connection
    else Close
        Server-->>Client: Close Connection
    end
Loading

@renovate renovate bot force-pushed the renovate/aiohttp-3.x branch from 8a3a7ac to 27e1aab Compare March 24, 2025 08:09
@renovate renovate bot changed the title chore(deps): update dependency aiohttp to v3.11.13 chore(deps): update dependency aiohttp to v3.11.14 Mar 24, 2025
Copy link

[puLL-Merge] - aio-libs/[email protected]

Diff
diff --git .github/workflows/ci-cd.yml .github/workflows/ci-cd.yml
index 765047b933f..a794dc65d77 100644
--- .github/workflows/ci-cd.yml
+++ .github/workflows/ci-cd.yml
@@ -47,7 +47,7 @@ jobs:
       with:
         python-version: 3.11
     - name: Cache PyPI
-      uses: actions/[email protected]
+      uses: actions/[email protected]
       with:
         key: pip-lint-${{ hashFiles('requirements/*.txt') }}
         path: ~/.cache/pip
@@ -99,7 +99,7 @@ jobs:
       with:
         submodules: true
     - name: Cache llhttp generated files
-      uses: actions/[email protected]
+      uses: actions/[email protected]
       id: cache
       with:
         key: llhttp-${{ hashFiles('vendor/llhttp/package*.json', 'vendor/llhttp/src/**/*') }}
@@ -114,7 +114,7 @@ jobs:
       run: |
         make generate-llhttp
     - name: Upload llhttp generated files
-      uses: actions/upload-artifact@v3
+      uses: actions/upload-artifact@v4
       with:
         name: llhttp
         path: vendor/llhttp/build
@@ -163,7 +163,7 @@ jobs:
         echo "dir=$(pip cache dir)" >> "${GITHUB_OUTPUT}"
       shell: bash
     - name: Cache PyPI
-      uses: actions/[email protected]
+      uses: actions/[email protected]
       with:
         key: pip-ci-${{ runner.os }}-${{ matrix.pyver }}-${{ matrix.no-extensions }}-${{ hashFiles('requirements/*.txt') }}
         path: ${{ steps.pip-cache.outputs.dir }}
@@ -177,7 +177,7 @@ jobs:
         python -m pip install -r requirements/test.in -c requirements/test.txt
     - name: Restore llhttp generated files
       if: ${{ matrix.no-extensions == '' }}
-      uses: actions/download-artifact@v3
+      uses: actions/download-artifact@v4
       with:
         name: llhttp
         path: vendor/llhttp/build/
@@ -250,11 +250,11 @@ jobs:
       uses: actions/checkout@v4
       with:
         submodules: true
-    - name: Setup Python 3.12
+    - name: Setup Python 3.13.2
       id: python-install
       uses: actions/setup-python@v5
       with:
-        python-version: 3.12
+        python-version: 3.13.2
         cache: pip
         cache-dependency-path: requirements/*.txt
     - name: Update pip, wheel, setuptools, build, twine
@@ -264,7 +264,7 @@ jobs:
       run: |
         python -m pip install -r requirements/test.in -c requirements/test.txt
     - name: Restore llhttp generated files
-      uses: actions/download-artifact@v3
+      uses: actions/download-artifact@v4
       with:
         name: llhttp
         path: vendor/llhttp/build/
@@ -325,7 +325,7 @@ jobs:
         python -m
         pip install -r requirements/cython.in -c requirements/cython.txt
     - name: Restore llhttp generated files
-      uses: actions/download-artifact@v3
+      uses: actions/download-artifact@v4
       with:
         name: llhttp
         path: vendor/llhttp/build/
@@ -336,27 +336,41 @@ jobs:
       run: |
         python -m build --sdist
     - name: Upload artifacts
-      uses: actions/upload-artifact@v3
+      uses: actions/upload-artifact@v4
       with:
-        name: dist
+        name: dist-sdist
         path: dist
 
   build-wheels:
-    name: Build wheels on ${{ matrix.os }} ${{ matrix.qemu }}
-    runs-on: ${{ matrix.os }}-latest
+    name: Build wheels on ${{ matrix.os }} ${{ matrix.qemu }} ${{ matrix.musl }}
+    runs-on: ${{ matrix.os }}
     needs: pre-deploy
     strategy:
       matrix:
-        os: [ubuntu, windows, macos]
+        os: ["ubuntu-latest", "windows-latest", "macos-latest", "ubuntu-24.04-arm"]
         qemu: ['']
+        musl: [""]
         include:
-          # Split ubuntu job for the sake of speed-up
-        - os: ubuntu
-          qemu: aarch64
-        - os: ubuntu
+          # Split ubuntu/musl jobs for the sake of speed-up
+        - os: ubuntu-latest
+          qemu: ppc64le
+          musl: ""
+        - os: ubuntu-latest
           qemu: ppc64le
-        - os: ubuntu
+          musl: musllinux
+        - os: ubuntu-latest
           qemu: s390x
+          musl: ""
+        - os: ubuntu-latest
+          qemu: s390x
+          musl: musllinux
+        - os: ubuntu-latest
+          qemu: armv7l
+          musl: musllinux
+        - os: ubuntu-latest
+          musl: musllinux
+        - os: ubuntu-24.04-arm
+          musl: musllinux
     steps:
     - name: Checkout
       uses: actions/checkout@v4
@@ -367,6 +381,10 @@ jobs:
       uses: docker/setup-qemu-action@v3
       with:
         platforms: all
+        # This should be temporary
+        # xref https://github.com/docker/setup-qemu-action/issues/188
+        # xref https://github.com/tonistiigi/binfmt/issues/215
+        image: tonistiigi/binfmt:qemu-v8.1.5
       id: qemu
     - name: Prepare emulation
       run: |
@@ -388,7 +406,7 @@ jobs:
         python -m
         pip install -r requirements/cython.in -c requirements/cython.txt
     - name: Restore llhttp generated files
-      uses: actions/download-artifact@v3
+      uses: actions/download-artifact@v4
       with:
         name: llhttp
         path: vendor/llhttp/build/
@@ -398,10 +416,17 @@ jobs:
     - name: Build wheels
       uses: pypa/[email protected]
       env:
+        CIBW_SKIP: pp* ${{ matrix.musl == 'musllinux' && '*manylinux*' || '*musllinux*' }}
         CIBW_ARCHS_MACOS: x86_64 arm64 universal2
-    - uses: actions/upload-artifact@v3
+    - name: Upload wheels
+      uses: actions/upload-artifact@v4
       with:
-        name: dist
+        name: >-
+          dist-${{ matrix.os }}-${{ matrix.musl }}-${{
+            matrix.qemu
+            && matrix.qemu
+            || 'native'
+          }}
         path: ./wheelhouse/*.whl
 
   deploy:
@@ -426,10 +451,11 @@ jobs:
       run: |
         echo "${{ secrets.GITHUB_TOKEN }}" | gh auth login --with-token
     - name: Download distributions
-      uses: actions/download-artifact@v3
+      uses: actions/download-artifact@v4
       with:
-        name: dist
         path: dist
+        pattern: dist-*
+        merge-multiple: true
     - name: Collected dists
       run: |
         tree dist
diff --git .readthedocs.yml .readthedocs.yml
index b3edaf4b8ea..b7d8a9236f6 100644
--- .readthedocs.yml
+++ .readthedocs.yml
@@ -5,6 +5,10 @@
 ---
 version: 2
 
+sphinx:
+  # Path to your Sphinx configuration file.
+  configuration: docs/conf.py
+
 submodules:
   include: all
   exclude: []
diff --git CHANGES.rst CHANGES.rst
index 8352236c320..3c8c12b8d95 100644
--- CHANGES.rst
+++ CHANGES.rst
@@ -10,6 +10,351 @@
 
 .. towncrier release notes start
 
+3.11.14 (2025-03-16)
+====================
+
+Bug fixes
+---------
+
+- Fixed an issue where dns queries were delayed indefinitely when an exception occurred in a ``trace.send_dns_cache_miss``
+  -- by :user:`logioniz`.
+
+
+  *Related issues and pull requests on GitHub:*
+  :issue:`10529`.
+
+
+
+- Fixed DNS resolution on platforms that don't support ``socket.AI_ADDRCONFIG`` -- by :user:`maxbachmann`.
+
+
+  *Related issues and pull requests on GitHub:*
+  :issue:`10542`.
+
+
+
+- The connector now raises :exc:`aiohttp.ClientConnectionError` instead of :exc:`OSError` when failing to explicitly close the socket after :py:meth:`asyncio.loop.create_connection` fails -- by :user:`bdraco`.
+
+
+  *Related issues and pull requests on GitHub:*
+  :issue:`10551`.
+
+
+
+- Break cyclic references at connection close when there was a traceback -- by :user:`bdraco`.
+
+  Special thanks to :user:`availov` for reporting the issue.
+
+
+  *Related issues and pull requests on GitHub:*
+  :issue:`10556`.
+
+
+
+- Break cyclic references when there is an exception handling a request -- by :user:`bdraco`.
+
+
+  *Related issues and pull requests on GitHub:*
+  :issue:`10569`.
+
+
+
+
+Features
+--------
+
+- Improved logging on non-overlapping WebSocket client protocols to include the remote address -- by :user:`bdraco`.
+
+
+  *Related issues and pull requests on GitHub:*
+  :issue:`10564`.
+
+
+
+
+Miscellaneous internal changes
+------------------------------
+
+- Improved performance of parsing content types by adding a cache in the same manner currently done with mime types -- by :user:`bdraco`.
+
+
+  *Related issues and pull requests on GitHub:*
+  :issue:`10552`.
+
+
+
+
+----
+
+
+3.11.13 (2025-02-24)
+====================
+
+Bug fixes
+---------
+
+- Removed a break statement inside the finally block in :py:class:`~aiohttp.web.RequestHandler`
+  -- by :user:`Cycloctane`.
+
+
+  *Related issues and pull requests on GitHub:*
+  :issue:`10434`.
+
+
+
+- Changed connection creation to explicitly close sockets if an exception is raised in the event loop's ``create_connection`` method -- by :user:`top-oai`.
+
+
+  *Related issues and pull requests on GitHub:*
+  :issue:`10464`.
+
+
+
+
+Packaging updates and notes for downstreams
+-------------------------------------------
+
+- Fixed test ``test_write_large_payload_deflate_compression_data_in_eof_writelines`` failing with Python 3.12.9+ or 3.13.2+ -- by :user:`bdraco`.
+
+
+  *Related issues and pull requests on GitHub:*
+  :issue:`10423`.
+
+
+
+
+Miscellaneous internal changes
+------------------------------
+
+- Added human-readable error messages to the exceptions for WebSocket disconnects due to PONG not being received -- by :user:`bdraco`.
+
+  Previously, the error messages were empty strings, which made it hard to determine what went wrong.
+
+
+  *Related issues and pull requests on GitHub:*
+  :issue:`10422`.
+
+
+
+
+----
+
+
+3.11.12 (2025-02-05)
+====================
+
+Bug fixes
+---------
+
+- ``MultipartForm.decode()`` now follows RFC1341 7.2.1 with a ``CRLF`` after the boundary
+  -- by :user:`imnotjames`.
+
+
+  *Related issues and pull requests on GitHub:*
+  :issue:`10270`.
+
+
+
+- Restored the missing ``total_bytes`` attribute to ``EmptyStreamReader`` -- by :user:`bdraco`.
+
+
+  *Related issues and pull requests on GitHub:*
+  :issue:`10387`.
+
+
+
+
+Features
+--------
+
+- Updated :py:func:`~aiohttp.request` to make it accept ``_RequestOptions`` kwargs.
+  -- by :user:`Cycloctane`.
+
+
+  *Related issues and pull requests on GitHub:*
+  :issue:`10300`.
+
+
+
+- Improved logging of HTTP protocol errors to include the remote address -- by :user:`bdraco`.
+
+
+  *Related issues and pull requests on GitHub:*
+  :issue:`10332`.
+
+
+
+
+Improved documentation
+----------------------
+
+- Added ``aiohttp-openmetrics`` to list of third-party libraries -- by :user:`jelmer`.
+
+
+  *Related issues and pull requests on GitHub:*
+  :issue:`10304`.
+
+
+
+
+Packaging updates and notes for downstreams
+-------------------------------------------
+
+- Added missing files to the source distribution to fix ``Makefile`` targets.
+  Added a ``cythonize-nodeps`` target to run Cython without invoking pip to install dependencies.
+
+
+  *Related issues and pull requests on GitHub:*
+  :issue:`10366`.
+
+
+
+- Started building armv7l musllinux wheels -- by :user:`bdraco`.
+
+
+  *Related issues and pull requests on GitHub:*
+  :issue:`10404`.
+
+
+
+
+Contributor-facing changes
+--------------------------
+
+- The CI/CD workflow has been updated to use `upload-artifact` v4 and `download-artifact` v4 GitHub Actions -- by :user:`silamon`.
+
+
+  *Related issues and pull requests on GitHub:*
+  :issue:`10281`.
+
+
+
+
+Miscellaneous internal changes
+------------------------------
+
+- Restored support for zero copy writes when using Python 3.12 versions 3.12.9 and later or Python 3.13.2+ -- by :user:`bdraco`.
+
+  Zero copy writes were previously disabled due to :cve:`2024-12254` which is resolved in these Python versions.
+
+
+  *Related issues and pull requests on GitHub:*
+  :issue:`10137`.
+
+
+
+
+----
+
+
+3.11.11 (2024-12-18)
+====================
+
+Bug fixes
+---------
+
+- Updated :py:meth:`~aiohttp.ClientSession.request` to reuse the ``quote_cookie`` setting from ``ClientSession._cookie_jar`` when processing cookies parameter.
+  -- by :user:`Cycloctane`.
+
+
+  *Related issues and pull requests on GitHub:*
+  :issue:`10093`.
+
+
+
+- Fixed type of ``SSLContext`` for some static type checkers (e.g. pyright).
+
+
+  *Related issues and pull requests on GitHub:*
+  :issue:`10099`.
+
+
+
+- Updated :meth:`aiohttp.web.StreamResponse.write` annotation to also allow :class:`bytearray` and :class:`memoryview` as inputs -- by :user:`cdce8p`.
+
+
+  *Related issues and pull requests on GitHub:*
+  :issue:`10154`.
+
+
+
+- Fixed a hang where a connection previously used for a streaming
+  download could be returned to the pool in a paused state.
+  -- by :user:`javitonino`.
+
+
+  *Related issues and pull requests on GitHub:*
+  :issue:`10169`.
+
+
+
+
+Features
+--------
+
+- Enabled ALPN on default SSL contexts. This improves compatibility with some
+  proxies which don't work without this extension.
+  -- by :user:`Cycloctane`.
+
+
+  *Related issues and pull requests on GitHub:*
+  :issue:`10156`.
+
+
+
+
+Miscellaneous internal changes
+------------------------------
+
+- Fixed an infinite loop that can occur when using aiohttp in combination
+  with `async-solipsism`_ -- by :user:`bmerry`.
+
+  .. _async-solipsism: https://github.com/bmerry/async-solipsism
+
+
+  *Related issues and pull requests on GitHub:*
+  :issue:`10149`.
+
+
+
+
+----
+
+
+3.11.10 (2024-12-05)
+====================
+
+Bug fixes
+---------
+
+- Fixed race condition in :class:`aiohttp.web.FileResponse` that could have resulted in an incorrect response if the file was replaced on the file system during ``prepare`` -- by :user:`bdraco`.
+
+
+  *Related issues and pull requests on GitHub:*
+  :issue:`10101`, :issue:`10113`.
+
+
+
+- Replaced deprecated call to :func:`mimetypes.guess_type` with :func:`mimetypes.guess_file_type` when using Python 3.13+ -- by :user:`bdraco`.
+
+
+  *Related issues and pull requests on GitHub:*
+  :issue:`10102`.
+
+
+
+- Disabled zero copy writes in the ``StreamWriter`` -- by :user:`bdraco`.
+
+
+  *Related issues and pull requests on GitHub:*
+  :issue:`10125`.
+
+
+
+
+----
+
+
 3.11.9 (2024-12-01)
 ===================
 
diff --git CONTRIBUTORS.txt CONTRIBUTORS.txt
index 6adb3b97fb1..953af52498a 100644
--- CONTRIBUTORS.txt
+++ CONTRIBUTORS.txt
@@ -9,6 +9,7 @@ Adam Mills
 Adrian Krupa
 Adrián Chaves
 Ahmed Tahri
+Alan Bogarin
 Alan Tse
 Alec Hanefeld
 Alejandro Gómez
@@ -30,6 +31,7 @@ Alexandru Mihai
 Alexey Firsov
 Alexey Nikitin
 Alexey Popravka
+Alexey Stavrov
 Alexey Stepanov
 Amin Etesamian
 Amit Tulshyan
@@ -41,6 +43,7 @@ Andrej Antonov
 Andrew Leech
 Andrew Lytvyn
 Andrew Svetlov
+Andrew Top
 Andrew Zhou
 Andrii Soldatenko
 Anes Abismail
@@ -166,10 +169,12 @@ Jaesung Lee
 Jake Davis
 Jakob Ackermann
 Jakub Wilk
+James Ward
 Jan Buchar
 Jan Gosmann
 Jarno Elonen
 Jashandeep Sohi
+Javier Torres
 Jean-Baptiste Estival
 Jens Steinhauser
 Jeonghun Lee
@@ -364,6 +369,7 @@ William S.
 Wilson Ong
 wouter bolsterlee
 Xavier Halloran
+Xi Rui
 Xiang Li
 Yang Zhou
 Yannick Koechlin
diff --git MANIFEST.in MANIFEST.in
index d7c5cef6aad..64cee139a1f 100644
--- MANIFEST.in
+++ MANIFEST.in
@@ -7,6 +7,7 @@ graft aiohttp
 graft docs
 graft examples
 graft tests
+graft tools
 graft requirements
 recursive-include vendor *
 global-include aiohttp *.pyi
diff --git Makefile Makefile
index b0a3ef3226b..c6193fea9e4 100644
--- Makefile
+++ Makefile
@@ -81,6 +81,9 @@ generate-llhttp: .llhttp-gen
 .PHONY: cythonize
 cythonize: .install-cython $(PYXS:.pyx=.c) aiohttp/_websocket/reader_c.c
 
+.PHONY: cythonize-nodeps
+cythonize-nodeps: $(PYXS:.pyx=.c) aiohttp/_websocket/reader_c.c
+
 .install-deps: .install-cython $(PYXS:.pyx=.c) aiohttp/_websocket/reader_c.c $(call to-hash,$(CYS) $(REQS))
 	@python -m pip install -r requirements/dev.in -c requirements/dev.txt
 	@touch .install-deps
diff --git aiohttp/__init__.py aiohttp/__init__.py
index 5615e5349ae..0628433d35b 100644
--- aiohttp/__init__.py
+++ aiohttp/__init__.py
@@ -1,4 +1,4 @@
-__version__ = "3.11.9"
+__version__ = "3.11.14"
 
 from typing import TYPE_CHECKING, Tuple
 
diff --git aiohttp/_websocket/reader_py.py aiohttp/_websocket/reader_py.py
index 94d20010890..1645b3949b1 100644
--- aiohttp/_websocket/reader_py.py
+++ aiohttp/_websocket/reader_py.py
@@ -93,6 +93,7 @@ def _release_waiter(self) -> None:
     def feed_eof(self) -> None:
         self._eof = True
         self._release_waiter()
+        self._exception = None  # Break cyclic references
 
     def feed_data(self, data: "WSMessage", size: "int_") -> None:
         self._size += size
diff --git aiohttp/abc.py aiohttp/abc.py
index d6f9f782b0f..5794a9108b0 100644
--- aiohttp/abc.py
+++ aiohttp/abc.py
@@ -17,6 +17,7 @@
     Optional,
     Tuple,
     TypedDict,
+    Union,
 )
 
 from multidict import CIMultiDict
@@ -175,6 +176,11 @@ class AbstractCookieJar(Sized, IterableBase):
     def __init__(self, *, loop: Optional[asyncio.AbstractEventLoop] = None) -> None:
         self._loop = loop or asyncio.get_running_loop()
 
+    @property
+    @abstractmethod
+    def quote_cookie(self) -> bool:
+        """Return True if cookies should be quoted."""
+
     @abstractmethod
     def clear(self, predicate: Optional[ClearCookiePredicate] = None) -> None:
         """Clear all cookies if no predicate is passed."""
@@ -200,7 +206,7 @@ class AbstractStreamWriter(ABC):
     length: Optional[int] = 0
 
     @abstractmethod
-    async def write(self, chunk: bytes) -> None:
+    async def write(self, chunk: Union[bytes, bytearray, memoryview]) -> None:
         """Write chunk into stream."""
 
     @abstractmethod
diff --git aiohttp/client.py aiohttp/client.py
index e04a6ff989a..7c788e825eb 100644
--- aiohttp/client.py
+++ aiohttp/client.py
@@ -658,7 +658,9 @@ async def _request(
                     all_cookies = self._cookie_jar.filter_cookies(url)
 
                     if cookies is not None:
-                        tmp_cookie_jar = CookieJar()
+                        tmp_cookie_jar = CookieJar(
+                            quote_cookie=self._cookie_jar.quote_cookie
+                        )
                         tmp_cookie_jar.update_cookies(cookies)
                         req_cookies = tmp_cookie_jar.filter_cookies(url)
                         if req_cookies:
@@ -1469,106 +1471,80 @@ async def __aexit__(
         await self._session.close()
 
 
-def request(
-    method: str,
-    url: StrOrURL,
-    *,
-    params: Query = None,
-    data: Any = None,
-    json: Any = None,
-    headers: Optional[LooseHeaders] = None,
-    skip_auto_headers: Optional[Iterable[str]] = None,
-    auth: Optional[BasicAuth] = None,
-    allow_redirects: bool = True,
-    max_redirects: int = 10,
-    compress: Optional[str] = None,
-    chunked: Optional[bool] = None,
-    expect100: bool = False,
-    raise_for_status: Optional[bool] = None,
-    read_until_eof: bool = True,
-    proxy: Optional[StrOrURL] = None,
-    proxy_auth: Optional[BasicAuth] = None,
-    timeout: Union[ClientTimeout, object] = sentinel,
-    cookies: Optional[LooseCookies] = None,
-    version: HttpVersion = http.HttpVersion11,
-    connector: Optional[BaseConnector] = None,
-    read_bufsize: Optional[int] = None,
-    loop: Optional[asyncio.AbstractEventLoop] = None,
-    max_line_size: int = 8190,
-    max_field_size: int = 8190,
-) -> _SessionRequestContextManager:
-    """Constructs and sends a request.
-
-    Returns response object.
-    method - HTTP method
-    url - request url
-    params - (optional) Dictionary or bytes to be sent in the query
-      string of the new request
-    data - (optional) Dictionary, bytes, or file-like object to
-      send in the body of the request
-    json - (optional) Any json compatible python object
-    headers - (optional) Dictionary of HTTP Headers to send with
-      the request
-    cookies - (optional) Dict object to send with the request
-    auth - (optional) BasicAuth named tuple represent HTTP Basic Auth
-    auth - aiohttp.helpers.BasicAuth
-    allow_redirects - (optional) If set to False, do not follow
-      redirects
-    version - Request HTTP version.
-    compress - Set to True if request has to be compressed
-       with deflate encoding.
-    chunked - Set to chunk size for chunked transfer encoding.
-    expect100 - Expect 100-continue response from server.
-    connector - BaseConnector sub-class instance to support
-       connection pooling.
-    read_until_eof - Read response until eof if response
-       does not have Content-Length header.
-    loop - Optional event loop.
-    timeout - Optional ClientTimeout settings structure, 5min
-       total timeout by default.
-    Usage::
-      >>> import aiohttp
-      >>> resp = await aiohttp.request('GET', 'http://python.org/')
-      >>> resp
-      <ClientResponse(python.org/) [200]>
-      >>> data = await resp.read()
-    """
-    connector_owner = False
-    if connector is None:
-        connector_owner = True
-        connector = TCPConnector(loop=loop, force_close=True)
-
-    session = ClientSession(
-        loop=loop,
-        cookies=cookies,
-        version=version,
-        timeout=timeout,
-        connector=connector,
-        connector_owner=connector_owner,
-    )
+if sys.version_info >= (3, 11) and TYPE_CHECKING:
 
-    return _SessionRequestContextManager(
-        session._request(
-            method,
-            url,
-            params=params,
-            data=data,
-            json=json,
-            headers=headers,
-            skip_auto_headers=skip_auto_headers,
-            auth=auth,
-            allow_redirects=allow_redirects,
-            max_redirects=max_redirects,
-            compress=compress,
-            chunked=chunked,
-            expect100=expect100,
-            raise_for_status=raise_for_status,
-            read_until_eof=read_until_eof,
-            proxy=proxy,
-            proxy_auth=proxy_auth,
-            read_bufsize=read_bufsize,
-            max_line_size=max_line_size,
-            max_field_size=max_field_size,
-        ),
-        session,
-    )
+    def request(
+        method: str,
+        url: StrOrURL,
+        *,
+        version: HttpVersion = http.HttpVersion11,
+        connector: Optional[BaseConnector] = None,
+        loop: Optional[asyncio.AbstractEventLoop] = None,
+        **kwargs: Unpack[_RequestOptions],
+    ) -> _SessionRequestContextManager: ...
+
+else:
+
+    def request(
+        method: str,
+        url: StrOrURL,
+        *,
+        version: HttpVersion = http.HttpVersion11,
+        connector: Optional[BaseConnector] = None,
+        loop: Optional[asyncio.AbstractEventLoop] = None,
+        **kwargs: Any,
+    ) -> _SessionRequestContextManager:
+        """Constructs and sends a request.
+
+        Returns response object.
+        method - HTTP method
+        url - request url
+        params - (optional) Dictionary or bytes to be sent in the query
+        string of the new request
+        data - (optional) Dictionary, bytes, or file-like object to
+        send in the body of the request
+        json - (optional) Any json compatible python object
+        headers - (optional) Dictionary of HTTP Headers to send with
+        the request
+        cookies - (optional) Dict object to send with the request
+        auth - (optional) BasicAuth named tuple represent HTTP Basic Auth
+        auth - aiohttp.helpers.BasicAuth
+        allow_redirects - (optional) If set to False, do not follow
+        redirects
+        version - Request HTTP version.
+        compress - Set to True if request has to be compressed
+        with deflate encoding.
+        chunked - Set to chunk size for chunked transfer encoding.
+        expect100 - Expect 100-continue response from server.
+        connector - BaseConnector sub-class instance to support
+        connection pooling.
+        read_until_eof - Read response until eof if response
+        does not have Content-Length header.
+        loop - Optional event loop.
+        timeout - Optional ClientTimeout settings structure, 5min
+        total timeout by default.
+        Usage::
+        >>> import aiohttp
+        >>> async with aiohttp.request('GET', 'http://python.org/') as resp:
+        ...    print(resp)
+        ...    data = await resp.read()
+        <ClientResponse(https://www.python.org/) [200 OK]>
+        """
+        connector_owner = False
+        if connector is None:
+            connector_owner = True
+            connector = TCPConnector(loop=loop, force_close=True)
+
+        session = ClientSession(
+            loop=loop,
+            cookies=kwargs.pop("cookies", None),
+            version=version,
+            timeout=kwargs.pop("timeout", sentinel),
+            connector=connector,
+            connector_owner=connector_owner,
+        )
+
+        return _SessionRequestContextManager(
+            session._request(method, url, **kwargs),
+            session,
+        )
diff --git aiohttp/client_exceptions.py aiohttp/client_exceptions.py
index 667da8d5084..1d298e9a8cf 100644
--- aiohttp/client_exceptions.py
+++ aiohttp/client_exceptions.py
@@ -8,13 +8,17 @@
 
 from .typedefs import StrOrURL
 
-try:
+if TYPE_CHECKING:
     import ssl
 
     SSLContext = ssl.SSLContext
-except ImportError:  # pragma: no cover
-    ssl = SSLContext = None  # type: ignore[assignment]
+else:
+    try:
+        import ssl
 
+        SSLContext = ssl.SSLContext
+    except ImportError:  # pragma: no cover
+        ssl = SSLContext = None  # type: ignore[assignment]
 
 if TYPE_CHECKING:
     from .client_reqrep import ClientResponse, ConnectionKey, Fingerprint, RequestInfo
diff --git aiohttp/client_proto.py aiohttp/client_proto.py
index 79f033e3e12..2d64b3f3644 100644
--- aiohttp/client_proto.py
+++ aiohttp/client_proto.py
@@ -64,6 +64,7 @@ def force_close(self) -> None:
         self._should_close = True
 
     def close(self) -> None:
+        self._exception = None  # Break cyclic references
         transport = self.transport
         if transport is not None:
             transport.close()
diff --git aiohttp/client_reqrep.py aiohttp/client_reqrep.py
index e97c40ce0e5..43b48063c6e 100644
--- aiohttp/client_reqrep.py
+++ aiohttp/client_reqrep.py
@@ -72,12 +72,16 @@
     RawHeaders,
 )
 
-try:
+if TYPE_CHECKING:
     import ssl
     from ssl import SSLContext
-except ImportError:  # pragma: no cover
-    ssl = None  # type: ignore[assignment]
-    SSLContext = object  # type: ignore[misc,assignment]
+else:
+    try:
+        import ssl
+        from ssl import SSLContext
+    except ImportError:  # pragma: no cover
+        ssl = None  # type: ignore[assignment]
+        SSLContext = object  # type: ignore[misc,assignment]
 
 
 __all__ = ("ClientRequest", "ClientResponse", "RequestInfo", "Fingerprint")
diff --git aiohttp/client_ws.py aiohttp/client_ws.py
index f4cfa1bffe8..daa57d1930b 100644
--- aiohttp/client_ws.py
+++ aiohttp/client_ws.py
@@ -163,7 +163,9 @@ def _ping_task_done(self, task: "asyncio.Task[None]") -> None:
         self._ping_task = None
 
     def _pong_not_received(self) -> None:
-        self._handle_ping_pong_exception(ServerTimeoutError())
+        self._handle_ping_pong_exception(
+            ServerTimeoutError(f"No PONG received after {self._pong_heartbeat} seconds")
+        )
 
     def _handle_ping_pong_exception(self, exc: BaseException) -> None:
         """Handle exceptions raised during ping/pong processing."""
diff --git aiohttp/connector.py aiohttp/connector.py
index 93bc2513b20..e5cf3674cba 100644
--- aiohttp/connector.py
+++ aiohttp/connector.py
@@ -60,14 +60,18 @@
 )
 from .resolver import DefaultResolver
 
-try:
+if TYPE_CHECKING:
     import ssl
 
     SSLContext = ssl.SSLContext
-except ImportError:  # pragma: no cover
-    ssl = None  # type: ignore[assignment]
-    SSLContext = object  # type: ignore[misc,assignment]
+else:
+    try:
+        import ssl
 
+        SSLContext = ssl.SSLContext
+    except ImportError:  # pragma: no cover
+        ssl = None  # type: ignore[assignment]
+        SSLContext = object  # type: ignore[misc,assignment]
 
 EMPTY_SCHEMA_SET = frozenset({""})
 HTTP_SCHEMA_SET = frozenset({"http", "https"})
@@ -776,14 +780,16 @@ def _make_ssl_context(verified: bool) -> SSLContext:
         # No ssl support
         return None
     if verified:
-        return ssl.create_default_context()
-    sslcontext = ssl.SSLContext(ssl.PROTOCOL_TLS_CLIENT)
-    sslcontext.options |= ssl.OP_NO_SSLv2
-    sslcontext.options |= ssl.OP_NO_SSLv3
-    sslcontext.check_hostname = False
-    sslcontext.verify_mode = ssl.CERT_NONE
-    sslcontext.options |= ssl.OP_NO_COMPRESSION
-    sslcontext.set_default_verify_paths()
+        sslcontext = ssl.create_default_context()
+    else:
+        sslcontext = ssl.SSLContext(ssl.PROTOCOL_TLS_CLIENT)
+        sslcontext.options |= ssl.OP_NO_SSLv2
+        sslcontext.options |= ssl.OP_NO_SSLv3
+        sslcontext.check_hostname = False
+        sslcontext.verify_mode = ssl.CERT_NONE
+        sslcontext.options |= ssl.OP_NO_COMPRESSION
+        sslcontext.set_default_verify_paths()
+    sslcontext.set_alpn_protocols(("http/1.1",))
     return sslcontext
 
 
@@ -1009,11 +1015,11 @@ async def _resolve_host_with_throttle(
         This method must be run in a task and shielded from cancellation
         to avoid cancelling the underlying lookup.
         """
-        if traces:
-            for trace in traces:
-                await trace.send_dns_cache_miss(host)
         try:
             if traces:
+                for trace in traces:
+                    await trace.send_dns_cache_miss(host)
+
                 for trace in traces:
                     await trace.send_dns_resolvehost_start(host)
 
@@ -1102,6 +1108,7 @@ async def _wrap_create_connection(
         client_error: Type[Exception] = ClientConnectorError,
         **kwargs: Any,
     ) -> Tuple[asyncio.Transport, ResponseHandler]:
+        sock: Union[socket.socket, None] = None
         try:
             async with ceil_timeout(
                 timeout.sock_connect, ceil_threshold=timeout.ceil_threshold
@@ -1113,7 +1120,11 @@ async def _wrap_create_connection(
                     interleave=self._interleave,
                     loop=self._loop,
                 )
-                return await self._loop.create_connection(*args, **kwargs, sock=sock)
+                connection = await self._loop.create_connection(
+                    *args, **kwargs, sock=sock
+                )
+                sock = None
+                return connection
         except cert_errors as exc:
             raise ClientConnectorCertificateError(req.connection_key, exc) from exc
         except ssl_errors as exc:
@@ -1122,6 +1133,15 @@ async def _wrap_create_connection(
             if exc.errno is None and isinstance(exc, asyncio.TimeoutError):
                 raise
             raise client_error(req.connection_key, exc) from exc
+        finally:
+            if sock is not None:
+                # Will be hit if an exception is thrown before the event loop takes the socket.
+                # In that case, proactively close the socket to guard against event loop leaks.
+                # For example, see https://github.com/MagicStack/uvloop/issues/653.
+                try:
+                    sock.close()
+                except OSError as exc:
+                    raise client_error(req.connection_key, exc) from exc
 
     async def _wrap_existing_connection(
         self,
diff --git aiohttp/cookiejar.py aiohttp/cookiejar.py
index ef04bda5ad6..f6b9a921767 100644
--- aiohttp/cookiejar.py
+++ aiohttp/cookiejar.py
@@ -117,6 +117,10 @@ def __init__(
         self._expire_heap: List[Tuple[float, Tuple[str, str, str]]] = []
         self._expirations: Dict[Tuple[str, str, str], float] = {}
 
+    @property
+    def quote_cookie(self) -> bool:
+        return self._quote_cookie
+
     def save(self, file_path: PathLike) -> None:
         file_path = pathlib.Path(file_path)
         with file_path.open(mode="wb") as f:
@@ -474,6 +478,10 @@ def __iter__(self) -> "Iterator[Morsel[str]]":
     def __len__(self) -> int:
         return 0
 
+    @property
+    def quote_cookie(self) -> bool:
+        return True
+
     def clear(self, predicate: Optional[ClearCookiePredicate] = None) -> None:
         pass
 
diff --git aiohttp/helpers.py aiohttp/helpers.py
index 8038931ebec..ace4f0e9b53 100644
--- aiohttp/helpers.py
+++ aiohttp/helpers.py
@@ -21,7 +21,7 @@
 from email.utils import parsedate
 from math import ceil
 from pathlib import Path
-from types import TracebackType
+from types import MappingProxyType, TracebackType
 from typing import (
     Any,
     Callable,
@@ -357,6 +357,20 @@ def parse_mimetype(mimetype: str) -> MimeType:
     )
 
 
+@functools.lru_cache(maxsize=56)
+def parse_content_type(raw: str) -> Tuple[str, MappingProxyType[str, str]]:
+    """Parse Content-Type header.
+
+    Returns a tuple of the parsed content type and a
+    MappingProxyType of parameters.
+    """
+    msg = HeaderParser().parsestr(f"Content-Type: {raw}")
+    content_type = msg.get_content_type()
+    params = msg.get_params(())
+    content_dict = dict(params[1:])  # First element is content type again
+    return content_type, MappingProxyType(content_dict)
+
+
 def guess_filename(obj: Any, default: Optional[str] = None) -> Optional[str]:
     name = getattr(obj, "name", None)
     if name and isinstance(name, str) and name[0] != "<" and name[-1] != ">":
@@ -710,10 +724,10 @@ def _parse_content_type(self, raw: Optional[str]) -> None:
             self._content_type = "application/octet-stream"
             self._content_dict = {}
         else:
-            msg = HeaderParser().parsestr("Content-Type: " + raw)
-            self._content_type = msg.get_content_type()
-            params = msg.get_params(())
-            self._content_dict = dict(params[1:])  # First element is content type again
+            content_type, content_mapping_proxy = parse_content_type(raw)
+            self._content_type = content_type
+            # _content_dict needs to be mutable so we can update it
+            self._content_dict = content_mapping_proxy.copy()
 
     @property
     def content_type(self) -> str:
diff --git aiohttp/http_writer.py aiohttp/http_writer.py
index c66fda3d8d0..e031a97708d 100644
--- aiohttp/http_writer.py
+++ aiohttp/http_writer.py
@@ -1,6 +1,7 @@
 """Http related parsers and protocol."""
 
 import asyncio
+import sys
 import zlib
 from typing import (  # noqa
     Any,
@@ -24,6 +25,17 @@
 __all__ = ("StreamWriter", "HttpVersion", "HttpVersion10", "HttpVersion11")
 
 
+MIN_PAYLOAD_FOR_WRITELINES = 2048
+IS_PY313_BEFORE_313_2 = (3, 13, 0) <= sys.version_info < (3, 13, 2)
+IS_PY_BEFORE_312_9 = sys.version_info < (3, 12, 9)
+SKIP_WRITELINES = IS_PY313_BEFORE_313_2 or IS_PY_BEFORE_312_9
+# writelines is not safe for use
+# on Python 3.12+ until 3.12.9
+# on Python 3.13+ until 3.13.2
+# and on older versions it not any faster than write
+# CVE-2024-12254: https://github.com/python/cpython/pull/127656
+
+
 class HttpVersion(NamedTuple):
     major: int
     minor: int
@@ -72,7 +84,7 @@ def enable_compression(
     ) -> None:
         self._compress = ZLibCompressor(encoding=encoding, strategy=strategy)
 
-    def _write(self, chunk: bytes) -> None:
+    def _write(self, chunk: Union[bytes, bytearray, memoryview]) -> None:
         size = len(chunk)
         self.buffer_size += size
         self.output_size += size
@@ -90,10 +102,17 @@ def _writelines(self, chunks: Iterable[bytes]) -> None:
         transport = self._protocol.transport
         if transport is None or transport.is_closing():
             raise ClientConnectionResetError("Cannot write to closing transport")
-        transport.writelines(chunks)
+        if SKIP_WRITELINES or size < MIN_PAYLOAD_FOR_WRITELINES:
+            transport.write(b"".join(chunks))
+        else:
+            transport.writelines(chunks)
 
     async def write(
-        self, chunk: bytes, *, drain: bool = True, LIMIT: int = 0x10000
+        self,
+        chunk: Union[bytes, bytearray, memoryview],
+        *,
+        drain: bool = True,
+        LIMIT: int = 0x10000,
     ) -> None:
         """Writes chunk of data to a stream.
 
diff --git aiohttp/multipart.py aiohttp/multipart.py
index e0bcce07449..bd4d8ae1ddf 100644
--- aiohttp/multipart.py
+++ aiohttp/multipart.py
@@ -979,7 +979,7 @@ def decode(self, encoding: str = "utf-8", errors: str = "strict") -> str:
         return "".join(
             "--"
             + self.boundary
-            + "\n"
+            + "\r\n"
             + part._binary_headers.decode(encoding, errors)
             + part.decode()
             for part, _e, _te in self._parts
diff --git aiohttp/payload.py aiohttp/payload.py
index c8c01814698..3f6d3672db2 100644
--- aiohttp/payload.py
+++ aiohttp/payload.py
@@ -4,6 +4,7 @@
 import json
 import mimetypes
 import os
+import sys
 import warnings
 from abc import ABC, abstractmethod
 from itertools import chain
@@ -169,7 +170,11 @@ def __init__(
         if content_type is not sentinel and content_type is not None:
             self._headers[hdrs.CONTENT_TYPE] = content_type
         elif self._filename is not None:
-            content_type = mimetypes.guess_type(self._filename)[0]
+            if sys.version_info >= (3, 13):
+                guesser = mimetypes.guess_file_type
+            else:
+                guesser = mimetypes.guess_type
+            content_type = guesser(self._filename)[0]
             if content_type is None:
                 content_type = self._default_content_type
             self._headers[hdrs.CONTENT_TYPE] = content_type
diff --git aiohttp/resolver.py aiohttp/resolver.py
index 9c744514fae..e14179cc8a2 100644
--- aiohttp/resolver.py
+++ aiohttp/resolver.py
@@ -18,6 +18,9 @@
 
 _NUMERIC_SOCKET_FLAGS = socket.AI_NUMERICHOST | socket.AI_NUMERICSERV
 _NAME_SOCKET_FLAGS = socket.NI_NUMERICHOST | socket.NI_NUMERICSERV
+_AI_ADDRCONFIG = socket.AI_ADDRCONFIG
+if hasattr(socket, "AI_MASK"):
+    _AI_ADDRCONFIG &= socket.AI_MASK
 
 
 class ThreadedResolver(AbstractResolver):
@@ -38,7 +41,7 @@ async def resolve(
             port,
             type=socket.SOCK_STREAM,
             family=family,
-            flags=socket.AI_ADDRCONFIG,
+            flags=_AI_ADDRCONFIG,
         )
 
         hosts: List[ResolveResult] = []
@@ -105,7 +108,7 @@ async def resolve(
                 port=port,
                 type=socket.SOCK_STREAM,
                 family=family,
-                flags=socket.AI_ADDRCONFIG,
+                flags=_AI_ADDRCONFIG,
             )
         except aiodns.error.DNSError as exc:
             msg = exc.args[1] if len(exc.args) >= 1 else "DNS lookup failed"
diff --git aiohttp/streams.py aiohttp/streams.py
index b97846171b1..7a3f64d1289 100644
--- aiohttp/streams.py
+++ aiohttp/streams.py
@@ -220,6 +220,9 @@ def feed_eof(self) -> None:
             self._eof_waiter = None
             set_result(waiter, None)
 
+        if self._protocol._reading_paused:
+            self._protocol.resume_reading()
+
         for cb in self._eof_callbacks:
             try:
                 cb()
@@ -517,8 +520,9 @@ def _read_nowait_chunk(self, n: int) -> bytes:
         else:
             data = self._buffer.popleft()
 
-        self._size -= len(data)
-        self._cursor += len(data)
+        data_len = len(data)
+        self._size -= data_len
+        self._cursor += data_len
 
         chunk_splits = self._http_chunk_splits
         # Prevent memory leak: drop useless chunk splits
@@ -551,6 +555,7 @@ class EmptyStreamReader(StreamReader):  # lgtm [py/missing-call-to-init]
 
     def __init__(self) -> None:
         self._read_eof_chunk = False
+        self.total_bytes = 0
 
     def __repr__(self) -> str:
         return "<%s>" % self.__class__.__name__
diff --git aiohttp/web.py aiohttp/web.py
index f975b665331..d6ab6f6fad4 100644
--- aiohttp/web.py
+++ aiohttp/web.py
@@ -9,6 +9,7 @@
 from contextlib import suppress
 from importlib import import_module
 from typing import (
+    TYPE_CHECKING,
     Any,
     Awaitable,
     Callable,
@@ -287,10 +288,13 @@
 )
 
 
-try:
+if TYPE_CHECKING:
     from ssl import SSLContext
-except ImportError:  # pragma: no cover
-    SSLContext = Any  # type: ignore[misc,assignment]
+else:
+    try:
+        from ssl import SSLContext
+    except ImportError:  # pragma: no cover
+        SSLContext = object  # type: ignore[misc,assignment]
 
 # Only display warning when using -Wdefault, -We, -X dev or similar.
 warnings.filterwarnings("ignore", category=NotAppKeyWarning, append=True)
diff --git aiohttp/web_fileresponse.py aiohttp/web_fileresponse.py
index 3b2bc2caf12..be9cf87e069 100644
--- aiohttp/web_fileresponse.py
+++ aiohttp/web_fileresponse.py
@@ -1,7 +1,10 @@
 import asyncio
+import io
 import os
 import pathlib
+import sys
 from contextlib import suppress
+from enum import Enum, auto
 from mimetypes import MimeTypes
 from stat import S_ISREG
 from types import MappingProxyType
@@ -15,6 +18,7 @@
     Iterator,
     List,
     Optional,
+    Set,
     Tuple,
     Union,
     cast,
@@ -66,12 +70,25 @@
     }
 )
 
+
+class _FileResponseResult(Enum):
+    """The result of the file response."""
+
+    SEND_FILE = auto()  # Ie a regular file to send
+    NOT_ACCEPTABLE = auto()  # Ie a socket, or non-regular file
+    PRE_CONDITION_FAILED = auto()  # Ie If-Match or If-None-Match failed
+    NOT_MODIFIED = auto()  # 304 Not Modified
+
+
 # Add custom pairs and clear the encodings map so guess_type ignores them.
 CONTENT_TYPES.encodings_map.clear()
 for content_type, extension in ADDITIONAL_CONTENT_TYPES.items():
     CONTENT_TYPES.add_type(content_type, extension)  # type: ignore[attr-defined]
 
 
+_CLOSE_FUTURES: Set[asyncio.Future[None]] = set()
+
+
 class FileResponse(StreamResponse):
     """A response object can be used to send files."""
 
@@ -160,10 +177,12 @@ async def _precondition_failed(
         self.content_length = 0
         return await super().prepare(request)
 
-    def _get_file_path_stat_encoding(
-        self, accept_encoding: str
-    ) -> Tuple[pathlib.Path, os.stat_result, Optional[str]]:
-        """Return the file path, stat result, and encoding.
+    def _make_response(
+        self, request: "BaseRequest", accept_encoding: str
+    ) -> Tuple[
+        _FileResponseResult, Optional[io.BufferedReader], os.stat_result, Optional[str]
+    ]:
+        """Return the response result, io object, stat result, and encoding.
 
         If an uncompressed file is returned, the encoding is set to
         :py:data:`None`.
@@ -171,6 +190,52 @@ def _get_file_path_stat_encoding(
         This method should be called from a thread executor
         since it calls os.stat which may block.
         """
+        file_path, st, file_encoding = self._get_file_path_stat_encoding(
+            accept_encoding
+        )
+        if not file_path:
+            return _FileResponseResult.NOT_ACCEPTABLE, None, st, None
+
+        etag_value = f"{st.st_mtime_ns:x}-{st.st_size:x}"
+
+        # https://www.rfc-editor.org/rfc/rfc9110#section-13.1.1-2
+        if (ifmatch := request.if_match) is not None and not self._etag_match(
+            etag_value, ifmatch, weak=False
+        ):
+            return _FileResponseResult.PRE_CONDITION_FAILED, None, st, file_encoding
+
+        if (
+            (unmodsince := request.if_unmodified_since) is not None
+            and ifmatch is None
+            and st.st_mtime > unmodsince.timestamp()
+        ):
+            return _FileResponseResult.PRE_CONDITION_FAILED, None, st, file_encoding
+
+        # https://www.rfc-editor.org/rfc/rfc9110#section-13.1.2-2
+        if (ifnonematch := request.if_none_match) is not None and self._etag_match(
+            etag_value, ifnonematch, weak=True
+        ):
+            return _FileResponseResult.NOT_MODIFIED, None, st, file_encoding
+
+        if (
+            (modsince := request.if_modified_since) is not None
+            and ifnonematch is None
+            and st.st_mtime <= modsince.timestamp()
+        ):
+            return _FileResponseResult.NOT_MODIFIED, None, st, file_encoding
+
+        fobj = file_path.open("rb")
+        with suppress(OSError):
+            # fstat() may not be available on all platforms
+            # Once we open the file, we want the fstat() to ensure
+            # the file has not changed between the first stat()
+            # and the open().
+            st = os.stat(fobj.fileno())
+        return _FileResponseResult.SEND_FILE, fobj, st, file_encoding
+
+    def _get_file_path_stat_encoding(
+        self, accept_encoding: str
+    ) -> Tuple[Optional[pathlib.Path], os.stat_result, Optional[str]]:
         file_path = self._path
         for file_extension, file_encoding in ENCODING_EXTENSIONS.items():
             if file_encoding not in accept_encoding:
@@ -184,7 +249,8 @@ def _get_file_path_stat_encoding(
                     return compressed_path, st, file_encoding
 
         # Fallback to the uncompressed file
-        return file_path, file_path.stat(), None
+        st = file_path.stat()
+        return file_path if S_ISREG(st.st_mode) else None, st, None
 
     async def prepare(self, request: "BaseRequest") -> Optional[AbstractStreamWriter]:
         loop = asyncio.get_running_loop()
@@ -192,9 +258,12 @@ async def prepare(self, request: "BaseRequest") -> Optional[AbstractStreamWriter
         # https://www.rfc-editor.org/rfc/rfc9110#section-8.4.1
         accept_encoding = request.headers.get(hdrs.ACCEPT_ENCODING, "").lower()
         try:
-            file_path, st, file_encoding = await loop.run_in_executor(
-                None, self._get_file_path_stat_encoding, accept_encoding
+            response_result, fobj, st, file_encoding = await loop.run_in_executor(
+                None, self._make_response, request, accept_encoding
             )
+        except PermissionError:
+            self.set_status(HTTPForbidden.status_code)
+            return await super().prepare(request)
         except OSError:
             # Most likely to be FileNotFoundError or OSError for circular
             # symlinks in python >= 3.13, so respond with 404.
@@ -202,51 +271,46 @@ async def prepare(self, request: "BaseRequest") -> Optional[AbstractStreamWriter
             return await super().prepare(request)
 
         # Forbid special files like sockets, pipes, devices, etc.
-        if not S_ISREG(st.st_mode):
+        if response_result is _FileResponseResult.NOT_ACCEPTABLE:
             self.set_status(HTTPForbidden.status_code)
             return await super().prepare(request)
 
-        etag_value = f"{st.st_mtime_ns:x}-{st.st_size:x}"
-        last_modified = st.st_mtime
-
-        # https://www.rfc-editor.org/rfc/rfc9110#section-13.1.1-2
-        ifmatch = request.if_match
-        if ifmatch is not None and not self._etag_match(
-            etag_value, ifmatch, weak=False
-        ):
-            return await self._precondition_failed(request)
-
-        unmodsince = request.if_unmodified_since
-        if (
-            unmodsince is not None
-            and ifmatch is None
-            and st.st_mtime > unmodsince.timestamp()
-        ):
+        if response_result is _FileResponseResult.PRE_CONDITION_FAILED:
             return await self._precondition_failed(request)
 
-        # https://www.rfc-editor.org/rfc/rfc9110#section-13.1.2-2
-        ifnonematch = request.if_none_match
-        if ifnonematch is not None and self._etag_match(
-            etag_value, ifnonematch, weak=True
-        ):
-            return await self._not_modified(request, etag_value, last_modified)
-
-        modsince = request.if_modified_since
-        if (
-            modsince is not None
-            and ifnonematch is None
-            and st.st_mtime <= modsince.timestamp()
-        ):
+        if response_result is _FileResponseResult.NOT_MODIFIED:
+            etag_value = f"{st.st_mtime_ns:x}-{st.st_size:x}"
+            last_modified = st.st_mtime
             return await self._not_modified(request, etag_value, last_modified)
 
+        assert fobj is not None
+        try:
+            return await self._prepare_open_file(request, fobj, st, file_encoding)
+        finally:
+            # We do not await here because we do not want to wait
+            # for the executor to finish before returning the response
+            # so the connection can begin servicing another request
+            # as soon as possible.
+            close_future = loop.run_in_executor(None, fobj.close)
+            # Hold a strong reference to the future to prevent it from being
+            # garbage collected before it completes.
+            _CLOSE_FUTURES.add(close_future)
+            close_future.add_done_callback(_CLOSE_FUTURES.remove)
+
+    async def _prepare_open_file(
+        self,
+        request: "BaseRequest",
+        fobj: io.BufferedReader,
+        st: os.stat_result,
+        file_encoding: Optional[str],
+    ) -> Optional[AbstractStreamWriter]:
         status = self._status
-        file_size = st.st_size
-        count = file_size
-
-        start = None
+        file_size: int = st.st_size
+        file_mtime: float = st.st_mtime
+        count: int = file_size
+        start: Optional[int] = None
 
-        ifrange = request.if_range
-        if ifrange is None or st.st_mtime <= ifrange.timestamp():
+        if (ifrange := request.if_range) is None or file_mtime <= ifrange.timestamp():
             # If-Range header check:
             # condition = cached date >= last modification date
             # return 206 if True else 200.
@@ -257,7 +321,7 @@ async def prepare(self, request: "BaseRequest") -> Optional[AbstractStreamWriter
             try:
                 rng = request.http_range
                 start = rng.start
-                end = rng.stop
+                end: Optional[int] = rng.stop
             except ValueError:
                 # https://tools.ietf.org/html/rfc7233:
                 # A server generating a 416 (Range Not Satisfiable) response to
@@ -268,13 +332,13 @@ async def prepare(self, request: "BaseRequest") -> Optional[AbstractStreamWriter
                 #
                 # Will do the same below. Many servers ignore this and do not
                 # send a Content-Range header with HTTP 416
-                self.headers[hdrs.CONTENT_RANGE] = f"bytes */{file_size}"
+                self._headers[hdrs.CONTENT_RANGE] = f"bytes */{file_size}"
                 self.set_status(HTTPRequestRangeNotSatisfiable.status_code)
                 return await super().prepare(request)
 
             # If a range request has been made, convert start, end slice
             # notation into file pointer offset and count
-            if start is not None or end is not None:
+            if start is not None:
                 if start < 0 and end is None:  # return tail of file
                     start += file_size
                     if start < 0:
@@ -304,7 +368,7 @@ async def prepare(self, request: "BaseRequest") -> Optional[AbstractStreamWriter
                     # suffix-byte-range-spec with a non-zero suffix-length,
                     # then the byte-range-set is satisfiable. Otherwise, the
                     # byte-range-set is unsatisfiable.
-                    self.headers[hdrs.CONTENT_RANGE] = f"bytes */{file_size}"
+                    self._headers[hdrs.CONTENT_RANGE] = f"bytes */{file_size}"
                     self.set_status(HTTPRequestRangeNotSatisfiable.status_code)
                     return await super().prepare(request)
 
@@ -316,48 +380,39 @@ async def prepare(self, request: "BaseRequest") -> Optional[AbstractStreamWriter
         # If the Content-Type header is not already set, guess it based on the
         # extension of the request path. The encoding returned by guess_type
         #  can be ignored since the map was cleared above.
-        if hdrs.CONTENT_TYPE not in self.headers:
-            self.content_type = (
-                CONTENT_TYPES.guess_type(self._path)[0] or FALLBACK_CONTENT_TYPE
-            )
+        if hdrs.CONTENT_TYPE not in self._headers:
+            if sys.version_info >= (3, 13):
+                guesser = CONTENT_TYPES.guess_file_type
+            else:
+                guesser = CONTENT_TYPES.guess_type
+            self.content_type = guesser(self._path)[0] or FALLBACK_CONTENT_TYPE
 
         if file_encoding:
-            self.headers[hdrs.CONTENT_ENCODING] = file_encoding
-            self.headers[hdrs.VARY] = hdrs.ACCEPT_ENCODING
+            self._headers[hdrs.CONTENT_ENCODING] = file_encoding
+            self._headers[hdrs.VARY] = hdrs.ACCEPT_ENCODING
             # Disable compression if we are already sending
             # a compressed file since we don't want to double
             # compress.
             self._compression = False
 
-        self.etag = etag_value  # type: ignore[assignment]
-        self.last_modified = st.st_mtime  # type: ignore[assignment]
+        self.etag = f"{st.st_mtime_ns:x}-{st.st_size:x}"  # type: ignore[assignment]
+        self.last_modified = file_mtime  # type: ignore[assignment]
         self.content_length = count
 
-        self.headers[hdrs.ACCEPT_RANGES] = "bytes"
-
-        real_start = cast(int, start)
+        self._headers[hdrs.ACCEPT_RANGES] = "bytes"
 
         if status == HTTPPartialContent.status_code:
-            self.headers[hdrs.CONTENT_RANGE] = "bytes {}-{}/{}".format(
+            real_start = start
+            assert real_start is not None
+            self._headers[hdrs.CONTENT_RANGE] = "bytes {}-{}/{}".format(
                 real_start, real_start + count - 1, file_size
             )
 
         # If we are sending 0 bytes calling sendfile() will throw a ValueError
-        if count == 0 or must_be_empty_body(request.method, self.status):
-            return await super().prepare(request)
-
-        try:
-            fobj = await loop.run_in_executor(None, file_path.open, "rb")
-        except PermissionError:
-            self.set_status(HTTPForbidden.status_code)
+        if count == 0 or must_be_empty_body(request.method, status):
             return await super().prepare(request)
 
-        if start:  # be aware that start could be None or int=0 here.
-            offset = start
-        else:
-            offset = 0
+        # be aware that start could be None or int=0 here.
+        offset = start or 0
 
-        try:
-            return await self._sendfile(request, fobj, offset, count)
-        finally:
-            await asyncio.shield(loop.run_in_executor(None, fobj.close))
+        return await self._sendfile(request, fobj, offset, count)
diff --git aiohttp/web_protocol.py aiohttp/web_protocol.py
index e8bb41abf97..1dba9606ea0 100644
--- aiohttp/web_protocol.py
+++ aiohttp/web_protocol.py
@@ -458,7 +458,7 @@ def _process_keepalive(self) -> None:
         loop = self._loop
         now = loop.time()
         close_time = self._next_keepalive_close_time
-        if now <= close_time:
+        if now < close_time:
             # Keep alive close check fired too early, reschedule
             self._keepalive_handle = loop.call_at(close_time, self._process_keepalive)
             return
@@ -520,8 +520,6 @@ async def start(self) -> None:
         keep_alive(True) specified.
         """
         loop = self._loop
-        handler = asyncio.current_task(loop)
-        assert handler is not None
         manager = self._manager
         assert manager is not None
         keepalive_timeout = self._keepalive_timeout
@@ -551,7 +549,16 @@ async def start(self) -> None:
             else:
                 request_handler = self._request_handler
 
-            request = self._request_factory(message, payload, self, writer, handler)
+            # Important don't hold a reference to the current task
+            # as on traceback it will prevent the task from being
+            # collected and will cause a memory leak.
+            request = self._request_factory(
+                message,
+                payload,
+                self,
+                writer,
+                self._task_handler or asyncio.current_task(loop),  # type: ignore[arg-type]
+            )
             try:
                 # a new task is used for copy context vars (#3406)
                 coro = self._handle_request(request, start, request_handler)
@@ -608,26 +615,29 @@ async def start(self) -> None:
 
             except asyncio.CancelledError:
                 self.log_debug("Ignored premature client disconnection")
+                self.force_close()
                 raise
             except Exception as exc:
                 self.log_exception("Unhandled exception", exc_info=exc)
                 self.force_close()
+            except BaseException:
+                self.force_close()
+                raise
             finally:
+                request._task = None  # type: ignore[assignment] # Break reference cycle in case of exception
                 if self.transport is None and resp is not None:
                     self.log_debug("Ignored premature client disconnection.")
-                elif not self._force_close:
-                    if self._keepalive and not self._close:
-                        # start keep-alive timer
-                        if keepalive_timeout is not None:
-                            now = loop.time()
-                            close_time = now + keepalive_timeout
-                            self._next_keepalive_close_time = close_time
-                            if self._keepalive_handle is None:
-                                self._keepalive_handle = loop.call_at(
-                                    close_time, self._process_keepalive
-                                )
-                    else:
-                        break
+
+            if self._keepalive and not self._close and not self._force_close:
+                # start keep-alive timer
+                close_time = loop.time() + keepalive_timeout
+                self._next_keepalive_close_time = close_time
+                if self._keepalive_handle is None:
+                    self._keepalive_handle = loop.call_at(
+                        close_time, self._process_keepalive
+                    )
+            else:
+                break
 
         # remove handler, close transport if no handlers left
         if not self._force_close:
@@ -694,9 +704,13 @@ def handle_error(
             # or encrypted traffic to an HTTP port. This is expected
             # to happen when connected to the public internet so we log
             # it at the debug level as to not fill logs with noise.
-            self.logger.debug("Error handling request", exc_info=exc)
+            self.logger.debug(
+                "Error handling request from %s", request.remote, exc_info=exc
+            )
         else:
-            self.log_exception("Error handling request", exc_info=exc)
+            self.log_exception(
+                "Error handling request from %s", request.remote, exc_info=exc
+            )
 
         # some data already got sent, connection is broken
         if request.writer.output_size > 0:
diff --git aiohttp/web_response.py aiohttp/web_response.py
index cd2be24f1a3..e498a905caf 100644
--- aiohttp/web_response.py
+++ aiohttp/web_response.py
@@ -537,7 +537,7 @@ async def _write_headers(self) -> None:
         status_line = f"HTTP/{version[0]}.{version[1]} {self._status} {self._reason}"
         await writer.write_headers(status_line, self._headers)
 
-    async def write(self, data: bytes) -> None:
+    async def write(self, data: Union[bytes, bytearray, memoryview]) -> None:
         assert isinstance(
             data, (bytes, bytearray, memoryview)
         ), "data argument must be byte-ish (%r)" % type(data)
diff --git aiohttp/web_runner.py aiohttp/web_runner.py
index f8933383435..bcfec727c84 100644
--- aiohttp/web_runner.py
+++ aiohttp/web_runner.py
@@ -3,7 +3,7 @@
 import socket
 import warnings
 from abc import ABC, abstractmethod
-from typing import Any, List, Optional, Set
+from typing import TYPE_CHECKING, Any, List, Optional, Set
 
 from yarl import URL
 
@@ -11,11 +11,13 @@
 from .web_app import Application
 from .web_server import Server
 
-try:
+if TYPE_CHECKING:
     from ssl import SSLContext
-except ImportError:
-    SSLContext = object  # type: ignore[misc,assignment]
-
+else:
+    try:
+        from ssl import SSLContext
+    except ImportError:  # pragma: no cover
+        SSLContext = object  # type: ignore[misc,assignment]
 
 __all__ = (
     "BaseSite",
diff --git aiohttp/web_ws.py aiohttp/web_ws.py
index 0fb1549a3aa..439b8049987 100644
--- aiohttp/web_ws.py
+++ aiohttp/web_ws.py
@@ -182,7 +182,11 @@ def _ping_task_done(self, task: "asyncio.Task[None]") -> None:
 
     def _pong_not_received(self) -> None:
         if self._req is not None and self._req.transport is not None:
-            self._handle_ping_pong_exception(asyncio.TimeoutError())
+            self._handle_ping_pong_exception(
+                asyncio.TimeoutError(
+                    f"No PONG received after {self._pong_heartbeat} seconds"
+                )
+            )
 
     def _handle_ping_pong_exception(self, exc: BaseException) -> None:
         """Handle exceptions raised during ping/pong processing."""
@@ -248,7 +252,8 @@ def _handshake(
             else:
                 # No overlap found: Return no protocol as per spec
                 ws_logger.warning(
-                    "Client protocols %r don’t overlap server-known ones %r",
+                    "%s: Client protocols %r don’t overlap server-known ones %r",
+                    request.remote,
                     req_protocols,
                     self._protocols,
                 )
diff --git aiohttp/worker.py aiohttp/worker.py
index 9b307697336..8ed121ac955 100644
--- aiohttp/worker.py
+++ aiohttp/worker.py
@@ -6,7 +6,7 @@
 import signal
 import sys
 from types import FrameType
-from typing import Any, Awaitable, Callable, Optional, Union  # noqa
+from typing import TYPE_CHECKING, Any, Optional
 
 from gunicorn.config import AccessLogFormat as GunicornAccessLogFormat
 from gunicorn.workers import base
@@ -17,13 +17,18 @@
 from .web_app import Application
 from .web_log import AccessLogger
 
-try:
+if TYPE_CHECKING:
     import ssl
 
     SSLContext = ssl.SSLContext
-except ImportError:  # pragma: no cover
-    ssl = None  # type: ignore[assignment]
-    SSLContext = object  # type: ignore[misc,assignment]
+else:
+    try:
+        import ssl
+
+        SSLContext = ssl.SSLContext
+    except ImportError:  # pragma: no cover
+        ssl = None  # type: ignore[assignment]
+        SSLContext = object  # type: ignore[misc,assignment]
 
 
 __all__ = ("GunicornWebWorker", "GunicornUVLoopWebWorker")
diff --git docs/client_reference.rst docs/client_reference.rst
index c9031de5383..26537161971 100644
--- docs/client_reference.rst
+++ docs/client_reference.rst
@@ -448,11 +448,16 @@ The client session supports the context manager protocol for self closing.
       :param aiohttp.BasicAuth auth: an object that represents HTTP
                                      Basic Authorization (optional)
 
-      :param bool allow_redirects: If set to ``False``, do not follow redirects.
-                                   ``True`` by default (optional).
+      :param bool allow_redirects: Whether to process redirects or not.
+         When ``True``, redirects are followed (up to ``max_redirects`` times)
+         and logged into :attr:`ClientResponse.history` and ``trace_configs``.
+         When ``False``, the original response is returned.
+         ``True`` by default (optional).
 
       :param int max_redirects: Maximum number of redirects to follow.
-                                ``10`` by default.
+         :exc:`TooManyRedirects` is raised if the number is exceeded.
+         Ignored when ``allow_redirects=False``.
+         ``10`` by default.
 
       :param bool compress: Set to ``True`` if request has to be compressed
          with deflate encoding. If `compress` can not be combined
@@ -508,7 +513,7 @@ The client session supports the context manager protocol for self closing.
          .. versionadded:: 3.0
 
       :param str server_hostname: Sets or overrides the host name that the
-         target server’s certificate will be matched against.
+         target server's certificate will be matched against.
 
          See :py:meth:`asyncio.loop.create_connection` for more information.
 
@@ -554,8 +559,11 @@ The client session supports the context manager protocol for self closing.
 
       :param url: Request URL, :class:`str` or :class:`~yarl.URL`
 
-      :param bool allow_redirects: If set to ``False``, do not follow redirects.
-                                   ``True`` by default (optional).
+      :param bool allow_redirects: Whether to process redirects or not.
+         When ``True``, redirects are followed and logged into
+         :attr:`ClientResponse.history`.
+         When ``False``, the original response is returned.
+         ``True`` by default (optional).
 
       :return ClientResponse: a :class:`client response
                               <ClientResponse>` object.
@@ -623,8 +631,11 @@ The client session supports the context manager protocol for self closing.
 
       :param url: Request URL, :class:`str` or :class:`~yarl.URL`
 
-      :param bool allow_redirects: If set to ``False``, do not follow redirects.
-                                   ``False`` by default (optional).
+      :param bool allow_redirects: Whet,her to process redirects or not.
+         When ``True``, redirects are followed and logged into
+         :attr:`ClientResponse.history`.
+         When ``False``, the original response is returned.
+         ``False`` by default (optional).
 
       :return ClientResponse: a :class:`client response
                               <ClientResponse>` object.
@@ -641,8 +652,11 @@ The client session supports the context manager protocol for self closing.
 
       :param url: Request URL, :class:`str` or :class:`~yarl.URL`
 
-      :param bool allow_redirects: If set to ``False``, do not follow redirects.
-                                   ``True`` by default (optional).
+      :param bool allow_redirects: Whether to process redirects or not.
+         When ``True``, redirects are followed and logged into
+         :attr:`ClientResponse.history`.
+         When ``False``, the original response is returned.
+         ``True`` by default (optional).
 
       :return ClientResponse: a :class:`client response
                               <ClientResponse>` object.
@@ -836,14 +850,21 @@ certification chaining.
 
 .. function:: request(method, url, *, params=None, data=None, \
                         json=None,\
-                        headers=None, cookies=None, auth=None, \
+                        cookies=None, headers=None, skip_auto_headers=None, auth=None, \
                         allow_redirects=True, max_redirects=10, \
-                        encoding='utf-8', \
-                        version=HttpVersion(major=1, minor=1), \
-                        compress=None, chunked=None, expect100=False, raise_for_status=False, \
+                        compress=False, chunked=None, expect100=False, raise_for_status=None, \
+                        read_until_eof=True, \
+                        proxy=None, proxy_auth=None, \
+                        timeout=sentinel, ssl=True, \
+                        server_hostname=None, \
+                        proxy_headers=None, \
+                        trace_request_ctx=None, \
                         read_bufsize=None, \
-                        connector=None, loop=None,\
-                        read_until_eof=True, timeout=sentinel)
+                        auto_decompress=None, \
+                        max_line_size=None, \
+                        max_field_size=None, \
+                        version=aiohttp.HttpVersion11, \
+                        connector=None)
    :async:
 
    Asynchronous context manager for performing an asynchronous HTTP
@@ -856,8 +877,20 @@ certification chaining.
                be encoded with :class:`~yarl.URL` (see :class:`~yarl.URL`
                to skip encoding).
 
-   :param dict params: Parameters to be sent in the query
-                       string of the new request (optional)
+   :param params: Mapping, iterable of tuple of *key*/*value* pairs or
+                  string to be sent as parameters in the query
+                  string of the new request. Ignored for subsequent
+                  redirected requests (optional)
+
+                  Allowed values are:
+
+                  - :class:`collections.abc.Mapping` e.g. :class:`dict`,
+                     :class:`multidict.MultiDict` or
+                     :class:`multidict.MultiDictProxy`
+                  - :class:`collections.abc.Iterable` e.g. :class:`tuple` or
+                     :class:`list`
+                  - :class:`str` with preferably url-encoded content
+                     (**Warning:** content will not be encoded by *aiohttp*)
 
    :param data: The data to send in the body of the request. This can be a
                 :class:`FormData` object or anything that can be passed into
@@ -867,25 +900,46 @@ certification chaining.
    :param json: Any json compatible python object (optional). *json* and *data*
                 parameters could not be used at the same time.
 
+   :param dict cookies: HTTP Cookies to send with the request (optional)
+
    :param dict headers: HTTP Headers to send with the request (optional)
 
-   :param dict cookies: Cookies to send with the request (optional)
+   :param skip_auto_headers: set of headers for which autogeneration
+      should be skipped.
+
+      *aiohttp* autogenerates headers like ``User-Agent`` or
+      ``Content-Type`` if these headers are not explicitly
+      passed. Using ``skip_auto_headers`` parameter allows to skip
+      that generation.
+
+      Iterable of :class:`str` or :class:`~multidict.istr`
+      (optional)
 
    :param aiohttp.BasicAuth auth: an object that represents HTTP Basic
                                   Authorization (optional)
 
-   :param bool allow_redirects: If set to ``False``, do not follow redirects.
-                                ``True`` by default (optional).
+   :param bool allow_redirects: Whether to process redirects or not.
+      When ``True``, redirects are followed (up to ``max_redirects`` times)
+      and logged into :attr:`ClientResponse.history` and ``trace_configs``.
+      When ``False``, the original response is returned.
+      ``True`` by default (optional).
 
-   :param aiohttp.protocol.HttpVersion version: Request HTTP version (optional)
+   :param int max_redirects: Maximum number of redirects to follow.
+      :exc:`TooManyRedirects` is raised if the number is exceeded.
+      Ignored when ``allow_redirects=False``.
+      ``10`` by default.
 
    :param bool compress: Set to ``True`` if request has to be compressed
-                         with deflate encoding.
-                         ``False`` instructs aiohttp to not compress data.
+                         with deflate encoding. If `compress` can not be combined
+                         with a *Content-Encoding* and *Content-Length* headers.
                          ``None`` by default (optional).
 
    :param int chunked: Enables chunked transfer encoding.
-                       ``None`` by default (optional).
+      It is up to the developer
+      to decide how to chunk data streams. If chunking is enabled, aiohttp
+      encodes the provided chunks in the "Transfer-encoding: chunked" format.
+      If *chunked* is set, then the *Transfer-encoding* and *content-length*
+      headers are disallowed. ``None`` by default (optional).
 
    :param bool expect100: Expect 100-continue response from server.
                           ``False`` by default (optional).
@@ -899,28 +953,60 @@ certification chaining.
 
       .. versionadded:: 3.4
 
-   :param aiohttp.BaseConnector connector: BaseConnector sub-class
-      instance to support connection pooling.
-
    :param bool read_until_eof: Read response until EOF if response
                                does not have Content-Length header.
                                ``True`` by default (optional).
 
+   :param proxy: Proxy URL, :class:`str` or :class:`~yarl.URL` (optional)
+
+   :param aiohttp.BasicAuth proxy_auth: an object that represents proxy HTTP
+                                        Basic Authorization (optional)
+
+   :param timeout: a :class:`ClientTimeout` settings structure, 300 seconds (5min)
+        total timeout, 30 seconds socket connect timeout by default.
+
+   :param ssl: SSL validation mode. ``True`` for default SSL check
+               (:func:`ssl.create_default_context` is used),
+               ``False`` for skip SSL certificate validation,
+               :class:`aiohttp.Fingerprint` for fingerprint
+               validation, :class:`ssl.SSLContext` for custom SSL
+               certificate validation.
+
+               Supersedes *verify_ssl*, *ssl_context* and
+               *fingerprint* parameters.
+
+   :param str server_hostname: Sets or overrides the host name that the
+      target server's certificate will be matched against.
+
+      See :py:meth:`asyncio.loop.create_connection`
+      for more information.
+
+   :param collections.abc.Mapping proxy_headers: HTTP headers to send to the proxy
+      if the parameter proxy has been provided.
+
+   :param trace_request_ctx: Object used to give as a kw param for each new
+      :class:`TraceConfig` object instantiated,
+      used to give information to the
+      tracers that is only available at request time.
+
    :param int read_bufsize: Size of the read buffer (:attr:`ClientResponse.content`).
                             ``None`` by default,
                             it means that the session global value is used.
 
       .. versionadded:: 3.7
 
-   :param timeout: a :class:`ClientTimeout` settings structure, 300 seconds (5min)
-        total timeout, 30 seconds socket connect timeout by default.
+   :param bool auto_decompress: Automatically decompress response body.
+      May be used to enable/disable auto decompression on a per-request basis.
 
-   :param loop: :ref:`event loop<asyncio-event-loop>`
-                used for processing HTTP requests.
-                If param is ``None``, :func:`asyncio.get_event_loop`
-                is used for getting default event loop.
+   :param int max_line_size: Maximum allowed size of lines in responses.
 
-      .. deprecated:: 2.0
+   :param int max_field_size: Maximum allowed size of header fields in responses.
+
+   :param aiohttp.protocol.HttpVersion version: Request HTTP version,
+      ``HTTP 1.1`` by default. (optional)
+
+   :param aiohttp.BaseConnector connector: BaseConnector sub-class
+      instance to support connection pooling. (optional)
 
    :return ClientResponse: a :class:`client response <ClientResponse>` object.
 
diff --git docs/contributing-admins.rst docs/contributing-admins.rst
index acfaebc0e97..b17cbe1019a 100644
--- docs/contributing-admins.rst
+++ docs/contributing-admins.rst
@@ -21,9 +21,9 @@ To create a new release:
 #. Run ``towncrier``.
 #. Check and cleanup the changes in ``CHANGES.rst``.
 #. Checkout a new branch: e.g. ``git checkout -b release/v3.8.6``
-#. Commit and create a PR. Once PR is merged, continue.
+#. Commit and create a PR. Verify the changelog and release notes look good on Read the Docs. Once PR is merged, continue.
 #. Go back to the release branch: e.g. ``git checkout 3.8 && git pull``
-#. Add a tag: e.g. ``git tag -a v3.8.6 -m 'Release 3.8.6'``
+#. Add a tag: e.g. ``git tag -a v3.8.6 -m 'Release 3.8.6' -s``
 #. Push the tag: e.g. ``git push origin v3.8.6``
 #. Monitor CI to ensure release process completes without errors.
 
@@ -49,6 +49,10 @@ first merge into the newer release branch (e.g. 3.8 into 3.9) and then to master
 
 Back on the original release branch, bump the version number and append ``.dev0`` in ``__init__.py``.
 
+Post the release announcement to social media:
+ - BlueSky: https://bsky.app/profile/aiohttp.org and re-post to https://bsky.app/profile/aio-libs.org
+ - Mastodon: https://fosstodon.org/@aiohttp and re-post to https://fosstodon.org/@aio_libs
+
 If doing a minor release:
 
 #. Create a new release branch for future features to go to: e.g. ``git checkout -b 3.10 3.9 && git push``
diff --git docs/spelling_wordlist.txt docs/spelling_wordlist.txt
index a1f3d944584..59ea99c40bb 100644
--- docs/spelling_wordlist.txt
+++ docs/spelling_wordlist.txt
@@ -13,6 +13,8 @@ app
 app’s
 apps
 arg
+args
+armv
 Arsenic
 async
 asyncio
@@ -169,6 +171,7 @@ keepaliving
 kib
 KiB
 kwarg
+kwargs
 latin
 lifecycle
 linux
@@ -199,6 +202,7 @@ multidicts
 Multidicts
 multipart
 Multipart
+musllinux
 mypy
 Nagle
 Nagle’s
@@ -245,6 +249,7 @@ py
 pydantic
 pyenv
 pyflakes
+pyright
 pytest
 Pytest
 Quickstart
diff --git docs/third_party.rst docs/third_party.rst
index e8095c7f09d..145a505a5de 100644
--- docs/third_party.rst
+++ docs/third_party.rst
@@ -305,3 +305,6 @@ ask to raise the status.
 
 - `aiohttp-asgi-connector <https://github.com/thearchitector/aiohttp-asgi-connector>`_
   An aiohttp connector for using a ``ClientSession`` to interface directly with separate ASGI applications.
+
+- `aiohttp-openmetrics <https://github.com/jelmer/aiohttp-openmetrics>`_
+  An aiohttp middleware for exposing Prometheus metrics.
diff --git requirements/base.txt requirements/base.txt
index 1e7c0bbe6c1..d79bdab3893 100644
--- requirements/base.txt
+++ requirements/base.txt
@@ -30,7 +30,7 @@ multidict==6.1.0
     # via
     #   -r requirements/runtime-deps.in
     #   yarl
-packaging==24.1
+packaging==24.2
     # via gunicorn
 propcache==0.2.0
     # via
diff --git requirements/constraints.txt requirements/constraints.txt
index d32acc7b773..041a3737ab0 100644
--- requirements/constraints.txt
+++ requirements/constraints.txt
@@ -14,7 +14,7 @@ aiohttp-theme==0.1.7
     # via -r requirements/doc.in
 aiosignal==1.3.1
     # via -r requirements/runtime-deps.in
-alabaster==0.7.13
+alabaster==1.0.0
     # via sphinx
 annotated-types==0.7.0
     # via pydantic
@@ -129,7 +129,7 @@ mypy-extensions==1.0.0
     # via mypy
 nodeenv==1.9.1
     # via pre-commit
-packaging==24.1
+packaging==24.2
     # via
     #   build
     #   gunicorn
@@ -236,22 +236,22 @@ slotscheck==0.19.1
     # via -r requirements/lint.in
 snowballstemmer==2.2.0
     # via sphinx
-sphinx==7.1.2
+sphinx==8.1.3
     # via
     #   -r requirements/doc.in
     #   sphinxcontrib-spelling
     #   sphinxcontrib-towncrier
-sphinxcontrib-applehelp==1.0.4
+sphinxcontrib-applehelp==2.0.0
     # via sphinx
-sphinxcontrib-devhelp==1.0.2
+sphinxcontrib-devhelp==2.0.0
     # via sphinx
-sphinxcontrib-htmlhelp==2.0.1
+sphinxcontrib-htmlhelp==2.1.0
     # via sphinx
 sphinxcontrib-jsmath==1.0.1
     # via sphinx
-sphinxcontrib-qthelp==1.0.3
+sphinxcontrib-qthelp==2.0.0
     # via sphinx
-sphinxcontrib-serializinghtml==1.1.5
+sphinxcontrib-serializinghtml==2.0.0
     # via sphinx
 sphinxcontrib-spelling==8.0.0 ; platform_system != "Windows"
     # via -r requirements/doc-spelling.in
diff --git requirements/dev.txt requirements/dev.txt
index 168ce639d19..a99644dff81 100644
--- requirements/dev.txt
+++ requirements/dev.txt
@@ -14,7 +14,7 @@ aiohttp-theme==0.1.7
     # via -r requirements/doc.in
 aiosignal==1.3.1
     # via -r requirements/runtime-deps.in
-alabaster==0.7.13
+alabaster==1.0.0
     # via sphinx
 annotated-types==0.7.0
     # via pydantic
@@ -122,7 +122,7 @@ mypy-extensions==1.0.0
     # via mypy
 nodeenv==1.9.1
     # via pre-commit
-packaging==24.1
+packaging==24.2
     # via
     #   build
     #   gunicorn
@@ -210,21 +210,21 @@ slotscheck==0.19.1
     # via -r requirements/lint.in
 snowballstemmer==2.2.0
     # via sphinx
-sphinx==7.1.2
+sphinx==8.1.3
     # via
     #   -r requirements/doc.in
     #   sphinxcontrib-towncrier
-sphinxcontrib-applehelp==1.0.4
+sphinxcontrib-applehelp==2.0.0
     # via sphinx
-sphinxcontrib-devhelp==1.0.2
+sphinxcontrib-devhelp==2.0.0
     # via sphinx
-sphinxcontrib-htmlhelp==2.0.1
+sphinxcontrib-htmlhelp==2.1.0
     # via sphinx
 sphinxcontrib-jsmath==1.0.1
     # via sphinx
-sphinxcontrib-qthelp==1.0.3
+sphinxcontrib-qthelp==2.0.0
     # via sphinx
-sphinxcontrib-serializinghtml==1.1.5
+sphinxcontrib-serializinghtml==2.0.0
     # via sphinx
 sphinxcontrib-towncrier==0.4.0a0
     # via -r requirements/doc.in
diff --git requirements/doc-spelling.txt requirements/doc-spelling.txt
index df393012548..43b3822706e 100644
--- requirements/doc-spelling.txt
+++ requirements/doc-spelling.txt
@@ -6,7 +6,7 @@
 #
 aiohttp-theme==0.1.7
     # via -r requirements/doc.in
-alabaster==0.7.13
+alabaster==1.0.0
     # via sphinx
 babel==2.16.0
     # via sphinx
@@ -34,7 +34,7 @@ jinja2==3.1.4
     #   towncrier
 markupsafe==2.1.5
     # via jinja2
-packaging==24.1
+packaging==24.2
     # via sphinx
 pyenchant==3.2.2
     # via sphinxcontrib-spelling
@@ -46,22 +46,22 @@ requests==2.32.3
     # via sphinx
 snowballstemmer==2.2.0
     # via sphinx
-sphinx==7.1.2
+sphinx==8.1.3
     # via
     #   -r requirements/doc.in
     #   sphinxcontrib-spelling
     #   sphinxcontrib-towncrier
-sphinxcontrib-applehelp==1.0.4
+sphinxcontrib-applehelp==2.0.0
     # via sphinx
-sphinxcontrib-devhelp==1.0.2
+sphinxcontrib-devhelp==2.0.0
     # via sphinx
-sphinxcontrib-htmlhelp==2.0.1
+sphinxcontrib-htmlhelp==2.1.0
     # via sphinx
 sphinxcontrib-jsmath==1.0.1
     # via sphinx
-sphinxcontrib-qthelp==1.0.3
+sphinxcontrib-qthelp==2.0.0
     # via sphinx
-sphinxcontrib-serializinghtml==1.1.5
+sphinxcontrib-serializinghtml==2.0.0
     # via sphinx
 sphinxcontrib-spelling==8.0.0 ; platform_system != "Windows"
     # via -r requirements/doc-spelling.in
diff --git requirements/doc.txt requirements/doc.txt
index 43b7c6b7e8b..6ddfc47455b 100644
--- requirements/doc.txt
+++ requirements/doc.txt
@@ -6,7 +6,7 @@
 #
 aiohttp-theme==0.1.7
     # via -r requirements/doc.in
-alabaster==0.7.13
+alabaster==1.0.0
     # via sphinx
 babel==2.16.0
     # via sphinx
@@ -34,7 +34,7 @@ jinja2==3.1.4
     #   towncrier
 markupsafe==2.1.5
     # via jinja2
-packaging==24.1
+packaging==24.2
     # via sphinx
 pygments==2.18.0
     # via sphinx
@@ -44,21 +44,21 @@ requests==2.32.3
     # via sphinx
 snowballstemmer==2.2.0
     # via sphinx
-sphinx==7.1.2
+sphinx==8.1.3
     # via
     #   -r requirements/doc.in
     #   sphinxcontrib-towncrier
-sphinxcontrib-applehelp==1.0.4
+sphinxcontrib-applehelp==2.0.0
     # via sphinx
-sphinxcontrib-devhelp==1.0.2
+sphinxcontrib-devhelp==2.0.0
     # via sphinx
-sphinxcontrib-htmlhelp==2.0.1
+sphinxcontrib-htmlhelp==2.1.0
     # via sphinx
 sphinxcontrib-jsmath==1.0.1
     # via sphinx
-sphinxcontrib-qthelp==1.0.3
+sphinxcontrib-qthelp==2.0.0
     # via sphinx
-sphinxcontrib-serializinghtml==1.1.5
+sphinxcontrib-serializinghtml==2.0.0
     # via sphinx
 sphinxcontrib-towncrier==0.4.0a0
     # via -r requirements/doc.in
diff --git requirements/lint.txt requirements/lint.txt
index d7d97277bce..e2547d13da5 100644
--- requirements/lint.txt
+++ requirements/lint.txt
@@ -55,7 +55,7 @@ mypy-extensions==1.0.0
     # via mypy
 nodeenv==1.9.1
     # via pre-commit
-packaging==24.1
+packaging==24.2
     # via pytest
 platformdirs==4.3.6
     # via virtualenv
diff --git requirements/test.txt requirements/test.txt
index 33510f18682..cf81a7bf257 100644
--- requirements/test.txt
+++ requirements/test.txt
@@ -70,7 +70,7 @@ mypy==1.11.2 ; implementation_name == "cpython"
     # via -r requirements/test.in
 mypy-extensions==1.0.0
     # via mypy
-packaging==24.1
+packaging==24.2
     # via
     #   gunicorn
     #   pytest
diff --git tests/conftest.py tests/conftest.py
index 44ae384b633..95a98cd4fc0 100644
--- tests/conftest.py
+++ tests/conftest.py
@@ -221,6 +221,7 @@ def start_connection():
         "aiohttp.connector.aiohappyeyeballs.start_connection",
         autospec=True,
         spec_set=True,
+        return_value=mock.create_autospec(socket.socket, spec_set=True, instance=True),
     ) as start_connection_mock:
         yield start_connection_mock
 
diff --git a/tests/isolated/check_for_client_response_leak.py b/tests/isolated/check_for_client_response_leak.py
new file mode 100644
index 00000000000..67393c2c2d8
--- /dev/null
+++ tests/isolated/check_for_client_response_leak.py
@@ -0,0 +1,47 @@
+import asyncio
+import contextlib
+import gc
+import sys
+
+from aiohttp import ClientError, ClientSession, web
+from aiohttp.test_utils import get_unused_port_socket
+
+gc.set_debug(gc.DEBUG_LEAK)
+
+
+async def main() -> None:
+    app = web.Application()
+
+    async def stream_handler(request: web.Request) -> web.Response:
+        assert request.transport is not None
+        request.transport.close()  # Forcefully closing connection
+        return web.Response()
+
+    app.router.add_get("/stream", stream_handler)
+    sock = get_unused_port_socket("127.0.0.1")
+    port = sock.getsockname()[1]
+
+    runner = web.AppRunner(app)
+    await runner.setup()
+    site = web.SockSite(runner, sock)
+    await site.start()
+
+    session = ClientSession()
+
+    async def fetch_stream(url: str) -> None:
+        """Fetch a stream and read a few bytes from it."""
+        with contextlib.suppress(ClientError):
+            await session.get(url)
+
+    client_task = asyncio.create_task(fetch_stream(f"http://localhost:{port}/stream"))
+    await client_task
+    gc.collect()
+    client_response_present = any(
+        type(obj).__name__ == "ClientResponse" for obj in gc.garbage
+    )
+    await session.close()
+    await runner.cleanup()
+    sys.exit(1 if client_response_present else 0)
+
+
+asyncio.run(main())
diff --git a/tests/isolated/check_for_request_leak.py b/tests/isolated/check_for_request_leak.py
new file mode 100644
index 00000000000..6f340a05277
--- /dev/null
+++ tests/isolated/check_for_request_leak.py
@@ -0,0 +1,41 @@
+import asyncio
+import gc
+import sys
+from typing import NoReturn
+
+from aiohttp import ClientSession, web
+from aiohttp.test_utils import get_unused_port_socket
+
+gc.set_debug(gc.DEBUG_LEAK)
+
+
+async def main() -> None:
+    app = web.Application()
+
+    async def handler(request: web.Request) -> NoReturn:
+        await request.json()
+        assert False
+
+    app.router.add_route("GET", "/json", handler)
+    sock = get_unused_port_socket("127.0.0.1")
+    port = sock.getsockname()[1]
+
+    runner = web.AppRunner(app)
+    await runner.setup()
+    site = web.SockSite(runner, sock)
+    await site.start()
+
+    async with ClientSession() as session:
+        async with session.get(f"http://127.0.0.1:{port}/json") as resp:
+            await resp.read()
+
+    # Give time for the cancelled task to be collected
+    await asyncio.sleep(0.5)
+    gc.collect()
+    request_present = any(type(obj).__name__ == "Request" for obj in gc.garbage)
+    await session.close()
+    await runner.cleanup()
+    sys.exit(1 if request_present else 0)
+
+
+asyncio.run(main())
diff --git tests/test_benchmarks_client.py tests/test_benchmarks_client.py
index 61439183334..ae89bc1f667 100644
--- tests/test_benchmarks_client.py
+++ tests/test_benchmarks_client.py
@@ -124,7 +124,7 @@ def test_one_hundred_get_requests_with_512kib_chunked_payload(
     aiohttp_client: AiohttpClient,
     benchmark: BenchmarkFixture,
 ) -> None:
-    """Benchmark 100 GET requests with a payload of 512KiB."""
+    """Benchmark 100 GET requests with a payload of 512KiB using read."""
     message_count = 100
     payload = b"a" * (2**19)
 
@@ -148,6 +148,36 @@ def _run() -> None:
         loop.run_until_complete(run_client_benchmark())
 
 
+def test_one_hundred_get_requests_iter_chunks_on_512kib_chunked_payload(
+    loop: asyncio.AbstractEventLoop,
+    aiohttp_client: AiohttpClient,
+    benchmark: BenchmarkFixture,
+) -> None:
+    """Benchmark 100 GET requests with a payload of 512KiB using iter_chunks."""
+    message_count = 100
+    payload = b"a" * (2**19)
+
+    async def handler(request: web.Request) -> web.Response:
+        resp = web.Response(body=payload)
+        resp.enable_chunked_encoding()
+        return resp
+
+    app = web.Application()
+    app.router.add_route("GET", "/", handler)
+
+    async def run_client_benchmark() -> None:
+        client = await aiohttp_client(app)
+        for _ in range(message_count):
+            resp = await client.get("/")
+            async for _ in resp.content.iter_chunks():
+                pass
+        await client.close()
+
+    @benchmark
+    def _run() -> None:
+        loop.run_until_complete(run_client_benchmark())
+
+
 def test_get_request_with_251308_compressed_chunked_payload(
     loop: asyncio.AbstractEventLoop,
     aiohttp_client: AiohttpClient,
@@ -289,3 +319,30 @@ async def run_client_benchmark() -> None:
     @benchmark
     def _run() -> None:
         loop.run_until_complete(run_client_benchmark())
+
+
+def test_one_hundred_json_post_requests(
+    loop: asyncio.AbstractEventLoop,
+    aiohttp_client: AiohttpClient,
+    benchmark: BenchmarkFixture,
+) -> None:
+    """Benchmark 100 JSON POST requests that check the content-type."""
+    message_count = 100
+
+    async def handler(request: web.Request) -> web.Response:
+        _ = request.content_type
+        _ = request.charset
+        return web.Response()
+
+    app = web.Application()
+    app.router.add_route("POST", "/", handler)
+
+    async def run_client_benchmark() -> None:
+        client = await aiohttp_client(app)
+        for _ in range(message_count):
+            await client.post("/", json={"key": "value"})
+        await client.close()
+
+    @benchmark
+    def _run() -> None:
+        loop.run_until_complete(run_client_benchmark())
diff --git a/tests/test_benchmarks_web_fileresponse.py b/tests/test_benchmarks_web_fileresponse.py
new file mode 100644
index 00000000000..01aa7448c86
--- /dev/null
+++ tests/test_benchmarks_web_fileresponse.py
@@ -0,0 +1,105 @@
+"""codspeed benchmarks for the web file responses."""
+
+import asyncio
+import pathlib
+
+from multidict import CIMultiDict
+from pytest_codspeed import BenchmarkFixture
+
+from aiohttp import ClientResponse, web
+from aiohttp.pytest_plugin import AiohttpClient
+
+
+def test_simple_web_file_response(
+    loop: asyncio.AbstractEventLoop,
+    aiohttp_client: AiohttpClient,
+    benchmark: BenchmarkFixture,
+) -> None:
+    """Benchmark creating 100 simple web.FileResponse."""
+    response_count = 100
+    filepath = pathlib.Path(__file__).parent / "sample.txt"
+
+    async def handler(request: web.Request) -> web.FileResponse:
+        return web.FileResponse(path=filepath)
+
+    app = web.Application()
+    app.router.add_route("GET", "/", handler)
+
+    async def run_file_response_benchmark() -> None:
+        client = await aiohttp_client(app)
+        for _ in range(response_count):
+            await client.get("/")
+        await client.close()
+
+    @benchmark
+    def _run() -> None:
+        loop.run_until_complete(run_file_response_benchmark())
+
+
+def test_simple_web_file_sendfile_fallback_response(
+    loop: asyncio.AbstractEventLoop,
+    aiohttp_client: AiohttpClient,
+    benchmark: BenchmarkFixture,
+) -> None:
+    """Benchmark creating 100 simple web.FileResponse without sendfile."""
+    response_count = 100
+    filepath = pathlib.Path(__file__).parent / "sample.txt"
+
+    async def handler(request: web.Request) -> web.FileResponse:
+        transport = request.transport
+        assert transport is not None
+        transport._sendfile_compatible = False  # type: ignore[attr-defined]
+        return web.FileResponse(path=filepath)
+
+    app = web.Application()
+    app.router.add_route("GET", "/", handler)
+
+    async def run_file_response_benchmark() -> None:
+        client = await aiohttp_client(app)
+        for _ in range(response_count):
+            await client.get("/")
+        await client.close()
+
+    @benchmark
+    def _run() -> None:
+        loop.run_until_complete(run_file_response_benchmark())
+
+
+def test_simple_web_file_response_not_modified(
+    loop: asyncio.AbstractEventLoop,
+    aiohttp_client: AiohttpClient,
+    benchmark: BenchmarkFixture,
+) -> None:
+    """Benchmark web.FileResponse that return a 304."""
+    response_count = 100
+    filepath = pathlib.Path(__file__).parent / "sample.txt"
+
+    async def handler(request: web.Request) -> web.FileResponse:
+        return web.FileResponse(path=filepath)
+
+    app = web.Application()
+    app.router.add_route("GET", "/", handler)
+
+    async def make_last_modified_header() -> CIMultiDict[str]:
+        client = await aiohttp_client(app)
+        resp = await client.get("/")
+        last_modified = resp.headers["Last-Modified"]
+        headers = CIMultiDict({"If-Modified-Since": last_modified})
+        return headers
+
+    async def run_file_response_benchmark(
+        headers: CIMultiDict[str],
+    ) -> ClientResponse:
+        client = await aiohttp_client(app)
+        for _ in range(response_count):
+            resp = await client.get("/", headers=headers)
+
+        await client.close()
+        return resp  # type: ignore[possibly-undefined]
+
+    headers = loop.run_until_complete(make_last_modified_header())
+
+    @benchmark
+    def _run() -> None:
+        resp = loop.run_until_complete(run_file_response_benchmark(headers))
+        assert resp.status == 304
diff --git tests/test_client_functional.py tests/test_client_functional.py
index b34ccdb600d..ba75e8e93c6 100644
--- tests/test_client_functional.py
+++ tests/test_client_functional.py
@@ -603,6 +603,30 @@ async def handler(request):
     assert txt == "Test message"
 
 
+async def test_ssl_client_alpn(
+    aiohttp_server: AiohttpServer,
+    aiohttp_client: AiohttpClient,
+    ssl_ctx: ssl.SSLContext,
+) -> None:
+
+    async def handler(request: web.Request) -> web.Response:
+        assert request.transport is not None
+        sslobj = request.transport.get_extra_info("ssl_object")
+        return web.Response(text=sslobj.selected_alpn_protocol())
+
+    app = web.Application()
+    app.router.add_route("GET", "/", handler)
+    ssl_ctx.set_alpn_protocols(("http/1.1",))
+    server = await aiohttp_server(app, ssl=ssl_ctx)
+
+    connector = aiohttp.TCPConnector(ssl=False)
+    client = await aiohttp_client(server, connector=connector)
+    resp = await client.get("/")
+    assert resp.status == 200
+    txt = await resp.text()
+    assert txt == "http/1.1"
+
+
 async def test_tcp_connector_fingerprint_ok(
     aiohttp_server,
     aiohttp_client,
@@ -3358,6 +3382,22 @@ async def handler(request: web.Request) -> web.Response:
     await server.close()
 
 
+async def test_aiohttp_request_ssl(
+    aiohttp_server: AiohttpServer,
+    ssl_ctx: ssl.SSLContext,
+    client_ssl_ctx: ssl.SSLContext,
+) -> None:
+    async def handler(request: web.Request) -> web.Response:
+        return web.Response()
+
+    app = web.Application()
+    app.router.add_get("/", handler)
+    server = await aiohttp_server(app, ssl=ssl_ctx)
+
+    async with aiohttp.request("GET", server.make_url("/"), ssl=client_ssl_ctx) as resp:
+        assert resp.status == 200
+
+
 async def test_yield_from_in_session_request(aiohttp_client: AiohttpClient) -> None:
     # a test for backward compatibility with yield from syntax
     async def handler(request):
diff --git tests/test_client_session.py tests/test_client_session.py
index 65f80b6abe9..6309c5daf2e 100644
--- tests/test_client_session.py
+++ tests/test_client_session.py
@@ -15,13 +15,14 @@
 from yarl import URL
 
 import aiohttp
-from aiohttp import client, hdrs, web
+from aiohttp import CookieJar, client, hdrs, web
 from aiohttp.client import ClientSession
 from aiohttp.client_proto import ResponseHandler
 from aiohttp.client_reqrep import ClientRequest
 from aiohttp.connector import BaseConnector, Connection, TCPConnector, UnixConnector
 from aiohttp.helpers import DEBUG
 from aiohttp.http import RawResponseMessage
+from aiohttp.pytest_plugin import AiohttpServer
 from aiohttp.test_utils import make_mocked_coro
 from aiohttp.tracing import Trace
 
@@ -634,8 +635,24 @@ async def handler(request):
     assert resp_cookies["response"].value == "resp_value"
 
 
-async def test_session_default_version(loop) -> None:
-    session = aiohttp.ClientSession(loop=loop)
+async def test_cookies_with_not_quoted_cookie_jar(
+    aiohttp_server: AiohttpServer,
+) -> None:
+    async def handler(_: web.Request) -> web.Response:
+        return web.Response()
+
+    app = web.Application()
+    app.router.add_route("GET", "/", handler)
+    server = await aiohttp_server(app)
+    jar = CookieJar(quote_cookie=False)
+    cookies = {"name": "val=foobar"}
+    async with aiohttp.ClientSession(cookie_jar=jar) as sess:
+        resp = await sess.request("GET", server.make_url("/"), cookies=cookies)
+    assert resp.request_info.headers.get("Cookie", "") == "name=val=foobar"
+
+
+async def test_session_default_version(loop: asyncio.AbstractEventLoop) -> None:
+    session = aiohttp.ClientSession()
     assert session.version == aiohttp.HttpVersion11
     await session.close()
 
diff --git tests/test_client_ws_functional.py tests/test_client_ws_functional.py
index 7ede7432adf..54cd5e92f80 100644
--- tests/test_client_ws_functional.py
+++ tests/test_client_ws_functional.py
@@ -902,6 +902,7 @@ async def handler(request):
         assert resp.close_code is WSCloseCode.ABNORMAL_CLOSURE
         assert msg.type is WSMsgType.ERROR
         assert isinstance(msg.data, ServerTimeoutError)
+        assert str(msg.data) == "No PONG received after 0.05 seconds"
 
 
 async def test_close_websocket_while_ping_inflight(
diff --git tests/test_connector.py tests/test_connector.py
index 483759a4180..a86a2417423 100644
--- tests/test_connector.py
+++ tests/test_connector.py
@@ -617,6 +617,56 @@ async def certificate_error(*args, **kwargs):
     await conn.close()
 
 
+async def test_tcp_connector_closes_socket_on_error(
+    loop: asyncio.AbstractEventLoop, start_connection: mock.AsyncMock
+) -> None:
+    req = ClientRequest("GET", URL("https://127.0.0.1:443"), loop=loop)
+
+    conn = aiohttp.TCPConnector()
+    with (
+        mock.patch.object(
+            conn._loop,
+            "create_connection",
+            autospec=True,
+            spec_set=True,
+            side_effect=ValueError,
+        ),
+        pytest.raises(ValueError),
+    ):
+        await conn.connect(req, [], ClientTimeout())
+
+    assert start_connection.return_value.close.called
+
+    await conn.close()
+
+
+async def test_tcp_connector_closes_socket_on_error_results_in_another_error(
+    loop: asyncio.AbstractEventLoop, start_connection: mock.AsyncMock
+) -> None:
+    """Test that when error occurs while closing the socket."""
+    req = ClientRequest("GET", URL("https://127.0.0.1:443"), loop=loop)
+    start_connection.return_value.close.side_effect = OSError(
+        1, "error from closing socket"
+    )
+
+    conn = aiohttp.TCPConnector()
+    with (
+        mock.patch.object(
+            conn._loop,
+            "create_connection",
+            autospec=True,
+            spec_set=True,
+            side_effect=ValueError,
+        ),
+        pytest.raises(aiohttp.ClientConnectionError, match="error from closing socket"),
+    ):
+        await conn.connect(req, [], ClientTimeout())
+
+    assert start_connection.return_value.close.called
+
+    await conn.close()
+
+
 async def test_tcp_connector_server_hostname_default(
     loop: Any, start_connection: mock.AsyncMock
 ) -> None:
@@ -3474,6 +3524,61 @@ async def send_dns_cache_hit(self, *args: object, **kwargs: object) -> None:
     await connector.close()
 
 
+async def test_connector_resolve_in_case_of_trace_cache_miss_exception(
+    loop: asyncio.AbstractEventLoop,
+) -> None:
+    token: ResolveResult = {
+        "hostname": "localhost",
+        "host": "127.0.0.1",
+        "port": 80,
+        "family": socket.AF_INET,
+        "proto": 0,
+        "flags": socket.AI_NUMERICHOST,
+    }
+
+    request_count = 0
+
+    class DummyTracer(Trace):
+        def __init__(self) -> None:
+            """Dummy"""
+
+        async def send_dns_cache_hit(self, *args: object, **kwargs: object) -> None:
+            """Dummy send_dns_cache_hit"""
+
+        async def send_dns_resolvehost_start(
+            self, *args: object, **kwargs: object
+        ) -> None:
+            """Dummy send_dns_resolvehost_start"""
+
+        async def send_dns_resolvehost_end(
+            self, *args: object, **kwargs: object
+        ) -> None:
+            """Dummy send_dns_resolvehost_end"""
+
+        async def send_dns_cache_miss(self, *args: object, **kwargs: object) -> None:
+            nonlocal request_count
+            request_count += 1
+            if request_count <= 1:
+                raise Exception("first attempt")
+
+    async def resolve_response() -> List[ResolveResult]:
+        await asyncio.sleep(0)
+        return [token]
+
+    with mock.patch("aiohttp.connector.DefaultResolver") as m_resolver:
+        m_resolver().resolve.return_value = resolve_response()
+
+        connector = TCPConnector()
+        traces = [DummyTracer()]
+
+        with pytest.raises(Exception):
+            await connector._resolve_host("", 0, traces)
+
+        await connector._resolve_host("", 0, traces) == [token]
+
+    await connector.close()
+
+
 async def test_connector_does_not_remove_needed_waiters(
     loop: asyncio.AbstractEventLoop, key: ConnectionKey
 ) -> None:
diff --git tests/test_cookiejar.py tests/test_cookiejar.py
index bdcf54fa796..0b440bc2ca6 100644
--- tests/test_cookiejar.py
+++ tests/test_cookiejar.py
@@ -807,6 +807,7 @@ async def make_jar():
 async def test_dummy_cookie_jar() -> None:
     cookie = SimpleCookie("foo=bar; Domain=example.com;")
     dummy_jar = DummyCookieJar()
+    assert dummy_jar.quote_cookie is True
     assert len(dummy_jar) == 0
     dummy_jar.update_cookies(cookie)
     assert len(dummy_jar) == 0
diff --git tests/test_flowcontrol_streams.py tests/test_flowcontrol_streams.py
index 68e623b6dd7..9874cc2511e 100644
--- tests/test_flowcontrol_streams.py
+++ tests/test_flowcontrol_streams.py
@@ -4,6 +4,7 @@
 import pytest
 
 from aiohttp import streams
+from aiohttp.base_protocol import BaseProtocol
 
 
 @pytest.fixture
@@ -112,6 +113,15 @@ async def test_read_nowait(self, stream) -> None:
         assert res == b""
         assert stream._protocol.resume_reading.call_count == 1  # type: ignore[attr-defined]
 
+    async def test_resumed_on_eof(self, stream: streams.StreamReader) -> None:
+        stream.feed_data(b"data")
+        assert stream._protocol.pause_reading.call_count == 1  # type: ignore[attr-defined]
+        assert stream._protocol.resume_reading.call_count == 0  # type: ignore[attr-defined]
+        stream._protocol._reading_paused = True
+
+        stream.feed_eof()
+        assert stream._protocol.resume_reading.call_count == 1  # type: ignore[attr-defined]
+
 
 async def test_flow_control_data_queue_waiter_cancelled(
     buffer: streams.FlowControlDataQueue,
@@ -180,3 +190,16 @@ async def test_flow_control_data_queue_read_eof(
     buffer.feed_eof()
     with pytest.raises(streams.EofStream):
         await buffer.read()
+
+
+async def test_stream_reader_eof_when_full() -> None:
+    loop = asyncio.get_event_loop()
+    protocol = BaseProtocol(loop=loop)
+    protocol.transport = asyncio.Transport()
+    stream = streams.StreamReader(protocol, 1024, loop=loop)
+
+    data_len = stream._high_water + 1
+    stream.feed_data(b"0" * data_len)
+    assert protocol._reading_paused
+    stream.feed_eof()
+    assert not protocol._reading_paused
diff --git tests/test_http_writer.py tests/test_http_writer.py
index 0ed0e615700..c39fe3c7251 100644
--- tests/test_http_writer.py
+++ tests/test_http_writer.py
@@ -2,7 +2,7 @@
 import array
 import asyncio
 import zlib
-from typing import Iterable
+from typing import Generator, Iterable
 from unittest import mock
 
 import pytest
@@ -14,7 +14,25 @@
 
 
 @pytest.fixture
-def buf():
+def enable_writelines() -> Generator[None, None, None]:
+    with mock.patch("aiohttp.http_writer.SKIP_WRITELINES", False):
+        yield
+
+
+@pytest.fixture
+def disable_writelines() -> Generator[None, None, None]:
+    with mock.patch("aiohttp.http_writer.SKIP_WRITELINES", True):
+        yield
+
+
+@pytest.fixture
+def force_writelines_small_payloads() -> Generator[None, None, None]:
+    with mock.patch("aiohttp.http_writer.MIN_PAYLOAD_FOR_WRITELINES", 1):
+        yield
+
+
+@pytest.fixture
+def buf() -> bytearray:
     return bytearray()
 
 
@@ -92,6 +110,7 @@ async def test_write_payload_length(protocol, transport, loop) -> None:
     assert b"da" == content.split(b"\r\n\r\n", 1)[-1]
 
 
+@pytest.mark.usefixtures("disable_writelines")
 async def test_write_large_payload_deflate_compression_data_in_eof(
     protocol: BaseProtocol,
     transport: asyncio.Transport,
@@ -100,6 +119,32 @@ async def test_write_large_payload_deflate_compression_data_in_eof(
     msg = http.StreamWriter(protocol, loop)
     msg.enable_compression("deflate")
 
+    await msg.write(b"data" * 4096)
+    assert transport.write.called  # type: ignore[attr-defined]
+    chunks = [c[1][0] for c in list(transport.write.mock_calls)]  # type: ignore[attr-defined]
+    transport.write.reset_mock()  # type: ignore[attr-defined]
+
+    # This payload compresses to 20447 bytes
+    payload = b"".join(
+        [bytes((*range(0, i), *range(i, 0, -1))) for i in range(255) for _ in range(64)]
+    )
+    await msg.write_eof(payload)
+    chunks.extend([c[1][0] for c in list(transport.write.mock_calls)])  # type: ignore[attr-defined]
+
+    assert all(chunks)
+    content = b"".join(chunks)
+    assert zlib.decompress(content) == (b"data" * 4096) + payload
+
+
+@pytest.mark.usefixtures("enable_writelines")
+async def test_write_large_payload_deflate_compression_data_in_eof_writelines(
+    protocol: BaseProtocol,
+    transport: asyncio.Transport,
+    loop: asyncio.AbstractEventLoop,
+) -> None:
+    msg = http.StreamWriter(protocol, loop)
+    msg.enable_compression("deflate")
+
     await msg.write(b"data" * 4096)
     assert transport.write.called  # type: ignore[attr-defined]
     chunks = [c[1][0] for c in list(transport.write.mock_calls)]  # type: ignore[attr-defined]
@@ -180,6 +225,26 @@ async def test_write_payload_deflate_compression_chunked(
     await msg.write(b"data")
     await msg.write_eof()
 
+    chunks = [c[1][0] for c in list(transport.write.mock_calls)]  # type: ignore[attr-defined]
+    assert all(chunks)
+    content = b"".join(chunks)
+    assert content == expected
+
+
+@pytest.mark.usefixtures("enable_writelines")
+@pytest.mark.usefixtures("force_writelines_small_payloads")
+async def test_write_payload_deflate_compression_chunked_writelines(
+    protocol: BaseProtocol,
+    transport: asyncio.Transport,
+    loop: asyncio.AbstractEventLoop,
+) -> None:
+    expected = b"2\r\nx\x9c\r\na\r\nKI,I\x04\x00\x04\x00\x01\x9b\r\n0\r\n\r\n"
+    msg = http.StreamWriter(protocol, loop)
+    msg.enable_compression("deflate")
+    msg.enable_chunking()
+    await msg.write(b"data")
+    await msg.write_eof()
+
     chunks = [b"".join(c[1][0]) for c in list(transport.writelines.mock_calls)]  # type: ignore[attr-defined]
     assert all(chunks)
     content = b"".join(chunks)
@@ -216,6 +281,26 @@ async def test_write_payload_deflate_compression_chunked_data_in_eof(
     await msg.write(b"data")
     await msg.write_eof(b"end")
 
+    chunks = [c[1][0] for c in list(transport.write.mock_calls)]  # type: ignore[attr-defined]
+    assert all(chunks)
+    content = b"".join(chunks)
+    assert content == expected
+
+
+@pytest.mark.usefixtures("enable_writelines")
+@pytest.mark.usefixtures("force_writelines_small_payloads")
+async def test_write_payload_deflate_compression_chunked_data_in_eof_writelines(
+    protocol: BaseProtocol,
+    transport: asyncio.Transport,
+    loop: asyncio.AbstractEventLoop,
+) -> None:
+    expected = b"2\r\nx\x9c\r\nd\r\nKI,IL\xcdK\x01\x00\x0b@\x02\xd2\r\n0\r\n\r\n"
+    msg = http.StreamWriter(protocol, loop)
+    msg.enable_compression("deflate")
+    msg.enable_chunking()
+    await msg.write(b"data")
+    await msg.write_eof(b"end")
+
     chunks = [b"".join(c[1][0]) for c in list(transport.writelines.mock_calls)]  # type: ignore[attr-defined]
     assert all(chunks)
     content = b"".join(chunks)
@@ -231,6 +316,34 @@ async def test_write_large_payload_deflate_compression_chunked_data_in_eof(
     msg.enable_compression("deflate")
     msg.enable_chunking()
 
+    await msg.write(b"data" * 4096)
+    # This payload compresses to 1111 bytes
+    payload = b"".join([bytes((*range(0, i), *range(i, 0, -1))) for i in range(255)])
+    await msg.write_eof(payload)
+
+    compressed = []
+    chunks = [c[1][0] for c in list(transport.write.mock_calls)]  # type: ignore[attr-defined]
+    chunked_body = b"".join(chunks)
+    split_body = chunked_body.split(b"\r\n")
+    while split_body:
+        if split_body.pop(0):
+            compressed.append(split_body.pop(0))
+
+    content = b"".join(compressed)
+    assert zlib.decompress(content) == (b"data" * 4096) + payload
+
+
+@pytest.mark.usefixtures("enable_writelines")
+@pytest.mark.usefixtures("force_writelines_small_payloads")
+async def test_write_large_payload_deflate_compression_chunked_data_in_eof_writelines(
+    protocol: BaseProtocol,
+    transport: asyncio.Transport,
+    loop: asyncio.AbstractEventLoop,
+) -> None:
+    msg = http.StreamWriter(protocol, loop)
+    msg.enable_compression("deflate")
+    msg.enable_chunking()
+
     await msg.write(b"data" * 4096)
     # This payload compresses to 1111 bytes
     payload = b"".join([bytes((*range(0, i), *range(i, 0, -1))) for i in range(255)])
diff --git tests/test_imports.py tests/test_imports.py
index 5a2bb76b03c..b3f545ad900 100644
--- tests/test_imports.py
+++ tests/test_imports.py
@@ -38,7 +38,7 @@ def test_web___all__(pytester: pytest.Pytester) -> None:
         # and even slower under pytest-xdist, especially in CI
         _XDIST_WORKER_COUNT * 100 * (1 if _IS_CI_ENV else 1.53)
         if _IS_XDIST_RUN
-        else 265
+        else 295
     ),
 }
 _TARGET_TIMINGS_BY_PYTHON_VERSION["3.13"] = _TARGET_TIMINGS_BY_PYTHON_VERSION["3.12"]
diff --git a/tests/test_leaks.py b/tests/test_leaks.py
new file mode 100644
index 00000000000..07b506bdb99
--- /dev/null
+++ tests/test_leaks.py
@@ -0,0 +1,37 @@
+import pathlib
+import platform
+import subprocess
+import sys
+
+import pytest
+
+IS_PYPY = platform.python_implementation() == "PyPy"
+
+
+@pytest.mark.skipif(IS_PYPY, reason="gc.DEBUG_LEAK not available on PyPy")
+@pytest.mark.parametrize(
+    ("script", "message"),
+    [
+        (
+            # Test that ClientResponse is collected after server disconnects.
+            # https://github.com/aio-libs/aiohttp/issues/10535
+            "check_for_client_response_leak.py",
+            "ClientResponse leaked",
+        ),
+        (
+            # Test that Request object is collected when the handler raises.
+            # https://github.com/aio-libs/aiohttp/issues/10548
+            "check_for_request_leak.py",
+            "Request leaked",
+        ),
+    ],
+)
+def test_leak(script: str, message: str) -> None:
+    """Run isolated leak test script and check for leaks."""
+    leak_test_script = pathlib.Path(__file__).parent.joinpath("isolated", script)
+
+    with subprocess.Popen(
+        [sys.executable, "-u", str(leak_test_script)],
+        stdout=subprocess.PIPE,
+    ) as proc:
+        assert proc.wait() == 0, message
diff --git tests/test_proxy.py tests/test_proxy.py
index 1679b68909f..83457de891f 100644
--- tests/test_proxy.py
+++ tests/test_proxy.py
@@ -207,6 +207,7 @@ async def make_conn():
         "aiohttp.connector.aiohappyeyeballs.start_connection",
         autospec=True,
         spec_set=True,
+        return_value=mock.create_autospec(socket.socket, spec_set=True, instance=True),
     )
     def test_proxy_connection_error(self, start_connection: Any) -> None:
         async def make_conn():
diff --git tests/test_streams.py tests/test_streams.py
index fcf13a91eb3..1b65f771c77 100644
--- tests/test_streams.py
+++ tests/test_streams.py
@@ -1141,6 +1141,7 @@ async def test_empty_stream_reader() -> None:
     with pytest.raises(asyncio.IncompleteReadError):
         await s.readexactly(10)
     assert s.read_nowait() == b""
+    assert s.total_bytes == 0
 
 
 async def test_empty_stream_reader_iter_chunks() -> None:
diff --git tests/test_urldispatch.py tests/test_urldispatch.py
index 8ee3df33202..ba6bdff23a0 100644
--- tests/test_urldispatch.py
+++ tests/test_urldispatch.py
@@ -358,7 +358,7 @@ def test_add_static_path_resolution(router: any) -> None:
     """Test that static paths are expanded and absolute."""
     res = router.add_static("/", "~/..")
     directory = str(res.get_info()["directory"])
-    assert directory == str(pathlib.Path.home().parent)
+    assert directory == str(pathlib.Path.home().resolve(strict=True).parent)
 
 
 def test_add_static(router) -> None:
diff --git tests/test_web_functional.py tests/test_web_functional.py
index a3a990141a1..e4979851300 100644
--- tests/test_web_functional.py
+++ tests/test_web_functional.py
@@ -2324,3 +2324,41 @@ async def handler(request: web.Request) -> web.Response:
         # Make 2nd request which will hit the race condition.
         async with client.get("/") as resp:
             assert resp.status == 200
+
+
+async def test_keepalive_expires_on_time(aiohttp_client: AiohttpClient) -> None:
+    """Test that the keepalive handle expires on time."""
+
+    async def handler(request: web.Request) -> web.Response:
+        body = await request.read()
+        assert b"" == body
+        return web.Response(body=b"OK")
+
+    app = web.Application()
+    app.router.add_route("GET", "/", handler)
+
+    connector = aiohttp.TCPConnector(limit=1)
+    client = await aiohttp_client(app, connector=connector)
+
+    loop = asyncio.get_running_loop()
+    now = loop.time()
+
+    # Patch loop time so we can control when the keepalive timeout is processed
+    with mock.patch.object(loop, "time") as loop_time_mock:
+        loop_time_mock.return_value = now
+        resp1 = await client.get("/")
+        await resp1.read()
+        request_handler = client.server.handler.connections[0]
+
+        # Ensure the keep alive handle is set
+        assert request_handler._keepalive_handle is not None
+
+        # Set the loop time to exactly the keepalive timeout
+        loop_time_mock.return_value = request_handler._next_keepalive_close_time
+
+        # sleep twice to ensure the keep alive timeout is processed
+        await asyncio.sleep(0)
+        await asyncio.sleep(0)
+
+        # Ensure the keep alive handle expires
+        assert request_handler._keepalive_handle is None
diff --git tests/test_web_response.py tests/test_web_response.py
index f4acf23f61b..0591426c57b 100644
--- tests/test_web_response.py
+++ tests/test_web_response.py
@@ -1201,7 +1201,7 @@ def read(self, size: int = -1) -> bytes:
         (BodyPartReader("x", CIMultiDictProxy(CIMultiDict()), mock.Mock()), None),
         (
             mpwriter,
-            "--x\nContent-Type: text/plain; charset=utf-8\r\nContent-Length: 4\r\n\r\ntest",
+            "--x\r\nContent-Type: text/plain; charset=utf-8\r\nContent-Length: 4\r\n\r\ntest",
         ),
     ),
 )
diff --git tests/test_web_server.py tests/test_web_server.py
index 7b9b87a374a..9098ef9e7bf 100644
--- tests/test_web_server.py
+++ tests/test_web_server.py
@@ -56,7 +56,9 @@ async def handler(request):
     assert txt.startswith("500 Internal Server Error")
     assert "Traceback" not in txt
 
-    logger.exception.assert_called_with("Error handling request", exc_info=exc)
+    logger.exception.assert_called_with(
+        "Error handling request from %s", cli.host, exc_info=exc
+    )
 
 
 async def test_raw_server_logs_invalid_method_with_loop_debug(
@@ -85,7 +87,9 @@ async def handler(request: web.BaseRequest) -> NoReturn:
     # on the first request since the client may
     # be probing for TLS/SSL support which is
     # expected to fail
-    logger.debug.assert_called_with("Error handling request", exc_info=exc)
+    logger.debug.assert_called_with(
+        "Error handling request from %s", cli.host, exc_info=exc
+    )
     logger.debug.reset_mock()
 
     # Now make another connection to the server
@@ -99,7 +103,9 @@ async def handler(request: web.BaseRequest) -> NoReturn:
     # on the first request since the client may
     # be probing for TLS/SSL support which is
     # expected to fail
-    logger.debug.assert_called_with("Error handling request", exc_info=exc)
+    logger.debug.assert_called_with(
+        "Error handling request from %s", cli.host, exc_info=exc
+    )
 
 
 async def test_raw_server_logs_invalid_method_without_loop_debug(
@@ -128,7 +134,9 @@ async def handler(request: web.BaseRequest) -> NoReturn:
     # on the first request since the client may
     # be probing for TLS/SSL support which is
     # expected to fail
-    logger.debug.assert_called_with("Error handling request", exc_info=exc)
+    logger.debug.assert_called_with(
+        "Error handling request from %s", cli.host, exc_info=exc
+    )
 
 
 async def test_raw_server_logs_invalid_method_second_request(
@@ -159,7 +167,9 @@ async def handler(request: web.BaseRequest) -> web.Response:
     # BadHttpMethod should be logged as an exception
     # if its not the first request since we know
     # that the client already was speaking HTTP
-    logger.exception.assert_called_with("Error handling request", exc_info=exc)
+    logger.exception.assert_called_with(
+        "Error handling request from %s", cli.host, exc_info=exc
+    )
 
 
 async def test_raw_server_logs_bad_status_line_as_exception(
@@ -184,7 +194,9 @@ async def handler(request: web.BaseRequest) -> NoReturn:
     txt = await resp.text()
     assert "Traceback (most recent call last):\n" not in txt
 
-    logger.exception.assert_called_with("Error handling request", exc_info=exc)
+    logger.exception.assert_called_with(
+        "Error handling request from %s", cli.host, exc_info=exc
+    )
 
 
 async def test_raw_server_handler_timeout(
@@ -221,6 +233,24 @@ async def handler(request):
     logger.debug.assert_called_with("Ignored premature client disconnection")
 
 
+async def test_raw_server_does_not_swallow_base_exceptions(
+    aiohttp_raw_server: AiohttpRawServer, aiohttp_client: AiohttpClient
+) -> None:
+    class UnexpectedException(BaseException):
+        """Dummy base exception."""
+
+    async def handler(request: web.BaseRequest) -> NoReturn:
+        raise UnexpectedException()
+
+    loop = asyncio.get_event_loop()
+    loop.set_debug(True)
+    server = await aiohttp_raw_server(handler)
+    cli = await aiohttp_client(server)
+
+    with pytest.raises(client.ServerDisconnectedError):
+        await cli.get("/path/to", timeout=client.ClientTimeout(10))
+
+
 async def test_raw_server_cancelled_in_write_eof(aiohttp_raw_server, aiohttp_client):
     async def handler(request):
         resp = web.Response(text=str(request.rel_url))
@@ -254,7 +284,9 @@ async def handler(request):
     txt = await resp.text()
     assert "Traceback (most recent call last):\n" in txt
 
-    logger.exception.assert_called_with("Error handling request", exc_info=exc)
+    logger.exception.assert_called_with(
+        "Error handling request from %s", cli.host, exc_info=exc
+    )
 
 
 async def test_raw_server_html_exception(aiohttp_raw_server, aiohttp_client):
@@ -278,7 +310,9 @@ async def handler(request):
         "</body></html>\n"
     )
 
-    logger.exception.assert_called_with("Error handling request", exc_info=exc)
+    logger.exception.assert_called_with(
+        "Error handling request from %s", cli.host, exc_info=exc
+    )
 
 
 async def test_raw_server_html_exception_debug(aiohttp_raw_server, aiohttp_client):
@@ -302,7 +336,9 @@ async def handler(request):
         "<pre>Traceback (most recent call last):\n"
     )
 
-    logger.exception.assert_called_with("Error handling request", exc_info=exc)
+    logger.exception.assert_called_with(
+        "Error handling request from %s", cli.host, exc_info=exc
+    )
 
 
 async def test_handler_cancellation(unused_port_socket: socket.socket) -> None:
diff --git tests/test_web_urldispatcher.py tests/test_web_urldispatcher.py
index 92066f09b7d..ee60b6917c5 100644
--- tests/test_web_urldispatcher.py
+++ tests/test_web_urldispatcher.py
@@ -585,16 +585,17 @@ async def test_access_mock_special_resource(
     my_special.touch()
 
     real_result = my_special.stat()
-    real_stat = pathlib.Path.stat
+    real_stat = os.stat
 
-    def mock_stat(self: pathlib.Path, **kwargs: Any) -> os.stat_result:
-        s = real_stat(self, **kwargs)
+    def mock_stat(path: Any, **kwargs: Any) -> os.stat_result:
+        s = real_stat(path, **kwargs)
         if os.path.samestat(s, real_result):
             mock_mode = S_IFIFO | S_IMODE(s.st_mode)
             s = os.stat_result([mock_mode] + list(s)[1:])
         return s
 
     monkeypatch.setattr("pathlib.Path.stat", mock_stat)
+    monkeypatch.setattr("os.stat", mock_stat)
 
     app = web.Application()
     app.router.add_static("/", str(tmp_path))
diff --git tests/test_web_websocket_functional.py tests/test_web_websocket_functional.py
index b7494d9265f..945096a2af3 100644
--- tests/test_web_websocket_functional.py
+++ tests/test_web_websocket_functional.py
@@ -797,6 +797,7 @@ async def handler(request: web.Request) -> NoReturn:
     assert ws.close_code == WSCloseCode.ABNORMAL_CLOSURE
     assert ws_server_close_code == WSCloseCode.ABNORMAL_CLOSURE
     assert isinstance(ws_server_exception, asyncio.TimeoutError)
+    assert str(ws_server_exception) == "No PONG received after 0.025 seconds"
     await ws.close()
 
 
diff --git tests/test_websocket_handshake.py tests/test_websocket_handshake.py
index bbfa1d9260d..53d5d9152bb 100644
--- tests/test_websocket_handshake.py
+++ tests/test_websocket_handshake.py
@@ -174,7 +174,7 @@ async def test_handshake_protocol_unsupported(caplog) -> None:
 
     assert (
         caplog.records[-1].msg
-        == "Client protocols %r don’t overlap server-known ones %r"
+        == "%s: Client protocols %r don’t overlap server-known ones %r"
     )
     assert ws.ws_protocol is None
 
diff --git tools/gen.py tools/gen.py
index ab2b39a2df0..24fb71bdd9d 100755
--- tools/gen.py
+++ tools/gen.py
@@ -7,7 +7,7 @@
 import multidict
 
 ROOT = pathlib.Path.cwd()
-while ROOT.parent != ROOT and not (ROOT / ".git").exists():
+while ROOT.parent != ROOT and not (ROOT / "pyproject.toml").exists():
     ROOT = ROOT.parent
 
 

Description

This PR includes several significant updates to the aiohttp library, including bug fixes, performance improvements, security patches and dependency updates. It advances the version from 3.11.9 to 3.11.14.

Possible Issues

  1. The changes to disable zero copy writes for certain Python versions (3.12 before 3.12.9 and 3.13 before 3.13.2) may impact performance on those versions.
  2. The changes to cycle reference breaking could potentially cause issues if applications were relying on the previous behavior.

Security Hotspots

  1. Disabling of zero copy writes for vulnerable Python versions (CVE-2024-12254) - Moderate risk but appropriately mitigated
  2. TLS/SSL context changes including ALPN protocol setting - Low risk but should be monitored
  3. Socket handling changes in connector error cases - Low risk but could impact error handling
Changes

Changes

  1. Client and Core Changes

    • Fixed DNS query delay issues
    • Added ALPN support for HTTP/1.1
    • Fixed cyclic references in connection handling
    • Improved error messages and logging
    • Added support for Python 3.13
  2. Web Response Changes

    • Improved FileResponse handling
    • Added better file content type detection
    • Fixed race conditions in file responses
  3. Test and CI Updates

    • Added new leak tests
    • Updated CI workflow to use newer GitHub actions
    • Added support for more platforms (armv7l musllinux)
  4. Documentation

    • Enhanced documentation for client methods
    • Added new third-party libraries
    • Updated contributor documentation
sequenceDiagram
    participant Client
    participant ClientSession
    participant Connector
    participant FileResponse
    participant Server

    Client->>ClientSession: request()
    ClientSession->>Connector: connect()
    Note over Connector: New ALPN protocol setup
    Connector-->>Server: establish connection
    Server-->>Connector: connection established
    
    alt File Response
        Server->>FileResponse: prepare()
        FileResponse->>FileResponse: _make_response()
        FileResponse-->>Server: response
    end
    
    Server-->>ClientSession: response
    ClientSession-->>Client: response

    Note over ClientSession: Break cyclic refs
    Note over Connector: Handle socket cleanup
Loading

@renovate renovate bot force-pushed the renovate/aiohttp-3.x branch from 27e1aab to dcfae70 Compare April 1, 2025 03:29
@renovate renovate bot changed the title chore(deps): update dependency aiohttp to v3.11.14 chore(deps): update dependency aiohttp to v3.11.12 Apr 1, 2025
Copy link

github-actions bot commented Apr 1, 2025

[puLL-Merge] - aio-libs/[email protected]

Diff
diff --git .github/workflows/ci-cd.yml .github/workflows/ci-cd.yml
index 765047b933f..a794dc65d77 100644
--- .github/workflows/ci-cd.yml
+++ .github/workflows/ci-cd.yml
@@ -47,7 +47,7 @@ jobs:
       with:
         python-version: 3.11
     - name: Cache PyPI
-      uses: actions/[email protected]
+      uses: actions/[email protected]
       with:
         key: pip-lint-${{ hashFiles('requirements/*.txt') }}
         path: ~/.cache/pip
@@ -99,7 +99,7 @@ jobs:
       with:
         submodules: true
     - name: Cache llhttp generated files
-      uses: actions/[email protected]
+      uses: actions/[email protected]
       id: cache
       with:
         key: llhttp-${{ hashFiles('vendor/llhttp/package*.json', 'vendor/llhttp/src/**/*') }}
@@ -114,7 +114,7 @@ jobs:
       run: |
         make generate-llhttp
     - name: Upload llhttp generated files
-      uses: actions/upload-artifact@v3
+      uses: actions/upload-artifact@v4
       with:
         name: llhttp
         path: vendor/llhttp/build
@@ -163,7 +163,7 @@ jobs:
         echo "dir=$(pip cache dir)" >> "${GITHUB_OUTPUT}"
       shell: bash
     - name: Cache PyPI
-      uses: actions/[email protected]
+      uses: actions/[email protected]
       with:
         key: pip-ci-${{ runner.os }}-${{ matrix.pyver }}-${{ matrix.no-extensions }}-${{ hashFiles('requirements/*.txt') }}
         path: ${{ steps.pip-cache.outputs.dir }}
@@ -177,7 +177,7 @@ jobs:
         python -m pip install -r requirements/test.in -c requirements/test.txt
     - name: Restore llhttp generated files
       if: ${{ matrix.no-extensions == '' }}
-      uses: actions/download-artifact@v3
+      uses: actions/download-artifact@v4
       with:
         name: llhttp
         path: vendor/llhttp/build/
@@ -250,11 +250,11 @@ jobs:
       uses: actions/checkout@v4
       with:
         submodules: true
-    - name: Setup Python 3.12
+    - name: Setup Python 3.13.2
       id: python-install
       uses: actions/setup-python@v5
       with:
-        python-version: 3.12
+        python-version: 3.13.2
         cache: pip
         cache-dependency-path: requirements/*.txt
     - name: Update pip, wheel, setuptools, build, twine
@@ -264,7 +264,7 @@ jobs:
       run: |
         python -m pip install -r requirements/test.in -c requirements/test.txt
     - name: Restore llhttp generated files
-      uses: actions/download-artifact@v3
+      uses: actions/download-artifact@v4
       with:
         name: llhttp
         path: vendor/llhttp/build/
@@ -325,7 +325,7 @@ jobs:
         python -m
         pip install -r requirements/cython.in -c requirements/cython.txt
     - name: Restore llhttp generated files
-      uses: actions/download-artifact@v3
+      uses: actions/download-artifact@v4
       with:
         name: llhttp
         path: vendor/llhttp/build/
@@ -336,27 +336,41 @@ jobs:
       run: |
         python -m build --sdist
     - name: Upload artifacts
-      uses: actions/upload-artifact@v3
+      uses: actions/upload-artifact@v4
       with:
-        name: dist
+        name: dist-sdist
         path: dist
 
   build-wheels:
-    name: Build wheels on ${{ matrix.os }} ${{ matrix.qemu }}
-    runs-on: ${{ matrix.os }}-latest
+    name: Build wheels on ${{ matrix.os }} ${{ matrix.qemu }} ${{ matrix.musl }}
+    runs-on: ${{ matrix.os }}
     needs: pre-deploy
     strategy:
       matrix:
-        os: [ubuntu, windows, macos]
+        os: ["ubuntu-latest", "windows-latest", "macos-latest", "ubuntu-24.04-arm"]
         qemu: ['']
+        musl: [""]
         include:
-          # Split ubuntu job for the sake of speed-up
-        - os: ubuntu
-          qemu: aarch64
-        - os: ubuntu
+          # Split ubuntu/musl jobs for the sake of speed-up
+        - os: ubuntu-latest
+          qemu: ppc64le
+          musl: ""
+        - os: ubuntu-latest
           qemu: ppc64le
-        - os: ubuntu
+          musl: musllinux
+        - os: ubuntu-latest
           qemu: s390x
+          musl: ""
+        - os: ubuntu-latest
+          qemu: s390x
+          musl: musllinux
+        - os: ubuntu-latest
+          qemu: armv7l
+          musl: musllinux
+        - os: ubuntu-latest
+          musl: musllinux
+        - os: ubuntu-24.04-arm
+          musl: musllinux
     steps:
     - name: Checkout
       uses: actions/checkout@v4
@@ -367,6 +381,10 @@ jobs:
       uses: docker/setup-qemu-action@v3
       with:
         platforms: all
+        # This should be temporary
+        # xref https://github.com/docker/setup-qemu-action/issues/188
+        # xref https://github.com/tonistiigi/binfmt/issues/215
+        image: tonistiigi/binfmt:qemu-v8.1.5
       id: qemu
     - name: Prepare emulation
       run: |
@@ -388,7 +406,7 @@ jobs:
         python -m
         pip install -r requirements/cython.in -c requirements/cython.txt
     - name: Restore llhttp generated files
-      uses: actions/download-artifact@v3
+      uses: actions/download-artifact@v4
       with:
         name: llhttp
         path: vendor/llhttp/build/
@@ -398,10 +416,17 @@ jobs:
     - name: Build wheels
       uses: pypa/[email protected]
       env:
+        CIBW_SKIP: pp* ${{ matrix.musl == 'musllinux' && '*manylinux*' || '*musllinux*' }}
         CIBW_ARCHS_MACOS: x86_64 arm64 universal2
-    - uses: actions/upload-artifact@v3
+    - name: Upload wheels
+      uses: actions/upload-artifact@v4
       with:
-        name: dist
+        name: >-
+          dist-${{ matrix.os }}-${{ matrix.musl }}-${{
+            matrix.qemu
+            && matrix.qemu
+            || 'native'
+          }}
         path: ./wheelhouse/*.whl
 
   deploy:
@@ -426,10 +451,11 @@ jobs:
       run: |
         echo "${{ secrets.GITHUB_TOKEN }}" | gh auth login --with-token
     - name: Download distributions
-      uses: actions/download-artifact@v3
+      uses: actions/download-artifact@v4
       with:
-        name: dist
         path: dist
+        pattern: dist-*
+        merge-multiple: true
     - name: Collected dists
       run: |
         tree dist
diff --git .readthedocs.yml .readthedocs.yml
index b3edaf4b8ea..b7d8a9236f6 100644
--- .readthedocs.yml
+++ .readthedocs.yml
@@ -5,6 +5,10 @@
 ---
 version: 2
 
+sphinx:
+  # Path to your Sphinx configuration file.
+  configuration: docs/conf.py
+
 submodules:
   include: all
   exclude: []
diff --git CHANGES.rst CHANGES.rst
index 8352236c320..104dd7a746d 100644
--- CHANGES.rst
+++ CHANGES.rst
@@ -10,6 +10,221 @@
 
 .. towncrier release notes start
 
+3.11.12 (2025-02-05)
+====================
+
+Bug fixes
+---------
+
+- ``MultipartForm.decode()`` now follows RFC1341 7.2.1 with a ``CRLF`` after the boundary
+  -- by :user:`imnotjames`.
+
+
+  *Related issues and pull requests on GitHub:*
+  :issue:`10270`.
+
+
+
+- Restored the missing ``total_bytes`` attribute to ``EmptyStreamReader`` -- by :user:`bdraco`.
+
+
+  *Related issues and pull requests on GitHub:*
+  :issue:`10387`.
+
+
+
+
+Features
+--------
+
+- Updated :py:func:`~aiohttp.request` to make it accept ``_RequestOptions`` kwargs.
+  -- by :user:`Cycloctane`.
+
+
+  *Related issues and pull requests on GitHub:*
+  :issue:`10300`.
+
+
+
+- Improved logging of HTTP protocol errors to include the remote address -- by :user:`bdraco`.
+
+
+  *Related issues and pull requests on GitHub:*
+  :issue:`10332`.
+
+
+
+
+Improved documentation
+----------------------
+
+- Added ``aiohttp-openmetrics`` to list of third-party libraries -- by :user:`jelmer`.
+
+
+  *Related issues and pull requests on GitHub:*
+  :issue:`10304`.
+
+
+
+
+Packaging updates and notes for downstreams
+-------------------------------------------
+
+- Added missing files to the source distribution to fix ``Makefile`` targets.
+  Added a ``cythonize-nodeps`` target to run Cython without invoking pip to install dependencies.
+
+
+  *Related issues and pull requests on GitHub:*
+  :issue:`10366`.
+
+
+
+- Started building armv7l musllinux wheels -- by :user:`bdraco`.
+
+
+  *Related issues and pull requests on GitHub:*
+  :issue:`10404`.
+
+
+
+
+Contributor-facing changes
+--------------------------
+
+- The CI/CD workflow has been updated to use `upload-artifact` v4 and `download-artifact` v4 GitHub Actions -- by :user:`silamon`.
+
+
+  *Related issues and pull requests on GitHub:*
+  :issue:`10281`.
+
+
+
+
+Miscellaneous internal changes
+------------------------------
+
+- Restored support for zero copy writes when using Python 3.12 versions 3.12.9 and later or Python 3.13.2+ -- by :user:`bdraco`.
+
+  Zero copy writes were previously disabled due to :cve:`2024-12254` which is resolved in these Python versions.
+
+
+  *Related issues and pull requests on GitHub:*
+  :issue:`10137`.
+
+
+
+
+----
+
+
+3.11.11 (2024-12-18)
+====================
+
+Bug fixes
+---------
+
+- Updated :py:meth:`~aiohttp.ClientSession.request` to reuse the ``quote_cookie`` setting from ``ClientSession._cookie_jar`` when processing cookies parameter.
+  -- by :user:`Cycloctane`.
+
+
+  *Related issues and pull requests on GitHub:*
+  :issue:`10093`.
+
+
+
+- Fixed type of ``SSLContext`` for some static type checkers (e.g. pyright).
+
+
+  *Related issues and pull requests on GitHub:*
+  :issue:`10099`.
+
+
+
+- Updated :meth:`aiohttp.web.StreamResponse.write` annotation to also allow :class:`bytearray` and :class:`memoryview` as inputs -- by :user:`cdce8p`.
+
+
+  *Related issues and pull requests on GitHub:*
+  :issue:`10154`.
+
+
+
+- Fixed a hang where a connection previously used for a streaming
+  download could be returned to the pool in a paused state.
+  -- by :user:`javitonino`.
+
+
+  *Related issues and pull requests on GitHub:*
+  :issue:`10169`.
+
+
+
+
+Features
+--------
+
+- Enabled ALPN on default SSL contexts. This improves compatibility with some
+  proxies which don't work without this extension.
+  -- by :user:`Cycloctane`.
+
+
+  *Related issues and pull requests on GitHub:*
+  :issue:`10156`.
+
+
+
+
+Miscellaneous internal changes
+------------------------------
+
+- Fixed an infinite loop that can occur when using aiohttp in combination
+  with `async-solipsism`_ -- by :user:`bmerry`.
+
+  .. _async-solipsism: https://github.com/bmerry/async-solipsism
+
+
+  *Related issues and pull requests on GitHub:*
+  :issue:`10149`.
+
+
+
+
+----
+
+
+3.11.10 (2024-12-05)
+====================
+
+Bug fixes
+---------
+
+- Fixed race condition in :class:`aiohttp.web.FileResponse` that could have resulted in an incorrect response if the file was replaced on the file system during ``prepare`` -- by :user:`bdraco`.
+
+
+  *Related issues and pull requests on GitHub:*
+  :issue:`10101`, :issue:`10113`.
+
+
+
+- Replaced deprecated call to :func:`mimetypes.guess_type` with :func:`mimetypes.guess_file_type` when using Python 3.13+ -- by :user:`bdraco`.
+
+
+  *Related issues and pull requests on GitHub:*
+  :issue:`10102`.
+
+
+
+- Disabled zero copy writes in the ``StreamWriter`` -- by :user:`bdraco`.
+
+
+  *Related issues and pull requests on GitHub:*
+  :issue:`10125`.
+
+
+
+
+----
+
+
 3.11.9 (2024-12-01)
 ===================
 
diff --git CONTRIBUTORS.txt CONTRIBUTORS.txt
index 6adb3b97fb1..fb1b87ccc9d 100644
--- CONTRIBUTORS.txt
+++ CONTRIBUTORS.txt
@@ -9,6 +9,7 @@ Adam Mills
 Adrian Krupa
 Adrián Chaves
 Ahmed Tahri
+Alan Bogarin
 Alan Tse
 Alec Hanefeld
 Alejandro Gómez
@@ -166,10 +167,12 @@ Jaesung Lee
 Jake Davis
 Jakob Ackermann
 Jakub Wilk
+James Ward
 Jan Buchar
 Jan Gosmann
 Jarno Elonen
 Jashandeep Sohi
+Javier Torres
 Jean-Baptiste Estival
 Jens Steinhauser
 Jeonghun Lee
@@ -364,6 +367,7 @@ William S.
 Wilson Ong
 wouter bolsterlee
 Xavier Halloran
+Xi Rui
 Xiang Li
 Yang Zhou
 Yannick Koechlin
diff --git MANIFEST.in MANIFEST.in
index d7c5cef6aad..64cee139a1f 100644
--- MANIFEST.in
+++ MANIFEST.in
@@ -7,6 +7,7 @@ graft aiohttp
 graft docs
 graft examples
 graft tests
+graft tools
 graft requirements
 recursive-include vendor *
 global-include aiohttp *.pyi
diff --git Makefile Makefile
index b0a3ef3226b..c6193fea9e4 100644
--- Makefile
+++ Makefile
@@ -81,6 +81,9 @@ generate-llhttp: .llhttp-gen
 .PHONY: cythonize
 cythonize: .install-cython $(PYXS:.pyx=.c) aiohttp/_websocket/reader_c.c
 
+.PHONY: cythonize-nodeps
+cythonize-nodeps: $(PYXS:.pyx=.c) aiohttp/_websocket/reader_c.c
+
 .install-deps: .install-cython $(PYXS:.pyx=.c) aiohttp/_websocket/reader_c.c $(call to-hash,$(CYS) $(REQS))
 	@python -m pip install -r requirements/dev.in -c requirements/dev.txt
 	@touch .install-deps
diff --git aiohttp/__init__.py aiohttp/__init__.py
index 5615e5349ae..4bafa848287 100644
--- aiohttp/__init__.py
+++ aiohttp/__init__.py
@@ -1,4 +1,4 @@
-__version__ = "3.11.9"
+__version__ = "3.11.12"
 
 from typing import TYPE_CHECKING, Tuple
 
diff --git aiohttp/abc.py aiohttp/abc.py
index d6f9f782b0f..5794a9108b0 100644
--- aiohttp/abc.py
+++ aiohttp/abc.py
@@ -17,6 +17,7 @@
     Optional,
     Tuple,
     TypedDict,
+    Union,
 )
 
 from multidict import CIMultiDict
@@ -175,6 +176,11 @@ class AbstractCookieJar(Sized, IterableBase):
     def __init__(self, *, loop: Optional[asyncio.AbstractEventLoop] = None) -> None:
         self._loop = loop or asyncio.get_running_loop()
 
+    @property
+    @abstractmethod
+    def quote_cookie(self) -> bool:
+        """Return True if cookies should be quoted."""
+
     @abstractmethod
     def clear(self, predicate: Optional[ClearCookiePredicate] = None) -> None:
         """Clear all cookies if no predicate is passed."""
@@ -200,7 +206,7 @@ class AbstractStreamWriter(ABC):
     length: Optional[int] = 0
 
     @abstractmethod
-    async def write(self, chunk: bytes) -> None:
+    async def write(self, chunk: Union[bytes, bytearray, memoryview]) -> None:
         """Write chunk into stream."""
 
     @abstractmethod
diff --git aiohttp/client.py aiohttp/client.py
index e04a6ff989a..7c788e825eb 100644
--- aiohttp/client.py
+++ aiohttp/client.py
@@ -658,7 +658,9 @@ async def _request(
                     all_cookies = self._cookie_jar.filter_cookies(url)
 
                     if cookies is not None:
-                        tmp_cookie_jar = CookieJar()
+                        tmp_cookie_jar = CookieJar(
+                            quote_cookie=self._cookie_jar.quote_cookie
+                        )
                         tmp_cookie_jar.update_cookies(cookies)
                         req_cookies = tmp_cookie_jar.filter_cookies(url)
                         if req_cookies:
@@ -1469,106 +1471,80 @@ async def __aexit__(
         await self._session.close()
 
 
-def request(
-    method: str,
-    url: StrOrURL,
-    *,
-    params: Query = None,
-    data: Any = None,
-    json: Any = None,
-    headers: Optional[LooseHeaders] = None,
-    skip_auto_headers: Optional[Iterable[str]] = None,
-    auth: Optional[BasicAuth] = None,
-    allow_redirects: bool = True,
-    max_redirects: int = 10,
-    compress: Optional[str] = None,
-    chunked: Optional[bool] = None,
-    expect100: bool = False,
-    raise_for_status: Optional[bool] = None,
-    read_until_eof: bool = True,
-    proxy: Optional[StrOrURL] = None,
-    proxy_auth: Optional[BasicAuth] = None,
-    timeout: Union[ClientTimeout, object] = sentinel,
-    cookies: Optional[LooseCookies] = None,
-    version: HttpVersion = http.HttpVersion11,
-    connector: Optional[BaseConnector] = None,
-    read_bufsize: Optional[int] = None,
-    loop: Optional[asyncio.AbstractEventLoop] = None,
-    max_line_size: int = 8190,
-    max_field_size: int = 8190,
-) -> _SessionRequestContextManager:
-    """Constructs and sends a request.
-
-    Returns response object.
-    method - HTTP method
-    url - request url
-    params - (optional) Dictionary or bytes to be sent in the query
-      string of the new request
-    data - (optional) Dictionary, bytes, or file-like object to
-      send in the body of the request
-    json - (optional) Any json compatible python object
-    headers - (optional) Dictionary of HTTP Headers to send with
-      the request
-    cookies - (optional) Dict object to send with the request
-    auth - (optional) BasicAuth named tuple represent HTTP Basic Auth
-    auth - aiohttp.helpers.BasicAuth
-    allow_redirects - (optional) If set to False, do not follow
-      redirects
-    version - Request HTTP version.
-    compress - Set to True if request has to be compressed
-       with deflate encoding.
-    chunked - Set to chunk size for chunked transfer encoding.
-    expect100 - Expect 100-continue response from server.
-    connector - BaseConnector sub-class instance to support
-       connection pooling.
-    read_until_eof - Read response until eof if response
-       does not have Content-Length header.
-    loop - Optional event loop.
-    timeout - Optional ClientTimeout settings structure, 5min
-       total timeout by default.
-    Usage::
-      >>> import aiohttp
-      >>> resp = await aiohttp.request('GET', 'http://python.org/')
-      >>> resp
-      <ClientResponse(python.org/) [200]>
-      >>> data = await resp.read()
-    """
-    connector_owner = False
-    if connector is None:
-        connector_owner = True
-        connector = TCPConnector(loop=loop, force_close=True)
-
-    session = ClientSession(
-        loop=loop,
-        cookies=cookies,
-        version=version,
-        timeout=timeout,
-        connector=connector,
-        connector_owner=connector_owner,
-    )
+if sys.version_info >= (3, 11) and TYPE_CHECKING:
 
-    return _SessionRequestContextManager(
-        session._request(
-            method,
-            url,
-            params=params,
-            data=data,
-            json=json,
-            headers=headers,
-            skip_auto_headers=skip_auto_headers,
-            auth=auth,
-            allow_redirects=allow_redirects,
-            max_redirects=max_redirects,
-            compress=compress,
-            chunked=chunked,
-            expect100=expect100,
-            raise_for_status=raise_for_status,
-            read_until_eof=read_until_eof,
-            proxy=proxy,
-            proxy_auth=proxy_auth,
-            read_bufsize=read_bufsize,
-            max_line_size=max_line_size,
-            max_field_size=max_field_size,
-        ),
-        session,
-    )
+    def request(
+        method: str,
+        url: StrOrURL,
+        *,
+        version: HttpVersion = http.HttpVersion11,
+        connector: Optional[BaseConnector] = None,
+        loop: Optional[asyncio.AbstractEventLoop] = None,
+        **kwargs: Unpack[_RequestOptions],
+    ) -> _SessionRequestContextManager: ...
+
+else:
+
+    def request(
+        method: str,
+        url: StrOrURL,
+        *,
+        version: HttpVersion = http.HttpVersion11,
+        connector: Optional[BaseConnector] = None,
+        loop: Optional[asyncio.AbstractEventLoop] = None,
+        **kwargs: Any,
+    ) -> _SessionRequestContextManager:
+        """Constructs and sends a request.
+
+        Returns response object.
+        method - HTTP method
+        url - request url
+        params - (optional) Dictionary or bytes to be sent in the query
+        string of the new request
+        data - (optional) Dictionary, bytes, or file-like object to
+        send in the body of the request
+        json - (optional) Any json compatible python object
+        headers - (optional) Dictionary of HTTP Headers to send with
+        the request
+        cookies - (optional) Dict object to send with the request
+        auth - (optional) BasicAuth named tuple represent HTTP Basic Auth
+        auth - aiohttp.helpers.BasicAuth
+        allow_redirects - (optional) If set to False, do not follow
+        redirects
+        version - Request HTTP version.
+        compress - Set to True if request has to be compressed
+        with deflate encoding.
+        chunked - Set to chunk size for chunked transfer encoding.
+        expect100 - Expect 100-continue response from server.
+        connector - BaseConnector sub-class instance to support
+        connection pooling.
+        read_until_eof - Read response until eof if response
+        does not have Content-Length header.
+        loop - Optional event loop.
+        timeout - Optional ClientTimeout settings structure, 5min
+        total timeout by default.
+        Usage::
+        >>> import aiohttp
+        >>> async with aiohttp.request('GET', 'http://python.org/') as resp:
+        ...    print(resp)
+        ...    data = await resp.read()
+        <ClientResponse(https://www.python.org/) [200 OK]>
+        """
+        connector_owner = False
+        if connector is None:
+            connector_owner = True
+            connector = TCPConnector(loop=loop, force_close=True)
+
+        session = ClientSession(
+            loop=loop,
+            cookies=kwargs.pop("cookies", None),
+            version=version,
+            timeout=kwargs.pop("timeout", sentinel),
+            connector=connector,
+            connector_owner=connector_owner,
+        )
+
+        return _SessionRequestContextManager(
+            session._request(method, url, **kwargs),
+            session,
+        )
diff --git aiohttp/client_exceptions.py aiohttp/client_exceptions.py
index 667da8d5084..1d298e9a8cf 100644
--- aiohttp/client_exceptions.py
+++ aiohttp/client_exceptions.py
@@ -8,13 +8,17 @@
 
 from .typedefs import StrOrURL
 
-try:
+if TYPE_CHECKING:
     import ssl
 
     SSLContext = ssl.SSLContext
-except ImportError:  # pragma: no cover
-    ssl = SSLContext = None  # type: ignore[assignment]
+else:
+    try:
+        import ssl
 
+        SSLContext = ssl.SSLContext
+    except ImportError:  # pragma: no cover
+        ssl = SSLContext = None  # type: ignore[assignment]
 
 if TYPE_CHECKING:
     from .client_reqrep import ClientResponse, ConnectionKey, Fingerprint, RequestInfo
diff --git aiohttp/client_reqrep.py aiohttp/client_reqrep.py
index e97c40ce0e5..43b48063c6e 100644
--- aiohttp/client_reqrep.py
+++ aiohttp/client_reqrep.py
@@ -72,12 +72,16 @@
     RawHeaders,
 )
 
-try:
+if TYPE_CHECKING:
     import ssl
     from ssl import SSLContext
-except ImportError:  # pragma: no cover
-    ssl = None  # type: ignore[assignment]
-    SSLContext = object  # type: ignore[misc,assignment]
+else:
+    try:
+        import ssl
+        from ssl import SSLContext
+    except ImportError:  # pragma: no cover
+        ssl = None  # type: ignore[assignment]
+        SSLContext = object  # type: ignore[misc,assignment]
 
 
 __all__ = ("ClientRequest", "ClientResponse", "RequestInfo", "Fingerprint")
diff --git aiohttp/connector.py aiohttp/connector.py
index 93bc2513b20..7e0986df657 100644
--- aiohttp/connector.py
+++ aiohttp/connector.py
@@ -60,14 +60,18 @@
 )
 from .resolver import DefaultResolver
 
-try:
+if TYPE_CHECKING:
     import ssl
 
     SSLContext = ssl.SSLContext
-except ImportError:  # pragma: no cover
-    ssl = None  # type: ignore[assignment]
-    SSLContext = object  # type: ignore[misc,assignment]
+else:
+    try:
+        import ssl
 
+        SSLContext = ssl.SSLContext
+    except ImportError:  # pragma: no cover
+        ssl = None  # type: ignore[assignment]
+        SSLContext = object  # type: ignore[misc,assignment]
 
 EMPTY_SCHEMA_SET = frozenset({""})
 HTTP_SCHEMA_SET = frozenset({"http", "https"})
@@ -776,14 +780,16 @@ def _make_ssl_context(verified: bool) -> SSLContext:
         # No ssl support
         return None
     if verified:
-        return ssl.create_default_context()
-    sslcontext = ssl.SSLContext(ssl.PROTOCOL_TLS_CLIENT)
-    sslcontext.options |= ssl.OP_NO_SSLv2
-    sslcontext.options |= ssl.OP_NO_SSLv3
-    sslcontext.check_hostname = False
-    sslcontext.verify_mode = ssl.CERT_NONE
-    sslcontext.options |= ssl.OP_NO_COMPRESSION
-    sslcontext.set_default_verify_paths()
+        sslcontext = ssl.create_default_context()
+    else:
+        sslcontext = ssl.SSLContext(ssl.PROTOCOL_TLS_CLIENT)
+        sslcontext.options |= ssl.OP_NO_SSLv2
+        sslcontext.options |= ssl.OP_NO_SSLv3
+        sslcontext.check_hostname = False
+        sslcontext.verify_mode = ssl.CERT_NONE
+        sslcontext.options |= ssl.OP_NO_COMPRESSION
+        sslcontext.set_default_verify_paths()
+    sslcontext.set_alpn_protocols(("http/1.1",))
     return sslcontext
 
 
diff --git aiohttp/cookiejar.py aiohttp/cookiejar.py
index ef04bda5ad6..f6b9a921767 100644
--- aiohttp/cookiejar.py
+++ aiohttp/cookiejar.py
@@ -117,6 +117,10 @@ def __init__(
         self._expire_heap: List[Tuple[float, Tuple[str, str, str]]] = []
         self._expirations: Dict[Tuple[str, str, str], float] = {}
 
+    @property
+    def quote_cookie(self) -> bool:
+        return self._quote_cookie
+
     def save(self, file_path: PathLike) -> None:
         file_path = pathlib.Path(file_path)
         with file_path.open(mode="wb") as f:
@@ -474,6 +478,10 @@ def __iter__(self) -> "Iterator[Morsel[str]]":
     def __len__(self) -> int:
         return 0
 
+    @property
+    def quote_cookie(self) -> bool:
+        return True
+
     def clear(self, predicate: Optional[ClearCookiePredicate] = None) -> None:
         pass
 
diff --git aiohttp/http_writer.py aiohttp/http_writer.py
index c66fda3d8d0..e031a97708d 100644
--- aiohttp/http_writer.py
+++ aiohttp/http_writer.py
@@ -1,6 +1,7 @@
 """Http related parsers and protocol."""
 
 import asyncio
+import sys
 import zlib
 from typing import (  # noqa
     Any,
@@ -24,6 +25,17 @@
 __all__ = ("StreamWriter", "HttpVersion", "HttpVersion10", "HttpVersion11")
 
 
+MIN_PAYLOAD_FOR_WRITELINES = 2048
+IS_PY313_BEFORE_313_2 = (3, 13, 0) <= sys.version_info < (3, 13, 2)
+IS_PY_BEFORE_312_9 = sys.version_info < (3, 12, 9)
+SKIP_WRITELINES = IS_PY313_BEFORE_313_2 or IS_PY_BEFORE_312_9
+# writelines is not safe for use
+# on Python 3.12+ until 3.12.9
+# on Python 3.13+ until 3.13.2
+# and on older versions it not any faster than write
+# CVE-2024-12254: https://github.com/python/cpython/pull/127656
+
+
 class HttpVersion(NamedTuple):
     major: int
     minor: int
@@ -72,7 +84,7 @@ def enable_compression(
     ) -> None:
         self._compress = ZLibCompressor(encoding=encoding, strategy=strategy)
 
-    def _write(self, chunk: bytes) -> None:
+    def _write(self, chunk: Union[bytes, bytearray, memoryview]) -> None:
         size = len(chunk)
         self.buffer_size += size
         self.output_size += size
@@ -90,10 +102,17 @@ def _writelines(self, chunks: Iterable[bytes]) -> None:
         transport = self._protocol.transport
         if transport is None or transport.is_closing():
             raise ClientConnectionResetError("Cannot write to closing transport")
-        transport.writelines(chunks)
+        if SKIP_WRITELINES or size < MIN_PAYLOAD_FOR_WRITELINES:
+            transport.write(b"".join(chunks))
+        else:
+            transport.writelines(chunks)
 
     async def write(
-        self, chunk: bytes, *, drain: bool = True, LIMIT: int = 0x10000
+        self,
+        chunk: Union[bytes, bytearray, memoryview],
+        *,
+        drain: bool = True,
+        LIMIT: int = 0x10000,
     ) -> None:
         """Writes chunk of data to a stream.
 
diff --git aiohttp/multipart.py aiohttp/multipart.py
index e0bcce07449..bd4d8ae1ddf 100644
--- aiohttp/multipart.py
+++ aiohttp/multipart.py
@@ -979,7 +979,7 @@ def decode(self, encoding: str = "utf-8", errors: str = "strict") -> str:
         return "".join(
             "--"
             + self.boundary
-            + "\n"
+            + "\r\n"
             + part._binary_headers.decode(encoding, errors)
             + part.decode()
             for part, _e, _te in self._parts
diff --git aiohttp/payload.py aiohttp/payload.py
index c8c01814698..3f6d3672db2 100644
--- aiohttp/payload.py
+++ aiohttp/payload.py
@@ -4,6 +4,7 @@
 import json
 import mimetypes
 import os
+import sys
 import warnings
 from abc import ABC, abstractmethod
 from itertools import chain
@@ -169,7 +170,11 @@ def __init__(
         if content_type is not sentinel and content_type is not None:
             self._headers[hdrs.CONTENT_TYPE] = content_type
         elif self._filename is not None:
-            content_type = mimetypes.guess_type(self._filename)[0]
+            if sys.version_info >= (3, 13):
+                guesser = mimetypes.guess_file_type
+            else:
+                guesser = mimetypes.guess_type
+            content_type = guesser(self._filename)[0]
             if content_type is None:
                 content_type = self._default_content_type
             self._headers[hdrs.CONTENT_TYPE] = content_type
diff --git aiohttp/streams.py aiohttp/streams.py
index b97846171b1..7a3f64d1289 100644
--- aiohttp/streams.py
+++ aiohttp/streams.py
@@ -220,6 +220,9 @@ def feed_eof(self) -> None:
             self._eof_waiter = None
             set_result(waiter, None)
 
+        if self._protocol._reading_paused:
+            self._protocol.resume_reading()
+
         for cb in self._eof_callbacks:
             try:
                 cb()
@@ -517,8 +520,9 @@ def _read_nowait_chunk(self, n: int) -> bytes:
         else:
             data = self._buffer.popleft()
 
-        self._size -= len(data)
-        self._cursor += len(data)
+        data_len = len(data)
+        self._size -= data_len
+        self._cursor += data_len
 
         chunk_splits = self._http_chunk_splits
         # Prevent memory leak: drop useless chunk splits
@@ -551,6 +555,7 @@ class EmptyStreamReader(StreamReader):  # lgtm [py/missing-call-to-init]
 
     def __init__(self) -> None:
         self._read_eof_chunk = False
+        self.total_bytes = 0
 
     def __repr__(self) -> str:
         return "<%s>" % self.__class__.__name__
diff --git aiohttp/web.py aiohttp/web.py
index f975b665331..d6ab6f6fad4 100644
--- aiohttp/web.py
+++ aiohttp/web.py
@@ -9,6 +9,7 @@
 from contextlib import suppress
 from importlib import import_module
 from typing import (
+    TYPE_CHECKING,
     Any,
     Awaitable,
     Callable,
@@ -287,10 +288,13 @@
 )
 
 
-try:
+if TYPE_CHECKING:
     from ssl import SSLContext
-except ImportError:  # pragma: no cover
-    SSLContext = Any  # type: ignore[misc,assignment]
+else:
+    try:
+        from ssl import SSLContext
+    except ImportError:  # pragma: no cover
+        SSLContext = object  # type: ignore[misc,assignment]
 
 # Only display warning when using -Wdefault, -We, -X dev or similar.
 warnings.filterwarnings("ignore", category=NotAppKeyWarning, append=True)
diff --git aiohttp/web_fileresponse.py aiohttp/web_fileresponse.py
index 3b2bc2caf12..be9cf87e069 100644
--- aiohttp/web_fileresponse.py
+++ aiohttp/web_fileresponse.py
@@ -1,7 +1,10 @@
 import asyncio
+import io
 import os
 import pathlib
+import sys
 from contextlib import suppress
+from enum import Enum, auto
 from mimetypes import MimeTypes
 from stat import S_ISREG
 from types import MappingProxyType
@@ -15,6 +18,7 @@
     Iterator,
     List,
     Optional,
+    Set,
     Tuple,
     Union,
     cast,
@@ -66,12 +70,25 @@
     }
 )
 
+
+class _FileResponseResult(Enum):
+    """The result of the file response."""
+
+    SEND_FILE = auto()  # Ie a regular file to send
+    NOT_ACCEPTABLE = auto()  # Ie a socket, or non-regular file
+    PRE_CONDITION_FAILED = auto()  # Ie If-Match or If-None-Match failed
+    NOT_MODIFIED = auto()  # 304 Not Modified
+
+
 # Add custom pairs and clear the encodings map so guess_type ignores them.
 CONTENT_TYPES.encodings_map.clear()
 for content_type, extension in ADDITIONAL_CONTENT_TYPES.items():
     CONTENT_TYPES.add_type(content_type, extension)  # type: ignore[attr-defined]
 
 
+_CLOSE_FUTURES: Set[asyncio.Future[None]] = set()
+
+
 class FileResponse(StreamResponse):
     """A response object can be used to send files."""
 
@@ -160,10 +177,12 @@ async def _precondition_failed(
         self.content_length = 0
         return await super().prepare(request)
 
-    def _get_file_path_stat_encoding(
-        self, accept_encoding: str
-    ) -> Tuple[pathlib.Path, os.stat_result, Optional[str]]:
-        """Return the file path, stat result, and encoding.
+    def _make_response(
+        self, request: "BaseRequest", accept_encoding: str
+    ) -> Tuple[
+        _FileResponseResult, Optional[io.BufferedReader], os.stat_result, Optional[str]
+    ]:
+        """Return the response result, io object, stat result, and encoding.
 
         If an uncompressed file is returned, the encoding is set to
         :py:data:`None`.
@@ -171,6 +190,52 @@ def _get_file_path_stat_encoding(
         This method should be called from a thread executor
         since it calls os.stat which may block.
         """
+        file_path, st, file_encoding = self._get_file_path_stat_encoding(
+            accept_encoding
+        )
+        if not file_path:
+            return _FileResponseResult.NOT_ACCEPTABLE, None, st, None
+
+        etag_value = f"{st.st_mtime_ns:x}-{st.st_size:x}"
+
+        # https://www.rfc-editor.org/rfc/rfc9110#section-13.1.1-2
+        if (ifmatch := request.if_match) is not None and not self._etag_match(
+            etag_value, ifmatch, weak=False
+        ):
+            return _FileResponseResult.PRE_CONDITION_FAILED, None, st, file_encoding
+
+        if (
+            (unmodsince := request.if_unmodified_since) is not None
+            and ifmatch is None
+            and st.st_mtime > unmodsince.timestamp()
+        ):
+            return _FileResponseResult.PRE_CONDITION_FAILED, None, st, file_encoding
+
+        # https://www.rfc-editor.org/rfc/rfc9110#section-13.1.2-2
+        if (ifnonematch := request.if_none_match) is not None and self._etag_match(
+            etag_value, ifnonematch, weak=True
+        ):
+            return _FileResponseResult.NOT_MODIFIED, None, st, file_encoding
+
+        if (
+            (modsince := request.if_modified_since) is not None
+            and ifnonematch is None
+            and st.st_mtime <= modsince.timestamp()
+        ):
+            return _FileResponseResult.NOT_MODIFIED, None, st, file_encoding
+
+        fobj = file_path.open("rb")
+        with suppress(OSError):
+            # fstat() may not be available on all platforms
+            # Once we open the file, we want the fstat() to ensure
+            # the file has not changed between the first stat()
+            # and the open().
+            st = os.stat(fobj.fileno())
+        return _FileResponseResult.SEND_FILE, fobj, st, file_encoding
+
+    def _get_file_path_stat_encoding(
+        self, accept_encoding: str
+    ) -> Tuple[Optional[pathlib.Path], os.stat_result, Optional[str]]:
         file_path = self._path
         for file_extension, file_encoding in ENCODING_EXTENSIONS.items():
             if file_encoding not in accept_encoding:
@@ -184,7 +249,8 @@ def _get_file_path_stat_encoding(
                     return compressed_path, st, file_encoding
 
         # Fallback to the uncompressed file
-        return file_path, file_path.stat(), None
+        st = file_path.stat()
+        return file_path if S_ISREG(st.st_mode) else None, st, None
 
     async def prepare(self, request: "BaseRequest") -> Optional[AbstractStreamWriter]:
         loop = asyncio.get_running_loop()
@@ -192,9 +258,12 @@ async def prepare(self, request: "BaseRequest") -> Optional[AbstractStreamWriter
         # https://www.rfc-editor.org/rfc/rfc9110#section-8.4.1
         accept_encoding = request.headers.get(hdrs.ACCEPT_ENCODING, "").lower()
         try:
-            file_path, st, file_encoding = await loop.run_in_executor(
-                None, self._get_file_path_stat_encoding, accept_encoding
+            response_result, fobj, st, file_encoding = await loop.run_in_executor(
+                None, self._make_response, request, accept_encoding
             )
+        except PermissionError:
+            self.set_status(HTTPForbidden.status_code)
+            return await super().prepare(request)
         except OSError:
             # Most likely to be FileNotFoundError or OSError for circular
             # symlinks in python >= 3.13, so respond with 404.
@@ -202,51 +271,46 @@ async def prepare(self, request: "BaseRequest") -> Optional[AbstractStreamWriter
             return await super().prepare(request)
 
         # Forbid special files like sockets, pipes, devices, etc.
-        if not S_ISREG(st.st_mode):
+        if response_result is _FileResponseResult.NOT_ACCEPTABLE:
             self.set_status(HTTPForbidden.status_code)
             return await super().prepare(request)
 
-        etag_value = f"{st.st_mtime_ns:x}-{st.st_size:x}"
-        last_modified = st.st_mtime
-
-        # https://www.rfc-editor.org/rfc/rfc9110#section-13.1.1-2
-        ifmatch = request.if_match
-        if ifmatch is not None and not self._etag_match(
-            etag_value, ifmatch, weak=False
-        ):
-            return await self._precondition_failed(request)
-
-        unmodsince = request.if_unmodified_since
-        if (
-            unmodsince is not None
-            and ifmatch is None
-            and st.st_mtime > unmodsince.timestamp()
-        ):
+        if response_result is _FileResponseResult.PRE_CONDITION_FAILED:
             return await self._precondition_failed(request)
 
-        # https://www.rfc-editor.org/rfc/rfc9110#section-13.1.2-2
-        ifnonematch = request.if_none_match
-        if ifnonematch is not None and self._etag_match(
-            etag_value, ifnonematch, weak=True
-        ):
-            return await self._not_modified(request, etag_value, last_modified)
-
-        modsince = request.if_modified_since
-        if (
-            modsince is not None
-            and ifnonematch is None
-            and st.st_mtime <= modsince.timestamp()
-        ):
+        if response_result is _FileResponseResult.NOT_MODIFIED:
+            etag_value = f"{st.st_mtime_ns:x}-{st.st_size:x}"
+            last_modified = st.st_mtime
             return await self._not_modified(request, etag_value, last_modified)
 
+        assert fobj is not None
+        try:
+            return await self._prepare_open_file(request, fobj, st, file_encoding)
+        finally:
+            # We do not await here because we do not want to wait
+            # for the executor to finish before returning the response
+            # so the connection can begin servicing another request
+            # as soon as possible.
+            close_future = loop.run_in_executor(None, fobj.close)
+            # Hold a strong reference to the future to prevent it from being
+            # garbage collected before it completes.
+            _CLOSE_FUTURES.add(close_future)
+            close_future.add_done_callback(_CLOSE_FUTURES.remove)
+
+    async def _prepare_open_file(
+        self,
+        request: "BaseRequest",
+        fobj: io.BufferedReader,
+        st: os.stat_result,
+        file_encoding: Optional[str],
+    ) -> Optional[AbstractStreamWriter]:
         status = self._status
-        file_size = st.st_size
-        count = file_size
-
-        start = None
+        file_size: int = st.st_size
+        file_mtime: float = st.st_mtime
+        count: int = file_size
+        start: Optional[int] = None
 
-        ifrange = request.if_range
-        if ifrange is None or st.st_mtime <= ifrange.timestamp():
+        if (ifrange := request.if_range) is None or file_mtime <= ifrange.timestamp():
             # If-Range header check:
             # condition = cached date >= last modification date
             # return 206 if True else 200.
@@ -257,7 +321,7 @@ async def prepare(self, request: "BaseRequest") -> Optional[AbstractStreamWriter
             try:
                 rng = request.http_range
                 start = rng.start
-                end = rng.stop
+                end: Optional[int] = rng.stop
             except ValueError:
                 # https://tools.ietf.org/html/rfc7233:
                 # A server generating a 416 (Range Not Satisfiable) response to
@@ -268,13 +332,13 @@ async def prepare(self, request: "BaseRequest") -> Optional[AbstractStreamWriter
                 #
                 # Will do the same below. Many servers ignore this and do not
                 # send a Content-Range header with HTTP 416
-                self.headers[hdrs.CONTENT_RANGE] = f"bytes */{file_size}"
+                self._headers[hdrs.CONTENT_RANGE] = f"bytes */{file_size}"
                 self.set_status(HTTPRequestRangeNotSatisfiable.status_code)
                 return await super().prepare(request)
 
             # If a range request has been made, convert start, end slice
             # notation into file pointer offset and count
-            if start is not None or end is not None:
+            if start is not None:
                 if start < 0 and end is None:  # return tail of file
                     start += file_size
                     if start < 0:
@@ -304,7 +368,7 @@ async def prepare(self, request: "BaseRequest") -> Optional[AbstractStreamWriter
                     # suffix-byte-range-spec with a non-zero suffix-length,
                     # then the byte-range-set is satisfiable. Otherwise, the
                     # byte-range-set is unsatisfiable.
-                    self.headers[hdrs.CONTENT_RANGE] = f"bytes */{file_size}"
+                    self._headers[hdrs.CONTENT_RANGE] = f"bytes */{file_size}"
                     self.set_status(HTTPRequestRangeNotSatisfiable.status_code)
                     return await super().prepare(request)
 
@@ -316,48 +380,39 @@ async def prepare(self, request: "BaseRequest") -> Optional[AbstractStreamWriter
         # If the Content-Type header is not already set, guess it based on the
         # extension of the request path. The encoding returned by guess_type
         #  can be ignored since the map was cleared above.
-        if hdrs.CONTENT_TYPE not in self.headers:
-            self.content_type = (
-                CONTENT_TYPES.guess_type(self._path)[0] or FALLBACK_CONTENT_TYPE
-            )
+        if hdrs.CONTENT_TYPE not in self._headers:
+            if sys.version_info >= (3, 13):
+                guesser = CONTENT_TYPES.guess_file_type
+            else:
+                guesser = CONTENT_TYPES.guess_type
+            self.content_type = guesser(self._path)[0] or FALLBACK_CONTENT_TYPE
 
         if file_encoding:
-            self.headers[hdrs.CONTENT_ENCODING] = file_encoding
-            self.headers[hdrs.VARY] = hdrs.ACCEPT_ENCODING
+            self._headers[hdrs.CONTENT_ENCODING] = file_encoding
+            self._headers[hdrs.VARY] = hdrs.ACCEPT_ENCODING
             # Disable compression if we are already sending
             # a compressed file since we don't want to double
             # compress.
             self._compression = False
 
-        self.etag = etag_value  # type: ignore[assignment]
-        self.last_modified = st.st_mtime  # type: ignore[assignment]
+        self.etag = f"{st.st_mtime_ns:x}-{st.st_size:x}"  # type: ignore[assignment]
+        self.last_modified = file_mtime  # type: ignore[assignment]
         self.content_length = count
 
-        self.headers[hdrs.ACCEPT_RANGES] = "bytes"
-
-        real_start = cast(int, start)
+        self._headers[hdrs.ACCEPT_RANGES] = "bytes"
 
         if status == HTTPPartialContent.status_code:
-            self.headers[hdrs.CONTENT_RANGE] = "bytes {}-{}/{}".format(
+            real_start = start
+            assert real_start is not None
+            self._headers[hdrs.CONTENT_RANGE] = "bytes {}-{}/{}".format(
                 real_start, real_start + count - 1, file_size
             )
 
         # If we are sending 0 bytes calling sendfile() will throw a ValueError
-        if count == 0 or must_be_empty_body(request.method, self.status):
-            return await super().prepare(request)
-
-        try:
-            fobj = await loop.run_in_executor(None, file_path.open, "rb")
-        except PermissionError:
-            self.set_status(HTTPForbidden.status_code)
+        if count == 0 or must_be_empty_body(request.method, status):
             return await super().prepare(request)
 
-        if start:  # be aware that start could be None or int=0 here.
-            offset = start
-        else:
-            offset = 0
+        # be aware that start could be None or int=0 here.
+        offset = start or 0
 
-        try:
-            return await self._sendfile(request, fobj, offset, count)
-        finally:
-            await asyncio.shield(loop.run_in_executor(None, fobj.close))
+        return await self._sendfile(request, fobj, offset, count)
diff --git aiohttp/web_protocol.py aiohttp/web_protocol.py
index e8bb41abf97..32f503474a9 100644
--- aiohttp/web_protocol.py
+++ aiohttp/web_protocol.py
@@ -458,7 +458,7 @@ def _process_keepalive(self) -> None:
         loop = self._loop
         now = loop.time()
         close_time = self._next_keepalive_close_time
-        if now <= close_time:
+        if now < close_time:
             # Keep alive close check fired too early, reschedule
             self._keepalive_handle = loop.call_at(close_time, self._process_keepalive)
             return
@@ -694,9 +694,13 @@ def handle_error(
             # or encrypted traffic to an HTTP port. This is expected
             # to happen when connected to the public internet so we log
             # it at the debug level as to not fill logs with noise.
-            self.logger.debug("Error handling request", exc_info=exc)
+            self.logger.debug(
+                "Error handling request from %s", request.remote, exc_info=exc
+            )
         else:
-            self.log_exception("Error handling request", exc_info=exc)
+            self.log_exception(
+                "Error handling request from %s", request.remote, exc_info=exc
+            )
 
         # some data already got sent, connection is broken
         if request.writer.output_size > 0:
diff --git aiohttp/web_response.py aiohttp/web_response.py
index cd2be24f1a3..e498a905caf 100644
--- aiohttp/web_response.py
+++ aiohttp/web_response.py
@@ -537,7 +537,7 @@ async def _write_headers(self) -> None:
         status_line = f"HTTP/{version[0]}.{version[1]} {self._status} {self._reason}"
         await writer.write_headers(status_line, self._headers)
 
-    async def write(self, data: bytes) -> None:
+    async def write(self, data: Union[bytes, bytearray, memoryview]) -> None:
         assert isinstance(
             data, (bytes, bytearray, memoryview)
         ), "data argument must be byte-ish (%r)" % type(data)
diff --git aiohttp/web_runner.py aiohttp/web_runner.py
index f8933383435..bcfec727c84 100644
--- aiohttp/web_runner.py
+++ aiohttp/web_runner.py
@@ -3,7 +3,7 @@
 import socket
 import warnings
 from abc import ABC, abstractmethod
-from typing import Any, List, Optional, Set
+from typing import TYPE_CHECKING, Any, List, Optional, Set
 
 from yarl import URL
 
@@ -11,11 +11,13 @@
 from .web_app import Application
 from .web_server import Server
 
-try:
+if TYPE_CHECKING:
     from ssl import SSLContext
-except ImportError:
-    SSLContext = object  # type: ignore[misc,assignment]
-
+else:
+    try:
+        from ssl import SSLContext
+    except ImportError:  # pragma: no cover
+        SSLContext = object  # type: ignore[misc,assignment]
 
 __all__ = (
     "BaseSite",
diff --git aiohttp/worker.py aiohttp/worker.py
index 9b307697336..8ed121ac955 100644
--- aiohttp/worker.py
+++ aiohttp/worker.py
@@ -6,7 +6,7 @@
 import signal
 import sys
 from types import FrameType
-from typing import Any, Awaitable, Callable, Optional, Union  # noqa
+from typing import TYPE_CHECKING, Any, Optional
 
 from gunicorn.config import AccessLogFormat as GunicornAccessLogFormat
 from gunicorn.workers import base
@@ -17,13 +17,18 @@
 from .web_app import Application
 from .web_log import AccessLogger
 
-try:
+if TYPE_CHECKING:
     import ssl
 
     SSLContext = ssl.SSLContext
-except ImportError:  # pragma: no cover
-    ssl = None  # type: ignore[assignment]
-    SSLContext = object  # type: ignore[misc,assignment]
+else:
+    try:
+        import ssl
+
+        SSLContext = ssl.SSLContext
+    except ImportError:  # pragma: no cover
+        ssl = None  # type: ignore[assignment]
+        SSLContext = object  # type: ignore[misc,assignment]
 
 
 __all__ = ("GunicornWebWorker", "GunicornUVLoopWebWorker")
diff --git docs/client_reference.rst docs/client_reference.rst
index c9031de5383..26537161971 100644
--- docs/client_reference.rst
+++ docs/client_reference.rst
@@ -448,11 +448,16 @@ The client session supports the context manager protocol for self closing.
       :param aiohttp.BasicAuth auth: an object that represents HTTP
                                      Basic Authorization (optional)
 
-      :param bool allow_redirects: If set to ``False``, do not follow redirects.
-                                   ``True`` by default (optional).
+      :param bool allow_redirects: Whether to process redirects or not.
+         When ``True``, redirects are followed (up to ``max_redirects`` times)
+         and logged into :attr:`ClientResponse.history` and ``trace_configs``.
+         When ``False``, the original response is returned.
+         ``True`` by default (optional).
 
       :param int max_redirects: Maximum number of redirects to follow.
-                                ``10`` by default.
+         :exc:`TooManyRedirects` is raised if the number is exceeded.
+         Ignored when ``allow_redirects=False``.
+         ``10`` by default.
 
       :param bool compress: Set to ``True`` if request has to be compressed
          with deflate encoding. If `compress` can not be combined
@@ -508,7 +513,7 @@ The client session supports the context manager protocol for self closing.
          .. versionadded:: 3.0
 
       :param str server_hostname: Sets or overrides the host name that the
-         target server’s certificate will be matched against.
+         target server's certificate will be matched against.
 
          See :py:meth:`asyncio.loop.create_connection` for more information.
 
@@ -554,8 +559,11 @@ The client session supports the context manager protocol for self closing.
 
       :param url: Request URL, :class:`str` or :class:`~yarl.URL`
 
-      :param bool allow_redirects: If set to ``False``, do not follow redirects.
-                                   ``True`` by default (optional).
+      :param bool allow_redirects: Whether to process redirects or not.
+         When ``True``, redirects are followed and logged into
+         :attr:`ClientResponse.history`.
+         When ``False``, the original response is returned.
+         ``True`` by default (optional).
 
       :return ClientResponse: a :class:`client response
                               <ClientResponse>` object.
@@ -623,8 +631,11 @@ The client session supports the context manager protocol for self closing.
 
       :param url: Request URL, :class:`str` or :class:`~yarl.URL`
 
-      :param bool allow_redirects: If set to ``False``, do not follow redirects.
-                                   ``False`` by default (optional).
+      :param bool allow_redirects: Whether to process redirects or not.
+         When ``True``, redirects are followed and logged into
+         :attr:`ClientResponse.history`.
+         When ``False``, the original response is returned.
+         ``False`` by default (optional).
 
       :return ClientResponse: a :class:`client response
                               <ClientResponse>` object.
@@ -641,8 +652,11 @@ The client session supports the context manager protocol for self closing.
 
       :param url: Request URL, :class:`str` or :class:`~yarl.URL`
 
-      :param bool allow_redirects: If set to ``False``, do not follow redirects.
-                                   ``True`` by default (optional).
+      :param bool allow_redirects: Whether to process redirects or not.
+         When ``True``, redirects are followed and logged into
+         :attr:`ClientResponse.history`.
+         When ``False``, the original response is returned.
+         ``True`` by default (optional).
 
       :return ClientResponse: a :class:`client response
                               <ClientResponse>` object.
@@ -836,14 +850,21 @@ certification chaining.
 
 .. function:: request(method, url, *, params=None, data=None, \
                         json=None,\
-                        headers=None, cookies=None, auth=None, \
+                        cookies=None, headers=None, skip_auto_headers=None, auth=None, \
                         allow_redirects=True, max_redirects=10, \
-                        encoding='utf-8', \
-                        version=HttpVersion(major=1, minor=1), \
-                        compress=None, chunked=None, expect100=False, raise_for_status=False, \
+                        compress=False, chunked=None, expect100=False, raise_for_status=None, \
+                        read_until_eof=True, \
+                        proxy=None, proxy_auth=None, \
+                        timeout=sentinel, ssl=True, \
+                        server_hostname=None, \
+                        proxy_headers=None, \
+                        trace_request_ctx=None, \
                         read_bufsize=None, \
-                        connector=None, loop=None,\
-                        read_until_eof=True, timeout=sentinel)
+                        auto_decompress=None, \
+                        max_line_size=None, \
+                        max_field_size=None, \
+                        version=aiohttp.HttpVersion11, \
+                        connector=None)
    :async:
 
    Asynchronous context manager for performing an asynchronous HTTP
@@ -856,8 +877,20 @@ certification chaining.
                be encoded with :class:`~yarl.URL` (see :class:`~yarl.URL`
                to skip encoding).
 
-   :param dict params: Parameters to be sent in the query
-                       string of the new request (optional)
+   :param params: Mapping, iterable of tuple of *key*/*value* pairs or
+                  string to be sent as parameters in the query
+                  string of the new request. Ignored for subsequent
+                  redirected requests (optional)
+
+                  Allowed values are:
+
+                  - :class:`collections.abc.Mapping` e.g. :class:`dict`,
+                     :class:`multidict.MultiDict` or
+                     :class:`multidict.MultiDictProxy`
+                  - :class:`collections.abc.Iterable` e.g. :class:`tuple` or
+                     :class:`list`
+                  - :class:`str` with preferably url-encoded content
+                     (**Warning:** content will not be encoded by *aiohttp*)
 
    :param data: The data to send in the body of the request. This can be a
                 :class:`FormData` object or anything that can be passed into
@@ -867,25 +900,46 @@ certification chaining.
    :param json: Any json compatible python object (optional). *json* and *data*
                 parameters could not be used at the same time.
 
+   :param dict cookies: HTTP Cookies to send with the request (optional)
+
    :param dict headers: HTTP Headers to send with the request (optional)
 
-   :param dict cookies: Cookies to send with the request (optional)
+   :param skip_auto_headers: set of headers for which autogeneration
+      should be skipped.
+
+      *aiohttp* autogenerates headers like ``User-Agent`` or
+      ``Content-Type`` if these headers are not explicitly
+      passed. Using ``skip_auto_headers`` parameter allows to skip
+      that generation.
+
+      Iterable of :class:`str` or :class:`~multidict.istr`
+      (optional)
 
    :param aiohttp.BasicAuth auth: an object that represents HTTP Basic
                                   Authorization (optional)
 
-   :param bool allow_redirects: If set to ``False``, do not follow redirects.
-                                ``True`` by default (optional).
+   :param bool allow_redirects: Whether to process redirects or not.
+      When ``True``, redirects are followed (up to ``max_redirects`` times)
+      and logged into :attr:`ClientResponse.history` and ``trace_configs``.
+      When ``False``, the original response is returned.
+      ``True`` by default (optional).
 
-   :param aiohttp.protocol.HttpVersion version: Request HTTP version (optional)
+   :param int max_redirects: Maximum number of redirects to follow.
+      :exc:`TooManyRedirects` is raised if the number is exceeded.
+      Ignored when ``allow_redirects=False``.
+      ``10`` by default.
 
    :param bool compress: Set to ``True`` if request has to be compressed
-                         with deflate encoding.
-                         ``False`` instructs aiohttp to not compress data.
+                         with deflate encoding. If `compress` can not be combined
+                         with a *Content-Encoding* and *Content-Length* headers.
                          ``None`` by default (optional).
 
    :param int chunked: Enables chunked transfer encoding.
-                       ``None`` by default (optional).
+      It is up to the developer
+      to decide how to chunk data streams. If chunking is enabled, aiohttp
+      encodes the provided chunks in the "Transfer-encoding: chunked" format.
+      If *chunked* is set, then the *Transfer-encoding* and *content-length*
+      headers are disallowed. ``None`` by default (optional).
 
    :param bool expect100: Expect 100-continue response from server.
                           ``False`` by default (optional).
@@ -899,28 +953,60 @@ certification chaining.
 
       .. versionadded:: 3.4
 
-   :param aiohttp.BaseConnector connector: BaseConnector sub-class
-      instance to support connection pooling.
-
    :param bool read_until_eof: Read response until EOF if response
                                does not have Content-Length header.
                                ``True`` by default (optional).
 
+   :param proxy: Proxy URL, :class:`str` or :class:`~yarl.URL` (optional)
+
+   :param aiohttp.BasicAuth proxy_auth: an object that represents proxy HTTP
+                                        Basic Authorization (optional)
+
+   :param timeout: a :class:`ClientTimeout` settings structure, 300 seconds (5min)
+        total timeout, 30 seconds socket connect timeout by default.
+
+   :param ssl: SSL validation mode. ``True`` for default SSL check
+               (:func:`ssl.create_default_context` is used),
+               ``False`` for skip SSL certificate validation,
+               :class:`aiohttp.Fingerprint` for fingerprint
+               validation, :class:`ssl.SSLContext` for custom SSL
+               certificate validation.
+
+               Supersedes *verify_ssl*, *ssl_context* and
+               *fingerprint* parameters.
+
+   :param str server_hostname: Sets or overrides the host name that the
+      target server's certificate will be matched against.
+
+      See :py:meth:`asyncio.loop.create_connection`
+      for more information.
+
+   :param collections.abc.Mapping proxy_headers: HTTP headers to send to the proxy
+      if the parameter proxy has been provided.
+
+   :param trace_request_ctx: Object used to give as a kw param for each new
+      :class:`TraceConfig` object instantiated,
+      used to give information to the
+      tracers that is only available at request time.
+
    :param int read_bufsize: Size of the read buffer (:attr:`ClientResponse.content`).
                             ``None`` by default,
                             it means that the session global value is used.
 
       .. versionadded:: 3.7
 
-   :param timeout: a :class:`ClientTimeout` settings structure, 300 seconds (5min)
-        total timeout, 30 seconds socket connect timeout by default.
+   :param bool auto_decompress: Automatically decompress response body.
+      May be used to enable/disable auto decompression on a per-request basis.
 
-   :param loop: :ref:`event loop<asyncio-event-loop>`
-                used for processing HTTP requests.
-                If param is ``None``, :func:`asyncio.get_event_loop`
-                is used for getting default event loop.
+   :param int max_line_size: Maximum allowed size of lines in responses.
 
-      .. deprecated:: 2.0
+   :param int max_field_size: Maximum allowed size of header fields in responses.
+
+   :param aiohttp.protocol.HttpVersion version: Request HTTP version,
+      ``HTTP 1.1`` by default. (optional)
+
+   :param aiohttp.BaseConnector connector: BaseConnector sub-class
+      instance to support connection pooling. (optional)
 
    :return ClientResponse: a :class:`client response <ClientResponse>` object.
 
diff --git docs/spelling_wordlist.txt docs/spelling_wordlist.txt
index a1f3d944584..59ea99c40bb 100644
--- docs/spelling_wordlist.txt
+++ docs/spelling_wordlist.txt
@@ -13,6 +13,8 @@ app
 app’s
 apps
 arg
+args
+armv
 Arsenic
 async
 asyncio
@@ -169,6 +171,7 @@ keepaliving
 kib
 KiB
 kwarg
+kwargs
 latin
 lifecycle
 linux
@@ -199,6 +202,7 @@ multidicts
 Multidicts
 multipart
 Multipart
+musllinux
 mypy
 Nagle
 Nagle’s
@@ -245,6 +249,7 @@ py
 pydantic
 pyenv
 pyflakes
+pyright
 pytest
 Pytest
 Quickstart
diff --git docs/third_party.rst docs/third_party.rst
index e8095c7f09d..145a505a5de 100644
--- docs/third_party.rst
+++ docs/third_party.rst
@@ -305,3 +305,6 @@ ask to raise the status.
 
 - `aiohttp-asgi-connector <https://github.com/thearchitector/aiohttp-asgi-connector>`_
   An aiohttp connector for using a ``ClientSession`` to interface directly with separate ASGI applications.
+
+- `aiohttp-openmetrics <https://github.com/jelmer/aiohttp-openmetrics>`_
+  An aiohttp middleware for exposing Prometheus metrics.
diff --git requirements/base.txt requirements/base.txt
index 1e7c0bbe6c1..d79bdab3893 100644
--- requirements/base.txt
+++ requirements/base.txt
@@ -30,7 +30,7 @@ multidict==6.1.0
     # via
     #   -r requirements/runtime-deps.in
     #   yarl
-packaging==24.1
+packaging==24.2
     # via gunicorn
 propcache==0.2.0
     # via
diff --git requirements/constraints.txt requirements/constraints.txt
index d32acc7b773..041a3737ab0 100644
--- requirements/constraints.txt
+++ requirements/constraints.txt
@@ -14,7 +14,7 @@ aiohttp-theme==0.1.7
     # via -r requirements/doc.in
 aiosignal==1.3.1
     # via -r requirements/runtime-deps.in
-alabaster==0.7.13
+alabaster==1.0.0
     # via sphinx
 annotated-types==0.7.0
     # via pydantic
@@ -129,7 +129,7 @@ mypy-extensions==1.0.0
     # via mypy
 nodeenv==1.9.1
     # via pre-commit
-packaging==24.1
+packaging==24.2
     # via
     #   build
     #   gunicorn
@@ -236,22 +236,22 @@ slotscheck==0.19.1
     # via -r requirements/lint.in
 snowballstemmer==2.2.0
     # via sphinx
-sphinx==7.1.2
+sphinx==8.1.3
     # via
     #   -r requirements/doc.in
     #   sphinxcontrib-spelling
     #   sphinxcontrib-towncrier
-sphinxcontrib-applehelp==1.0.4
+sphinxcontrib-applehelp==2.0.0
     # via sphinx
-sphinxcontrib-devhelp==1.0.2
+sphinxcontrib-devhelp==2.0.0
     # via sphinx
-sphinxcontrib-htmlhelp==2.0.1
+sphinxcontrib-htmlhelp==2.1.0
     # via sphinx
 sphinxcontrib-jsmath==1.0.1
     # via sphinx
-sphinxcontrib-qthelp==1.0.3
+sphinxcontrib-qthelp==2.0.0
     # via sphinx
-sphinxcontrib-serializinghtml==1.1.5
+sphinxcontrib-serializinghtml==2.0.0
     # via sphinx
 sphinxcontrib-spelling==8.0.0 ; platform_system != "Windows"
     # via -r requirements/doc-spelling.in
diff --git requirements/dev.txt requirements/dev.txt
index 168ce639d19..a99644dff81 100644
--- requirements/dev.txt
+++ requirements/dev.txt
@@ -14,7 +14,7 @@ aiohttp-theme==0.1.7
     # via -r requirements/doc.in
 aiosignal==1.3.1
     # via -r requirements/runtime-deps.in
-alabaster==0.7.13
+alabaster==1.0.0
     # via sphinx
 annotated-types==0.7.0
     # via pydantic
@@ -122,7 +122,7 @@ mypy-extensions==1.0.0
     # via mypy
 nodeenv==1.9.1
     # via pre-commit
-packaging==24.1
+packaging==24.2
     # via
     #   build
     #   gunicorn
@@ -210,21 +210,21 @@ slotscheck==0.19.1
     # via -r requirements/lint.in
 snowballstemmer==2.2.0
     # via sphinx
-sphinx==7.1.2
+sphinx==8.1.3
     # via
     #   -r requirements/doc.in
     #   sphinxcontrib-towncrier
-sphinxcontrib-applehelp==1.0.4
+sphinxcontrib-applehelp==2.0.0
     # via sphinx
-sphinxcontrib-devhelp==1.0.2
+sphinxcontrib-devhelp==2.0.0
     # via sphinx
-sphinxcontrib-htmlhelp==2.0.1
+sphinxcontrib-htmlhelp==2.1.0
     # via sphinx
 sphinxcontrib-jsmath==1.0.1
     # via sphinx
-sphinxcontrib-qthelp==1.0.3
+sphinxcontrib-qthelp==2.0.0
     # via sphinx
-sphinxcontrib-serializinghtml==1.1.5
+sphinxcontrib-serializinghtml==2.0.0
     # via sphinx
 sphinxcontrib-towncrier==0.4.0a0
     # via -r requirements/doc.in
diff --git requirements/doc-spelling.txt requirements/doc-spelling.txt
index df393012548..43b3822706e 100644
--- requirements/doc-spelling.txt
+++ requirements/doc-spelling.txt
@@ -6,7 +6,7 @@
 #
 aiohttp-theme==0.1.7
     # via -r requirements/doc.in
-alabaster==0.7.13
+alabaster==1.0.0
     # via sphinx
 babel==2.16.0
     # via sphinx
@@ -34,7 +34,7 @@ jinja2==3.1.4
     #   towncrier
 markupsafe==2.1.5
     # via jinja2
-packaging==24.1
+packaging==24.2
     # via sphinx
 pyenchant==3.2.2
     # via sphinxcontrib-spelling
@@ -46,22 +46,22 @@ requests==2.32.3
     # via sphinx
 snowballstemmer==2.2.0
     # via sphinx
-sphinx==7.1.2
+sphinx==8.1.3
     # via
     #   -r requirements/doc.in
     #   sphinxcontrib-spelling
     #   sphinxcontrib-towncrier
-sphinxcontrib-applehelp==1.0.4
+sphinxcontrib-applehelp==2.0.0
     # via sphinx
-sphinxcont,rib-devhelp==1.0.2
+sphinxcontrib-devhelp==2.0.0
     # via sphinx
-sphinxcontrib-htmlhelp==2.0.1
+sphinxcontrib-htmlhelp==2.1.0
     # via sphinx
 sphinxcontrib-jsmath==1.0.1
     # via sphinx
-sphinxcontrib-qthelp==1.0.3
+sphinxcontrib-qthelp==2.0.0
     # via sphinx
-sphinxcontrib-serializinghtml==1.1.5
+sphinxcontrib-serializinghtml==2.0.0
     # via sphinx
 sphinxcontrib-spelling==8.0.0 ; platform_system != "Windows"
     # via -r requirements/doc-spelling.in
diff --git requirements/doc.txt requirements/doc.txt
index 43b7c6b7e8b..6ddfc47455b 100644
--- requirements/doc.txt
+++ requirements/doc.txt
@@ -6,7 +6,7 @@
 #
 aiohttp-theme==0.1.7
     # via -r requirements/doc.in
-alabaster==0.7.13
+alabaster==1.0.0
     # via sphinx
 babel==2.16.0
     # via sphinx
@@ -34,7 +34,7 @@ jinja2==3.1.4
     #   towncrier
 markupsafe==2.1.5
     # via jinja2
-packaging==24.1
+packaging==24.2
     # via sphinx
 pygments==2.18.0
     # via sphinx
@@ -44,21 +44,21 @@ requests==2.32.3
     # via sphinx
 snowballstemmer==2.2.0
     # via sphinx
-sphinx==7.1.2
+sphinx==8.1.3
     # via
     #   -r requirements/doc.in
     #   sphinxcontrib-towncrier
-sphinxcontrib-applehelp==1.0.4
+sphinxcontrib-applehelp==2.0.0
     # via sphinx
-sphinxcontrib-devhelp==1.0.2
+sphinxcontrib-devhelp==2.0.0
     # via sphinx
-sphinxcontrib-htmlhelp==2.0.1
+sphinxcontrib-htmlhelp==2.1.0
     # via sphinx
 sphinxcontrib-jsmath==1.0.1
     # via sphinx
-sphinxcontrib-qthelp==1.0.3
+sphinxcontrib-qthelp==2.0.0
     # via sphinx
-sphinxcontrib-serializinghtml==1.1.5
+sphinxcontrib-serializinghtml==2.0.0
     # via sphinx
 sphinxcontrib-towncrier==0.4.0a0
     # via -r requirements/doc.in
diff --git requirements/lint.txt requirements/lint.txt
index d7d97277bce..e2547d13da5 100644
--- requirements/lint.txt
+++ requirements/lint.txt
@@ -55,7 +55,7 @@ mypy-extensions==1.0.0
     # via mypy
 nodeenv==1.9.1
     # via pre-commit
-packaging==24.1
+packaging==24.2
     # via pytest
 platformdirs==4.3.6
     # via virtualenv
diff --git requirements/test.txt requirements/test.txt
index 33510f18682..cf81a7bf257 100644
--- requirements/test.txt
+++ requirements/test.txt
@@ -70,7 +70,7 @@ mypy==1.11.2 ; implementation_name == "cpython"
     # via -r requirements/test.in
 mypy-extensions==1.0.0
     # via mypy
-packaging==24.1
+packaging==24.2
     # via
     #   gunicorn
     #   pytest
diff --git a/tests/test_benchmarks_web_fileresponse.py b/tests/test_benchmarks_web_fileresponse.py
new file mode 100644
index 00000000000..01aa7448c86
--- /dev/null
+++ tests/test_benchmarks_web_fileresponse.py
@@ -0,0 +1,105 @@
+"""codspeed benchmarks for the web file responses."""
+
+import asyncio
+import pathlib
+
+from multidict import CIMultiDict
+from pytest_codspeed import BenchmarkFixture
+
+from aiohttp import ClientResponse, web
+from aiohttp.pytest_plugin import AiohttpClient
+
+
+def test_simple_web_file_response(
+    loop: asyncio.AbstractEventLoop,
+    aiohttp_client: AiohttpClient,
+    benchmark: BenchmarkFixture,
+) -> None:
+    """Benchmark creating 100 simple web.FileResponse."""
+    response_count = 100
+    filepath = pathlib.Path(__file__).parent / "sample.txt"
+
+    async def handler(request: web.Request) -> web.FileResponse:
+        return web.FileResponse(path=filepath)
+
+    app = web.Application()
+    app.router.add_route("GET", "/", handler)
+
+    async def run_file_response_benchmark() -> None:
+        client = await aiohttp_client(app)
+        for _ in range(response_count):
+            await client.get("/")
+        await client.close()
+
+    @benchmark
+    def _run() -> None:
+        loop.run_until_complete(run_file_response_benchmark())
+
+
+def test_simple_web_file_sendfile_fallback_response(
+    loop: asyncio.AbstractEventLoop,
+    aiohttp_client: AiohttpClient,
+    benchmark: BenchmarkFixture,
+) -> None:
+    """Benchmark creating 100 simple web.FileResponse without sendfile."""
+    response_count = 100
+    filepath = pathlib.Path(__file__).parent / "sample.txt"
+
+    async def handler(request: web.Request) -> web.FileResponse:
+        transport = request.transport
+        assert transport is not None
+        transport._sendfile_compatible = False  # type: ignore[attr-defined]
+        return web.FileResponse(path=filepath)
+
+    app = web.Application()
+    app.router.add_route("GET", "/", handler)
+
+    async def run_file_response_benchmark() -> None:
+        client = await aiohttp_client(app)
+        for _ in range(response_count):
+            await client.get("/")
+        await client.close()
+
+    @benchmark
+    def _run() -> None:
+        loop.run_until_complete(run_file_response_benchmark())
+
+
+def test_simple_web_file_response_not_modified(
+    loop: asyncio.AbstractEventLoop,
+    aiohttp_client: AiohttpClient,
+    benchmark: BenchmarkFixture,
+) -> None:
+    """Benchmark web.FileResponse that return a 304."""
+    response_count = 100
+    filepath = pathlib.Path(__file__).parent / "sample.txt"
+
+    async def handler(request: web.Request) -> web.FileResponse:
+        return web.FileResponse(path=filepath)
+
+    app = web.Application()
+    app.router.add_route("GET", "/", handler)
+
+    async def make_last_modified_header() -> CIMultiDict[str]:
+        client = await aiohttp_client(app)
+        resp = await client.get("/")
+        last_modified = resp.headers["Last-Modified"]
+        headers = CIMultiDict({"If-Modified-Since": last_modified})
+        return headers
+
+    async def run_file_response_benchmark(
+        headers: CIMultiDict[str],
+    ) -> ClientResponse:
+        client = await aiohttp_client(app)
+        for _ in range(response_count):
+            resp = await client.get("/", headers=headers)
+
+        await client.close()
+        return resp  # type: ignore[possibly-undefined]
+
+    headers = loop.run_until_complete(make_last_modified_header())
+
+    @benchmark
+    def _run() -> None:
+        resp = loop.run_until_complete(run_file_response_benchmark(headers))
+        assert resp.status == 304
diff --git tests/test_client_functional.py tests/test_client_functional.py
index b34ccdb600d..ba75e8e93c6 100644
--- tests/test_client_functional.py
+++ tests/test_client_functional.py
@@ -603,6 +603,30 @@ async def handler(request):
     assert txt == "Test message"
 
 
+async def test_ssl_client_alpn(
+    aiohttp_server: AiohttpServer,
+    aiohttp_client: AiohttpClient,
+    ssl_ctx: ssl.SSLContext,
+) -> None:
+
+    async def handler(request: web.Request) -> web.Response:
+        assert request.transport is not None
+        sslobj = request.transport.get_extra_info("ssl_object")
+        return web.Response(text=sslobj.selected_alpn_protocol())
+
+    app = web.Application()
+    app.router.add_route("GET", "/", handler)
+    ssl_ctx.set_alpn_protocols(("http/1.1",))
+    server = await aiohttp_server(app, ssl=ssl_ctx)
+
+    connector = aiohttp.TCPConnector(ssl=False)
+    client = await aiohttp_client(server, connector=connector)
+    resp = await client.get("/")
+    assert resp.status == 200
+    txt = await resp.text()
+    assert txt == "http/1.1"
+
+
 async def test_tcp_connector_fingerprint_ok(
     aiohttp_server,
     aiohttp_client,
@@ -3358,6 +3382,22 @@ async def handler(request: web.Request) -> web.Response:
     await server.close()
 
 
+async def test_aiohttp_request_ssl(
+    aiohttp_server: AiohttpServer,
+    ssl_ctx: ssl.SSLContext,
+    client_ssl_ctx: ssl.SSLContext,
+) -> None:
+    async def handler(request: web.Request) -> web.Response:
+        return web.Response()
+
+    app = web.Application()
+    app.router.add_get("/", handler)
+    server = await aiohttp_server(app, ssl=ssl_ctx)
+
+    async with aiohttp.request("GET", server.make_url("/"), ssl=client_ssl_ctx) as resp:
+        assert resp.status == 200
+
+
 async def test_yield_from_in_session_request(aiohttp_client: AiohttpClient) -> None:
     # a test for backward compatibility with yield from syntax
     async def handler(request):
diff --git tests/test_client_session.py tests/test_client_session.py
index 65f80b6abe9..6309c5daf2e 100644
--- tests/test_client_session.py
+++ tests/test_client_session.py
@@ -15,13 +15,14 @@
 from yarl import URL
 
 import aiohttp
-from aiohttp import client, hdrs, web
+from aiohttp import CookieJar, client, hdrs, web
 from aiohttp.client import ClientSession
 from aiohttp.client_proto import ResponseHandler
 from aiohttp.client_reqrep import ClientRequest
 from aiohttp.connector import BaseConnector, Connection, TCPConnector, UnixConnector
 from aiohttp.helpers import DEBUG
 from aiohttp.http import RawResponseMessage
+from aiohttp.pytest_plugin import AiohttpServer
 from aiohttp.test_utils import make_mocked_coro
 from aiohttp.tracing import Trace
 
@@ -634,8 +635,24 @@ async def handler(request):
     assert resp_cookies["response"].value == "resp_value"
 
 
-async def test_session_default_version(loop) -> None:
-    session = aiohttp.ClientSession(loop=loop)
+async def test_cookies_with_not_quoted_cookie_jar(
+    aiohttp_server: AiohttpServer,
+) -> None:
+    async def handler(_: web.Request) -> web.Response:
+        return web.Response()
+
+    app = web.Application()
+    app.router.add_route("GET", "/", handler)
+    server = await aiohttp_server(app)
+    jar = CookieJar(quote_cookie=False)
+    cookies = {"name": "val=foobar"}
+    async with aiohttp.ClientSession(cookie_jar=jar) as sess:
+        resp = await sess.request("GET", server.make_url("/"), cookies=cookies)
+    assert resp.request_info.headers.get("Cookie", "") == "name=val=foobar"
+
+
+async def test_session_default_version(loop: asyncio.AbstractEventLoop) -> None:
+    session = aiohttp.ClientSession()
     assert session.version == aiohttp.HttpVersion11
     await session.close()
 
diff --git tests/test_cookiejar.py tests/test_cookiejar.py
index bdcf54fa796..0b440bc2ca6 100644
--- tests/test_cookiejar.py
+++ tests/test_cookiejar.py
@@ -807,6 +807,7 @@ async def make_jar():
 async def test_dummy_cookie_jar() -> None:
     cookie = SimpleCookie("foo=bar; Domain=example.com;")
     dummy_jar = DummyCookieJar()
+    assert dummy_jar.quote_cookie is True
     assert len(dummy_jar) == 0
     dummy_jar.update_cookies(cookie)
     assert len(dummy_jar) == 0
diff --git tests/test_flowcontrol_streams.py tests/test_flowcontrol_streams.py
index 68e623b6dd7..9874cc2511e 100644
--- tests/test_flowcontrol_streams.py
+++ tests/test_flowcontrol_streams.py
@@ -4,6 +4,7 @@
 import pytest
 
 from aiohttp import streams
+from aiohttp.base_protocol import BaseProtocol
 
 
 @pytest.fixture
@@ -112,6 +113,15 @@ async def test_read_nowait(self, stream) -> None:
         assert res == b""
         assert stream._protocol.resume_reading.call_count == 1  # type: ignore[attr-defined]
 
+    async def test_resumed_on_eof(self, stream: streams.StreamReader) -> None:
+        stream.feed_data(b"data")
+        assert stream._protocol.pause_reading.call_count == 1  # type: ignore[attr-defined]
+        assert stream._protocol.resume_reading.call_count == 0  # type: ignore[attr-defined]
+        stream._protocol._reading_paused = True
+
+        stream.feed_eof()
+        assert stream._protocol.resume_reading.call_count == 1  # type: ignore[attr-defined]
+
 
 async def test_flow_control_data_queue_waiter_cancelled(
     buffer: streams.FlowControlDataQueue,
@@ -180,3 +190,16 @@ async def test_flow_control_data_queue_read_eof(
     buffer.feed_eof()
     with pytest.raises(streams.EofStream):
         await buffer.read()
+
+
+async def test_stream_reader_eof_when_full() -> None:
+    loop = asyncio.get_event_loop()
+    protocol = BaseProtocol(loop=loop)
+    protocol.transport = asyncio.Transport()
+    stream = streams.StreamReader(protocol, 1024, loop=loop)
+
+    data_len = stream._high_water + 1
+    stream.feed_data(b"0" * data_len)
+    assert protocol._reading_paused
+    stream.feed_eof()
+    assert not protocol._reading_paused
diff --git tests/test_http_writer.py tests/test_http_writer.py
index 0ed0e615700..677b5bc9678 100644
--- tests/test_http_writer.py
+++ tests/test_http_writer.py
@@ -2,7 +2,7 @@
 import array
 import asyncio
 import zlib
-from typing import Iterable
+from typing import Generator, Iterable
 from unittest import mock
 
 import pytest
@@ -14,7 +14,19 @@
 
 
 @pytest.fixture
-def buf():
+def enable_writelines() -> Generator[None, None, None]:
+    with mock.patch("aiohttp.http_writer.SKIP_WRITELINES", False):
+        yield
+
+
+@pytest.fixture
+def force_writelines_small_payloads() -> Generator[None, None, None]:
+    with mock.patch("aiohttp.http_writer.MIN_PAYLOAD_FOR_WRITELINES", 1):
+        yield
+
+
+@pytest.fixture
+def buf() -> bytearray:
     return bytearray()
 
 
@@ -100,6 +112,32 @@ async def test_write_large_payload_deflate_compression_data_in_eof(
     msg = http.StreamWriter(protocol, loop)
     msg.enable_compression("deflate")
 
+    await msg.write(b"data" * 4096)
+    assert transport.write.called  # type: ignore[attr-defined]
+    chunks = [c[1][0] for c in list(transport.write.mock_calls)]  # type: ignore[attr-defined]
+    transport.write.reset_mock()  # type: ignore[attr-defined]
+
+    # This payload compresses to 20447 bytes
+    payload = b"".join(
+        [bytes((*range(0, i), *range(i, 0, -1))) for i in range(255) for _ in range(64)]
+    )
+    await msg.write_eof(payload)
+    chunks.extend([c[1][0] for c in list(transport.write.mock_calls)])  # type: ignore[attr-defined]
+
+    assert all(chunks)
+    content = b"".join(chunks)
+    assert zlib.decompress(content) == (b"data" * 4096) + payload
+
+
+@pytest.mark.usefixtures("enable_writelines")
+async def test_write_large_payload_deflate_compression_data_in_eof_writelines(
+    protocol: BaseProtocol,
+    transport: asyncio.Transport,
+    loop: asyncio.AbstractEventLoop,
+) -> None:
+    msg = http.StreamWriter(protocol, loop)
+    msg.enable_compression("deflate")
+
     await msg.write(b"data" * 4096)
     assert transport.write.called  # type: ignore[attr-defined]
     chunks = [c[1][0] for c in list(transport.write.mock_calls)]  # type: ignore[attr-defined]
@@ -180,6 +218,26 @@ async def test_write_payload_deflate_compression_chunked(
     await msg.write(b"data")
     await msg.write_eof()
 
+    chunks = [c[1][0] for c in list(transport.write.mock_calls)]  # type: ignore[attr-defined]
+    assert all(chunks)
+    content = b"".join(chunks)
+    assert content == expected
+
+
+@pytest.mark.usefixtures("enable_writelines")
+@pytest.mark.usefixtures("force_writelines_small_payloads")
+async def test_write_payload_deflate_compression_chunked_writelines(
+    protocol: BaseProtocol,
+    transport: asyncio.Transport,
+    loop: asyncio.AbstractEventLoop,
+) -> None:
+    expected = b"2\r\nx\x9c\r\na\r\nKI,I\x04\x00\x04\x00\x01\x9b\r\n0\r\n\r\n"
+    msg = http.StreamWriter(protocol, loop)
+    msg.enable_compression("deflate")
+    msg.enable_chunking()
+    await msg.write(b"data")
+    await msg.write_eof()
+
     chunks = [b"".join(c[1][0]) for c in list(transport.writelines.mock_calls)]  # type: ignore[attr-defined]
     assert all(chunks)
     content = b"".join(chunks)
@@ -216,6 +274,26 @@ async def test_write_payload_deflate_compression_chunked_data_in_eof(
     await msg.write(b"data")
     await msg.write_eof(b"end")
 
+    chunks = [c[1][0] for c in list(transport.write.mock_calls)]  # type: ignore[attr-defined]
+    assert all(chunks)
+    content = b"".join(chunks)
+    assert content == expected
+
+
+@pytest.mark.usefixtures("enable_writelines")
+@pytest.mark.usefixtures("force_writelines_small_payloads")
+async def test_write_payload_deflate_compression_chunked_data_in_eof_writelines(
+    protocol: BaseProtocol,
+    transport: asyncio.Transport,
+    loop: asyncio.AbstractEventLoop,
+) -> None:
+    expected = b"2\r\nx\x9c\r\nd\r\nKI,IL\xcdK\x01\x00\x0b@\x02\xd2\r\n0\r\n\r\n"
+    msg = http.StreamWriter(protocol, loop)
+    msg.enable_compression("deflate")
+    msg.enable_chunking()
+    await msg.write(b"data")
+    await msg.write_eof(b"end")
+
     chunks = [b"".join(c[1][0]) for c in list(transport.writelines.mock_calls)]  # type: ignore[attr-defined]
     assert all(chunks)
     content = b"".join(chunks)
@@ -231,6 +309,34 @@ async def test_write_large_payload_deflate_compression_chunked_data_in_eof(
     msg.enable_compression("deflate")
     msg.enable_chunking()
 
+    await msg.write(b"data" * 4096)
+    # This payload compresses to 1111 bytes
+    payload = b"".join([bytes((*range(0, i), *range(i, 0, -1))) for i in range(255)])
+    await msg.write_eof(payload)
+
+    compressed = []
+    chunks = [c[1][0] for c in list(transport.write.mock_calls)]  # type: ignore[attr-defined]
+    chunked_body = b"".join(chunks)
+    split_body = chunked_body.split(b"\r\n")
+    while split_body:
+        if split_body.pop(0):
+            compressed.append(split_body.pop(0))
+
+    content = b"".join(compressed)
+    assert zlib.decompress(content) == (b"data" * 4096) + payload
+
+
+@pytest.mark.usefixtures("enable_writelines")
+@pytest.mark.usefixtures("force_writelines_small_payloads")
+async def test_write_large_payload_deflate_compression_chunked_data_in_eof_writelines(
+    protocol: BaseProtocol,
+    transport: asyncio.Transport,
+    loop: asyncio.AbstractEventLoop,
+) -> None:
+    msg = http.StreamWriter(protocol, loop)
+    msg.enable_compression("deflate")
+    msg.enable_chunking()
+
     await msg.write(b"data" * 4096)
     # This payload compresses to 1111 bytes
     payload = b"".join([bytes((*range(0, i), *range(i, 0, -1))) for i in range(255)])
diff --git tests/test_imports.py tests/test_imports.py
index 5a2bb76b03c..b3f545ad900 100644
--- tests/test_imports.py
+++ tests/test_imports.py
@@ -38,7 +38,7 @@ def test_web___all__(pytester: pytest.Pytester) -> None:
         # and even slower under pytest-xdist, especially in CI
         _XDIST_WORKER_COUNT * 100 * (1 if _IS_CI_ENV else 1.53)
         if _IS_XDIST_RUN
-        else 265
+        else 295
     ),
 }
 _TARGET_TIMINGS_BY_PYTHON_VERSION["3.13"] = _TARGET_TIMINGS_BY_PYTHON_VERSION["3.12"]
diff --git tests/test_streams.py tests/test_streams.py
index fcf13a91eb3..1b65f771c77 100644
--- tests/test_streams.py
+++ tests/test_streams.py
@@ -1141,6 +1141,7 @@ async def test_empty_stream_reader() -> None:
     with pytest.raises(asyncio.IncompleteReadError):
         await s.readexactly(10)
     assert s.read_nowait() == b""
+    assert s.total_bytes == 0
 
 
 async def test_empty_stream_reader_iter_chunks() -> None:
diff --git tests/test_web_functional.py tests/test_web_functional.py
index a3a990141a1..e4979851300 100644
--- tests/test_web_functional.py
+++ tests/test_web_functional.py
@@ -2324,3 +2324,41 @@ async def handler(request: web.Request) -> web.Response:
         # Make 2nd request which will hit the race condition.
         async with client.get("/") as resp:
             assert resp.status == 200
+
+
+async def test_keepalive_expires_on_time(aiohttp_client: AiohttpClient) -> None:
+    """Test that the keepalive handle expires on time."""
+
+    async def handler(request: web.Request) -> web.Response:
+        body = await request.read()
+        assert b"" == body
+        return web.Response(body=b"OK")
+
+    app = web.Application()
+    app.router.add_route("GET", "/", handler)
+
+    connector = aiohttp.TCPConnector(limit=1)
+    client = await aiohttp_client(app, connector=connector)
+
+    loop = asyncio.get_running_loop()
+    now = loop.time()
+
+    # Patch loop time so we can control when the keepalive timeout is processed
+    with mock.patch.object(loop, "time") as loop_time_mock:
+        loop_time_mock.return_value = now
+        resp1 = await client.get("/")
+        await resp1.read()
+        request_handler = client.server.handler.connections[0]
+
+        # Ensure the keep alive handle is set
+        assert request_handler._keepalive_handle is not None
+
+        # Set the loop time to exactly the keepalive timeout
+        loop_time_mock.return_value = request_handler._next_keepalive_close_time
+
+        # sleep twice to ensure the keep alive timeout is processed
+        await asyncio.sleep(0)
+        await asyncio.sleep(0)
+
+        # Ensure the keep alive handle expires
+        assert request_handler._keepalive_handle is None
diff --git tests/test_web_response.py tests/test_web_response.py
index f4acf23f61b..0591426c57b 100644
--- tests/test_web_response.py
+++ tests/test_web_response.py
@@ -1201,7 +1201,7 @@ def read(self, size: int = -1) -> bytes:
         (BodyPartReader("x", CIMultiDictProxy(CIMultiDict()), mock.Mock()), None),
         (
             mpwriter,
-            "--x\nContent-Type: text/plain; charset=utf-8\r\nContent-Length: 4\r\n\r\ntest",
+            "--x\r\nContent-Type: text/plain; charset=utf-8\r\nContent-Length: 4\r\n\r\ntest",
         ),
     ),
 )
diff --git tests/test_web_server.py tests/test_web_server.py
index 7b9b87a374a..910f074e90f 100644
--- tests/test_web_server.py
+++ tests/test_web_server.py
@@ -56,7 +56,9 @@ async def handler(request):
     assert txt.startswith("500 Internal Server Error")
     assert "Traceback" not in txt
 
-    logger.exception.assert_called_with("Error handling request", exc_info=exc)
+    logger.exception.assert_called_with(
+        "Error handling request from %s", cli.host, exc_info=exc
+    )
 
 
 async def test_raw_server_logs_invalid_method_with_loop_debug(
@@ -85,7 +87,9 @@ async def handler(request: web.BaseRequest) -> NoReturn:
     # on the first request since the client may
     # be probing for TLS/SSL support which is
     # expected to fail
-    logger.debug.assert_called_with("Error handling request", exc_info=exc)
+    logger.debug.assert_called_with(
+        "Error handling request from %s", cli.host, exc_info=exc
+    )
     logger.debug.reset_mock()
 
     # Now make another connection to the server
@@ -99,7 +103,9 @@ async def handler(request: web.BaseRequest) -> NoReturn:
     # on the first request since the client may
     # be probing for TLS/SSL support which is
     # expected to fail
-    logger.debug.assert_called_with("Error handling request", exc_info=exc)
+    logger.debug.assert_called_with(
+        "Error handling request from %s", cli.host, exc_info=exc
+    )
 
 
 async def test_raw_server_logs_invalid_method_without_loop_debug(
@@ -128,7 +134,9 @@ async def handler(request: web.BaseRequest) -> NoReturn:
     # on the first request since the client may
     # be probing for TLS/SSL support which is
     # expected to fail
-    logger.debug.assert_called_with("Error handling request", exc_info=exc)
+    logger.debug.assert_called_with(
+        "Error handling request from %s", cli.host, exc_info=exc
+    )
 
 
 async def test_raw_server_logs_invalid_method_second_request(
@@ -159,7 +167,9 @@ async def handler(request: web.BaseRequest) -> web.Response:
     # BadHttpMethod should be logged as an exception
     # if its not the first request since we know
     # that the client already was speaking HTTP
-    logger.exception.assert_called_with("Error handling request", exc_info=exc)
+    logger.exception.assert_called_with(
+        "Error handling request from %s", cli.host, exc_info=exc
+    )
 
 
 async def test_raw_server_logs_bad_status_line_as_exception(
@@ -184,7 +194,9 @@ async def handler(request: web.BaseRequest) -> NoReturn:
     txt = await resp.text()
     assert "Traceback (most recent call last):\n" not in txt
 
-    logger.exception.assert_called_with("Error handling request", exc_info=exc)
+    logger.exception.assert_called_with(
+        "Error handling request from %s", cli.host, exc_info=exc
+    )
 
 
 async def test_raw_server_handler_timeout(
@@ -254,7 +266,9 @@ async def handler(request):
     txt = await resp.text()
     assert "Traceback (most recent call last):\n" in txt
 
-    logger.exception.assert_called_with("Error handling request", exc_info=exc)
+    logger.exception.assert_called_with(
+        "Error handling request from %s", cli.host, exc_info=exc
+    )
 
 
 async def test_raw_server_html_exception(aiohttp_raw_server, aiohttp_client):
@@ -278,7 +292,9 @@ async def handler(request):
         "</body></html>\n"
     )
 
-    logger.exception.assert_called_with("Error handling request", exc_info=exc)
+    logger.exception.assert_called_with(
+        "Error handling request from %s", cli.host, exc_info=exc
+    )
 
 
 async def test_raw_server_html_exception_debug(aiohttp_raw_server, aiohttp_client):
@@ -302,7 +318,9 @@ async def handler(request):
         "<pre>Traceback (most recent call last):\n"
     )
 
-    logger.exception.assert_called_with("Error handling request", exc_info=exc)
+    logger.exception.assert_called_with(
+        "Error handling request from %s", cli.host, exc_info=exc
+    )
 
 
 async def test_handler_cancellation(unused_port_socket: socket.socket) -> None:
diff --git tests/test_web_urldispatcher.py tests/test_web_urldispatcher.py
index 92066f09b7d..ee60b6917c5 100644
--- tests/test_web_urldispatcher.py
+++ tests/test_web_urldispatcher.py
@@ -585,16 +585,17 @@ async def test_access_mock_special_resource(
     my_special.touch()
 
     real_result = my_special.stat()
-    real_stat = pathlib.Path.stat
+    real_stat = os.stat
 
-    def mock_stat(self: pathlib.Path, **kwargs: Any) -> os.stat_result:
-        s = real_stat(self, **kwargs)
+    def mock_stat(path: Any, **kwargs: Any) -> os.stat_result:
+        s = real_stat(path, **kwargs)
         if os.path.samestat(s, real_result):
             mock_mode = S_IFIFO | S_IMODE(s.st_mode)
             s = os.stat_result([mock_mode] + list(s)[1:])
         return s
 
     monkeypatch.setattr("pathlib.Path.stat", mock_stat)
+    monkeypatch.setattr("os.stat", mock_stat)
 
     app = web.Application()
     app.router.add_static("/", str(tmp_path))
diff --git tools/gen.py tools/gen.py
index ab2b39a2df0..24fb71bdd9d 100755
--- tools/gen.py
+++ tools/gen.py
@@ -7,7 +7,7 @@
 import multidict
 
 ROOT = pathlib.Path.cwd()
-while ROOT.parent != ROOT and not (ROOT / ".git").exists():
+while ROOT.parent != ROOT and not (ROOT / "pyproject.toml").exists():
     ROOT = ROOT.parent
 
 

Description

This PR includes several significant updates to the aiohttp library, focusing on improving security, performance, and functionality. Key updates include version bumps in CI/CD workflows, bug fixes, security improvements, and documentation enhancements.

Possible Issues

  1. Python 3.13 support may not be fully tested yet since it's a development version
  2. Potential performance impact from disabling zero copy writes due to CVE-2024-12254 on older Python versions

Security Hotspots

  1. SSL/ALPN improvements for better proxy compatibility and security
  2. CVE-2024-12254 mitigation by disabling zero copy writes on affected Python versions
  3. File response security improvements to prevent race conditions and handle permissions properly
Changes

Changes

  • CI/CD Workflow

    • Updated GitHub Actions versions (cache v4.2.0, upload/download artifact v4)
    • Added musllinux wheel builds for various architectures
    • Updated Python version to 3.13.2
  • Client

    • Added ALPN support for improved proxy compatibility
    • Fixed cookie handling and quote settings
    • Updated request() function to accept _RequestOptions kwargs
  • Web

    • Fixed race condition in FileResponse
    • Improved HTTP protocol error logging
    • Fixed keepalive behavior
    • Updated multipart form handling for RFC compliance
  • Core

    • Disabled zero copy writes for affected Python versions (CVE-2024-12254)
    • Updated SSL context handling
    • Improved flow control and stream handling
  • Documentation

    • Added new third-party libraries
    • Enhanced request/response documentation
    • Updated SSL and proxy configuration docs
sequenceDiagram
    participant Client
    participant FileResponse
    participant Protocol
    participant Transport
    
    Client->>FileResponse: GET /file
    FileResponse->>Protocol: Check transport
    FileResponse->>FileResponse: _make_response()
    alt File is valid
        FileResponse->>Transport: Send file (with sendfile)
    else File is invalid or permission error
        FileResponse->>Client: Return error response
    end
    FileResponse->>Protocol: Resume reading
    FileResponse->>Client: Complete response
Loading

@renovate renovate bot changed the title chore(deps): update dependency aiohttp to v3.11.12 chore(deps): update dependency aiohttp to v3.11.15 Apr 8, 2025
@renovate renovate bot force-pushed the renovate/aiohttp-3.x branch from dcfae70 to 549f427 Compare April 8, 2025 05:43
Copy link

github-actions bot commented Apr 8, 2025

[puLL-Merge] - aio-libs/[email protected]

Diff
diff --git .github/workflows/ci-cd.yml .github/workflows/ci-cd.yml
index 765047b933f..a794dc65d77 100644
--- .github/workflows/ci-cd.yml
+++ .github/workflows/ci-cd.yml
@@ -47,7 +47,7 @@ jobs:
       with:
         python-version: 3.11
     - name: Cache PyPI
-      uses: actions/[email protected]
+      uses: actions/[email protected]
       with:
         key: pip-lint-${{ hashFiles('requirements/*.txt') }}
         path: ~/.cache/pip
@@ -99,7 +99,7 @@ jobs:
       with:
         submodules: true
     - name: Cache llhttp generated files
-      uses: actions/[email protected]
+      uses: actions/[email protected]
       id: cache
       with:
         key: llhttp-${{ hashFiles('vendor/llhttp/package*.json', 'vendor/llhttp/src/**/*') }}
@@ -114,7 +114,7 @@ jobs:
       run: |
         make generate-llhttp
     - name: Upload llhttp generated files
-      uses: actions/upload-artifact@v3
+      uses: actions/upload-artifact@v4
       with:
         name: llhttp
         path: vendor/llhttp/build
@@ -163,7 +163,7 @@ jobs:
         echo "dir=$(pip cache dir)" >> "${GITHUB_OUTPUT}"
       shell: bash
     - name: Cache PyPI
-      uses: actions/[email protected]
+      uses: actions/[email protected]
       with:
         key: pip-ci-${{ runner.os }}-${{ matrix.pyver }}-${{ matrix.no-extensions }}-${{ hashFiles('requirements/*.txt') }}
         path: ${{ steps.pip-cache.outputs.dir }}
@@ -177,7 +177,7 @@ jobs:
         python -m pip install -r requirements/test.in -c requirements/test.txt
     - name: Restore llhttp generated files
       if: ${{ matrix.no-extensions == '' }}
-      uses: actions/download-artifact@v3
+      uses: actions/download-artifact@v4
       with:
         name: llhttp
         path: vendor/llhttp/build/
@@ -250,11 +250,11 @@ jobs:
       uses: actions/checkout@v4
       with:
         submodules: true
-    - name: Setup Python 3.12
+    - name: Setup Python 3.13.2
       id: python-install
       uses: actions/setup-python@v5
       with:
-        python-version: 3.12
+        python-version: 3.13.2
         cache: pip
         cache-dependency-path: requirements/*.txt
     - name: Update pip, wheel, setuptools, build, twine
@@ -264,7 +264,7 @@ jobs:
       run: |
         python -m pip install -r requirements/test.in -c requirements/test.txt
     - name: Restore llhttp generated files
-      uses: actions/download-artifact@v3
+      uses: actions/download-artifact@v4
       with:
         name: llhttp
         path: vendor/llhttp/build/
@@ -325,7 +325,7 @@ jobs:
         python -m
         pip install -r requirements/cython.in -c requirements/cython.txt
     - name: Restore llhttp generated files
-      uses: actions/download-artifact@v3
+      uses: actions/download-artifact@v4
       with:
         name: llhttp
         path: vendor/llhttp/build/
@@ -336,27 +336,41 @@ jobs:
       run: |
         python -m build --sdist
     - name: Upload artifacts
-      uses: actions/upload-artifact@v3
+      uses: actions/upload-artifact@v4
       with:
-        name: dist
+        name: dist-sdist
         path: dist
 
   build-wheels:
-    name: Build wheels on ${{ matrix.os }} ${{ matrix.qemu }}
-    runs-on: ${{ matrix.os }}-latest
+    name: Build wheels on ${{ matrix.os }} ${{ matrix.qemu }} ${{ matrix.musl }}
+    runs-on: ${{ matrix.os }}
     needs: pre-deploy
     strategy:
       matrix:
-        os: [ubuntu, windows, macos]
+        os: ["ubuntu-latest", "windows-latest", "macos-latest", "ubuntu-24.04-arm"]
         qemu: ['']
+        musl: [""]
         include:
-          # Split ubuntu job for the sake of speed-up
-        - os: ubuntu
-          qemu: aarch64
-        - os: ubuntu
+          # Split ubuntu/musl jobs for the sake of speed-up
+        - os: ubuntu-latest
+          qemu: ppc64le
+          musl: ""
+        - os: ubuntu-latest
           qemu: ppc64le
-        - os: ubuntu
+          musl: musllinux
+        - os: ubuntu-latest
           qemu: s390x
+          musl: ""
+        - os: ubuntu-latest
+          qemu: s390x
+          musl: musllinux
+        - os: ubuntu-latest
+          qemu: armv7l
+          musl: musllinux
+        - os: ubuntu-latest
+          musl: musllinux
+        - os: ubuntu-24.04-arm
+          musl: musllinux
     steps:
     - name: Checkout
       uses: actions/checkout@v4
@@ -367,6 +381,10 @@ jobs:
       uses: docker/setup-qemu-action@v3
       with:
         platforms: all
+        # This should be temporary
+        # xref https://github.com/docker/setup-qemu-action/issues/188
+        # xref https://github.com/tonistiigi/binfmt/issues/215
+        image: tonistiigi/binfmt:qemu-v8.1.5
       id: qemu
     - name: Prepare emulation
       run: |
@@ -388,7 +406,7 @@ jobs:
         python -m
         pip install -r requirements/cython.in -c requirements/cython.txt
     - name: Restore llhttp generated files
-      uses: actions/download-artifact@v3
+      uses: actions/download-artifact@v4
       with:
         name: llhttp
         path: vendor/llhttp/build/
@@ -398,10 +416,17 @@ jobs:
     - name: Build wheels
       uses: pypa/[email protected]
       env:
+        CIBW_SKIP: pp* ${{ matrix.musl == 'musllinux' && '*manylinux*' || '*musllinux*' }}
         CIBW_ARCHS_MACOS: x86_64 arm64 universal2
-    - uses: actions/upload-artifact@v3
+    - name: Upload wheels
+      uses: actions/upload-artifact@v4
       with:
-        name: dist
+        name: >-
+          dist-${{ matrix.os }}-${{ matrix.musl }}-${{
+            matrix.qemu
+            && matrix.qemu
+            || 'native'
+          }}
         path: ./wheelhouse/*.whl
 
   deploy:
@@ -426,10 +451,11 @@ jobs:
       run: |
         echo "${{ secrets.GITHUB_TOKEN }}" | gh auth login --with-token
     - name: Download distributions
-      uses: actions/download-artifact@v3
+      uses: actions/download-artifact@v4
       with:
-        name: dist
         path: dist
+        pattern: dist-*
+        merge-multiple: true
     - name: Collected dists
       run: |
         tree dist
diff --git .readthedocs.yml .readthedocs.yml
index b3edaf4b8ea..b7d8a9236f6 100644
--- .readthedocs.yml
+++ .readthedocs.yml
@@ -5,6 +5,10 @@
 ---
 version: 2
 
+sphinx:
+  # Path to your Sphinx configuration file.
+  configuration: docs/conf.py
+
 submodules:
   include: all
   exclude: []
diff --git CHANGES.rst CHANGES.rst
index 8352236c320..c2654b99214 100644
--- CHANGES.rst
+++ CHANGES.rst
@@ -10,6 +10,391 @@
 
 .. towncrier release notes start
 
+3.11.15 (2025-03-31)
+====================
+
+Bug fixes
+---------
+
+- Reverted explicitly closing sockets if an exception is raised during ``create_connection`` -- by :user:`bdraco`.
+
+  This change originally appeared in aiohttp 3.11.13
+
+
+  *Related issues and pull requests on GitHub:*
+  :issue:`10464`, :issue:`10617`, :issue:`10656`.
+
+
+
+
+Miscellaneous internal changes
+------------------------------
+
+- Improved performance of WebSocket buffer handling -- by :user:`bdraco`.
+
+
+  *Related issues and pull requests on GitHub:*
+  :issue:`10601`.
+
+
+
+- Improved performance of serializing headers -- by :user:`bdraco`.
+
+
+  *Related issues and pull requests on GitHub:*
+  :issue:`10625`.
+
+
+
+
+----
+
+
+3.11.14 (2025-03-16)
+====================
+
+Bug fixes
+---------
+
+- Fixed an issue where dns queries were delayed indefinitely when an exception occurred in a ``trace.send_dns_cache_miss``
+  -- by :user:`logioniz`.
+
+
+  *Related issues and pull requests on GitHub:*
+  :issue:`10529`.
+
+
+
+- Fixed DNS resolution on platforms that don't support ``socket.AI_ADDRCONFIG`` -- by :user:`maxbachmann`.
+
+
+  *Related issues and pull requests on GitHub:*
+  :issue:`10542`.
+
+
+
+- The connector now raises :exc:`aiohttp.ClientConnectionError` instead of :exc:`OSError` when failing to explicitly close the socket after :py:meth:`asyncio.loop.create_connection` fails -- by :user:`bdraco`.
+
+
+  *Related issues and pull requests on GitHub:*
+  :issue:`10551`.
+
+
+
+- Break cyclic references at connection close when there was a traceback -- by :user:`bdraco`.
+
+  Special thanks to :user:`availov` for reporting the issue.
+
+
+  *Related issues and pull requests on GitHub:*
+  :issue:`10556`.
+
+
+
+- Break cyclic references when there is an exception handling a request -- by :user:`bdraco`.
+
+
+  *Related issues and pull requests on GitHub:*
+  :issue:`10569`.
+
+
+
+
+Features
+--------
+
+- Improved logging on non-overlapping WebSocket client protocols to include the remote address -- by :user:`bdraco`.
+
+
+  *Related issues and pull requests on GitHub:*
+  :issue:`10564`.
+
+
+
+
+Miscellaneous internal changes
+------------------------------
+
+- Improved performance of parsing content types by adding a cache in the same manner currently done with mime types -- by :user:`bdraco`.
+
+
+  *Related issues and pull requests on GitHub:*
+  :issue:`10552`.
+
+
+
+
+----
+
+
+3.11.13 (2025-02-24)
+====================
+
+Bug fixes
+---------
+
+- Removed a break statement inside the finally block in :py:class:`~aiohttp.web.RequestHandler`
+  -- by :user:`Cycloctane`.
+
+
+  *Related issues and pull requests on GitHub:*
+  :issue:`10434`.
+
+
+
+- Changed connection creation to explicitly close sockets if an exception is raised in the event loop's ``create_connection`` method -- by :user:`top-oai`.
+
+
+  *Related issues and pull requests on GitHub:*
+  :issue:`10464`.
+
+
+
+
+Packaging updates and notes for downstreams
+-------------------------------------------
+
+- Fixed test ``test_write_large_payload_deflate_compression_data_in_eof_writelines`` failing with Python 3.12.9+ or 3.13.2+ -- by :user:`bdraco`.
+
+
+  *Related issues and pull requests on GitHub:*
+  :issue:`10423`.
+
+
+
+
+Miscellaneous internal changes
+------------------------------
+
+- Added human-readable error messages to the exceptions for WebSocket disconnects due to PONG not being received -- by :user:`bdraco`.
+
+  Previously, the error messages were empty strings, which made it hard to determine what went wrong.
+
+
+  *Related issues and pull requests on GitHub:*
+  :issue:`10422`.
+
+
+
+
+----
+
+
+3.11.12 (2025-02-05)
+====================
+
+Bug fixes
+---------
+
+- ``MultipartForm.decode()`` now follows RFC1341 7.2.1 with a ``CRLF`` after the boundary
+  -- by :user:`imnotjames`.
+
+
+  *Related issues and pull requests on GitHub:*
+  :issue:`10270`.
+
+
+
+- Restored the missing ``total_bytes`` attribute to ``EmptyStreamReader`` -- by :user:`bdraco`.
+
+
+  *Related issues and pull requests on GitHub:*
+  :issue:`10387`.
+
+
+
+
+Features
+--------
+
+- Updated :py:func:`~aiohttp.request` to make it accept ``_RequestOptions`` kwargs.
+  -- by :user:`Cycloctane`.
+
+
+  *Related issues and pull requests on GitHub:*
+  :issue:`10300`.
+
+
+
+- Improved logging of HTTP protocol errors to include the remote address -- by :user:`bdraco`.
+
+
+  *Related issues and pull requests on GitHub:*
+  :issue:`10332`.
+
+
+
+
+Improved documentation
+----------------------
+
+- Added ``aiohttp-openmetrics`` to list of third-party libraries -- by :user:`jelmer`.
+
+
+  *Related issues and pull requests on GitHub:*
+  :issue:`10304`.
+
+
+
+
+Packaging updates and notes for downstreams
+-------------------------------------------
+
+- Added missing files to the source distribution to fix ``Makefile`` targets.
+  Added a ``cythonize-nodeps`` target to run Cython without invoking pip to install dependencies.
+
+
+  *Related issues and pull requests on GitHub:*
+  :issue:`10366`.
+
+
+
+- Started building armv7l musllinux wheels -- by :user:`bdraco`.
+
+
+  *Related issues and pull requests on GitHub:*
+  :issue:`10404`.
+
+
+
+
+Contributor-facing changes
+--------------------------
+
+- The CI/CD workflow has been updated to use `upload-artifact` v4 and `download-artifact` v4 GitHub Actions -- by :user:`silamon`.
+
+
+  *Related issues and pull requests on GitHub:*
+  :issue:`10281`.
+
+
+
+
+Miscellaneous internal changes
+------------------------------
+
+- Restored support for zero copy writes when using Python 3.12 versions 3.12.9 and later or Python 3.13.2+ -- by :user:`bdraco`.
+
+  Zero copy writes were previously disabled due to :cve:`2024-12254` which is resolved in these Python versions.
+
+
+  *Related issues and pull requests on GitHub:*
+  :issue:`10137`.
+
+
+
+
+----
+
+
+3.11.11 (2024-12-18)
+====================
+
+Bug fixes
+---------
+
+- Updated :py:meth:`~aiohttp.ClientSession.request` to reuse the ``quote_cookie`` setting from ``ClientSession._cookie_jar`` when processing cookies parameter.
+  -- by :user:`Cycloctane`.
+
+
+  *Related issues and pull requests on GitHub:*
+  :issue:`10093`.
+
+
+
+- Fixed type of ``SSLContext`` for some static type checkers (e.g. pyright).
+
+
+  *Related issues and pull requests on GitHub:*
+  :issue:`10099`.
+
+
+
+- Updated :meth:`aiohttp.web.StreamResponse.write` annotation to also allow :class:`bytearray` and :class:`memoryview` as inputs -- by :user:`cdce8p`.
+
+
+  *Related issues and pull requests on GitHub:*
+  :issue:`10154`.
+
+
+
+- Fixed a hang where a connection previously used for a streaming
+  download could be returned to the pool in a paused state.
+  -- by :user:`javitonino`.
+
+
+  *Related issues and pull requests on GitHub:*
+  :issue:`10169`.
+
+
+
+
+Features
+--------
+
+- Enabled ALPN on default SSL contexts. This improves compatibility with some
+  proxies which don't work without this extension.
+  -- by :user:`Cycloctane`.
+
+
+  *Related issues and pull requests on GitHub:*
+  :issue:`10156`.
+
+
+
+
+Miscellaneous internal changes
+------------------------------
+
+- Fixed an infinite loop that can occur when using aiohttp in combination
+  with `async-solipsism`_ -- by :user:`bmerry`.
+
+  .. _async-solipsism: https://github.com/bmerry/async-solipsism
+
+
+  *Related issues and pull requests on GitHub:*
+  :issue:`10149`.
+
+
+
+
+----
+
+
+3.11.10 (2024-12-05)
+====================
+
+Bug fixes
+---------
+
+- Fixed race condition in :class:`aiohttp.web.FileResponse` that could have resulted in an incorrect response if the file was replaced on the file system during ``prepare`` -- by :user:`bdraco`.
+
+
+  *Related issues and pull requests on GitHub:*
+  :issue:`10101`, :issue:`10113`.
+
+
+
+- Replaced deprecated call to :func:`mimetypes.guess_type` with :func:`mimetypes.guess_file_type` when using Python 3.13+ -- by :user:`bdraco`.
+
+
+  *Related issues and pull requests on GitHub:*
+  :issue:`10102`.
+
+
+
+- Disabled zero copy writes in the ``StreamWriter`` -- by :user:`bdraco`.
+
+
+  *Related issues and pull requests on GitHub:*
+  :issue:`10125`.
+
+
+
+
+----
+
+
 3.11.9 (2024-12-01)
 ===================
 
diff --git CONTRIBUTORS.txt CONTRIBUTORS.txt
index 6adb3b97fb1..953af52498a 100644
--- CONTRIBUTORS.txt
+++ CONTRIBUTORS.txt
@@ -9,6 +9,7 @@ Adam Mills
 Adrian Krupa
 Adrián Chaves
 Ahmed Tahri
+Alan Bogarin
 Alan Tse
 Alec Hanefeld
 Alejandro Gómez
@@ -30,6 +31,7 @@ Alexandru Mihai
 Alexey Firsov
 Alexey Nikitin
 Alexey Popravka
+Alexey Stavrov
 Alexey Stepanov
 Amin Etesamian
 Amit Tulshyan
@@ -41,6 +43,7 @@ Andrej Antonov
 Andrew Leech
 Andrew Lytvyn
 Andrew Svetlov
+Andrew Top
 Andrew Zhou
 Andrii Soldatenko
 Anes Abismail
@@ -166,10 +169,12 @@ Jaesung Lee
 Jake Davis
 Jakob Ackermann
 Jakub Wilk
+James Ward
 Jan Buchar
 Jan Gosmann
 Jarno Elonen
 Jashandeep Sohi
+Javier Torres
 Jean-Baptiste Estival
 Jens Steinhauser
 Jeonghun Lee
@@ -364,6 +369,7 @@ William S.
 Wilson Ong
 wouter bolsterlee
 Xavier Halloran
+Xi Rui
 Xiang Li
 Yang Zhou
 Yannick Koechlin
diff --git MANIFEST.in MANIFEST.in
index d7c5cef6aad..64cee139a1f 100644
--- MANIFEST.in
+++ MANIFEST.in
@@ -7,6 +7,7 @@ graft aiohttp
 graft docs
 graft examples
 graft tests
+graft tools
 graft requirements
 recursive-include vendor *
 global-include aiohttp *.pyi
diff --git Makefile Makefile
index b0a3ef3226b..c6193fea9e4 100644
--- Makefile
+++ Makefile
@@ -81,6 +81,9 @@ generate-llhttp: .llhttp-gen
 .PHONY: cythonize
 cythonize: .install-cython $(PYXS:.pyx=.c) aiohttp/_websocket/reader_c.c
 
+.PHONY: cythonize-nodeps
+cythonize-nodeps: $(PYXS:.pyx=.c) aiohttp/_websocket/reader_c.c
+
 .install-deps: .install-cython $(PYXS:.pyx=.c) aiohttp/_websocket/reader_c.c $(call to-hash,$(CYS) $(REQS))
 	@python -m pip install -r requirements/dev.in -c requirements/dev.txt
 	@touch .install-deps
diff --git aiohttp/__init__.py aiohttp/__init__.py
index 5615e5349ae..aba86dc3a32 100644
--- aiohttp/__init__.py
+++ aiohttp/__init__.py
@@ -1,4 +1,4 @@
-__version__ = "3.11.9"
+__version__ = "3.11.15"
 
 from typing import TYPE_CHECKING, Tuple
 
diff --git aiohttp/_http_writer.pyx aiohttp/_http_writer.pyx
index 287371334f8..4a3ae1f9e68 100644
--- aiohttp/_http_writer.pyx
+++ aiohttp/_http_writer.pyx
@@ -97,27 +97,34 @@ cdef inline int _write_str(Writer* writer, str s):
             return -1
 
 
-# --------------- _serialize_headers ----------------------
-
-cdef str to_str(object s):
+cdef inline int _write_str_raise_on_nlcr(Writer* writer, object s):
+    cdef Py_UCS4 ch
+    cdef str out_str
     if type(s) is str:
-        return <str>s
+        out_str = <str>s
     elif type(s) is _istr:
-        return PyObject_Str(s)
+        out_str = PyObject_Str(s)
     elif not isinstance(s, str):
         raise TypeError("Cannot serialize non-str key {!r}".format(s))
     else:
-        return str(s)
+        out_str = str(s)
+
+    for ch in out_str:
+        if ch == 0x0D or ch == 0x0A:
+            raise ValueError(
+                "Newline or carriage return detected in headers. "
+                "Potential header injection attack."
+            )
+        if _write_utf8(writer, ch) < 0:
+            return -1
 
 
+# --------------- _serialize_headers ----------------------
 
 def _serialize_headers(str status_line, headers):
     cdef Writer writer
     cdef object key
     cdef object val
-    cdef bytes ret
-    cdef str key_str
-    cdef str val_str
 
     _init_writer(&writer)
 
@@ -130,22 +137,13 @@ def _serialize_headers(str status_line, headers):
             raise
 
         for key, val in headers.items():
-            key_str = to_str(key)
-            val_str = to_str(val)
-
-            if "\r" in key_str or "\n" in key_str or "\r" in val_str or "\n" in val_str:
-                raise ValueError(
-                    "Newline or carriage return character detected in HTTP status message or "
-                    "header. This is a potential security issue."
-                )
-
-            if _write_str(&writer, key_str) < 0:
+            if _write_str_raise_on_nlcr(&writer, key) < 0:
                 raise
             if _write_byte(&writer, b':') < 0:
                 raise
             if _write_byte(&writer, b' ') < 0:
                 raise
-            if _write_str(&writer, val_str) < 0:
+            if _write_str_raise_on_nlcr(&writer, val) < 0:
                 raise
             if _write_byte(&writer, b'\r') < 0:
                 raise
diff --git aiohttp/_websocket/reader_c.pxd aiohttp/_websocket/reader_c.pxd
index 461e658e116..f156a7ff704 100644
--- aiohttp/_websocket/reader_c.pxd
+++ aiohttp/_websocket/reader_c.pxd
@@ -93,6 +93,7 @@ cdef class WebSocketReader:
         chunk_size="unsigned int",
         chunk_len="unsigned int",
         buf_length="unsigned int",
+        buf_cstr="const unsigned char *",
         first_byte="unsigned char",
         second_byte="unsigned char",
         end_pos="unsigned int",
diff --git aiohttp/_websocket/reader_py.py aiohttp/_websocket/reader_py.py
index 94d20010890..92ad47a52f0 100644
--- aiohttp/_websocket/reader_py.py
+++ aiohttp/_websocket/reader_py.py
@@ -93,6 +93,7 @@ def _release_waiter(self) -> None:
     def feed_eof(self) -> None:
         self._eof = True
         self._release_waiter()
+        self._exception = None  # Break cyclic references
 
     def feed_data(self, data: "WSMessage", size: "int_") -> None:
         self._size += size
@@ -193,9 +194,8 @@ def _feed_data(self, data: bytes) -> None:
                     if self._max_msg_size and len(self._partial) >= self._max_msg_size:
                         raise WebSocketError(
                             WSCloseCode.MESSAGE_TOO_BIG,
-                            "Message size {} exceeds limit {}".format(
-                                len(self._partial), self._max_msg_size
-                            ),
+                            f"Message size {len(self._partial)} "
+                            f"exceeds limit {self._max_msg_size}",
                         )
                     continue
 
@@ -214,7 +214,7 @@ def _feed_data(self, data: bytes) -> None:
                     raise WebSocketError(
                         WSCloseCode.PROTOCOL_ERROR,
                         "The opcode in non-fin frame is expected "
-                        "to be zero, got {!r}".format(opcode),
+                        f"to be zero, got {opcode!r}",
                     )
 
                 assembled_payload: Union[bytes, bytearray]
@@ -227,9 +227,8 @@ def _feed_data(self, data: bytes) -> None:
                 if self._max_msg_size and len(assembled_payload) >= self._max_msg_size:
                     raise WebSocketError(
                         WSCloseCode.MESSAGE_TOO_BIG,
-                        "Message size {} exceeds limit {}".format(
-                            len(assembled_payload), self._max_msg_size
-                        ),
+                        f"Message size {len(assembled_payload)} "
+                        f"exceeds limit {self._max_msg_size}",
                     )
 
                 # Decompress process must to be done after all packets
@@ -246,9 +245,8 @@ def _feed_data(self, data: bytes) -> None:
                         left = len(self._decompressobj.unconsumed_tail)
                         raise WebSocketError(
                             WSCloseCode.MESSAGE_TOO_BIG,
-                            "Decompressed message size {} exceeds limit {}".format(
-                                self._max_msg_size + left, self._max_msg_size
-                            ),
+                            f"Decompressed message size {self._max_msg_size + left}"
+                            f" exceeds limit {self._max_msg_size}",
                         )
                 elif type(assembled_payload) is bytes:
                     payload_merged = assembled_payload
@@ -327,14 +325,15 @@ def parse_frame(
 
         start_pos: int = 0
         buf_length = len(buf)
+        buf_cstr = buf
 
         while True:
             # read header
             if self._state == READ_HEADER:
                 if buf_length - start_pos < 2:
                     break
-                first_byte = buf[start_pos]
-                second_byte = buf[start_pos + 1]
+                first_byte = buf_cstr[start_pos]
+                second_byte = buf_cstr[start_pos + 1]
                 start_pos += 2
 
                 fin = (first_byte >> 7) & 1
@@ -399,14 +398,14 @@ def parse_frame(
                 if length_flag == 126:
                     if buf_length - start_pos < 2:
                         break
-                    first_byte = buf[start_pos]
-                    second_byte = buf[start_pos + 1]
+                    first_byte = buf_cstr[start_pos]
+                    second_byte = buf_cstr[start_pos + 1]
                     start_pos += 2
                     self._payload_length = first_byte << 8 | second_byte
                 elif length_flag > 126:
                     if buf_length - start_pos < 8:
                         break
-                    data = buf[start_pos : start_pos + 8]
+                    data = buf_cstr[start_pos : start_pos + 8]
                     start_pos += 8
                     self._payload_length = UNPACK_LEN3(data)[0]
                 else:
@@ -418,7 +417,7 @@ def parse_frame(
             if self._state == READ_PAYLOAD_MASK:
                 if buf_length - start_pos < 4:
                     break
-                self._frame_mask = buf[start_pos : start_pos + 4]
+                self._frame_mask = buf_cstr[start_pos : start_pos + 4]
                 start_pos += 4
                 self._state = READ_PAYLOAD
 
@@ -434,10 +433,10 @@ def parse_frame(
                 if self._frame_payload_len:
                     if type(self._frame_payload) is not bytearray:
                         self._frame_payload = bytearray(self._frame_payload)
-                    self._frame_payload += buf[start_pos:end_pos]
+                    self._frame_payload += buf_cstr[start_pos:end_pos]
                 else:
                     # Fast path for the first frame
-                    self._frame_payload = buf[start_pos:end_pos]
+                    self._frame_payload = buf_cstr[start_pos:end_pos]
 
                 self._frame_payload_len += end_pos - start_pos
                 start_pos = end_pos
@@ -463,6 +462,7 @@ def parse_frame(
                 self._frame_payload_len = 0
                 self._state = READ_HEADER
 
-        self._tail = buf[start_pos:] if start_pos < buf_length else b""
+        # XXX: Cython needs slices to be bounded, so we can't omit the slice end here.
+        self._tail = buf_cstr[start_pos:buf_length] if start_pos < buf_length else b""
 
         return frames
diff --git aiohttp/abc.py aiohttp/abc.py
index d6f9f782b0f..5794a9108b0 100644
--- aiohttp/abc.py
+++ aiohttp/abc.py
@@ -17,6 +17,7 @@
     Optional,
     Tuple,
     TypedDict,
+    Union,
 )
 
 from multidict import CIMultiDict
@@ -175,6 +176,11 @@ class AbstractCookieJar(Sized, IterableBase):
     def __init__(self, *, loop: Optional[asyncio.AbstractEventLoop] = None) -> None:
         self._loop = loop or asyncio.get_running_loop()
 
+    @property
+    @abstractmethod
+    def quote_cookie(self) -> bool:
+        """Return True if cookies should be quoted."""
+
     @abstractmethod
     def clear(self, predicate: Optional[ClearCookiePredicate] = None) -> None:
         """Clear all cookies if no predicate is passed."""
@@ -200,7 +206,7 @@ class AbstractStreamWriter(ABC):
     length: Optional[int] = 0
 
     @abstractmethod
-    async def write(self, chunk: bytes) -> None:
+    async def write(self, chunk: Union[bytes, bytearray, memoryview]) -> None:
         """Write chunk into stream."""
 
     @abstractmethod
diff --git aiohttp/client.py aiohttp/client.py
index e04a6ff989a..7c788e825eb 100644
--- aiohttp/client.py
+++ aiohttp/client.py
@@ -658,7 +658,9 @@ async def _request(
                     all_cookies = self._cookie_jar.filter_cookies(url)
 
                     if cookies is not None:
-                        tmp_cookie_jar = CookieJar()
+                        tmp_cookie_jar = CookieJar(
+                            quote_cookie=self._cookie_jar.quote_cookie
+                        )
                         tmp_cookie_jar.update_cookies(cookies)
                         req_cookies = tmp_cookie_jar.filter_cookies(url)
                         if req_cookies:
@@ -1469,106 +1471,80 @@ async def __aexit__(
         await self._session.close()
 
 
-def request(
-    method: str,
-    url: StrOrURL,
-    *,
-    params: Query = None,
-    data: Any = None,
-    json: Any = None,
-    headers: Optional[LooseHeaders] = None,
-    skip_auto_headers: Optional[Iterable[str]] = None,
-    auth: Optional[BasicAuth] = None,
-    allow_redirects: bool = True,
-    max_redirects: int = 10,
-    compress: Optional[str] = None,
-    chunked: Optional[bool] = None,
-    expect100: bool = False,
-    raise_for_status: Optional[bool] = None,
-    read_until_eof: bool = True,
-    proxy: Optional[StrOrURL] = None,
-    proxy_auth: Optional[BasicAuth] = None,
-    timeout: Union[ClientTimeout, object] = sentinel,
-    cookies: Optional[LooseCookies] = None,
-    version: HttpVersion = http.HttpVersion11,
-    connector: Optional[BaseConnector] = None,
-    read_bufsize: Optional[int] = None,
-    loop: Optional[asyncio.AbstractEventLoop] = None,
-    max_line_size: int = 8190,
-    max_field_size: int = 8190,
-) -> _SessionRequestContextManager:
-    """Constructs and sends a request.
-
-    Returns response object.
-    method - HTTP method
-    url - request url
-    params - (optional) Dictionary or bytes to be sent in the query
-      string of the new request
-    data - (optional) Dictionary, bytes, or file-like object to
-      send in the body of the request
-    json - (optional) Any json compatible python object
-    headers - (optional) Dictionary of HTTP Headers to send with
-      the request
-    cookies - (optional) Dict object to send with the request
-    auth - (optional) BasicAuth named tuple represent HTTP Basic Auth
-    auth - aiohttp.helpers.BasicAuth
-    allow_redirects - (optional) If set to False, do not follow
-      redirects
-    version - Request HTTP version.
-    compress - Set to True if request has to be compressed
-       with deflate encoding.
-    chunked - Set to chunk size for chunked transfer encoding.
-    expect100 - Expect 100-continue response from server.
-    connector - BaseConnector sub-class instance to support
-       connection pooling.
-    read_until_eof - Read response until eof if response
-       does not have Content-Length header.
-    loop - Optional event loop.
-    timeout - Optional ClientTimeout settings structure, 5min
-       total timeout by default.
-    Usage::
-      >>> import aiohttp
-      >>> resp = await aiohttp.request('GET', 'http://python.org/')
-      >>> resp
-      <ClientResponse(python.org/) [200]>
-      >>> data = await resp.read()
-    """
-    connector_owner = False
-    if connector is None:
-        connector_owner = True
-        connector = TCPConnector(loop=loop, force_close=True)
-
-    session = ClientSession(
-        loop=loop,
-        cookies=cookies,
-        version=version,
-        timeout=timeout,
-        connector=connector,
-        connector_owner=connector_owner,
-    )
+if sys.version_info >= (3, 11) and TYPE_CHECKING:
 
-    return _SessionRequestContextManager(
-        session._request(
-            method,
-            url,
-            params=params,
-            data=data,
-            json=json,
-            headers=headers,
-            skip_auto_headers=skip_auto_headers,
-            auth=auth,
-            allow_redirects=allow_redirects,
-            max_redirects=max_redirects,
-            compress=compress,
-            chunked=chunked,
-            expect100=expect100,
-            raise_for_status=raise_for_status,
-            read_until_eof=read_until_eof,
-            proxy=proxy,
-            proxy_auth=proxy_auth,
-            read_bufsize=read_bufsize,
-            max_line_size=max_line_size,
-            max_field_size=max_field_size,
-        ),
-        session,
-    )
+    def request(
+        method: str,
+        url: StrOrURL,
+        *,
+        version: HttpVersion = http.HttpVersion11,
+        connector: Optional[BaseConnector] = None,
+        loop: Optional[asyncio.AbstractEventLoop] = None,
+        **kwargs: Unpack[_RequestOptions],
+    ) -> _SessionRequestContextManager: ...
+
+else:
+
+    def request(
+        method: str,
+        url: StrOrURL,
+        *,
+        version: HttpVersion = http.HttpVersion11,
+        connector: Optional[BaseConnector] = None,
+        loop: Optional[asyncio.AbstractEventLoop] = None,
+        **kwargs: Any,
+    ) -> _SessionRequestContextManager:
+        """Constructs and sends a request.
+
+        Returns response object.
+        method - HTTP method
+        url - request url
+        params - (optional) Dictionary or bytes to be sent in the query
+        string of the new request
+        data - (optional) Dictionary, bytes, or file-like object to
+        send in the body of the request
+        json - (optional) Any json compatible python object
+        headers - (optional) Dictionary of HTTP Headers to send with
+        the request
+        cookies - (optional) Dict object to send with the request
+        auth - (optional) BasicAuth named tuple represent HTTP Basic Auth
+        auth - aiohttp.helpers.BasicAuth
+        allow_redirects - (optional) If set to False, do not follow
+        redirects
+        version - Request HTTP version.
+        compress - Set to True if request has to be compressed
+        with deflate encoding.
+        chunked - Set to chunk size for chunked transfer encoding.
+        expect100 - Expect 100-continue response from server.
+        connector - BaseConnector sub-class instance to support
+        connection pooling.
+        read_until_eof - Read response until eof if response
+        does not have Content-Length header.
+        loop - Optional event loop.
+        timeout - Optional ClientTimeout settings structure, 5min
+        total timeout by default.
+        Usage::
+        >>> import aiohttp
+        >>> async with aiohttp.request('GET', 'http://python.org/') as resp:
+        ...    print(resp)
+        ...    data = await resp.read()
+        <ClientResponse(https://www.python.org/) [200 OK]>
+        """
+        connector_owner = False
+        if connector is None:
+            connector_owner = True
+            connector = TCPConnector(loop=loop, force_close=True)
+
+        session = ClientSession(
+            loop=loop,
+            cookies=kwargs.pop("cookies", None),
+            version=version,
+            timeout=kwargs.pop("timeout", sentinel),
+            connector=connector,
+            connector_owner=connector_owner,
+        )
+
+        return _SessionRequestContextManager(
+            session._request(method, url, **kwargs),
+            session,
+        )
diff --git aiohttp/client_exceptions.py aiohttp/client_exceptions.py
index 667da8d5084..1d298e9a8cf 100644
--- aiohttp/client_exceptions.py
+++ aiohttp/client_exceptions.py
@@ -8,13 +8,17 @@
 
 from .typedefs import StrOrURL
 
-try:
+if TYPE_CHECKING:
     import ssl
 
     SSLContext = ssl.SSLContext
-except ImportError:  # pragma: no cover
-    ssl = SSLContext = None  # type: ignore[assignment]
+else:
+    try:
+        import ssl
 
+        SSLContext = ssl.SSLContext
+    except ImportError:  # pragma: no cover
+        ssl = SSLContext = None  # type: ignore[assignment]
 
 if TYPE_CHECKING:
     from .client_reqrep import ClientResponse, ConnectionKey, Fingerprint, RequestInfo
diff --git aiohttp/client_proto.py aiohttp/client_proto.py
index 79f033e3e12..2d64b3f3644 100644
--- aiohttp/client_proto.py
+++ aiohttp/client_proto.py
@@ -64,6 +64,7 @@ def force_close(self) -> None:
         self._should_close = True
 
     def close(self) -> None:
+        self._exception = None  # Break cyclic references
         transport = self.transport
         if transport is not None:
             transport.close()
diff --git aiohttp/client_reqrep.py aiohttp/client_reqrep.py
index e97c40ce0e5..43b48063c6e 100644
--- aiohttp/client_reqrep.py
+++ aiohttp/client_reqrep.py
@@ -72,12 +72,16 @@
     RawHeaders,
 )
 
-try:
+if TYPE_CHECKING:
     import ssl
     from ssl import SSLContext
-except ImportError:  # pragma: no cover
-    ssl = None  # type: ignore[assignment]
-    SSLContext = object  # type: ignore[misc,assignment]
+else:
+    try:
+        import ssl
+        from ssl import SSLContext
+    except ImportError:  # pragma: no cover
+        ssl = None  # type: ignore[assignment]
+        SSLContext = object  # type: ignore[misc,assignment]
 
 
 __all__ = ("ClientRequest", "ClientResponse", "RequestInfo", "Fingerprint")
diff --git aiohttp/client_ws.py aiohttp/client_ws.py
index f4cfa1bffe8..daa57d1930b 100644
--- aiohttp/client_ws.py
+++ aiohttp/client_ws.py
@@ -163,7 +163,9 @@ def _ping_task_done(self, task: "asyncio.Task[None]") -> None:
         self._ping_task = None
 
     def _pong_not_received(self) -> None:
-        self._handle_ping_pong_exception(ServerTimeoutError())
+        self._handle_ping_pong_exception(
+            ServerTimeoutError(f"No PONG received after {self._pong_heartbeat} seconds")
+        )
 
     def _handle_ping_pong_exception(self, exc: BaseException) -> None:
         """Handle exceptions raised during ping/pong processing."""
diff --git aiohttp/connector.py aiohttp/connector.py
index 93bc2513b20..7420bd6070a 100644
--- aiohttp/connector.py
+++ aiohttp/connector.py
@@ -60,14 +60,18 @@
 )
 from .resolver import DefaultResolver
 
-try:
+if TYPE_CHECKING:
     import ssl
 
     SSLContext = ssl.SSLContext
-except ImportError:  # pragma: no cover
-    ssl = None  # type: ignore[assignment]
-    SSLContext = object  # type: ignore[misc,assignment]
+else:
+    try:
+        import ssl
 
+        SSLContext = ssl.SSLContext
+    except ImportError:  # pragma: no cover
+        ssl = None  # type: ignore[assignment]
+        SSLContext = object  # type: ignore[misc,assignment]
 
 EMPTY_SCHEMA_SET = frozenset({""})
 HTTP_SCHEMA_SET = frozenset({"http", "https"})
@@ -776,14 +780,16 @@ def _make_ssl_context(verified: bool) -> SSLContext:
         # No ssl support
         return None
     if verified:
-        return ssl.create_default_context()
-    sslcontext = ssl.SSLContext(ssl.PROTOCOL_TLS_CLIENT)
-    sslcontext.options |= ssl.OP_NO_SSLv2
-    sslcontext.options |= ssl.OP_NO_SSLv3
-    sslcontext.check_hostname = False
-    sslcontext.verify_mode = ssl.CERT_NONE
-    sslcontext.options |= ssl.OP_NO_COMPRESSION
-    sslcontext.set_default_verify_paths()
+        sslcontext = ssl.create_default_context()
+    else:
+        sslcontext = ssl.SSLContext(ssl.PROTOCOL_TLS_CLIENT)
+        sslcontext.options |= ssl.OP_NO_SSLv2
+        sslcontext.options |= ssl.OP_NO_SSLv3
+        sslcontext.check_hostname = False
+        sslcontext.verify_mode = ssl.CERT_NONE
+        sslcontext.options |= ssl.OP_NO_COMPRESSION
+        sslcontext.set_default_verify_paths()
+    sslcontext.set_alpn_protocols(("http/1.1",))
     return sslcontext
 
 
@@ -1009,11 +1015,11 @@ async def _resolve_host_with_throttle(
         This method must be run in a task and shielded from cancellation
         to avoid cancelling the underlying lookup.
         """
-        if traces:
-            for trace in traces:
-                await trace.send_dns_cache_miss(host)
         try:
             if traces:
+                for trace in traces:
+                    await trace.send_dns_cache_miss(host)
+
                 for trace in traces:
                     await trace.send_dns_resolvehost_start(host)
 
diff --git aiohttp/cookiejar.py aiohttp/cookiejar.py
index ef04bda5ad6..f6b9a921767 100644
--- aiohttp/cookiejar.py
+++ aiohttp/cookiejar.py
@@ -117,6 +117,10 @@ def __init__(
         self._expire_heap: List[Tuple[float, Tuple[str, str, str]]] = []
         self._expirations: Dict[Tuple[str, str, str], float] = {}
 
+    @property
+    def quote_cookie(self) -> bool:
+        return self._quote_cookie
+
     def save(self, file_path: PathLike) -> None:
         file_path = pathlib.Path(file_path)
         with file_path.open(mode="wb") as f:
@@ -474,6 +478,10 @@ def __iter__(self) -> "Iterator[Morsel[str]]":
     def __len__(self) -> int:
         return 0
 
+    @property
+    def quote_cookie(self) -> bool:
+        return True
+
     def clear(self, predicate: Optional[ClearCookiePredicate] = None) -> None:
         pass
 
diff --git aiohttp/helpers.py aiohttp/helpers.py
index 8038931ebec..ace4f0e9b53 100644
--- aiohttp/helpers.py
+++ aiohttp/helpers.py
@@ -21,7 +21,7 @@
 from email.utils import parsedate
 from math import ceil
 from pathlib import Path
-from types import TracebackType
+from types import MappingProxyType, TracebackType
 from typing import (
     Any,
     Callable,
@@ -357,6 +357,20 @@ def parse_mimetype(mimetype: str) -> MimeType:
     )
 
 
+@functools.lru_cache(maxsize=56)
+def parse_content_type(raw: str) -> Tuple[str, MappingProxyType[str, str]]:
+    """Parse Content-Type header.
+
+    Returns a tuple of the parsed content type and a
+    MappingProxyType of parameters.
+    """
+    msg = HeaderParser().parsestr(f"Content-Type: {raw}")
+    content_type = msg.get_content_type()
+    params = msg.get_params(())
+    content_dict = dict(params[1:])  # First element is content type again
+    return content_type, MappingProxyType(content_dict)
+
+
 def guess_filename(obj: Any, default: Optional[str] = None) -> Optional[str]:
     name = getattr(obj, "name", None)
     if name and isinstance(name, str) and name[0] != "<" and name[-1] != ">":
@@ -710,10 +724,10 @@ def _parse_content_type(self, raw: Optional[str]) -> None:
             self._content_type = "application/octet-stream"
             self._content_dict = {}
         else:
-            msg = HeaderParser().parsestr("Content-Type: " + raw)
-            self._content_type = msg.get_content_type()
-            params = msg.get_params(())
-            self._content_dict = dict(params[1:])  # First element is content type again
+            content_type, content_mapping_proxy = parse_content_type(raw)
+            self._content_type = content_type
+            # _content_dict needs to be mutable so we can update it
+            self._content_dict = content_mapping_proxy.copy()
 
     @property
     def content_type(self) -> str:
diff --git aiohttp/http_writer.py aiohttp/http_writer.py
index c66fda3d8d0..e031a97708d 100644
--- aiohttp/http_writer.py
+++ aiohttp/http_writer.py
@@ -1,6 +1,7 @@
 """Http related parsers and protocol."""
 
 import asyncio
+import sys
 import zlib
 from typing import (  # noqa
     Any,
@@ -24,6 +25,17 @@
 __all__ = ("StreamWriter", "HttpVersion", "HttpVersion10", "HttpVersion11")
 
 
+MIN_PAYLOAD_FOR_WRITELINES = 2048
+IS_PY313_BEFORE_313_2 = (3, 13, 0) <= sys.version_info < (3, 13, 2)
+IS_PY_BEFORE_312_9 = sys.version_info < (3, 12, 9)
+SKIP_WRITELINES = IS_PY313_BEFORE_313_2 or IS_PY_BEFORE_312_9
+# writelines is not safe for use
+# on Python 3.12+ until 3.12.9
+# on Python 3.13+ until 3.13.2
+# and on older versions it not any faster than write
+# CVE-2024-12254: https://github.com/python/cpython/pull/127656
+
+
 class HttpVersion(NamedTuple):
     major: int
     minor: int
@@ -72,7 +84,7 @@ def enable_compression(
     ) -> None:
         self._compress = ZLibCompressor(encoding=encoding, strategy=strategy)
 
-    def _write(self, chunk: bytes) -> None:
+    def _write(self, chunk: Union[bytes, bytearray, memoryview]) -> None:
         size = len(chunk)
         self.buffer_size += size
         self.output_size += size
@@ -90,10 +102,17 @@ def _writelines(self, chunks: Iterable[bytes]) -> None:
         transport = self._protocol.transport
         if transport is None or transport.is_closing():
             raise ClientConnectionResetError("Cannot write to closing transport")
-        transport.writelines(chunks)
+        if SKIP_WRITELINES or size < MIN_PAYLOAD_FOR_WRITELINES:
+            transport.write(b"".join(chunks))
+        else:
+            transport.writelines(chunks)
 
     async def write(
-        self, chunk: bytes, *, drain: bool = True, LIMIT: int = 0x10000
+        self,
+        chunk: Union[bytes, bytearray, memoryview],
+        *,
+        drain: bool = True,
+        LIMIT: int = 0x10000,
     ) -> None:
         """Writes chunk of data to a stream.
 
diff --git aiohttp/multipart.py aiohttp/multipart.py
index e0bcce07449..bd4d8ae1ddf 100644
--- aiohttp/multipart.py
+++ aiohttp/multipart.py
@@ -979,7 +979,7 @@ def decode(self, encoding: str = "utf-8", errors: str = "strict") -> str:
         return "".join(
             "--"
             + self.boundary
-            + "\n"
+            + "\r\n"
             + part._binary_headers.decode(encoding, errors)
             + part.decode()
             for part, _e, _te in self._parts
diff --git aiohttp/payload.py aiohttp/payload.py
index c8c01814698..3f6d3672db2 100644
--- aiohttp/payload.py
+++ aiohttp/payload.py
@@ -4,6 +4,7 @@
 import json
 import mimetypes
 import os
+import sys
 import warnings
 from abc import ABC, abstractmethod
 from itertools import chain
@@ -169,7 +170,11 @@ def __init__(
         if content_type is not sentinel and content_type is not None:
             self._headers[hdrs.CONTENT_TYPE] = content_type
         elif self._filename is not None:
-            content_type = mimetypes.guess_type(self._filename)[0]
+            if sys.version_info >= (3, 13):
+                guesser = mimetypes.guess_file_type
+            else:
+                guesser = mimetypes.guess_type
+            content_type = guesser(self._filename)[0]
             if content_type is None:
                 content_type = self._default_content_type
             self._headers[hdrs.CONTENT_TYPE] = content_type
diff --git aiohttp/resolver.py aiohttp/resolver.py
index 9c744514fae..e14179cc8a2 100644
--- aiohttp/resolver.py
+++ aiohttp/resolver.py
@@ -18,6 +18,9 @@
 
 _NUMERIC_SOCKET_FLAGS = socket.AI_NUMERICHOST | socket.AI_NUMERICSERV
 _NAME_SOCKET_FLAGS = socket.NI_NUMERICHOST | socket.NI_NUMERICSERV
+_AI_ADDRCONFIG = socket.AI_ADDRCONFIG
+if hasattr(socket, "AI_MASK"):
+    _AI_ADDRCONFIG &= socket.AI_MASK
 
 
 class ThreadedResolver(AbstractResolver):
@@ -38,7 +41,7 @@ async def resolve(
             port,
             type=socket.SOCK_STREAM,
             family=family,
-            flags=socket.AI_ADDRCONFIG,
+            flags=_AI_ADDRCONFIG,
         )
 
         hosts: List[ResolveResult] = []
@@ -105,7 +108,7 @@ async def resolve(
                 port=port,
                 type=socket.SOCK_STREAM,
                 family=family,
-                flags=socket.AI_ADDRCONFIG,
+                flags=_AI_ADDRCONFIG,
             )
         except aiodns.error.DNSError as exc:
             msg = exc.args[1] if len(exc.args) >= 1 else "DNS lookup failed"
diff --git aiohttp/streams.py aiohttp/streams.py
index b97846171b1..7a3f64d1289 100644
--- aiohttp/streams.py
+++ aiohttp/streams.py
@@ -220,6 +220,9 @@ def feed_eof(self) -> None:
             self._eof_waiter = None
             set_result(waiter, None)
 
+        if self._protocol._reading_paused:
+            self._protocol.resume_reading()
+
         for cb in self._eof_callbacks:
             try:
                 cb()
@@ -517,8 +520,9 @@ def _read_nowait_chunk(self, n: int) -> bytes:
         else:
             data = self._buffer.popleft()
 
-        self._size -= len(data)
-        self._cursor += len(data)
+        data_len = len(data)
+        self._size -= data_len
+        self._cursor += data_len
 
         chunk_splits = self._http_chunk_splits
         # Prevent memory leak: drop useless chunk splits
@@ -551,6 +555,7 @@ class EmptyStreamReader(StreamReader):  # lgtm [py/missing-call-to-init]
 
     def __init__(self) -> None:
         self._read_eof_chunk = False
+        self.total_bytes = 0
 
     def __repr__(self) -> str:
         return "<%s>" % self.__class__.__name__
diff --git aiohttp/web.py aiohttp/web.py
index f975b665331..d6ab6f6fad4 100644
--- aiohttp/web.py
+++ aiohttp/web.py
@@ -9,6 +9,7 @@
 from contextlib import suppress
 from importlib import import_module
 from typing import (
+    TYPE_CHECKING,
     Any,
     Awaitable,
     Callable,
@@ -287,10 +288,13 @@
 )
 
 
-try:
+if TYPE_CHECKING:
     from ssl import SSLContext
-except ImportError:  # pragma: no cover
-    SSLContext = Any  # type: ignore[misc,assignment]
+else:
+    try:
+        from ssl import SSLContext
+    except ImportError:  # pragma: no cover
+        SSLContext = object  # type: ignore[misc,assignment]
 
 # Only display warning when using -Wdefault, -We, -X dev or similar.
 warnings.filterwarnings("ignore", category=NotAppKeyWarning, append=True)
diff --git aiohttp/web_fileresponse.py aiohttp/web_fileresponse.py
index 3b2bc2caf12..be9cf87e069 100644
--- aiohttp/web_fileresponse.py
+++ aiohttp/web_fileresponse.py
@@ -1,7 +1,10 @@
 import asyncio
+import io
 import os
 import pathlib
+import sys
 from contextlib import suppress
+from enum import Enum, auto
 from mimetypes import MimeTypes
 from stat import S_ISREG
 from types import MappingProxyType
@@ -15,6 +18,7 @@
     Iterator,
     List,
     Optional,
+    Set,
     Tuple,
     Union,
     cast,
@@ -66,12 +70,25 @@
     }
 )
 
+
+class _FileResponseResult(Enum):
+    """The result of the file response."""
+
+    SEND_FILE = auto()  # Ie a regular file to send
+    NOT_ACCEPTABLE = auto()  # Ie a socket, or non-regular file
+    PRE_CONDITION_FAILED = auto()  # Ie If-Match or If-None-Match failed
+    NOT_MODIFIED = auto()  # 304 Not Modified
+
+
 # Add custom pairs and clear the encodings map so guess_type ignores them.
 CONTENT_TYPES.encodings_map.clear()
 for content_type, extension in ADDITIONAL_CONTENT_TYPES.items():
     CONTENT_TYPES.add_type(content_type, extension)  # type: ignore[attr-defined]
 
 
+_CLOSE_FUTURES: Set[asyncio.Future[None]] = set()
+
+
 class FileResponse(StreamResponse):
     """A response object can be used to send files."""
 
@@ -160,10 +177,12 @@ async def _precondition_failed(
         self.content_length = 0
         return await super().prepare(request)
 
-    def _get_file_path_stat_encoding(
-        self, accept_encoding: str
-    ) -> Tuple[pathlib.Path, os.stat_result, Optional[str]]:
-        """Return the file path, stat result, and encoding.
+    def _make_response(
+        self, request: "BaseRequest", accept_encoding: str
+    ) -> Tuple[
+        _FileResponseResult, Optional[io.BufferedReader], os.stat_result, Optional[str]
+    ]:
+        """Return the response result, io object, stat result, and encoding.
 
         If an uncompressed file is returned, the encoding is set to
         :py:data:`None`.
@@ -171,6 +190,52 @@ def _get_file_path_stat_encoding(
         This method should be called from a thread executor
         since it calls os.stat which may block.
         """
+        file_path, st, file_encoding = self._get_file_path_stat_encoding(
+            accept_encoding
+        )
+        if not file_path:
+            return _FileResponseResult.NOT_ACCEPTABLE, None, st, None
+
+        etag_value = f"{st.st_mtime_ns:x}-{st.st_size:x}"
+
+        # https://www.rfc-editor.org/rfc/rfc9110#section-13.1.1-2
+        if (ifmatch := request.if_match) is not None and not self._etag_match(
+            etag_value, ifmatch, weak=False
+        ):
+            return _FileResponseResult.PRE_CONDITION_FAILED, None, st, file_encoding
+
+        if (
+            (unmodsince := request.if_unmodified_since) is not None
+            and ifmatch is None
+            and st.st_mtime > unmodsince.timestamp()
+        ):
+            return _FileResponseResult.PRE_CONDITION_FAILED, None, st, file_encoding
+
+        # https://www.rfc-editor.org/rfc/rfc9110#section-13.1.2-2
+        if (ifnonematch := request.if_none_match) is not None and self._etag_match(
+            etag_value, ifnonematch, weak=True
+        ):
+            return _FileResponseResult.NOT_MODIFIED, None, st, file_encoding
+
+        if (
+            (modsince := request.if_modified_since) is not None
+            and ifnonematch is None
+            and st.st_mtime <= modsince.timestamp()
+        ):
+            return _FileResponseResult.NOT_MODIFIED, None, st, file_encoding
+
+        fobj = file_path.open("rb")
+        with suppress(OSError):
+            # fstat() may not be available on all platforms
+            # Once we open the file, we want the fstat() to ensure
+            # the file has not changed between the first stat()
+            # and the open().
+            st = os.stat(fobj.fileno())
+        return _FileResponseResult.SEND_FILE, fobj, st, file_encoding
+
+    def _get_file_path_stat_encoding(
+        self, accept_encoding: str
+    ) -> Tuple[Optional[pathlib.Path], os.stat_result, Optional[str]]:
         file_path = self._path
         for file_extension, file_encoding in ENCODING_EXTENSIONS.items():
             if file_encoding not in accept_encoding:
@@ -184,7 +249,8 @@ def _get_file_path_stat_encoding(
                     return compressed_path, st, file_encoding
 
         # Fallback to the uncompressed file
-        return file_path, file_path.stat(), None
+        st = file_path.stat()
+        return file_path if S_ISREG(st.st_mode) else None, st, None
 
     async def prepare(self, request: "BaseRequest") -> Optional[AbstractStreamWriter]:
         loop = asyncio.get_running_loop()
@@ -192,9 +258,12 @@ async def prepare(self, request: "BaseRequest") -> Optional[AbstractStreamWriter
         # https://www.rfc-editor.org/rfc/rfc9110#section-8.4.1
         accept_encoding = request.headers.get(hdrs.ACCEPT_ENCODING, "").lower()
         try:
-            file_path, st, file_encoding = await loop.run_in_executor(
-                None, self._get_file_path_stat_encoding, accept_encoding
+            response_result, fobj, st, file_encoding = await loop.run_in_executor(
+                None, self._make_response, request, accept_encoding
             )
+        except PermissionError:
+            self.set_status(HTTPForbidden.status_code)
+            return await super().prepare(request)
         except OSError:
             # Most likely to be FileNotFoundError or OSError for circular
             # symlinks in python >= 3.13, so respond with 404.
@@ -202,51 +271,46 @@ async def prepare(self, request: "BaseRequest") -> Optional[AbstractStreamWriter
             return await super().prepare(request)
 
         # Forbid special files like sockets, pipes, devices, etc.
-        if not S_ISREG(st.st_mode):
+        if response_result is _FileResponseResult.NOT_ACCEPTABLE:
             self.set_status(HTTPForbidden.status_code)
             return await super().prepare(request)
 
-        etag_value = f"{st.st_mtime_ns:x}-{st.st_size:x}"
-        last_modified = st.st_mtime
-
-        # https://www.rfc-editor.org/rfc/rfc9110#section-13.1.1-2
-        ifmatch = request.if_match
-        if ifmatch is not None and not self._etag_match(
-            etag_value, ifmatch, weak=False
-        ):
-            return await self._precondition_failed(request)
-
-        unmodsince = request.if_unmodified_since
-        if (
-            unmodsince is not None
-            and ifmatch is None
-            and st.st_mtime > unmodsince.timestamp()
-        ):
+        if response_result is _FileResponseResult.PRE_CONDITION_FAILED:
             return await self._precondition_failed(request)
 
-        # https://www.rfc-editor.org/rfc/rfc9110#section-13.1.2-2
-        ifnonematch = request.if_none_match
-        if ifnonematch is not None and self._etag_match(
-            etag_value, ifnonematch, weak=True
-        ):
-            return await self._not_modified(request, etag_value, last_modified)
-
-        modsince = request.if_modified_since
-        if (
-            modsince is not None
-            and ifnonematch is None
-            and st.st_mtime <= modsince.timestamp()
-        ):
+        if response_result is _FileResponseResult.NOT_MODIFIED:
+            etag_value = f"{st.st_mtime_ns:x}-{st.st_size:x}"
+            last_modified = st.st_mtime
             return await self._not_modified(request, etag_value, last_modified)
 
+        assert fobj is not None
+        try:
+            return await self._prepare_open_file(request, fobj, st, file_encoding)
+        finally:
+            # We do not await here because we do not want to wait
+            # for the executor to finish before returning the response
+            # so the connection can begin servicing another request
+            # as soon as possible.
+            close_future = loop.run_in_executor(None, fobj.close)
+            # Hold a strong reference to the future to prevent it from being
+            # garbage collected before it completes.
+            _CLOSE_FUTURES.add(close_future)
+            close_future.add_done_callback(_CLOSE_FUTURES.remove)
+
+    async def _prepare_open_file(
+        self,
+        request: "BaseRequest",
+        fobj: io.BufferedReader,
+        st: os.stat_result,
+        file_encoding: Optional[str],
+    ) -> Optional[AbstractStreamWriter]:
         status = self._status
-        file_size = st.st_size
-        count = file_size
-
-        start = None
+        file_size: int = st.st_size
+        file_mtime: float = st.st_mtime
+        count: int = file_size
+        start: Optional[int] = None
 
-        ifrange = request.if_range
-        if ifrange is None or st.st_mtime <= ifrange.timestamp():
+        if (ifrange := request.if_range) is None or file_mtime <= ifrange.timestamp():
             # If-Range header check:
             # condition = cached date >= last modification date
             # return 206 if True else 200.
@@ -257,7 +321,7 @@ async def prepare(self, request: "BaseRequest") -> Optional[AbstractStreamWriter
             try:
                 rng = request.http_range
                 start = rng.start
-                end = rng.stop
+                end: Optional[int] = rng.stop
             except ValueError:
                 # https://tools.ietf.org/html/rfc7233:
                 # A server generating a 416 (Range Not Satisfiable) response to
@@ -268,13 +332,13 @@ async def prepare(self, request: "BaseRequest") -> Optional[AbstractStreamWriter
                 #
                 # Will do the same below. Many servers ignore this and do not
                 # send a Content-Range header with HTTP 416
-                self.headers[hdrs.CONTENT_RANGE] = f"bytes */{file_size}"
+                self._headers[hdrs.CONTENT_RANGE] = f"bytes */{file_size}"
                 self.set_status(HTTPRequestRangeNotSatisfiable.status_code)
                 return await super().prepare(request)
 
             # If a range request has been made, convert start, end slice
             # notation into file pointer offset and count
-            if start is not None or end is not None:
+            if start is not None:
                 if start < 0 and end is None:  # return tail of file
                     start += file_size
                     if start < 0:
@@ -304,7 +368,7 @@ async def prepare(self, request: "BaseRequest") -> Optional[AbstractStreamWriter
                     # suffix-byte-range-spec with a non-zero suffix-length,
                     # then the byte-range-set is satisfiable. Otherwise, the
                     # byte-range-set is unsatisfiable.
-                    self.headers[hdrs.CONTENT_RANGE] = f"bytes */{file_size}"
+                    self._headers[hdrs.CONTENT_RANGE] = f"bytes */{file_size}"
                     self.set_status(HTTPRequestRangeNotSatisfiable.status_code)
                     return await super().prepare(request)
 
@@ -316,48 +380,39 @@ async def prepare(self, request: "BaseRequest") -> Optional[AbstractStreamWriter
         # If the Content-Type header is not already set, guess it based on the
         # extension of the request path. The encoding returned by guess_type
         #  can be ignored since the map was cleared above.
-        if hdrs.CONTENT_TYPE not in self.headers:
-            self.content_type = (
-                CONTENT_TYPES.guess_type(self._path)[0] or FALLBACK_CONTENT_TYPE
-            )
+        if hdrs.CONTENT_TYPE not in self._headers:
+            if sys.version_info >= (3, 13):
+                guesser = CONTENT_TYPES.guess_file_type
+            else:
+                guesser = CONTENT_TYPES.guess_type
+            self.content_type = guesser(self._path)[0] or FALLBACK_CONTENT_TYPE
 
         if file_encoding:
-            self.headers[hdrs.CONTENT_ENCODING] = file_encoding
-            self.headers[hdrs.VARY] = hdrs.ACCEPT_ENCODING
+            self._headers[hdrs.CONTENT_ENCODING] = file_encoding
+            self._headers[hdrs.VARY] = hdrs.ACCEPT_ENCODING
             # Disable compression if we are already sending
             # a compressed file since we don't want to double
             # compress.
             self._compression = False
 
-        self.etag = etag_value  # type: ignore[assignment]
-        self.last_modified = st.st_mtime  # type: ignore[assignment]
+        self.etag = f"{st.st_mtime_ns:x}-{st.st_size:x}"  # type: ignore[assignment]
+        self.last_modified = file_mtime  # type: ignore[assignment]
         self.content_length = count
 
-        self.headers[hdrs.ACCEPT_RANGES] = "bytes"
-
-        real_start = cast(int, start)
+        self._headers[hdrs.ACCEPT_RANGES] = "bytes"
 
         if status == HTTPPartialContent.status_code:
-            self.headers[hdrs.CONTENT_RANGE] = "bytes {}-{}/{}".format(
+            real_start = start
+            assert real_start is not None
+            self._headers[hdrs.CONTENT_RANGE] = "bytes {}-{}/{}".format(
                 real_start, real_start + count - 1, file_size
             )
 
         # If we are sending 0 bytes calling sendfile() will throw a ValueError
-        if count == 0 or must_be_empty_body(request.method, self.status):
-            return await super().prepare(request)
-
-        try:
-            fobj = await loop.run_in_executor(None, file_path.open, "rb")
-        except PermissionError:
-            self.set_status(HTTPForbidden.status_code)
+        if count == 0 or must_be_empty_body(request.method, status):
             return await super().prepare(request)
 
-        if start:  # be aware that start could be None or int=0 here.
-            offset = start
-        else:
-            offset = 0
+        # be aware that start could be None or int=0 here.
+        offset = start or 0
 
-        try:
-            return await self._sendfile(request, fobj, offset, count)
-        finally:
-            await asyncio.shield(loop.run_in_executor(None, fobj.close))
+        return await self._sendfile(request, fobj, offset, count)
diff --git aiohttp/web_protocol.py aiohttp/web_protocol.py
index e8bb41abf97..1dba9606ea0 100644
--- aiohttp/web_protocol.py
+++ aiohttp/web_protocol.py
@@ -458,7 +458,7 @@ def _process_keepalive(self) -> None:
         loop = self._loop
         now = loop.time()
         close_time = self._next_keepalive_close_time
-        if now <= close_time:
+        if now < close_time:
             # Keep alive close check fired too early, reschedule
             self._keepalive_handle = loop.call_at(close_time, self._process_keepalive)
             return
@@ -520,8 +520,6 @@ async def start(self) -> None:
         keep_alive(True) specified.
         """
         loop = self._loop
-        handler = asyncio.current_task(loop)
-        assert handler is not None
         manager = self._manager
         assert manager is not None
         keepalive_timeout = self._keepalive_timeout
@@ -551,7 +549,16 @@ async def start(self) -> None:
             else:
                 request_handler = self._request_handler
 
-            request = self._request_factory(message, payload, self, writer, handler)
+            # Important don't hold a reference to the current task
+            # as on traceback it will prevent the task from being
+            # collected and will cause a memory leak.
+            request = self._request_factory(
+                message,
+                payload,
+                self,
+                writer,
+                self._task_handler or asyncio.current_task(loop),  # type: ignore[arg-type]
+            )
             try:
                 # a new task is used for copy context vars (#3406)
                 coro = self._handle_request(request, start, request_handler)
@@ -608,26 +615,29 @@ async def start(self) -> None:
 
             except asyncio.CancelledError:
                 self.log_debug("Ignored premature client disconnection")
+                self.force_close()
                 raise
             except Exception as exc:
                 self.log_exception("Unhandled exception", exc_info=exc)
                 self.force_close()
+            except BaseException:
+                self.force_close()
+                raise
             finally:
+                request._task = None  # type: ignore[assignment] # Break reference cycle in case of exception
                 if self.transport is None and resp is not None:
                     self.log_debug("Ignored premature client disconnection.")
-                elif not self._force_close:
-                    if self._keepalive and not self._close:
-                        # start keep-alive timer
-                        if keepalive_timeout is not None:
-                            now = loop.time()
-                            close_time = now + keepalive_timeout
-                            self._next_keepalive_close_time = close_time
-                            if self._keepalive_handle is None:
-                                self._keepalive_handle = loop.call_at(
-                                    close_time, self._process_keepalive
-                                )
-                    else:
-                        break
+
+            if self._keepalive and not self._close and not self._force_close:
+                # start keep-alive timer
+                close_time = loop.time() + keepalive_timeout
+                self._next_keepalive_close_time = close_time
+                if self._keepalive_handle is None:
+                    self._keepalive_handle = loop.call_at(
+                        close_time, self._process_keepalive
+                    )
+            else:
+                break
 
         # remove handler, close transport if no handlers left
         if not self._force_close:
@@ -694,9 +704,13 @@ def handle_error(
             # or, encrypted traffic to an HTTP port. This is expected
             # to happen when connected to the public internet so we log
             # it at the debug level as to not fill logs with noise.
-            self.logger.debug("Error handling request", exc_info=exc)
+            self.logger.debug(
+                "Error handling request from %s", request.remote, exc_info=exc
+            )
         else:
-            self.log_exception("Error handling request", exc_info=exc)
+            self.log_exception(
+                "Error handling request from %s", request.remote, exc_info=exc
+            )
 
         # some data already got sent, connection is broken
         if request.writer.output_size > 0:
diff --git aiohttp/web_response.py aiohttp/web_response.py
index cd2be24f1a3..e498a905caf 100644
--- aiohttp/web_response.py
+++ aiohttp/web_response.py
@@ -537,7 +537,7 @@ async def _write_headers(self) -> None:
         status_line = f"HTTP/{version[0]}.{version[1]} {self._status} {self._reason}"
         await writer.write_headers(status_line, self._headers)
 
-    async def write(self, data: bytes) -> None:
+    async def write(self, data: Union[bytes, bytearray, memoryview]) -> None:
         assert isinstance(
             data, (bytes, bytearray, memoryview)
         ), "data argument must be byte-ish (%r)" % type(data)
diff --git aiohttp/web_runner.py aiohttp/web_runner.py
index f8933383435..bcfec727c84 100644
--- aiohttp/web_runner.py
+++ aiohttp/web_runner.py
@@ -3,7 +3,7 @@
 import socket
 import warnings
 from abc import ABC, abstractmethod
-from typing import Any, List, Optional, Set
+from typing import TYPE_CHECKING, Any, List, Optional, Set
 
 from yarl import URL
 
@@ -11,11 +11,13 @@
 from .web_app import Application
 from .web_server import Server
 
-try:
+if TYPE_CHECKING:
     from ssl import SSLContext
-except ImportError:
-    SSLContext = object  # type: ignore[misc,assignment]
-
+else:
+    try:
+        from ssl import SSLContext
+    except ImportError:  # pragma: no cover
+        SSLContext = object  # type: ignore[misc,assignment]
 
 __all__ = (
     "BaseSite",
diff --git aiohttp/web_ws.py aiohttp/web_ws.py
index 0fb1549a3aa..439b8049987 100644
--- aiohttp/web_ws.py
+++ aiohttp/web_ws.py
@@ -182,7 +182,11 @@ def _ping_task_done(self, task: "asyncio.Task[None]") -> None:
 
     def _pong_not_received(self) -> None:
         if self._req is not None and self._req.transport is not None:
-            self._handle_ping_pong_exception(asyncio.TimeoutError())
+            self._handle_ping_pong_exception(
+                asyncio.TimeoutError(
+                    f"No PONG received after {self._pong_heartbeat} seconds"
+                )
+            )
 
     def _handle_ping_pong_exception(self, exc: BaseException) -> None:
         """Handle exceptions raised during ping/pong processing."""
@@ -248,7 +252,8 @@ def _handshake(
             else:
                 # No overlap found: Return no protocol as per spec
                 ws_logger.warning(
-                    "Client protocols %r don’t overlap server-known ones %r",
+                    "%s: Client protocols %r don’t overlap server-known ones %r",
+                    request.remote,
                     req_protocols,
                     self._protocols,
                 )
diff --git aiohttp/worker.py aiohttp/worker.py
index 9b307697336..8ed121ac955 100644
--- aiohttp/worker.py
+++ aiohttp/worker.py
@@ -6,7 +6,7 @@
 import signal
 import sys
 from types import FrameType
-from typing import Any, Awaitable, Callable, Optional, Union  # noqa
+from typing import TYPE_CHECKING, Any, Optional
 
 from gunicorn.config import AccessLogFormat as GunicornAccessLogFormat
 from gunicorn.workers import base
@@ -17,13 +17,18 @@
 from .web_app import Application
 from .web_log import AccessLogger
 
-try:
+if TYPE_CHECKING:
     import ssl
 
     SSLContext = ssl.SSLContext
-except ImportError:  # pragma: no cover
-    ssl = None  # type: ignore[assignment]
-    SSLContext = object  # type: ignore[misc,assignment]
+else:
+    try:
+        import ssl
+
+        SSLContext = ssl.SSLContext
+    except ImportError:  # pragma: no cover
+        ssl = None  # type: ignore[assignment]
+        SSLContext = object  # type: ignore[misc,assignment]
 
 
 __all__ = ("GunicornWebWorker", "GunicornUVLoopWebWorker")
diff --git docs/client_quickstart.rst docs/client_quickstart.rst
index f99339cf4a6..0e03f104e90 100644
--- docs/client_quickstart.rst
+++ docs/client_quickstart.rst
@@ -93,7 +93,7 @@ Passing Parameters In URLs
 You often want to send some sort of data in the URL's query string. If
 you were constructing the URL by hand, this data would be given as key/value
 pairs in the URL after a question mark, e.g. ``httpbin.org/get?key=val``.
-Requests allows you to provide these arguments as a :class:`dict`, using the
+aiohttp allows you to provide these arguments as a :class:`dict`, using the
 ``params`` keyword argument. As an example, if you wanted to pass
 ``key1=value1`` and ``key2=value2`` to ``httpbin.org/get``, you would use the
 following code::
diff --git docs/client_reference.rst docs/client_reference.rst
index c9031de5383..26537161971 100644
--- docs/client_reference.rst
+++ docs/client_reference.rst
@@ -448,11 +448,16 @@ The client session supports the context manager protocol for self closing.
       :param aiohttp.BasicAuth auth: an object that represents HTTP
                                      Basic Authorization (optional)
 
-      :param bool allow_redirects: If set to ``False``, do not follow redirects.
-                                   ``True`` by default (optional).
+      :param bool allow_redirects: Whether to process redirects or not.
+         When ``True``, redirects are followed (up to ``max_redirects`` times)
+         and logged into :attr:`ClientResponse.history` and ``trace_configs``.
+         When ``False``, the original response is returned.
+         ``True`` by default (optional).
 
       :param int max_redirects: Maximum number of redirects to follow.
-                                ``10`` by default.
+         :exc:`TooManyRedirects` is raised if the number is exceeded.
+         Ignored when ``allow_redirects=False``.
+         ``10`` by default.
 
       :param bool compress: Set to ``True`` if request has to be compressed
          with deflate encoding. If `compress` can not be combined
@@ -508,7 +513,7 @@ The client session supports the context manager protocol for self closing.
          .. versionadded:: 3.0
 
       :param str server_hostname: Sets or overrides the host name that the
-         target server’s certificate will be matched against.
+         target server's certificate will be matched against.
 
          See :py:meth:`asyncio.loop.create_connection` for more information.
 
@@ -554,8 +559,11 @@ The client session supports the context manager protocol for self closing.
 
       :param url: Request URL, :class:`str` or :class:`~yarl.URL`
 
-      :param bool allow_redirects: If set to ``False``, do not follow redirects.
-                                   ``True`` by default (optional).
+      :param bool allow_redirects: Whether to process redirects or not.
+         When ``True``, redirects are followed and logged into
+         :attr:`ClientResponse.history`.
+         When ``False``, the original response is returned.
+         ``True`` by default (optional).
 
       :return ClientResponse: a :class:`client response
                               <ClientResponse>` object.
@@ -623,8 +631,11 @@ The client session supports the context manager protocol for self closing.
 
       :param url: Request URL, :class:`str` or :class:`~yarl.URL`
 
-      :param bool allow_redirects: If set to ``False``, do not follow redirects.
-                                   ``False`` by default (optional).
+      :param bool allow_redirects: Whether to process redirects or not.
+         When ``True``, redirects are followed and logged into
+         :attr:`ClientResponse.history`.
+         When ``False``, the original response is returned.
+         ``False`` by default (optional).
 
       :return ClientResponse: a :class:`client response
                               <ClientResponse>` object.
@@ -641,8 +652,11 @@ The client session supports the context manager protocol for self closing.
 
       :param url: Request URL, :class:`str` or :class:`~yarl.URL`
 
-      :param bool allow_redirects: If set to ``False``, do not follow redirects.
-                                   ``True`` by default (optional).
+      :param bool allow_redirects: Whether to process redirects or not.
+         When ``True``, redirects are followed and logged into
+         :attr:`ClientResponse.history`.
+         When ``False``, the original response is returned.
+         ``True`` by default (optional).
 
       :return ClientResponse: a :class:`client response
                               <ClientResponse>` object.
@@ -836,14 +850,21 @@ certification chaining.
 
 .. function:: request(method, url, *, params=None, data=None, \
                         json=None,\
-                        headers=None, cookies=None, auth=None, \
+                        cookies=None, headers=None, skip_auto_headers=None, auth=None, \
                         allow_redirects=True, max_redirects=10, \
-                        encoding='utf-8', \
-                        version=HttpVersion(major=1, minor=1), \
-                        compress=None, chunked=None, expect100=False, raise_for_status=False, \
+                        compress=False, chunked=None, expect100=False, raise_for_status=None, \
+                        read_until_eof=True, \
+                        proxy=None, proxy_auth=None, \
+                        timeout=sentinel, ssl=True, \
+                        server_hostname=None, \
+                        proxy_headers=None, \
+                        trace_request_ctx=None, \
                         read_bufsize=None, \
-                        connector=None, loop=None,\
-                        read_until_eof=True, timeout=sentinel)
+                        auto_decompress=None, \
+                        max_line_size=None, \
+                        max_field_size=None, \
+                        version=aiohttp.HttpVersion11, \
+                        connector=None)
    :async:
 
    Asynchronous context manager for performing an asynchronous HTTP
@@ -856,8 +877,20 @@ certification chaining.
                be encoded with :class:`~yarl.URL` (see :class:`~yarl.URL`
                to skip encoding).
 
-   :param dict params: Parameters to be sent in the query
-                       string of the new request (optional)
+   :param params: Mapping, iterable of tuple of *key*/*value* pairs or
+                  string to be sent as parameters in the query
+                  string of the new request. Ignored for subsequent
+                  redirected requests (optional)
+
+                  Allowed values are:
+
+                  - :class:`collections.abc.Mapping` e.g. :class:`dict`,
+                     :class:`multidict.MultiDict` or
+                     :class:`multidict.MultiDictProxy`
+                  - :class:`collections.abc.Iterable` e.g. :class:`tuple` or
+                     :class:`list`
+                  - :class:`str` with preferably url-encoded content
+                     (**Warning:** content will not be encoded by *aiohttp*)
 
    :param data: The data to send in the body of the request. This can be a
                 :class:`FormData` object or anything that can be passed into
@@ -867,25 +900,46 @@ certification chaining.
    :param json: Any json compatible python object (optional). *json* and *data*
                 parameters could not be used at the same time.
 
+   :param dict cookies: HTTP Cookies to send with the request (optional)
+
    :param dict headers: HTTP Headers to send with the request (optional)
 
-   :param dict cookies: Cookies to send with the request (optional)
+   :param skip_auto_headers: set of headers for which autogeneration
+      should be skipped.
+
+      *aiohttp* autogenerates headers like ``User-Agent`` or
+      ``Content-Type`` if these headers are not explicitly
+      passed. Using ``skip_auto_headers`` parameter allows to skip
+      that generation.
+
+      Iterable of :class:`str` or :class:`~multidict.istr`
+      (optional)
 
    :param aiohttp.BasicAuth auth: an object that represents HTTP Basic
                                   Authorization (optional)
 
-   :param bool allow_redirects: If set to ``False``, do not follow redirects.
-                                ``True`` by default (optional).
+   :param bool allow_redirects: Whether to process redirects or not.
+      When ``True``, redirects are followed (up to ``max_redirects`` times)
+      and logged into :attr:`ClientResponse.history` and ``trace_configs``.
+      When ``False``, the original response is returned.
+      ``True`` by default (optional).
 
-   :param aiohttp.protocol.HttpVersion version: Request HTTP version (optional)
+   :param int max_redirects: Maximum number of redirects to follow.
+      :exc:`TooManyRedirects` is raised if the number is exceeded.
+      Ignored when ``allow_redirects=False``.
+      ``10`` by default.
 
    :param bool compress: Set to ``True`` if request has to be compressed
-                         with deflate encoding.
-                         ``False`` instructs aiohttp to not compress data.
+                         with deflate encoding. If `compress` can not be combined
+                         with a *Content-Encoding* and *Content-Length* headers.
                          ``None`` by default (optional).
 
    :param int chunked: Enables chunked transfer encoding.
-                       ``None`` by default (optional).
+      It is up to the developer
+      to decide how to chunk data streams. If chunking is enabled, aiohttp
+      encodes the provided chunks in the "Transfer-encoding: chunked" format.
+      If *chunked* is set, then the *Transfer-encoding* and *content-length*
+      headers are disallowed. ``None`` by default (optional).
 
    :param bool expect100: Expect 100-continue response from server.
                           ``False`` by default (optional).
@@ -899,28 +953,60 @@ certification chaining.
 
       .. versionadded:: 3.4
 
-   :param aiohttp.BaseConnector connector: BaseConnector sub-class
-      instance to support connection pooling.
-
    :param bool read_until_eof: Read response until EOF if response
                                does not have Content-Length header.
                                ``True`` by default (optional).
 
+   :param proxy: Proxy URL, :class:`str` or :class:`~yarl.URL` (optional)
+
+   :param aiohttp.BasicAuth proxy_auth: an object that represents proxy HTTP
+                                        Basic Authorization (optional)
+
+   :param timeout: a :class:`ClientTimeout` settings structure, 300 seconds (5min)
+        total timeout, 30 seconds socket connect timeout by default.
+
+   :param ssl: SSL validation mode. ``True`` for default SSL check
+               (:func:`ssl.create_default_context` is used),
+               ``False`` for skip SSL certificate validation,
+               :class:`aiohttp.Fingerprint` for fingerprint
+               validation, :class:`ssl.SSLContext` for custom SSL
+               certificate validation.
+
+               Supersedes *verify_ssl*, *ssl_context* and
+               *fingerprint* parameters.
+
+   :param str server_hostname: Sets or overrides the host name that the
+      target server's certificate will be matched against.
+
+      See :py:meth:`asyncio.loop.create_connection`
+      for more information.
+
+   :param collections.abc.Mapping proxy_headers: HTTP headers to send to the proxy
+      if the parameter proxy has been provided.
+
+   :param trace_request_ctx: Object used to give as a kw param for each new
+      :class:`TraceConfig` object instantiated,
+      used to give information to the
+      tracers that is only available at request time.
+
    :param int read_bufsize: Size of the read buffer (:attr:`ClientResponse.content`).
                             ``None`` by default,
                             it means that the session global value is used.
 
       .. versionadded:: 3.7
 
-   :param timeout: a :class:`ClientTimeout` settings structure, 300 seconds (5min)
-        total timeout, 30 seconds socket connect timeout by default.
+   :param bool auto_decompress: Automatically decompress response body.
+      May be used to enable/disable auto decompression on a per-request basis.
 
-   :param loop: :ref:`event loop<asyncio-event-loop>`
-                used for processing HTTP requests.
-                If param is ``None``, :func:`asyncio.get_event_loop`
-                is used for getting default event loop.
+   :param int max_line_size: Maximum allowed size of lines in responses.
 
-      .. deprecated:: 2.0
+   :param int max_field_size: Maximum allowed size of header fields in responses.
+
+   :param aiohttp.protocol.HttpVersion version: Request HTTP version,
+      ``HTTP 1.1`` by default. (optional)
+
+   :param aiohttp.BaseConnector connector: BaseConnector sub-class
+      instance to support connection pooling. (optional)
 
    :return ClientResponse: a :class:`client response <ClientResponse>` object.
 
diff --git docs/contributing-admins.rst docs/contributing-admins.rst
index acfaebc0e97..b17cbe1019a 100644
--- docs/contributing-admins.rst
+++ docs/contributing-admins.rst
@@ -21,9 +21,9 @@ To create a new release:
 #. Run ``towncrier``.
 #. Check and cleanup the changes in ``CHANGES.rst``.
 #. Checkout a new branch: e.g. ``git checkout -b release/v3.8.6``
-#. Commit and create a PR. Once PR is merged, continue.
+#. Commit and create a PR. Verify the changelog and release notes look good on Read the Docs. Once PR is merged, continue.
 #. Go back to the release branch: e.g. ``git checkout 3.8 && git pull``
-#. Add a tag: e.g. ``git tag -a v3.8.6 -m 'Release 3.8.6'``
+#. Add a tag: e.g. ``git tag -a v3.8.6 -m 'Release 3.8.6' -s``
 #. Push the tag: e.g. ``git push origin v3.8.6``
 #. Monitor CI to ensure release process completes without errors.
 
@@ -49,6 +49,10 @@ first merge into the newer release branch (e.g. 3.8 into 3.9) and then to master
 
 Back on the original release branch, bump the version number and append ``.dev0`` in ``__init__.py``.
 
+Post the release announcement to social media:
+ - BlueSky: https://bsky.app/profile/aiohttp.org and re-post to https://bsky.app/profile/aio-libs.org
+ - Mastodon: https://fosstodon.org/@aiohttp and re-post to https://fosstodon.org/@aio_libs
+
 If doing a minor release:
 
 #. Create a new release branch for future features to go to: e.g. ``git checkout -b 3.10 3.9 && git push``
diff --git docs/spelling_wordlist.txt docs/spelling_wordlist.txt
index a1f3d944584..59ea99c40bb 100644
--- docs/spelling_wordlist.txt
+++ docs/spelling_wordlist.txt
@@ -13,6 +13,8 @@ app
 app’s
 apps
 arg
+args
+armv
 Arsenic
 async
 asyncio
@@ -169,6 +171,7 @@ keepaliving
 kib
 KiB
 kwarg
+kwargs
 latin
 lifecycle
 linux
@@ -199,6 +202,7 @@ multidicts
 Multidicts
 multipart
 Multipart
+musllinux
 mypy
 Nagle
 Nagle’s
@@ -245,6 +249,7 @@ py
 pydantic
 pyenv
 pyflakes
+pyright
 pytest
 Pytest
 Quickstart
diff --git docs/third_party.rst docs/third_party.rst
index e8095c7f09d..145a505a5de 100644
--- docs/third_party.rst
+++ docs/third_party.rst
@@ -305,3 +305,6 @@ ask to raise the status.
 
 - `aiohttp-asgi-connector <https://github.com/thearchitector/aiohttp-asgi-connector>`_
   An aiohttp connector for using a ``ClientSession`` to interface directly with separate ASGI applications.
+
+- `aiohttp-openmetrics <https://github.com/jelmer/aiohttp-openmetrics>`_
+  An aiohttp middleware for exposing Prometheus metrics.
diff --git requirements/base.txt requirements/base.txt
index 1e7c0bbe6c1..d79bdab3893 100644
--- requirements/base.txt
+++ requirements/base.txt
@@ -30,7 +30,7 @@ multidict==6.1.0
     # via
     #   -r requirements/runtime-deps.in
     #   yarl
-packaging==24.1
+packaging==24.2
     # via gunicorn
 propcache==0.2.0
     # via
diff --git requirements/constraints.txt requirements/constraints.txt
index d32acc7b773..041a3737ab0 100644
--- requirements/constraints.txt
+++ requirements/constraints.txt
@@ -14,7 +14,7 @@ aiohttp-theme==0.1.7
     # via -r requirements/doc.in
 aiosignal==1.3.1
     # via -r requirements/runtime-deps.in
-alabaster==0.7.13
+alabaster==1.0.0
     # via sphinx
 annotated-types==0.7.0
     # via pydantic
@@ -129,7 +129,7 @@ mypy-extensions==1.0.0
     # via mypy
 nodeenv==1.9.1
     # via pre-commit
-packaging==24.1
+packaging==24.2
     # via
     #   build
     #   gunicorn
@@ -236,22 +236,22 @@ slotscheck==0.19.1
     # via -r requirements/lint.in
 snowballstemmer==2.2.0
     # via sphinx
-sphinx==7.1.2
+sphinx==8.1.3
     # via
     #   -r requirements/doc.in
     #   sphinxcontrib-spelling
     #   sphinxcontrib-towncrier
-sphinxcontrib-applehelp==1.0.4
+sphinxcontrib-applehelp==2.0.0
     # via sphinx
-sphinxcontrib-devhelp==1.0.2
+sphinxcontrib-devhelp==2.0.0
     # via sphinx
-sphinxcontrib-htmlhelp==2.0.1
+sphinxcontrib-htmlhelp==2.1.0
     # via sphinx
 sphinxcontrib-jsmath==1.0.1
     # via sphinx
-sphinxcontrib-qthelp==1.0.3
+sphinxcontrib-qthelp==2.0.0
     # via sphinx
-sphinxcontrib-serializinghtml==1.1.5
+sphinxcontrib-serializinghtml==2.0.0
     # via sphinx
 sphinxcontrib-spelling==8.0.0 ; platform_system != "Windows"
     # via -r requirements/doc-spelling.in
diff --git requirements/dev.txt requirements/dev.txt
index 168ce639d19..a99644dff81 100644
--- requirements/dev.txt
+++ requirements/dev.txt
@@ -14,7 +14,7 @@ aiohttp-theme==0.1.7
     # via -r requirements/doc.in
 aiosignal==1.3.1
     # via -r requirements/runtime-deps.in
-alabaster==0.7.13
+alabaster==1.0.0
     # via sphinx
 annotated-types==0.7.0
     # via pydantic
@@ -122,7 +122,7 @@ mypy-extensions==1.0.0
     # via mypy
 nodeenv==1.9.1
     # via pre-commit
-packaging==24.1
+packaging==24.2
     # via
     #   build
     #   gunicorn
@@ -210,21 +210,21 @@ slotscheck==0.19.1
     # via -r requirements/lint.in
 snowballstemmer==2.2.0
     # via sphinx
-sphinx==7.1.2
+sphinx==8.1.3
     # via
     #   -r requirements/doc.in
     #   sphinxcontrib-towncrier
-sphinxcontrib-applehelp==1.0.4
+sphinxcontrib-applehelp==2.0.0
     # via sphinx
-sphinxcontrib-devhelp==1.0.2
+sphinxcontrib-devhelp==2.0.0
     # via sphinx
-sphinxcontrib-htmlhelp==2.0.1
+sphinxcontrib-htmlhelp==2.1.0
     # via sphinx
 sphinxcontrib-jsmath==1.0.1
     # via sphinx
-sphinxcontrib-qthelp==1.0.3
+sphinxcontrib-qthelp==2.0.0
     # via sphinx
-sphinxcontrib-serializinghtml==1.1.5
+sphinxcontrib-serializinghtml==2.0.0
     # via sphinx
 sphinxcontrib-towncrier==0.4.0a0
     # via -r requirements/doc.in
diff --git requirements/doc-spelling.txt requirements/doc-spelling.txt
index df393012548..43b3822706e 100644
--- requirements/doc-spelling.txt
+++ requirements/doc-spelling.txt
@@ -6,7 +6,7 @@
 #
 aiohttp-theme==0.1.7
     # via -r requirements/doc.in
-alabaster==0.7.13
+alabaster==1.0.0
     # via sphinx
 babel==2.16.0
     # via sphinx
@@ -34,7 +34,7 @@ jinja2==3.1.4
     #   towncrier
 markupsafe==2.1.5
     # via jinja2
-packaging==24.1
+packaging==24.2
     # via sphinx
 pyenchant==3.2.2
     # via sphinxcontrib-spelling
@@ -46,22 +46,22 @@ requests==2.32.3
     # via sphinx
 snowballstemmer==2.2.0
     # via sphinx
-sphinx==7.1.2
+sphinx==8.1.3
     # via
     #   -r requirements/doc.in
     #   sphinxcontrib-spelling
     #   sphinxcontrib-towncrier
-sphinxcontrib-applehelp==1.0.4
+sphinxcontrib-applehelp==2.0.0
     # via sphinx
-sphinxcontrib-devhelp==1.0.2
+sphinxcontrib-devhelp==2.0.0
     # via sphinx
-sphinxcontrib-htmlhelp==2.0.1
+sphinxcontrib-htmlhelp==2.1.0
     # via sphinx
 sphinxcontrib-jsmath==1.0.1
     # via sphinx
-sphinxcontrib-qthelp==1.0.3
+sphinxcontrib-qthelp==2.0.0
     # via sphinx
-sphinxcontrib-serializinghtml==1.1.5
+sphinxcontrib-serializinghtml==2.0.0
     # via sphinx
 sphinxcontrib-spelling==8.0.0 ; platform_system != "Windows"
     # via -r requirements/doc-spelling.in
diff --git requirements/doc.txt requirements/doc.txt
index 43b7c6b7e8b..6ddfc47455b 100644
--- requirements/doc.txt
+++ requirements/doc.txt
@@ -6,7 +6,7 @@
 #
 aiohttp-theme==0.1.7
     # via -r requirements/doc.in
-alabaster==0.7.13
+alabaster==1.0.0
     # via sphinx
 babel==2.16.0
     # via sphinx
@@ -34,7 +34,7 @@ jinja2==3.1.4
     #   towncrier
 markupsafe==2.1.5
     # via jinja2
-packaging==24.1
+packaging==24.2
     # via sphinx
 pygments==2.18.0
     # via sphinx
@@ -44,21 +44,21 @@ requests==2.32.3
     # via sphinx
 snowballstemmer==2.2.0
     # via sphinx
-sphinx==7.1.2
+sphinx==8.1.3
     # via
     #   -r requirements/doc.in
     #   sphinxcontrib-towncrier
-sphinxcontrib-applehelp==1.0.4
+sphinxcontrib-applehelp==2.0.0
     # via sphinx
-sphinxcontrib-devhelp==1.0.2
+sphinxcontrib-devhelp==2.0.0
     # via sphinx
-sphinxcontrib-htmlhelp==2.0.1
+sphinxcontrib-htmlhelp==2.1.0
     # via sphinx
 sphinxcontrib-jsmath==1.0.1
     # via sphinx
-sphinxcontrib-qthelp==1.0.3
+sphinxcontrib-qthelp==2.0.0
     # via sphinx
-sphinxcontrib-serializinghtml==1.1.5
+sphinxcontrib-serializinghtml==2.0.0
     # via sphinx
 sphinxcontrib-towncrier==0.4.0a0
     # via -r requirements/doc.in
diff --git requirements/lint.txt requirements/lint.txt
index d7d97277bce..e2547d13da5 100644
--- requirements/lint.txt
+++ requirements/lint.txt
@@ -55,7 +55,7 @@ mypy-extensions==1.0.0
     # via mypy
 nodeenv==1.9.1
     # via pre-commit
-packaging==24.1
+packaging==24.2
     # via pytest
 platformdirs==4.3.6
     # via virtualenv
diff --git requirements/test.txt requirements/test.txt
index 33510f18682..cf81a7bf257 100644
--- requirements/test.txt
+++ requirements/test.txt
@@ -70,7 +70,7 @@ mypy==1.11.2 ; implementation_name == "cpython"
     # via -r requirements/test.in
 mypy-extensions==1.0.0
     # via mypy
-packaging==24.1
+packaging==24.2
     # via
     #   gunicorn
     #   pytest
diff --git tests/conftest.py tests/conftest.py
index 44ae384b633..95a98cd4fc0 100644
--- tests/conftest.py
+++ tests/conftest.py
@@ -221,6 +221,7 @@ def start_connection():
         "aiohttp.connector.aiohappyeyeballs.start_connection",
         autospec=True,
         spec_set=True,
+        return_value=mock.create_autospec(socket.socket, spec_set=True, instance=True),
     ) as start_connection_mock:
         yield start_connection_mock
 
diff --git a/tests/isolated/check_for_client_response_leak.py b/tests/isolated/check_for_client_response_leak.py
new file mode 100644
index 00000000000..67393c2c2d8
--- /dev/null
+++ tests/isolated/check_for_client_response_leak.py
@@ -0,0 +1,47 @@
+import asyncio
+import contextlib
+import gc
+import sys
+
+from aiohttp import ClientError, ClientSession, web
+from aiohttp.test_utils import get_unused_port_socket
+
+gc.set_debug(gc.DEBUG_LEAK)
+
+
+async def main() -> None:
+    app = web.Application()
+
+    async def stream_handler(request: web.Request) -> web.Response:
+        assert request.transport is not None
+        request.transport.close()  # Forcefully closing connection
+        return web.Response()
+
+    app.router.add_get("/stream", stream_handler)
+    sock = get_unused_port_socket("127.0.0.1")
+    port = sock.getsockname()[1]
+
+    runner = web.AppRunner(app)
+    await runner.setup()
+    site = web.SockSite(runner, sock)
+    await site.start()
+
+    session = ClientSession()
+
+    async def fetch_stream(url: str) -> None:
+        """Fetch a stream and read a few bytes from it."""
+        with contextlib.suppress(ClientError):
+            await session.get(url)
+
+    client_task = asyncio.create_task(fetch_stream(f"http://localhost:{port}/stream"))
+    await client_task
+    gc.collect()
+    client_response_present = any(
+        type(obj).__name__ == "ClientResponse" for obj in gc.garbage
+    )
+    await session.close()
+    await runner.cleanup()
+    sys.exit(1 if client_response_present else 0)
+
+
+asyncio.run(main())
diff --git a/tests/isolated/check_for_request_leak.py b/tests/isolated/check_for_request_leak.py
new file mode 100644
index 00000000000..6f340a05277
--- /dev/null
+++ tests/isolated/check_for_request_leak.py
@@ -0,0 +1,41 @@
+import asyncio
+import gc
+import sys
+from typing import NoReturn
+
+from aiohttp import ClientSession, web
+from aiohttp.test_utils import get_unused_port_socket
+
+gc.set_debug(gc.DEBUG_LEAK)
+
+
+async def main() -> None:
+    app = web.Application()
+
+    async def handler(request: web.Request) -> NoReturn:
+        await request.json()
+        assert False
+
+    app.router.add_route("GET", "/json", handler)
+    sock = get_unused_port_socket("127.0.0.1")
+    port = sock.getsockname()[1]
+
+    runner = web.AppRunner(app)
+    await runner.setup()
+    site = web.SockSite(runner, sock)
+    await site.start()
+
+    async with ClientSession() as session:
+        async with session.get(f"http://127.0.0.1:{port}/json") as resp:
+            await resp.read()
+
+    # Give time for the cancelled task to be collected
+    await asyncio.sleep(0.5)
+    gc.collect()
+    request_present = any(type(obj).__name__ == "Request" for obj in gc.garbage)
+    await session.close()
+    await runner.cleanup()
+    sys.exit(1 if request_present else 0)
+
+
+asyncio.run(main())
diff --git tests/test_benchmarks_client.py tests/test_benchmarks_client.py
index 61439183334..aa3536be820 100644
--- tests/test_benchmarks_client.py
+++ tests/test_benchmarks_client.py
@@ -124,7 +124,7 @@ def test_one_hundred_get_requests_with_512kib_chunked_payload(
     aiohttp_client: AiohttpClient,
     benchmark: BenchmarkFixture,
 ) -> None:
-    """Benchmark 100 GET requests with a payload of 512KiB."""
+    """Benchmark 100 GET requests with a payload of 512KiB using read."""
     message_count = 100
     payload = b"a" * (2**19)
 
@@ -148,6 +148,36 @@ def _run() -> None:
         loop.run_until_complete(run_client_benchmark())
 
 
+def test_one_hundred_get_requests_iter_chunks_on_512kib_chunked_payload(
+    loop: asyncio.AbstractEventLoop,
+    aiohttp_client: AiohttpClient,
+    benchmark: BenchmarkFixture,
+) -> None:
+    """Benchmark 100 GET requests with a payload of 512KiB using iter_chunks."""
+    message_count = 100
+    payload = b"a" * (2**19)
+
+    async def handler(request: web.Request) -> web.Response:
+        resp = web.Response(body=payload)
+        resp.enable_chunked_encoding()
+        return resp
+
+    app = web.Application()
+    app.router.add_route("GET", "/", handler)
+
+    async def run_client_benchmark() -> None:
+        client = await aiohttp_client(app)
+        for _ in range(message_count):
+            resp = await client.get("/")
+            async for _ in resp.content.iter_chunks():
+                pass
+        await client.close()
+
+    @benchmark
+    def _run() -> None:
+        loop.run_until_complete(run_client_benchmark())
+
+
 def test_get_request_with_251308_compressed_chunked_payload(
     loop: asyncio.AbstractEventLoop,
     aiohttp_client: AiohttpClient,
@@ -289,3 +319,158 @@ async def run_client_benchmark() -> None:
     @benchmark
     def _run() -> None:
         loop.run_until_complete(run_client_benchmark())
+
+
+def test_one_hundred_json_post_requests(
+    loop: asyncio.AbstractEventLoop,
+    aiohttp_client: AiohttpClient,
+    benchmark: BenchmarkFixture,
+) -> None:
+    """Benchmark 100 JSON POST requests that check the content-type."""
+    message_count = 100
+
+    async def handler(request: web.Request) -> web.Response:
+        _ = request.content_type
+        _ = request.charset
+        return web.Response()
+
+    app = web.Application()
+    app.router.add_route("POST", "/", handler)
+
+    async def run_client_benchmark() -> None:
+        client = await aiohttp_client(app)
+        for _ in range(message_count):
+            await client.post("/", json={"key": "value"})
+        await client.close()
+
+    @benchmark
+    def _run() -> None:
+        loop.run_until_complete(run_client_benchmark())
+
+
+def test_ten_streamed_responses_iter_any(
+    loop: asyncio.AbstractEventLoop,
+    aiohttp_client: AiohttpClient,
+    benchmark: BenchmarkFixture,
+) -> None:
+    """Benchmark 10 streamed responses using iter_any."""
+    message_count = 10
+    data = b"x" * 65536  # 64 KiB chunk size
+
+    async def handler(request: web.Request) -> web.StreamResponse:
+        resp = web.StreamResponse()
+        await resp.prepare(request)
+        for _ in range(10):
+            await resp.write(data)
+        return resp
+
+    app = web.Application()
+    app.router.add_route("GET", "/", handler)
+
+    async def run_client_benchmark() -> None:
+        client = await aiohttp_client(app)
+        for _ in range(message_count):
+            resp = await client.get("/")
+            async for _ in resp.content.iter_any():
+                pass
+        await client.close()
+
+    @benchmark
+    def _run() -> None:
+        loop.run_until_complete(run_client_benchmark())
+
+
+def test_ten_streamed_responses_iter_chunked_4096(
+    loop: asyncio.AbstractEventLoop,
+    aiohttp_client: AiohttpClient,
+    benchmark: BenchmarkFixture,
+) -> None:
+    """Benchmark 10 streamed responses using iter_chunked 4096."""
+    message_count = 10
+    data = b"x" * 65536  # 64 KiB chunk size, 4096 iter_chunked
+
+    async def handler(request: web.Request) -> web.StreamResponse:
+        resp = web.StreamResponse()
+        await resp.prepare(request)
+        for _ in range(10):
+            await resp.write(data)
+        return resp
+
+    app = web.Application()
+    app.router.add_route("GET", "/", handler)
+
+    async def run_client_benchmark() -> None:
+        client = await aiohttp_client(app)
+        for _ in range(message_count):
+            resp = await client.get("/")
+            async for _ in resp.content.iter_chunked(4096):
+                pass
+        await client.close()
+
+    @benchmark
+    def _run() -> None:
+        loop.run_until_complete(run_client_benchmark())
+
+
+def test_ten_streamed_responses_iter_chunked_65536(
+    loop: asyncio.AbstractEventLoop,
+    aiohttp_client: AiohttpClient,
+    benchmark: BenchmarkFixture,
+) -> None:
+    """Benchmark 10 streamed responses using iter_chunked 65536."""
+    message_count = 10
+    data = b"x" * 65536  # 64 KiB chunk size, 64 KiB iter_chunked
+
+    async def handler(request: web.Request) -> web.StreamResponse:
+        resp = web.StreamResponse()
+        await resp.prepare(request)
+        for _ in range(10):
+            await resp.write(data)
+        return resp
+
+    app = web.Application()
+    app.router.add_route("GET", "/", handler)
+
+    async def run_client_benchmark() -> None:
+        client = await aiohttp_client(app)
+        for _ in range(message_count):
+            resp = await client.get("/")
+            async for _ in resp.content.iter_chunked(65536):
+                pass
+        await client.close()
+
+    @benchmark
+    def _run() -> None:
+        loop.run_until_complete(run_client_benchmark())
+
+
+def test_ten_streamed_responses_iter_chunks(
+    loop: asyncio.AbstractEventLoop,
+    aiohttp_client: AiohttpClient,
+    benchmark: BenchmarkFixture,
+) -> None:
+    """Benchmark 10 streamed responses using iter_chunks."""
+    message_count = 10
+    data = b"x" * 65536  # 64 KiB chunk size
+
+    async def handler(request: web.Request) -> web.StreamResponse:
+        resp = web.StreamResponse()
+        await resp.prepare(request)
+        for _ in range(10):
+            await resp.write(data)
+        return resp
+
+    app = web.Application()
+    app.router.add_route("GET", "/", handler)
+
+    async def run_client_benchmark() -> None:
+        client = await aiohttp_client(app)
+        for _ in range(message_count):
+            resp = await client.get("/")
+            async for _ in resp.content.iter_chunks():
+                pass
+        await client.close()
+
+    @benchmark
+    def _run() -> None:
+        loop.run_until_complete(run_client_benchmark())
diff --git a/tests/test_benchmarks_web_fileresponse.py b/tests/test_benchmarks_web_fileresponse.py
new file mode 100644
index 00000000000..01aa7448c86
--- /dev/null
+++ tests/test_benchmarks_web_fileresponse.py
@@ -0,0 +1,105 @@
+"""codspeed benchmarks for the web file responses."""
+
+import asyncio
+import pathlib
+
+from multidict import CIMultiDict
+from pytest_codspeed import BenchmarkFixture
+
+from aiohttp import ClientResponse, web
+from aiohttp.pytest_plugin import AiohttpClient
+
+
+def test_simple_web_file_response(
+    loop: asyncio.AbstractEventLoop,
+    aiohttp_client: AiohttpClient,
+    benchmark: BenchmarkFixture,
+) -> None:
+    """Benchmark creating 100 simple web.FileResponse."""
+    response_count = 100
+    filepath = pathlib.Path(__file__).parent / "sample.txt"
+
+    async def handler(request: web.Request) -> web.FileResponse:
+        return web.FileResponse(path=filepath)
+
+    app = web.Application()
+    app.router.add_route("GET", "/", handler)
+
+    async def run_file_response_benchmark() -> None:
+        client = await aiohttp_client(app)
+        for _ in range(response_count):
+            await client.get("/")
+        await client.close()
+
+    @benchmark
+    def _run() -> None:
+        loop.run_until_complete(run_file_response_benchmark())
+
+
+def test_simple_web_file_sendfile_fallback_response(
+    loop: asyncio.AbstractEventLoop,
+    aiohttp_client: AiohttpClient,
+    benchmark: BenchmarkFixture,
+) -> None:
+    """Benchmark creating 100 simple web.FileResponse without sendfile."""
+    response_count = 100
+    filepath = pathlib.Path(__file__).parent / "sample.txt"
+
+    async def handler(request: web.Request) -> web.FileResponse:
+        transport = request.transport
+        assert transport is not None
+        transport._sendfile_compatible = False  # type: ignore[attr-defined]
+        return web.FileResponse(path=filepath)
+
+    app = web.Application()
+    app.router.add_route("GET", "/", handler)
+
+    async def run_file_response_benchmark() -> None:
+        client = await aiohttp_client(app)
+        for _ in range(response_count):
+            await client.get("/")
+        await client.close()
+
+    @benchmark
+    def _run() -> None:
+        loop.run_until_complete(run_file_response_benchmark())
+
+
+def test_simple_web_file_response_not_modified(
+    loop: asyncio.AbstractEventLoop,
+    aiohttp_client: AiohttpClient,
+    benchmark: BenchmarkFixture,
+) -> None:
+    """Benchmark web.FileResponse that return a 304."""
+    response_count = 100
+    filepath = pathlib.Path(__file__).parent / "sample.txt"
+
+    async def handler(request: web.Request) -> web.FileResponse:
+        return web.FileResponse(path=filepath)
+
+    app = web.Application()
+    app.router.add_route("GET", "/", handler)
+
+    async def make_last_modified_header() -> CIMultiDict[str]:
+        client = await aiohttp_client(app)
+        resp = await client.get("/")
+        last_modified = resp.headers["Last-Modified"]
+        headers = CIMultiDict({"If-Modified-Since": last_modified})
+        return headers
+
+    async def run_file_response_benchmark(
+        headers: CIMultiDict[str],
+    ) -> ClientResponse:
+        client = await aiohttp_client(app)
+        for _ in range(response_count):
+            resp = await client.get("/", headers=headers)
+
+        await client.close()
+        return resp  # type: ignore[possibly-undefined]
+
+    headers = loop.run_until_complete(make_last_modified_header())
+
+    @benchmark
+    def _run() -> None:
+        resp = loop.run_until_complete(run_file_response_benchmark(headers))
+        assert resp.status == 304
diff --git tests/test_client_functional.py tests/test_client_functional.py
index b34ccdb600d..ba75e8e93c6 100644
--- tests/test_client_functional.py
+++ tests/test_client_functional.py
@@ -603,6 +603,30 @@ async def handler(request):
     assert txt == "Test message"
 
 
+async def test_ssl_client_alpn(
+    aiohttp_server: AiohttpServer,
+    aiohttp_client: AiohttpClient,
+    ssl_ctx: ssl.SSLContext,
+) -> None:
+
+    async def handler(request: web.Request) -> web.Response:
+        assert request.transport is not None
+        sslobj = request.transport.get_extra_info("ssl_object")
+        return web.Response(text=sslobj.selected_alpn_protocol())
+
+    app = web.Application()
+    app.router.add_route("GET", "/", handler)
+    ssl_ctx.set_alpn_protocols(("http/1.1",))
+    server = await aiohttp_server(app, ssl=ssl_ctx)
+
+    connector = aiohttp.TCPConnector(ssl=False)
+    client = await aiohttp_client(server, connector=connector)
+    resp = await client.get("/")
+    assert resp.status == 200
+    txt = await resp.text()
+    assert txt == "http/1.1"
+
+
 async def test_tcp_connector_fingerprint_ok(
     aiohttp_server,
     aiohttp_client,
@@ -3358,6 +3382,22 @@ async def handler(request: web.Request) -> web.Response:
     await server.close()
 
 
+async def test_aiohttp_request_ssl(
+    aiohttp_server: AiohttpServer,
+    ssl_ctx: ssl.SSLContext,
+    client_ssl_ctx: ssl.SSLContext,
+) -> None:
+    async def handler(request: web.Request) -> web.Response:
+        return web.Response()
+
+    app = web.Application()
+    app.router.add_get("/", handler)
+    server = await aiohttp_server(app, ssl=ssl_ctx)
+
+    async with aiohttp.request("GET", server.make_url("/"), ssl=client_ssl_ctx) as resp:
+        assert resp.status == 200
+
+
 async def test_yield_from_in_session_request(aiohttp_client: AiohttpClient) -> None:
     # a test for backward compatibility with yield from syntax
     async def handler(request):
diff --git tests/test_client_session.py tests/test_client_session.py
index 65f80b6abe9..6309c5daf2e 100644
--- tests/test_client_session.py
+++ tests/test_client_session.py
@@ -15,13 +15,14 @@
 from yarl import URL
 
 import aiohttp
-from aiohttp import client, hdrs, web
+from aiohttp import CookieJar, client, hdrs, web
 from aiohttp.client import ClientSession
 from aiohttp.client_proto import ResponseHandler
 from aiohttp.client_reqrep import ClientRequest
 from aiohttp.connector import BaseConnector, Connection, TCPConnector, UnixConnector
 from aiohttp.helpers import DEBUG
 from aiohttp.http import RawResponseMessage
+from aiohttp.pytest_plugin import AiohttpServer
 from aiohttp.test_utils import make_mocked_coro
 from aiohttp.tracing import Trace
 
@@ -634,8 +635,24 @@ async def handler(request):
     assert resp_cookies["response"].value == "resp_value"
 
 
-async def test_session_default_version(loop) -> None:
-    session = aiohttp.ClientSession(loop=loop)
+async def test_cookies_with_not_quoted_cookie_jar(
+    aiohttp_server: AiohttpServer,
+) -> None:
+    async def handler(_: web.Request) -> web.Response:
+        return web.Response()
+
+    app = web.Application()
+    app.router.add_route("GET", "/", handler)
+    server = await aiohttp_server(app)
+    jar = CookieJar(quote_cookie=False)
+    cookies = {"name": "val=foobar"}
+    async with aiohttp.ClientSession(cookie_jar=jar) as sess:
+        resp = await sess.request("GET", server.make_url("/"), cookies=cookies)
+    assert resp.request_info.headers.get("Cookie", "") == "name=val=foobar"
+
+
+async def test_session_default_version(loop: asyncio.AbstractEventLoop) -> None:
+    session = aiohttp.ClientSession()
     assert session.version == aiohttp.HttpVersion11
     await session.close()
 
diff --git tests/test_client_ws_functional.py tests/test_client_ws_functional.py
index 7ede7432adf..54cd5e92f80 100644
--- tests/test_client_ws_functional.py
+++ tests/test_client_ws_functional.py
@@ -902,6 +902,7 @@ async def handler(request):
         assert resp.close_code is WSCloseCode.ABNORMAL_CLOSURE
         assert msg.type is WSMsgType.ERROR
         assert isinstance(msg.data, ServerTimeoutError)
+        assert str(msg.data) == "No PONG received after 0.05 seconds"
 
 
 async def test_close_websocket_while_ping_inflight(
diff --git tests/test_connector.py tests/test_connector.py
index 483759a4180..a3fffc447ae 100644
--- tests/test_connector.py
+++ tests/test_connector.py
@@ -3474,6 +3474,61 @@ async def send_dns_cache_hit(self, *args: object, **kwargs: object) -> None:
     await connector.close()
 
 
+async def test_connector_resolve_in_case_of_trace_cache_miss_exception(
+    loop: asyncio.AbstractEventLoop,
+) -> None:
+    token: ResolveResult = {
+        "hostname": "localhost",
+        "host": "127.0.0.1",
+        "port": 80,
+        "family": socket.AF_INET,
+        "proto": 0,
+        "flags": socket.AI_NUMERICHOST,
+    }
+
+    request_count = 0
+
+    class DummyTracer(Trace):
+        def __init__(self) -> None:
+            """Dummy"""
+
+        async def send_dns_cache_hit(self, *args: object, **kwargs: object) -> None:
+            """Dummy send_dns_cache_hit"""
+
+        async def send_dns_resolvehost_start(
+            self, *args: object, **kwargs: object
+        ) -> None:
+            """Dummy send_dns_resolvehost_start"""
+
+        async def send_dns_resolvehost_end(
+            self, *args: object, **kwargs: object
+        ) -> None:
+            """Dummy send_dns_resolvehost_end"""
+
+        async def send_dns_cache_miss(self, *args: object, **kwargs: object) -> None:
+            nonlocal request_count
+            request_count += 1
+            if request_count <= 1:
+                raise Exception("first attempt")
+
+    async def resolve_response() -> List[ResolveResult]:
+        await asyncio.sleep(0)
+        return [token]
+
+    with mock.patch("aiohttp.connector.DefaultResolver") as m_resolver:
+        m_resolver().resolve.return_value = resolve_response()
+
+        connector = TCPConnector()
+        traces = [DummyTracer()]
+
+        with pytest.raises(Exception):
+            await connector._resolve_host("", 0, traces)
+
+        await connector._resolve_host("", 0, traces) == [token]
+
+    await connector.close()
+
+
 async def test_connector_does_not_remove_needed_waiters(
     loop: asyncio.AbstractEventLoop, key: ConnectionKey
 ) -> None:
diff --git tests/test_cookiejar.py tests/test_cookiejar.py
index bdcf54fa796..0b440bc2ca6 100644
--- tests/test_cookiejar.py
+++ tests/test_cookiejar.py
@@ -807,6 +807,7 @@ async def make_jar():
 async def test_dummy_cookie_jar() -> None:
     cookie = SimpleCookie("foo=bar; Domain=example.com;")
     dummy_jar = DummyCookieJar()
+    assert dummy_jar.quote_cookie is True
     assert len(dummy_jar) == 0
     dummy_jar.update_cookies(cookie)
     assert len(dummy_jar) == 0
diff --git tests/test_flowcontrol_streams.py tests/test_flowcontrol_streams.py
index 68e623b6dd7..9874cc2511e 100644
--- tests/test_flowcontrol_streams.py
+++ tests/test_flowcontrol_streams.py
@@ -4,6 +4,7 @@
 import pytest
 
 from aiohttp import streams
+from aiohttp.base_protocol import BaseProtocol
 
 
 @pytest.fixture
@@ -112,6 +113,15 @@ async def test_read_nowait(self, stream) -> None:
         assert res == b""
         assert stream._protocol.resume_reading.call_count == 1  # type: ignore[attr-defined]
 
+    async def test_resumed_on_eof(self, stream: streams.StreamReader) -> None:
+        stream.feed_data(b"data")
+        assert stream._protocol.pause_reading.call_count == 1  # type: ignore[attr-defined]
+        assert stream._protocol.resume_reading.call_count == 0  # type: ignore[attr-defined]
+        stream._protocol._reading_paused = True
+
+        stream.feed_eof()
+        assert stream._protocol.resume_reading.call_count == 1  # type: ignore[attr-defined]
+
 
 async def test_flow_control_data_queue_waiter_cancelled(
     buffer: streams.FlowControlDataQueue,
@@ -180,3 +190,16 @@ async def test_flow_control_data_queue_read_eof(
     buffer.feed_eof()
     with pytest.raises(streams.EofStream):
         await buffer.read()
+
+
+async def test_stream_reader_eof_when_full() -> None:
+    loop = asyncio.get_event_loop()
+    protocol = BaseProtocol(loop=loop)
+    protocol.transport = asyncio.Transport()
+    stream = streams.StreamReader(protocol, 1024, loop=loop)
+
+    data_len = stream._high_water + 1
+    stream.feed_data(b"0" * data_len)
+    assert protocol._reading_paused
+    stream.feed_eof()
+    assert not protocol._reading_paused
diff --git tests/test_http_writer.py tests/test_http_writer.py
index 0ed0e615700..420816b3137 100644
--- tests/test_http_writer.py
+++ tests/test_http_writer.py
@@ -2,19 +2,38 @@
 import array
 import asyncio
 import zlib
-from typing import Iterable
+from typing import Generator, Iterable
 from unittest import mock
 
 import pytest
 from multidict import CIMultiDict
 
-from aiohttp import ClientConnectionResetError, http
+from aiohttp import ClientConnectionResetError, hdrs, http
 from aiohttp.base_protocol import BaseProtocol
+from aiohttp.http_writer import _serialize_headers
 from aiohttp.test_utils import make_mocked_coro
 
 
 @pytest.fixture
-def buf():
+def enable_writelines() -> Generator[None, None, None]:
+    with mock.patch("aiohttp.http_writer.SKIP_WRITELINES", False):
+        yield
+
+
+@pytest.fixture
+def disable_writelines() -> Generator[None, None, None]:
+    with mock.patch("aiohttp.http_writer.SKIP_WRITELINES", True):
+        yield
+
+
+@pytest.fixture
+def force_writelines_small_payloads() -> Generator[None, None, None]:
+    with mock.patch("aiohttp.http_writer.MIN_PAYLOAD_FOR_WRITELINES", 1):
+        yield
+
+
+@pytest.fixture
+def buf() -> bytearray:
     return bytearray()
 
 
@@ -92,6 +111,7 @@ async def test_write_payload_length(protocol, transport, loop) -> None:
     assert b"da" == content.split(b"\r\n\r\n", 1)[-1]
 
 
+@pytest.mark.usefixtures("disable_writelines")
 async def test_write_large_payload_deflate_compression_data_in_eof(
     protocol: BaseProtocol,
     transport: asyncio.Transport,
@@ -100,6 +120,32 @@ async def test_write_large_payload_deflate_compression_data_in_eof(
     msg = http.StreamWriter(protocol, loop)
     msg.enable_compression("deflate")
 
+    await msg.write(b"data" * 4096)
+    assert transport.write.called  # type: ignore[attr-defined]
+    chunks = [c[1][0] for c in list(transport.write.mock_calls)]  # type: ignore[attr-defined]
+    transport.write.reset_mock()  # type: ignore[attr-defined]
+
+    # This payload compresses to 20447 bytes
+    payload = b"".join(
+        [bytes((*range(0, i), *range(i, 0, -1))) for i in range(255) for _ in range(64)]
+    )
+    await msg.write_eof(payload)
+    chunks.extend([c[1][0] for c in list(transport.write.mock_calls)])  # type: ignore[attr-defined]
+
+    assert all(chunks)
+    content = b"".join(chunks)
+    assert zlib.decompress(content) == (b"data" * 4096) + payload
+
+
+@pytest.mark.usefixtures("enable_writelines")
+async def test_write_large_payload_deflate_compression_data_in_eof_writelines(
+    protocol: BaseProtocol,
+    transport: asyncio.Transport,
+    loop: asyncio.AbstractEventLoop,
+) -> None:
+    msg = http.StreamWriter(protocol, loop)
+    msg.enable_compression("deflate")
+
     await msg.write(b"data" * 4096)
     assert transport.write.called  # type: ignore[attr-defined]
     chunks = [c[1][0] for c in list(transport.write.mock_calls)]  # type: ignore[attr-defined]
@@ -180,6 +226,26 @@ async def test_write_payload_deflate_compression_chunked(
     await msg.write(b"data")
     await msg.write_eof()
 
+    chunks = [c[1][0] for c in list(transport.write.mock_calls)]  # type: ignore[attr-defined]
+    assert all(chunks)
+    content = b"".join(chunks)
+    assert content == expected
+
+
+@pytest.mark.usefixtures("enable_writelines")
+@pytest.mark.usefixtures("force_writelines_small_payloads")
+async def test_write_payload_deflate_compression_chunked_writelines(
+    protocol: BaseProtocol,
+    transport: asyncio.Transport,
+    loop: asyncio.AbstractEventLoop,
+) -> None:
+    expected = b"2\r\nx\x9c\r\na\r\nKI,I\x04\x00\x04\x00\x01\x9b\r\n0\r\n\r\n"
+    msg = http.StreamWriter(protocol, loop)
+    msg.enable_compression("deflate")
+    msg.enable_chunking()
+    await msg.write(b"data")
+    await msg.write_eof()
+
     chunks = [b"".join(c[1][0]) for c in list(transport.writelines.mock_calls)]  # type: ignore[attr-defined]
     assert all(chunks)
     content = b"".join(chunks)
@@ -216,6 +282,26 @@ async def test_write_payload_deflate_compression_chunked_data_in_eof(
     await msg.write(b"data")
     await msg.write_eof(b"end")
 
+    chunks = [c[1][0] for c in list(transport.write.mock_calls)]  # type: ignore[attr-defined]
+    assert all(chunks)
+    content = b"".join(chunks)
+    assert content == expected
+
+
+@pytest.mark.usefixtures("enable_writelines")
+@pytest.mark.usefixtures("force_writelines_small_payloads")
+async def test_write_payload_deflate_compression_chunked_data_in_eof_writelines(
+    protocol: BaseProtocol,
+    transport: asyncio.Transport,
+    loop: asyncio.AbstractEventLoop,
+) -> None:
+    expected = b"2\r\nx\x9c\r\nd\r\nKI,IL\xcdK\x01\x00\x0b@\x02\xd2\r\n0\r\n\r\n"
+    msg = http.StreamWriter(protocol, loop)
+    msg.enable_compression("deflate")
+    msg.enable_chunking()
+    await msg.write(b"data")
+    await msg.write_eof(b"end")
+
     chunks = [b"".join(c[1][0]) for c in list(transport.writelines.mock_calls)]  # type: ignore[attr-defined]
     assert all(chunks)
     content = b"".join(chunks)
@@ -231,6 +317,34 @@ async def test_write_large_payload_deflate_compression_chunked_data_in_eof(
     msg.enable_compression("deflate")
     msg.enable_chunking()
 
+    await msg.write(b"data" * 4096)
+    # This payload compresses to 1111 bytes
+    payload = b"".join([bytes((*range(0, i), *range(i, 0, -1))) for i in range(255)])
+    await msg.write_eof(payload)
+
+    compressed = []
+    chunks = [c[1][0] for c in list(transport.write.mock_calls)]  # type: ignore[attr-defined]
+    chunked_body = b"".join(chunks)
+    split_body = chunked_body.split(b"\r\n")
+    while split_body:
+        if split_body.pop(0):
+            compressed.append(split_body.pop(0))
+
+    content = b"".join(compressed)
+    assert zlib.decompress(content) == (b"data" * 4096) + payload
+
+
+@pytest.mark.usefixtures("enable_writelines")
+@pytest.mark.usefixtures("force_writelines_small_payloads")
+async def test_write_large_payload_deflate_compression_chunked_data_in_eof_writelines(
+    protocol: BaseProtocol,
+    transport: asyncio.Transport,
+    loop: asyncio.AbstractEventLoop,
+) -> None:
+    msg = http.StreamWriter(protocol, loop)
+    msg.enable_compression("deflate")
+    msg.enable_chunking()
+
     await msg.write(b"data" * 4096)
     # This payload compresses to 1111 bytes
     payload = b"".join([bytes((*range(0, i), *range(i, 0, -1))) for i in range(255)])
@@ -421,3 +535,29 @@ async def test_set_eof_after_write_headers(
     msg.set_eof()
     await msg.write_eof()
     assert not transport.write.called
+
+
+@pytest.mark.parametrize(
+    "char",
+    [
+        "\n",
+        "\r",
+    ],
+)
+def test_serialize_headers_raises_on_new_line_or_carriage_return(char: str) -> None:
+    """Verify serialize_headers raises on cr or nl in the headers."""
+    status_line = "HTTP/1.1 200 OK"
+    headers = CIMultiDict(
+        {
+            hdrs.CONTENT_TYPE: f"text/plain{char}",
+        }
+    )
+
+    with pytest.raises(
+        ValueError,
+        match=(
+            "Newline or carriage return detected in headers. "
+            "Potential header injection attack."
+        ),
+    ):
+        _serialize_headers(status_line, headers)
diff --git tests/test_imports.py tests/test_imports.py
index 5a2bb76b03c..b3f545ad900 100644
--- tests/test_imports.py
+++ tests/test_imports.py
@@ -38,7 +38,7 @@ def test_web___all__(pytester: pytest.Pytester) -> None:
         # and even slower under pytest-xdist, especially in CI
         _XDIST_WORKER_COUNT * 100 * (1 if _IS_CI_ENV else 1.53)
         if _IS_XDIST_RUN
-        else 265
+        else 295
     ),
 }
 _TARGET_TIMINGS_BY_PYTHON_VERSION["3.13"] = _TARGET_TIMINGS_BY_PYTHON_VERSION["3.12"]
diff --git a/tests/test_leaks.py b/tests/test_leaks.py
new file mode 100644
index 00000000000..07b506bdb99
--- /dev/null
+++ tests/test_leaks.py
@@ -0,0 +1,37 @@
+import pathlib
+import platform
+import subprocess
+import sys
+
+import pytest
+
+IS_PYPY = platform.python_implementation() == "PyPy"
+
+
+@pytest.mark.skipif(IS_PYPY, reason="gc.DEBUG_LEAK not available on PyPy")
+@pytest.mark.parametrize(
+    ("script", "message"),
+    [
+        (
+            # Test that ClientResponse is collected after server disconnects.
+            # https://github.com/aio-libs/aiohttp/issues/10535
+            "check_for_client_response_leak.py",
+            "ClientResponse leaked",
+        ),
+        (
+            # Test that Request object is collected when the handler raises.
+            # https://github.com/aio-libs/aiohttp/issues/10548
+            "check_for_request_leak.py",
+            "Request leaked",
+        ),
+    ],
+)
+def test_leak(script: str, message: str) -> None:
+    """Run isolated leak test script and check for leaks."""
+    leak_test_script = pathlib.Path(__file__).parent.joinpath("isolated", script)
+
+    with subprocess.Popen(
+        [sys.executable, "-u", str(leak_test_script)],
+        stdout=subprocess.PIPE,
+    ) as proc:
+        assert proc.wait() == 0, message
diff --git tests/test_proxy.py tests/test_proxy.py
index 1679b68909f..83457de891f 100644
--- tests/test_proxy.py
+++ tests/test_proxy.py
@@ -207,6 +207,7 @@ async def make_conn():
         "aiohttp.connector.aiohappyeyeballs.start_connection",
         autospec=True,
         spec_set=True,
+        return_value=mock.create_autospec(socket.socket, spec_set=True, instance=True),
     )
     def test_proxy_connection_error(self, start_connection: Any) -> None:
         async def make_conn():
diff --git tests/test_streams.py tests/test_streams.py
index fcf13a91eb3..1b65f771c77 100644
--- tests/test_streams.py
+++ tests/test_streams.py
@@ -1141,6 +1141,7 @@ async def test_empty_stream_reader() -> None:
     with pytest.raises(asyncio.IncompleteReadError):
         await s.readexactly(10)
     assert s.read_nowait() == b""
+    assert s.total_bytes == 0
 
 
 async def test_empty_stream_reader_iter_chunks() -> None:
diff --git tests/test_urldispatch.py tests/test_urldispatch.py
index 8ee3df33202..ba6bdff23a0 100644
--- tests/test_urldispatch.py
+++ tests/test_urldispatch.py
@@ -358,7 +358,7 @@ def test_add_static_path_resolution(router: any) -> None:
     """Test that static paths are expanded and absolute."""
     res = router.add_static("/", "~/..")
     directory = str(res.get_info()["directory"])
-    assert directory == str(pathlib.Path.home().parent)
+    assert directory == str(pathlib.Path.home().resolve(strict=True).parent)
 
 
 def test_add_static(router) -> None:
diff --git tests/test_web_functional.py tests/test_web_functional.py
index a3a990141a1..e4979851300 100644
--- tests/test_web_functional.py
+++ tests/test_web_functional.py
@@ -2324,3 +2324,41 @@ async def handler(request: web.Request) -> web.Response:
         # Make 2nd request which will hit the race condition.
         async with client.get("/") as resp:
             assert resp.status == 200
+
+
+async def test_keepalive_expires_on_time(aiohttp_client: AiohttpClient) -> None:
+    """Test that the keepalive handle expires on time."""
+
+    async def handler(request: web.Request) -> web.Response:
+        body = await request.read()
+        assert b"" == body
+        return web.Response(body=b"OK")
+
+    app = web.Application()
+    app.router.add_route("GET", "/", handler)
+
+    connector = aiohttp.TCPConnector(limit=1)
+    client = await aiohttp_client(app, connector=connector)
+
+    loop = asyncio.get_running_loop()
+    now = loop.time()
+
+    # Patch loop time so we can control when the keepalive timeout is processed
+    with mock.patch.object(loop, "time") as loop_time_mock:
+        loop_time_mock.return_value = now
+        resp1 = await client.get("/")
+        await resp1.read()
+        request_handler = client.server.handler.connections[0]
+
+        # Ensure the keep alive handle is set
+        assert request_handler._keepalive_handle is not None
+
+        # Set the loop time to exactly the keepalive timeout
+        loop_time_mock.return_value = request_handler._next_keepalive_close_time
+
+        # sleep twice to ensure the keep alive timeout is processed
+        await asyncio.sleep(0)
+        await asyncio.sleep(0)
+
+        # Ensure the keep alive handle expires
+        assert request_handler._keepalive_handle is None
diff --git tests/test_web_response.py tests/test_web_response.py
index f4acf23f61b..0591426c57b 100644
--- tests/test_web_response.py
+++ tests/test_web_response.py
@@ -1201,7 +1201,7 @@ def read(self, size: int = -1) -> bytes:
         (BodyPartReader("x", CIMultiDictProxy(CIMultiDict()), mock.Mock()), None),
         (
             mpwriter,
-            "--x\nContent-Type: text/plain; charset=utf-8\r\nContent-Length: 4\r\n\r\ntest",
+            "--x\r\nContent-Type: text/plain; charset=utf-8\r\nContent-Length: 4\r\n\r\ntest",
         ),
     ),
 )
diff --git tests/test_web_server.py tests/test_web_server.py
index 7b9b87a374a..9098ef9e7bf 100644
--- tests/test_web_server.py
+++ tests/test_web_server.py
@@ -56,7 +56,9 @@ async def handler(request):
     assert txt.startswith("500 Internal Server Error")
     assert "Traceback" not in txt
 
-    logger.exception.assert_called_with("Error handling request", exc_info=exc)
+    logger.exception.assert_called_with(
+        "Error handling request from %s", cli.host, exc_info=exc
+    )
 
 
 async def test_raw_server_logs_invalid_method_with_loop_debug(
@@ -85,7 +87,9 @@ async def handler(request: web.BaseRequest) -> NoReturn:
     # on the first request since the client may
     # be probing for TLS/SSL support which is
     # expected to fail
-    logger.debug.assert_called_with("Error handling request", exc_info=exc)
+    logger.debug.assert_called_with(
+        "Error handling request from %s", cli.host, exc_info=exc
+    )
     logger.debug.reset_mock()
 
     # Now make another connection to the server
@@ -99,7 +103,9 @@ async def handler(request: web.BaseRequest) -> NoReturn:
     # on the first request since the client may
     # be probing for TLS/SSL support which is
     # expected to fail
-    logger.debug.assert_called_with("Error handling request", exc_info=exc)
+    logger.debug.assert_called_with(
+        "Error handling request from %s", cli.host, exc_info=exc
+    )
 
 
 async def test_raw_server_logs_invalid_method_without_loop_debug(
@@ -128,7 +134,9 @@ async def handler(request: web.BaseRequest) -> NoReturn:
     # on the first request since the client may
     # be probing for TLS/SSL support which is
     # expected to fail
-    logger.debug.assert_called_with("Error handling request", exc_info=exc)
+    logger.debug.assert_called_with(
+        "Error handling request from %s", cli.host, exc_info=exc
+    )
 
 
 async def test_raw_server_logs_invalid_method_second_request(
@@ -159,7 +167,9 @@ async def handler(request: web.BaseRequest) -> web.Response:
     # BadHttpMethod should be logged as an exception
     # if its not the first request since we know
     # that the client already was speaking HTTP
-    logger.exception.assert_called_with("Error handling request", exc_info=exc)
+    logger.exception.assert_called_with(
+        "Error handling request from %s", cli.host, exc_info=exc
+    )
 
 
 async def test_raw_server_logs_bad_status_line_as_exception(
@@ -184,7 +194,9 @@ async def handler(request: web.BaseRequest) -> NoReturn:
     txt = await resp.text()
     assert "Traceback (most recent call last):\n" not in txt
 
-    logger.exception.assert_called_with("Error handling request", exc_info=exc)
+    logger.exception.assert_called_with(
+        "Error handling request from %s", cli.host, exc_info=exc
+    )
 
 
 async def test_raw_server_handler_timeout(
@@ -221,6 +233,24 @@ async def handler(request):
     logger.debug.assert_called_with("Ignored premature client disconnection")
 
 
+async def test_raw_server_does_not_swallow_base_exceptions(
+    aiohttp_raw_server: AiohttpRawServer, aiohttp_client: AiohttpClient
+) -> None:
+    class UnexpectedException(BaseException):
+        """Dummy base exception."""
+
+    async def handler(request: web.BaseRequest) -> NoReturn:
+        raise UnexpectedException()
+
+    loop = asyncio.get_event_loop()
+    loop.set_debug(True)
+    server = await aiohttp_raw_server(handler)
+    cli = await aiohttp_client(server)
+
+    with pytest.raises(client.ServerDisconnectedError):
+        await cli.get("/path/to", timeout=client.ClientTimeout(10))
+
+
 async def test_raw_server_cancelled_in_write_eof(aiohttp_raw_server, aiohttp_client):
     async def handler(request):
         resp = web.Response(text=str(request.rel_url))
@@ -254,7 +284,9 @@ async def handler(request):
     txt = await resp.text()
     assert "Traceback (most recent call last):\n" in txt
 
-    logger.exception.assert_called_with("Error handling request", exc_info=exc)
+    logger.exception.assert_called_with(
+        "Error handling request from %,s", cli.host, exc_info=exc
+    )
 
 
 async def test_raw_server_html_exception(aiohttp_raw_server, aiohttp_client):
@@ -278,7 +310,9 @@ async def handler(request):
         "</body></html>\n"
     )
 
-    logger.exception.assert_called_with("Error handling request", exc_info=exc)
+    logger.exception.assert_called_with(
+        "Error handling request from %s", cli.host, exc_info=exc
+    )
 
 
 async def test_raw_server_html_exception_debug(aiohttp_raw_server, aiohttp_client):
@@ -302,7 +336,9 @@ async def handler(request):
         "<pre>Traceback (most recent call last):\n"
     )
 
-    logger.exception.assert_called_with("Error handling request", exc_info=exc)
+    logger.exception.assert_called_with(
+        "Error handling request from %s", cli.host, exc_info=exc
+    )
 
 
 async def test_handler_cancellation(unused_port_socket: socket.socket) -> None:
diff --git tests/test_web_urldispatcher.py tests/test_web_urldispatcher.py
index 92066f09b7d..ee60b6917c5 100644
--- tests/test_web_urldispatcher.py
+++ tests/test_web_urldispatcher.py
@@ -585,16 +585,17 @@ async def test_access_mock_special_resource(
     my_special.touch()
 
     real_result = my_special.stat()
-    real_stat = pathlib.Path.stat
+    real_stat = os.stat
 
-    def mock_stat(self: pathlib.Path, **kwargs: Any) -> os.stat_result:
-        s = real_stat(self, **kwargs)
+    def mock_stat(path: Any, **kwargs: Any) -> os.stat_result:
+        s = real_stat(path, **kwargs)
         if os.path.samestat(s, real_result):
             mock_mode = S_IFIFO | S_IMODE(s.st_mode)
             s = os.stat_result([mock_mode] + list(s)[1:])
         return s
 
     monkeypatch.setattr("pathlib.Path.stat", mock_stat)
+    monkeypatch.setattr("os.stat", mock_stat)
 
     app = web.Application()
     app.router.add_static("/", str(tmp_path))
diff --git tests/test_web_websocket_functional.py tests/test_web_websocket_functional.py
index b7494d9265f..945096a2af3 100644
--- tests/test_web_websocket_functional.py
+++ tests/test_web_websocket_functional.py
@@ -797,6 +797,7 @@ async def handler(request: web.Request) -> NoReturn:
     assert ws.close_code == WSCloseCode.ABNORMAL_CLOSURE
     assert ws_server_close_code == WSCloseCode.ABNORMAL_CLOSURE
     assert isinstance(ws_server_exception, asyncio.TimeoutError)
+    assert str(ws_server_exception) == "No PONG received after 0.025 seconds"
     await ws.close()
 
 
diff --git tests/test_websocket_handshake.py tests/test_websocket_handshake.py
index bbfa1d9260d..53d5d9152bb 100644
--- tests/test_websocket_handshake.py
+++ tests/test_websocket_handshake.py
@@ -174,7 +174,7 @@ async def test_handshake_protocol_unsupported(caplog) -> None:
 
     assert (
         caplog.records[-1].msg
-        == "Client protocols %r don’t overlap server-known ones %r"
+        == "%s: Client protocols %r don’t overlap server-known ones %r"
     )
     assert ws.ws_protocol is None
 
diff --git tools/gen.py tools/gen.py
index ab2b39a2df0..24fb71bdd9d 100755
--- tools/gen.py
+++ tools/gen.py
@@ -7,7 +7,7 @@
 import multidict
 
 ROOT = pathlib.Path.cwd()
-while ROOT.parent != ROOT and not (ROOT / ".git").exists():
+while ROOT.parent != ROOT and not (ROOT / "pyproject.toml").exists():
     ROOT = ROOT.parent
 
 

Description

This PR introduces several significant improvements and bug fixes to the aiohttp library, updating it from version 3.11.9 to 3.11.15. The changes focus on memory leak fixes, performance improvements, and enhanced error handling.

Possible Issues

  1. The disabled zero copy writes in StreamWriter could impact performance for some use cases
  2. Changing header serialization logic might cause issues with applications relying on specific behavior

Security Hotspots

  1. Added header injection protection in _http_writer.pyx by validating against newlines and carriage returns in headers
  2. Fixed potential socket leaks during connection handling for failed connections
Changes

Changes

Key changes by component:

CI/CD Updates

  • Updated GitHub Actions to latest versions
  • Added support for musllinux wheels
  • Added Python 3.13.2 support

Memory Management

  • Fixed memory leaks in ClientResponse and Request objects
  • Improved cyclic reference handling for connections and requests
  • Added better connection cleanup and resource management

Performance Improvements

  • Enhanced WebSocket buffer handling
  • Improved header serialization performance
  • Added content type parsing caching
  • Modified writelines usage based on payload size

Error Handling

  • Improved DNS resolution error handling
  • Better logging for WebSocket disconnects
  • Enhanced protocol error logging with remote address information

API Changes

  • Updated request() function to accept more options
  • Added new helper functions for content type parsing
  • Enhanced FileResponse handling
sequenceDiagram
    participant Client
    participant ClientSession
    participant TCPConnector
    participant StreamWriter
    participant Server

    Client->>ClientSession: request()
    ClientSession->>TCPConnector: _create_direct_connection()
    TCPConnector-->>ClientSession: connection
    ClientSession->>StreamWriter: write_headers()
    Note over StreamWriter: New header injection checks
    StreamWriter->>Server: send headers
    alt payload > MIN_PAYLOAD_FOR_WRITELINES
        StreamWriter->>Server: writelines()
    else
        StreamWriter->>Server: write()
    end
    Server-->>Client: response
Loading

@renovate renovate bot force-pushed the renovate/aiohttp-3.x branch from 549f427 to 1ea2bfe Compare April 9, 2025 03:03
@renovate renovate bot changed the title chore(deps): update dependency aiohttp to v3.11.15 chore(deps): update dependency aiohttp to v3.11.16 Apr 9, 2025
Copy link

github-actions bot commented Apr 9, 2025

[puLL-Merge] - aio-libs/[email protected]

Diff
diff --git .github/workflows/ci-cd.yml .github/workflows/ci-cd.yml
index 765047b933f..a794dc65d77 100644
--- .github/workflows/ci-cd.yml
+++ .github/workflows/ci-cd.yml
@@ -47,7 +47,7 @@ jobs:
       with:
         python-version: 3.11
     - name: Cache PyPI
-      uses: actions/[email protected]
+      uses: actions/[email protected]
       with:
         key: pip-lint-${{ hashFiles('requirements/*.txt') }}
         path: ~/.cache/pip
@@ -99,7 +99,7 @@ jobs:
       with:
         submodules: true
     - name: Cache llhttp generated files
-      uses: actions/[email protected]
+      uses: actions/[email protected]
       id: cache
       with:
         key: llhttp-${{ hashFiles('vendor/llhttp/package*.json', 'vendor/llhttp/src/**/*') }}
@@ -114,7 +114,7 @@ jobs:
       run: |
         make generate-llhttp
     - name: Upload llhttp generated files
-      uses: actions/upload-artifact@v3
+      uses: actions/upload-artifact@v4
       with:
         name: llhttp
         path: vendor/llhttp/build
@@ -163,7 +163,7 @@ jobs:
         echo "dir=$(pip cache dir)" >> "${GITHUB_OUTPUT}"
       shell: bash
     - name: Cache PyPI
-      uses: actions/[email protected]
+      uses: actions/[email protected]
       with:
         key: pip-ci-${{ runner.os }}-${{ matrix.pyver }}-${{ matrix.no-extensions }}-${{ hashFiles('requirements/*.txt') }}
         path: ${{ steps.pip-cache.outputs.dir }}
@@ -177,7 +177,7 @@ jobs:
         python -m pip install -r requirements/test.in -c requirements/test.txt
     - name: Restore llhttp generated files
       if: ${{ matrix.no-extensions == '' }}
-      uses: actions/download-artifact@v3
+      uses: actions/download-artifact@v4
       with:
         name: llhttp
         path: vendor/llhttp/build/
@@ -250,11 +250,11 @@ jobs:
       uses: actions/checkout@v4
       with:
         submodules: true
-    - name: Setup Python 3.12
+    - name: Setup Python 3.13.2
       id: python-install
       uses: actions/setup-python@v5
       with:
-        python-version: 3.12
+        python-version: 3.13.2
         cache: pip
         cache-dependency-path: requirements/*.txt
     - name: Update pip, wheel, setuptools, build, twine
@@ -264,7 +264,7 @@ jobs:
       run: |
         python -m pip install -r requirements/test.in -c requirements/test.txt
     - name: Restore llhttp generated files
-      uses: actions/download-artifact@v3
+      uses: actions/download-artifact@v4
       with:
         name: llhttp
         path: vendor/llhttp/build/
@@ -325,7 +325,7 @@ jobs:
         python -m
         pip install -r requirements/cython.in -c requirements/cython.txt
     - name: Restore llhttp generated files
-      uses: actions/download-artifact@v3
+      uses: actions/download-artifact@v4
       with:
         name: llhttp
         path: vendor/llhttp/build/
@@ -336,27 +336,41 @@ jobs:
       run: |
         python -m build --sdist
     - name: Upload artifacts
-      uses: actions/upload-artifact@v3
+      uses: actions/upload-artifact@v4
       with:
-        name: dist
+        name: dist-sdist
         path: dist
 
   build-wheels:
-    name: Build wheels on ${{ matrix.os }} ${{ matrix.qemu }}
-    runs-on: ${{ matrix.os }}-latest
+    name: Build wheels on ${{ matrix.os }} ${{ matrix.qemu }} ${{ matrix.musl }}
+    runs-on: ${{ matrix.os }}
     needs: pre-deploy
     strategy:
       matrix:
-        os: [ubuntu, windows, macos]
+        os: ["ubuntu-latest", "windows-latest", "macos-latest", "ubuntu-24.04-arm"]
         qemu: ['']
+        musl: [""]
         include:
-          # Split ubuntu job for the sake of speed-up
-        - os: ubuntu
-          qemu: aarch64
-        - os: ubuntu
+          # Split ubuntu/musl jobs for the sake of speed-up
+        - os: ubuntu-latest
+          qemu: ppc64le
+          musl: ""
+        - os: ubuntu-latest
           qemu: ppc64le
-        - os: ubuntu
+          musl: musllinux
+        - os: ubuntu-latest
           qemu: s390x
+          musl: ""
+        - os: ubuntu-latest
+          qemu: s390x
+          musl: musllinux
+        - os: ubuntu-latest
+          qemu: armv7l
+          musl: musllinux
+        - os: ubuntu-latest
+          musl: musllinux
+        - os: ubuntu-24.04-arm
+          musl: musllinux
     steps:
     - name: Checkout
       uses: actions/checkout@v4
@@ -367,6 +381,10 @@ jobs:
       uses: docker/setup-qemu-action@v3
       with:
         platforms: all
+        # This should be temporary
+        # xref https://github.com/docker/setup-qemu-action/issues/188
+        # xref https://github.com/tonistiigi/binfmt/issues/215
+        image: tonistiigi/binfmt:qemu-v8.1.5
       id: qemu
     - name: Prepare emulation
       run: |
@@ -388,7 +406,7 @@ jobs:
         python -m
         pip install -r requirements/cython.in -c requirements/cython.txt
     - name: Restore llhttp generated files
-      uses: actions/download-artifact@v3
+      uses: actions/download-artifact@v4
       with:
         name: llhttp
         path: vendor/llhttp/build/
@@ -398,10 +416,17 @@ jobs:
     - name: Build wheels
       uses: pypa/[email protected]
       env:
+        CIBW_SKIP: pp* ${{ matrix.musl == 'musllinux' && '*manylinux*' || '*musllinux*' }}
         CIBW_ARCHS_MACOS: x86_64 arm64 universal2
-    - uses: actions/upload-artifact@v3
+    - name: Upload wheels
+      uses: actions/upload-artifact@v4
       with:
-        name: dist
+        name: >-
+          dist-${{ matrix.os }}-${{ matrix.musl }}-${{
+            matrix.qemu
+            && matrix.qemu
+            || 'native'
+          }}
         path: ./wheelhouse/*.whl
 
   deploy:
@@ -426,10 +451,11 @@ jobs:
       run: |
         echo "${{ secrets.GITHUB_TOKEN }}" | gh auth login --with-token
     - name: Download distributions
-      uses: actions/download-artifact@v3
+      uses: actions/download-artifact@v4
       with:
-        name: dist
         path: dist
+        pattern: dist-*
+        merge-multiple: true
     - name: Collected dists
       run: |
         tree dist
diff --git .readthedocs.yml .readthedocs.yml
index b3edaf4b8ea..b7d8a9236f6 100644
--- .readthedocs.yml
+++ .readthedocs.yml
@@ -5,6 +5,10 @@
 ---
 version: 2
 
+sphinx:
+  # Path to your Sphinx configuration file.
+  configuration: docs/conf.py
+
 submodules:
   include: all
   exclude: []
diff --git CHANGES.rst CHANGES.rst
index 8352236c320..00d728e775d 100644
--- CHANGES.rst
+++ CHANGES.rst
@@ -10,6 +10,418 @@
 
 .. towncrier release notes start
 
+3.11.16 (2025-04-01)
+====================
+
+Bug fixes
+---------
+
+- Replaced deprecated ``asyncio.iscoroutinefunction`` with its counterpart from ``inspect``
+  -- by :user:`layday`.
+
+
+  *Related issues and pull requests on GitHub:*
+  :issue:`10634`.
+
+
+
+- Fixed :class:`multidict.CIMultiDict` being mutated when passed to :class:`aiohttp.web.Response` -- by :user:`bdraco`.
+
+
+  *Related issues and pull requests on GitHub:*
+  :issue:`10672`.
+
+
+
+
+----
+
+
+3.11.15 (2025-03-31)
+====================
+
+Bug fixes
+---------
+
+- Reverted explicitly closing sockets if an exception is raised during ``create_connection`` -- by :user:`bdraco`.
+
+  This change originally appeared in aiohttp 3.11.13
+
+
+  *Related issues and pull requests on GitHub:*
+  :issue:`10464`, :issue:`10617`, :issue:`10656`.
+
+
+
+
+Miscellaneous internal changes
+------------------------------
+
+- Improved performance of WebSocket buffer handling -- by :user:`bdraco`.
+
+
+  *Related issues and pull requests on GitHub:*
+  :issue:`10601`.
+
+
+
+- Improved performance of serializing headers -- by :user:`bdraco`.
+
+
+  *Related issues and pull requests on GitHub:*
+  :issue:`10625`.
+
+
+
+
+----
+
+
+3.11.14 (2025-03-16)
+====================
+
+Bug fixes
+---------
+
+- Fixed an issue where dns queries were delayed indefinitely when an exception occurred in a ``trace.send_dns_cache_miss``
+  -- by :user:`logioniz`.
+
+
+  *Related issues and pull requests on GitHub:*
+  :issue:`10529`.
+
+
+
+- Fixed DNS resolution on platforms that don't support ``socket.AI_ADDRCONFIG`` -- by :user:`maxbachmann`.
+
+
+  *Related issues and pull requests on GitHub:*
+  :issue:`10542`.
+
+
+
+- The connector now raises :exc:`aiohttp.ClientConnectionError` instead of :exc:`OSError` when failing to explicitly close the socket after :py:meth:`asyncio.loop.create_connection` fails -- by :user:`bdraco`.
+
+
+  *Related issues and pull requests on GitHub:*
+  :issue:`10551`.
+
+
+
+- Break cyclic references at connection close when there was a traceback -- by :user:`bdraco`.
+
+  Special thanks to :user:`availov` for reporting the issue.
+
+
+  *Related issues and pull requests on GitHub:*
+  :issue:`10556`.
+
+
+
+- Break cyclic references when there is an exception handling a request -- by :user:`bdraco`.
+
+
+  *Related issues and pull requests on GitHub:*
+  :issue:`10569`.
+
+
+
+
+Features
+--------
+
+- Improved logging on non-overlapping WebSocket client protocols to include the remote address -- by :user:`bdraco`.
+
+
+  *Related issues and pull requests on GitHub:*
+  :issue:`10564`.
+
+
+
+
+Miscellaneous internal changes
+------------------------------
+
+- Improved performance of parsing content types by adding a cache in the same manner currently done with mime types -- by :user:`bdraco`.
+
+
+  *Related issues and pull requests on GitHub:*
+  :issue:`10552`.
+
+
+
+
+----
+
+
+3.11.13 (2025-02-24)
+====================
+
+Bug fixes
+---------
+
+- Removed a break statement inside the finally block in :py:class:`~aiohttp.web.RequestHandler`
+  -- by :user:`Cycloctane`.
+
+
+  *Related issues and pull requests on GitHub:*
+  :issue:`10434`.
+
+
+
+- Changed connection creation to explicitly close sockets if an exception is raised in the event loop's ``create_connection`` method -- by :user:`top-oai`.
+
+
+  *Related issues and pull requests on GitHub:*
+  :issue:`10464`.
+
+
+
+
+Packaging updates and notes for downstreams
+-------------------------------------------
+
+- Fixed test ``test_write_large_payload_deflate_compression_data_in_eof_writelines`` failing with Python 3.12.9+ or 3.13.2+ -- by :user:`bdraco`.
+
+
+  *Related issues and pull requests on GitHub:*
+  :issue:`10423`.
+
+
+
+
+Miscellaneous internal changes
+------------------------------
+
+- Added human-readable error messages to the exceptions for WebSocket disconnects due to PONG not being received -- by :user:`bdraco`.
+
+  Previously, the error messages were empty strings, which made it hard to determine what went wrong.
+
+
+  *Related issues and pull requests on GitHub:*
+  :issue:`10422`.
+
+
+
+
+----
+
+
+3.11.12 (2025-02-05)
+====================
+
+Bug fixes
+---------
+
+- ``MultipartForm.decode()`` now follows RFC1341 7.2.1 with a ``CRLF`` after the boundary
+  -- by :user:`imnotjames`.
+
+
+  *Related issues and pull requests on GitHub:*
+  :issue:`10270`.
+
+
+
+- Restored the missing ``total_bytes`` attribute to ``EmptyStreamReader`` -- by :user:`bdraco`.
+
+
+  *Related issues and pull requests on GitHub:*
+  :issue:`10387`.
+
+
+
+
+Features
+--------
+
+- Updated :py:func:`~aiohttp.request` to make it accept ``_RequestOptions`` kwargs.
+  -- by :user:`Cycloctane`.
+
+
+  *Related issues and pull requests on GitHub:*
+  :issue:`10300`.
+
+
+
+- Improved logging of HTTP protocol errors to include the remote address -- by :user:`bdraco`.
+
+
+  *Related issues and pull requests on GitHub:*
+  :issue:`10332`.
+
+
+
+
+Improved documentation
+----------------------
+
+- Added ``aiohttp-openmetrics`` to list of third-party libraries -- by :user:`jelmer`.
+
+
+  *Related issues and pull requests on GitHub:*
+  :issue:`10304`.
+
+
+
+
+Packaging updates and notes for downstreams
+-------------------------------------------
+
+- Added missing files to the source distribution to fix ``Makefile`` targets.
+  Added a ``cythonize-nodeps`` target to run Cython without invoking pip to install dependencies.
+
+
+  *Related issues and pull requests on GitHub:*
+  :issue:`10366`.
+
+
+
+- Started building armv7l musllinux wheels -- by :user:`bdraco`.
+
+
+  *Related issues and pull requests on GitHub:*
+  :issue:`10404`.
+
+
+
+
+Contributor-facing changes
+--------------------------
+
+- The CI/CD workflow has been updated to use `upload-artifact` v4 and `download-artifact` v4 GitHub Actions -- by :user:`silamon`.
+
+
+  *Related issues and pull requests on GitHub:*
+  :issue:`10281`.
+
+
+
+
+Miscellaneous internal changes
+------------------------------
+
+- Restored support for zero copy writes when using Python 3.12 versions 3.12.9 and later or Python 3.13.2+ -- by :user:`bdraco`.
+
+  Zero copy writes were previously disabled due to :cve:`2024-12254` which is resolved in these Python versions.
+
+
+  *Related issues and pull requests on GitHub:*
+  :issue:`10137`.
+
+
+
+
+----
+
+
+3.11.11 (2024-12-18)
+====================
+
+Bug fixes
+---------
+
+- Updated :py:meth:`~aiohttp.ClientSession.request` to reuse the ``quote_cookie`` setting from ``ClientSession._cookie_jar`` when processing cookies parameter.
+  -- by :user:`Cycloctane`.
+
+
+  *Related issues and pull requests on GitHub:*
+  :issue:`10093`.
+
+
+
+- Fixed type of ``SSLContext`` for some static type checkers (e.g. pyright).
+
+
+  *Related issues and pull requests on GitHub:*
+  :issue:`10099`.
+
+
+
+- Updated :meth:`aiohttp.web.StreamResponse.write` annotation to also allow :class:`bytearray` and :class:`memoryview` as inputs -- by :user:`cdce8p`.
+
+
+  *Related issues and pull requests on GitHub:*
+  :issue:`10154`.
+
+
+
+- Fixed a hang where a connection previously used for a streaming
+  download could be returned to the pool in a paused state.
+  -- by :user:`javitonino`.
+
+
+  *Related issues and pull requests on GitHub:*
+  :issue:`10169`.
+
+
+
+
+Features
+--------
+
+- Enabled ALPN on default SSL contexts. This improves compatibility with some
+  proxies which don't work without this extension.
+  -- by :user:`Cycloctane`.
+
+
+  *Related issues and pull requests on GitHub:*
+  :issue:`10156`.
+
+
+
+
+Miscellaneous internal changes
+------------------------------
+
+- Fixed an infinite loop that can occur when using aiohttp in combination
+  with `async-solipsism`_ -- by :user:`bmerry`.
+
+  .. _async-solipsism: https://github.com/bmerry/async-solipsism
+
+
+  *Related issues and pull requests on GitHub:*
+  :issue:`10149`.
+
+
+
+
+----
+
+
+3.11.10 (2024-12-05)
+====================
+
+Bug fixes
+---------
+
+- Fixed race condition in :class:`aiohttp.web.FileResponse` that could have resulted in an incorrect response if the file was replaced on the file system during ``prepare`` -- by :user:`bdraco`.
+
+
+  *Related issues and pull requests on GitHub:*
+  :issue:`10101`, :issue:`10113`.
+
+
+
+- Replaced deprecated call to :func:`mimetypes.guess_type` with :func:`mimetypes.guess_file_type` when using Python 3.13+ -- by :user:`bdraco`.
+
+
+  *Related issues and pull requests on GitHub:*
+  :issue:`10102`.
+
+
+
+- Disabled zero copy writes in the ``StreamWriter`` -- by :user:`bdraco`.
+
+
+  *Related issues and pull requests on GitHub:*
+  :issue:`10125`.
+
+
+
+
+----
+
+
 3.11.9 (2024-12-01)
 ===================
 
diff --git CONTRIBUTORS.txt CONTRIBUTORS.txt
index 6adb3b97fb1..953af52498a 100644
--- CONTRIBUTORS.txt
+++ CONTRIBUTORS.txt
@@ -9,6 +9,7 @@ Adam Mills
 Adrian Krupa
 Adrián Chaves
 Ahmed Tahri
+Alan Bogarin
 Alan Tse
 Alec Hanefeld
 Alejandro Gómez
@@ -30,6 +31,7 @@ Alexandru Mihai
 Alexey Firsov
 Alexey Nikitin
 Alexey Popravka
+Alexey Stavrov
 Alexey Stepanov
 Amin Etesamian
 Amit Tulshyan
@@ -41,6 +43,7 @@ Andrej Antonov
 Andrew Leech
 Andrew Lytvyn
 Andrew Svetlov
+Andrew Top
 Andrew Zhou
 Andrii Soldatenko
 Anes Abismail
@@ -166,10 +169,12 @@ Jaesung Lee
 Jake Davis
 Jakob Ackermann
 Jakub Wilk
+James Ward
 Jan Buchar
 Jan Gosmann
 Jarno Elonen
 Jashandeep Sohi
+Javier Torres
 Jean-Baptiste Estival
 Jens Steinhauser
 Jeonghun Lee
@@ -364,6 +369,7 @@ William S.
 Wilson Ong
 wouter bolsterlee
 Xavier Halloran
+Xi Rui
 Xiang Li
 Yang Zhou
 Yannick Koechlin
diff --git MANIFEST.in MANIFEST.in
index d7c5cef6aad..64cee139a1f 100644
--- MANIFEST.in
+++ MANIFEST.in
@@ -7,6 +7,7 @@ graft aiohttp
 graft docs
 graft examples
 graft tests
+graft tools
 graft requirements
 recursive-include vendor *
 global-include aiohttp *.pyi
diff --git Makefile Makefile
index b0a3ef3226b..c6193fea9e4 100644
--- Makefile
+++ Makefile
@@ -81,6 +81,9 @@ generate-llhttp: .llhttp-gen
 .PHONY: cythonize
 cythonize: .install-cython $(PYXS:.pyx=.c) aiohttp/_websocket/reader_c.c
 
+.PHONY: cythonize-nodeps
+cythonize-nodeps: $(PYXS:.pyx=.c) aiohttp/_websocket/reader_c.c
+
 .install-deps: .install-cython $(PYXS:.pyx=.c) aiohttp/_websocket/reader_c.c $(call to-hash,$(CYS) $(REQS))
 	@python -m pip install -r requirements/dev.in -c requirements/dev.txt
 	@touch .install-deps
diff --git aiohttp/__init__.py aiohttp/__init__.py
index 5615e5349ae..93b06c7367a 100644
--- aiohttp/__init__.py
+++ aiohttp/__init__.py
@@ -1,4 +1,4 @@
-__version__ = "3.11.9"
+__version__ = "3.11.16"
 
 from typing import TYPE_CHECKING, Tuple
 
diff --git aiohttp/_http_writer.pyx aiohttp/_http_writer.pyx
index 287371334f8..4a3ae1f9e68 100644
--- aiohttp/_http_writer.pyx
+++ aiohttp/_http_writer.pyx
@@ -97,27 +97,34 @@ cdef inline int _write_str(Writer* writer, str s):
             return -1
 
 
-# --------------- _serialize_headers ----------------------
-
-cdef str to_str(object s):
+cdef inline int _write_str_raise_on_nlcr(Writer* writer, object s):
+    cdef Py_UCS4 ch
+    cdef str out_str
     if type(s) is str:
-        return <str>s
+        out_str = <str>s
     elif type(s) is _istr:
-        return PyObject_Str(s)
+        out_str = PyObject_Str(s)
     elif not isinstance(s, str):
         raise TypeError("Cannot serialize non-str key {!r}".format(s))
     else:
-        return str(s)
+        out_str = str(s)
+
+    for ch in out_str:
+        if ch == 0x0D or ch == 0x0A:
+            raise ValueError(
+                "Newline or carriage return detected in headers. "
+                "Potential header injection attack."
+            )
+        if _write_utf8(writer, ch) < 0:
+            return -1
 
 
+# --------------- _serialize_headers ----------------------
 
 def _serialize_headers(str status_line, headers):
     cdef Writer writer
     cdef object key
     cdef object val
-    cdef bytes ret
-    cdef str key_str
-    cdef str val_str
 
     _init_writer(&writer)
 
@@ -130,22 +137,13 @@ def _serialize_headers(str status_line, headers):
             raise
 
         for key, val in headers.items():
-            key_str = to_str(key)
-            val_str = to_str(val)
-
-            if "\r" in key_str or "\n" in key_str or "\r" in val_str or "\n" in val_str:
-                raise ValueError(
-                    "Newline or carriage return character detected in HTTP status message or "
-                    "header. This is a potential security issue."
-                )
-
-            if _write_str(&writer, key_str) < 0:
+            if _write_str_raise_on_nlcr(&writer, key) < 0:
                 raise
             if _write_byte(&writer, b':') < 0:
                 raise
             if _write_byte(&writer, b' ') < 0:
                 raise
-            if _write_str(&writer, val_str) < 0:
+            if _write_str_raise_on_nlcr(&writer, val) < 0:
                 raise
             if _write_byte(&writer, b'\r') < 0:
                 raise
diff --git aiohttp/_websocket/reader_c.pxd aiohttp/_websocket/reader_c.pxd
index 461e658e116..f156a7ff704 100644
--- aiohttp/_websocket/reader_c.pxd
+++ aiohttp/_websocket/reader_c.pxd
@@ -93,6 +93,7 @@ cdef class WebSocketReader:
         chunk_size="unsigned int",
         chunk_len="unsigned int",
         buf_length="unsigned int",
+        buf_cstr="const unsigned char *",
         first_byte="unsigned char",
         second_byte="unsigned char",
         end_pos="unsigned int",
diff --git aiohttp/_websocket/reader_py.py aiohttp/_websocket/reader_py.py
index 94d20010890..92ad47a52f0 100644
--- aiohttp/_websocket/reader_py.py
+++ aiohttp/_websocket/reader_py.py
@@ -93,6 +93,7 @@ def _release_waiter(self) -> None:
     def feed_eof(self) -> None:
         self._eof = True
         self._release_waiter()
+        self._exception = None  # Break cyclic references
 
     def feed_data(self, data: "WSMessage", size: "int_") -> None:
         self._size += size
@@ -193,9 +194,8 @@ def _feed_data(self, data: bytes) -> None:
                     if self._max_msg_size and len(self._partial) >= self._max_msg_size:
                         raise WebSocketError(
                             WSCloseCode.MESSAGE_TOO_BIG,
-                            "Message size {} exceeds limit {}".format(
-                                len(self._partial), self._max_msg_size
-                            ),
+                            f"Message size {len(self._partial)} "
+                            f"exceeds limit {self._max_msg_size}",
                         )
                     continue
 
@@ -214,7 +214,7 @@ def _feed_data(self, data: bytes) -> None:
                     raise WebSocketError(
                         WSCloseCode.PROTOCOL_ERROR,
                         "The opcode in non-fin frame is expected "
-                        "to be zero, got {!r}".format(opcode),
+                        f"to be zero, got {opcode!r}",
                     )
 
                 assembled_payload: Union[bytes, bytearray]
@@ -227,9 +227,8 @@ def _feed_data(self, data: bytes) -> None:
                 if self._max_msg_size and len(assembled_payload) >= self._max_msg_size:
                     raise WebSocketError(
                         WSCloseCode.MESSAGE_TOO_BIG,
-                        "Message size {} exceeds limit {}".format(
-                            len(assembled_payload), self._max_msg_size
-                        ),
+                        f"Message size {len(assembled_payload)} "
+                        f"exceeds limit {self._max_msg_size}",
                     )
 
                 # Decompress process must to be done after all packets
@@ -246,9 +245,8 @@ def _feed_data(self, data: bytes) -> None:
                         left = len(self._decompressobj.unconsumed_tail)
                         raise WebSocketError(
                             WSCloseCode.MESSAGE_TOO_BIG,
-                            "Decompressed message size {} exceeds limit {}".format(
-                                self._max_msg_size + left, self._max_msg_size
-                            ),
+                            f"Decompressed message size {self._max_msg_size + left}"
+                            f" exceeds limit {self._max_msg_size}",
                         )
                 elif type(assembled_payload) is bytes:
                     payload_merged = assembled_payload
@@ -327,14 +325,15 @@ def parse_frame(
 
         start_pos: int = 0
         buf_length = len(buf)
+        buf_cstr = buf
 
         while True:
             # read header
             if self._state == READ_HEADER:
                 if buf_length - start_pos < 2:
                     break
-                first_byte = buf[start_pos]
-                second_byte = buf[start_pos + 1]
+                first_byte = buf_cstr[start_pos]
+                second_byte = buf_cstr[start_pos + 1]
                 start_pos += 2
 
                 fin = (first_byte >> 7) & 1
@@ -399,14 +398,14 @@ def parse_frame(
                 if length_flag == 126:
                     if buf_length - start_pos < 2:
                         break
-                    first_byte = buf[start_pos]
-                    second_byte = buf[start_pos + 1]
+                    first_byte = buf_cstr[start_pos]
+                    second_byte = buf_cstr[start_pos + 1]
                     start_pos += 2
                     self._payload_length = first_byte << 8 | second_byte
                 elif length_flag > 126:
                     if buf_length - start_pos < 8:
                         break
-                    data = buf[start_pos : start_pos + 8]
+                    data = buf_cstr[start_pos : start_pos + 8]
                     start_pos += 8
                     self._payload_length = UNPACK_LEN3(data)[0]
                 else:
@@ -418,7 +417,7 @@ def parse_frame(
             if self._state == READ_PAYLOAD_MASK:
                 if buf_length - start_pos < 4:
                     break
-                self._frame_mask = buf[start_pos : start_pos + 4]
+                self._frame_mask = buf_cstr[start_pos : start_pos + 4]
                 start_pos += 4
                 self._state = READ_PAYLOAD
 
@@ -434,10 +433,10 @@ def parse_frame(
                 if self._frame_payload_len:
                     if type(self._frame_payload) is not bytearray:
                         self._frame_payload = bytearray(self._frame_payload)
-                    self._frame_payload += buf[start_pos:end_pos]
+                    self._frame_payload += buf_cstr[start_pos:end_pos]
                 else:
                     # Fast path for the first frame
-                    self._frame_payload = buf[start_pos:end_pos]
+                    self._frame_payload = buf_cstr[start_pos:end_pos]
 
                 self._frame_payload_len += end_pos - start_pos
                 start_pos = end_pos
@@ -463,6 +462,7 @@ def parse_frame(
                 self._frame_payload_len = 0
                 self._state = READ_HEADER
 
-        self._tail = buf[start_pos:] if start_pos < buf_length else b""
+        # XXX: Cython needs slices to be bounded, so we can't omit the slice end here.
+        self._tail = buf_cstr[start_pos:buf_length] if start_pos < buf_length else b""
 
         return frames
diff --git aiohttp/abc.py aiohttp/abc.py
index d6f9f782b0f..5794a9108b0 100644
--- aiohttp/abc.py
+++ aiohttp/abc.py
@@ -17,6 +17,7 @@
     Optional,
     Tuple,
     TypedDict,
+    Union,
 )
 
 from multidict import CIMultiDict
@@ -175,6 +176,11 @@ class AbstractCookieJar(Sized, IterableBase):
     def __init__(self, *, loop: Optional[asyncio.AbstractEventLoop] = None) -> None:
         self._loop = loop or asyncio.get_running_loop()
 
+    @property
+    @abstractmethod
+    def quote_cookie(self) -> bool:
+        """Return True if cookies should be quoted."""
+
     @abstractmethod
     def clear(self, predicate: Optional[ClearCookiePredicate] = None) -> None:
         """Clear all cookies if no predicate is passed."""
@@ -200,7 +206,7 @@ class AbstractStreamWriter(ABC):
     length: Optional[int] = 0
 
     @abstractmethod
-    async def write(self, chunk: bytes) -> None:
+    async def write(self, chunk: Union[bytes, bytearray, memoryview]) -> None:
         """Write chunk into stream."""
 
     @abstractmethod
diff --git aiohttp/client.py aiohttp/client.py
index e04a6ff989a..7c788e825eb 100644
--- aiohttp/client.py
+++ aiohttp/client.py
@@ -658,7 +658,9 @@ async def _request(
                     all_cookies = self._cookie_jar.filter_cookies(url)
 
                     if cookies is not None:
-                        tmp_cookie_jar = CookieJar()
+                        tmp_cookie_jar = CookieJar(
+                            quote_cookie=self._cookie_jar.quote_cookie
+                        )
                         tmp_cookie_jar.update_cookies(cookies)
                         req_cookies = tmp_cookie_jar.filter_cookies(url)
                         if req_cookies:
@@ -1469,106 +1471,80 @@ async def __aexit__(
         await self._session.close()
 
 
-def request(
-    method: str,
-    url: StrOrURL,
-    *,
-    params: Query = None,
-    data: Any = None,
-    json: Any = None,
-    headers: Optional[LooseHeaders] = None,
-    skip_auto_headers: Optional[Iterable[str]] = None,
-    auth: Optional[BasicAuth] = None,
-    allow_redirects: bool = True,
-    max_redirects: int = 10,
-    compress: Optional[str] = None,
-    chunked: Optional[bool] = None,
-    expect100: bool = False,
-    raise_for_status: Optional[bool] = None,
-    read_until_eof: bool = True,
-    proxy: Optional[StrOrURL] = None,
-    proxy_auth: Optional[BasicAuth] = None,
-    timeout: Union[ClientTimeout, object] = sentinel,
-    cookies: Optional[LooseCookies] = None,
-    version: HttpVersion = http.HttpVersion11,
-    connector: Optional[BaseConnector] = None,
-    read_bufsize: Optional[int] = None,
-    loop: Optional[asyncio.AbstractEventLoop] = None,
-    max_line_size: int = 8190,
-    max_field_size: int = 8190,
-) -> _SessionRequestContextManager:
-    """Constructs and sends a request.
-
-    Returns response object.
-    method - HTTP method
-    url - request url
-    params - (optional) Dictionary or bytes to be sent in the query
-      string of the new request
-    data - (optional) Dictionary, bytes, or file-like object to
-      send in the body of the request
-    json - (optional) Any json compatible python object
-    headers - (optional) Dictionary of HTTP Headers to send with
-      the request
-    cookies - (optional) Dict object to send with the request
-    auth - (optional) BasicAuth named tuple represent HTTP Basic Auth
-    auth - aiohttp.helpers.BasicAuth
-    allow_redirects - (optional) If set to False, do not follow
-      redirects
-    version - Request HTTP version.
-    compress - Set to True if request has to be compressed
-       with deflate encoding.
-    chunked - Set to chunk size for chunked transfer encoding.
-    expect100 - Expect 100-continue response from server.
-    connector - BaseConnector sub-class instance to support
-       connection pooling.
-    read_until_eof - Read response until eof if response
-       does not have Content-Length header.
-    loop - Optional event loop.
-    timeout - Optional ClientTimeout settings structure, 5min
-       total timeout by default.
-    Usage::
-      >>> import aiohttp
-      >>> resp = await aiohttp.request('GET', 'http://python.org/')
-      >>> resp
-      <ClientResponse(python.org/) [200]>
-      >>> data = await resp.read()
-    """
-    connector_owner = False
-    if connector is None:
-        connector_owner = True
-        connector = TCPConnector(loop=loop, force_close=True)
-
-    session = ClientSession(
-        loop=loop,
-        cookies=cookies,
-        version=version,
-        timeout=timeout,
-        connector=connector,
-        connector_owner=connector_owner,
-    )
+if sys.version_info >= (3, 11) and TYPE_CHECKING:
 
-    return _SessionRequestContextManager(
-        session._request(
-            method,
-            url,
-            params=params,
-            data=data,
-            json=json,
-            headers=headers,
-            skip_auto_headers=skip_auto_headers,
-            auth=auth,
-            allow_redirects=allow_redirects,
-            max_redirects=max_redirects,
-            compress=compress,
-            chunked=chunked,
-            expect100=expect100,
-            raise_for_status=raise_for_status,
-            read_until_eof=read_until_eof,
-            proxy=proxy,
-            proxy_auth=proxy_auth,
-            read_bufsize=read_bufsize,
-            max_line_size=max_line_size,
-            max_field_size=max_field_size,
-        ),
-        session,
-    )
+    def request(
+        method: str,
+        url: StrOrURL,
+        *,
+        version: HttpVersion = http.HttpVersion11,
+        connector: Optional[BaseConnector] = None,
+        loop: Optional[asyncio.AbstractEventLoop] = None,
+        **kwargs: Unpack[_RequestOptions],
+    ) -> _SessionRequestContextManager: ...
+
+else:
+
+    def request(
+        method: str,
+        url: StrOrURL,
+        *,
+        version: HttpVersion = http.HttpVersion11,
+        connector: Optional[BaseConnector] = None,
+        loop: Optional[asyncio.AbstractEventLoop] = None,
+        **kwargs: Any,
+    ) -> _SessionRequestContextManager:
+        """Constructs and sends a request.
+
+        Returns response object.
+        method - HTTP method
+        url - request url
+        params - (optional) Dictionary or bytes to be sent in the query
+        string of the new request
+        data - (optional) Dictionary, bytes, or file-like object to
+        send in the body of the request
+        json - (optional) Any json compatible python object
+        headers - (optional) Dictionary of HTTP Headers to send with
+        the request
+        cookies - (optional) Dict object to send with the request
+        auth - (optional) BasicAuth named tuple represent HTTP Basic Auth
+        auth - aiohttp.helpers.BasicAuth
+        allow_redirects - (optional) If set to False, do not follow
+        redirects
+        version - Request HTTP version.
+        compress - Set to True if request has to be compressed
+        with deflate encoding.
+        chunked - Set to chunk size for chunked transfer encoding.
+        expect100 - Expect 100-continue response from server.
+        connector - BaseConnector sub-class instance to support
+        connection pooling.
+        read_until_eof - Read response until eof if response
+        does not have Content-Length header.
+        loop - Optional event loop.
+        timeout - Optional ClientTimeout settings structure, 5min
+        total timeout by default.
+        Usage::
+        >>> import aiohttp
+        >>> async with aiohttp.request('GET', 'http://python.org/') as resp:
+        ...    print(resp)
+        ...    data = await resp.read()
+        <ClientResponse(https://www.python.org/) [200 OK]>
+        """
+        connector_owner = False
+        if connector is None:
+            connector_owner = True
+            connector = TCPConnector(loop=loop, force_close=True)
+
+        session = ClientSession(
+            loop=loop,
+            cookies=kwargs.pop("cookies", None),
+            version=version,
+            timeout=kwargs.pop("timeout", sentinel),
+            connector=connector,
+            connector_owner=connector_owner,
+        )
+
+        return _SessionRequestContextManager(
+            session._request(method, url, **kwargs),
+            session,
+        )
diff --git aiohttp/client_exceptions.py aiohttp/client_exceptions.py
index 667da8d5084..1d298e9a8cf 100644
--- aiohttp/client_exceptions.py
+++ aiohttp/client_exceptions.py
@@ -8,13 +8,17 @@
 
 from .typedefs import StrOrURL
 
-try:
+if TYPE_CHECKING:
     import ssl
 
     SSLContext = ssl.SSLContext
-except ImportError:  # pragma: no cover
-    ssl = SSLContext = None  # type: ignore[assignment]
+else:
+    try:
+        import ssl
 
+        SSLContext = ssl.SSLContext
+    except ImportError:  # pragma: no cover
+        ssl = SSLContext = None  # type: ignore[assignment]
 
 if TYPE_CHECKING:
     from .client_reqrep import ClientResponse, ConnectionKey, Fingerprint, RequestInfo
diff --git aiohttp/client_proto.py aiohttp/client_proto.py
index 79f033e3e12..2d64b3f3644 100644
--- aiohttp/client_proto.py
+++ aiohttp/client_proto.py
@@ -64,6 +64,7 @@ def force_close(self) -> None:
         self._should_close = True
 
     def close(self) -> None:
+        self._exception = None  # Break cyclic references
         transport = self.transport
         if transport is not None:
             transport.close()
diff --git aiohttp/client_reqrep.py aiohttp/client_reqrep.py
index e97c40ce0e5..43b48063c6e 100644
--- aiohttp/client_reqrep.py
+++ aiohttp/client_reqrep.py
@@ -72,12 +72,16 @@
     RawHeaders,
 )
 
-try:
+if TYPE_CHECKING:
     import ssl
     from ssl import SSLContext
-except ImportError:  # pragma: no cover
-    ssl = None  # type: ignore[assignment]
-    SSLContext = object  # type: ignore[misc,assignment]
+else:
+    try:
+        import ssl
+        from ssl import SSLContext
+    except ImportError:  # pragma: no cover
+        ssl = None  # type: ignore[assignment]
+        SSLContext = object  # type: ignore[misc,assignment]
 
 
 __all__ = ("ClientRequest", "ClientResponse", "RequestInfo", "Fingerprint")
diff --git aiohttp/client_ws.py aiohttp/client_ws.py
index f4cfa1bffe8..daa57d1930b 100644
--- aiohttp/client_ws.py
+++ aiohttp/client_ws.py
@@ -163,7 +163,9 @@ def _ping_task_done(self, task: "asyncio.Task[None]") -> None:
         self._ping_task = None
 
     def _pong_not_received(self) -> None:
-        self._handle_ping_pong_exception(ServerTimeoutError())
+        self._handle_ping_pong_exception(
+            ServerTimeoutError(f"No PONG received after {self._pong_heartbeat} seconds")
+        )
 
     def _handle_ping_pong_exception(self, exc: BaseException) -> None:
         """Handle exceptions raised during ping/pong processing."""
diff --git aiohttp/connector.py aiohttp/connector.py
index 93bc2513b20..7420bd6070a 100644
--- aiohttp/connector.py
+++ aiohttp/connector.py
@@ -60,14 +60,18 @@
 )
 from .resolver import DefaultResolver
 
-try:
+if TYPE_CHECKING:
     import ssl
 
     SSLContext = ssl.SSLContext
-except ImportError:  # pragma: no cover
-    ssl = None  # type: ignore[assignment]
-    SSLContext = object  # type: ignore[misc,assignment]
+else:
+    try:
+        import ssl
 
+        SSLContext = ssl.SSLContext
+    except ImportError:  # pragma: no cover
+        ssl = None  # type: ignore[assignment]
+        SSLContext = object  # type: ignore[misc,assignment]
 
 EMPTY_SCHEMA_SET = frozenset({""})
 HTTP_SCHEMA_SET = frozenset({"http", "https"})
@@ -776,14 +780,16 @@ def _make_ssl_context(verified: bool) -> SSLContext:
         # No ssl support
         return None
     if verified:
-        return ssl.create_default_context()
-    sslcontext = ssl.SSLContext(ssl.PROTOCOL_TLS_CLIENT)
-    sslcontext.options |= ssl.OP_NO_SSLv2
-    sslcontext.options |= ssl.OP_NO_SSLv3
-    sslcontext.check_hostname = False
-    sslcontext.verify_mode = ssl.CERT_NONE
-    sslcontext.options |= ssl.OP_NO_COMPRESSION
-    sslcontext.set_default_verify_paths()
+        sslcontext = ssl.create_default_context()
+    else:
+        sslcontext = ssl.SSLContext(ssl.PROTOCOL_TLS_CLIENT)
+        sslcontext.options |= ssl.OP_NO_SSLv2
+        sslcontext.options |= ssl.OP_NO_SSLv3
+        sslcontext.check_hostname = False
+        sslcontext.verify_mode = ssl.CERT_NONE
+        sslcontext.options |= ssl.OP_NO_COMPRESSION
+        sslcontext.set_default_verify_paths()
+    sslcontext.set_alpn_protocols(("http/1.1",))
     return sslcontext
 
 
@@ -1009,11 +1015,11 @@ async def _resolve_host_with_throttle(
         This method must be run in a task and shielded from cancellation
         to avoid cancelling the underlying lookup.
         """
-        if traces:
-            for trace in traces:
-                await trace.send_dns_cache_miss(host)
         try:
             if traces:
+                for trace in traces:
+                    await trace.send_dns_cache_miss(host)
+
                 for trace in traces:
                     await trace.send_dns_resolvehost_start(host)
 
diff --git aiohttp/cookiejar.py aiohttp/cookiejar.py
index ef04bda5ad6..f6b9a921767 100644
--- aiohttp/cookiejar.py
+++ aiohttp/cookiejar.py
@@ -117,6 +117,10 @@ def __init__(
         self._expire_heap: List[Tuple[float, Tuple[str, str, str]]] = []
         self._expirations: Dict[Tuple[str, str, str], float] = {}
 
+    @property
+    def quote_cookie(self) -> bool:
+        return self._quote_cookie
+
     def save(self, file_path: PathLike) -> None:
         file_path = pathlib.Path(file_path)
         with file_path.open(mode="wb") as f:
@@ -474,6 +478,10 @@ def __iter__(self) -> "Iterator[Morsel[str]]":
     def __len__(self) -> int:
         return 0
 
+    @property
+    def quote_cookie(self) -> bool:
+        return True
+
     def clear(self, predicate: Optional[ClearCookiePredicate] = None) -> None:
         pass
 
diff --git aiohttp/helpers.py aiohttp/helpers.py
index 8038931ebec..ace4f0e9b53 100644
--- aiohttp/helpers.py
+++ aiohttp/helpers.py
@@ -21,7 +21,7 @@
 from email.utils import parsedate
 from math import ceil
 from pathlib import Path
-from types import TracebackType
+from types import MappingProxyType, TracebackType
 from typing import (
     Any,
     Callable,
@@ -357,6 +357,20 @@ def parse_mimetype(mimetype: str) -> MimeType:
     )
 
 
+@functools.lru_cache(maxsize=56)
+def parse_content_type(raw: str) -> Tuple[str, MappingProxyType[str, str]]:
+    """Parse Content-Type header.
+
+    Returns a tuple of the parsed content type and a
+    MappingProxyType of parameters.
+    """
+    msg = HeaderParser().parsestr(f"Content-Type: {raw}")
+    content_type = msg.get_content_type()
+    params = msg.get_params(())
+    content_dict = dict(params[1:])  # First element is content type again
+    return content_type, MappingProxyType(content_dict)
+
+
 def guess_filename(obj: Any, default: Optional[str] = None) -> Optional[str]:
     name = getattr(obj, "name", None)
     if name and isinstance(name, str) and name[0] != "<" and name[-1] != ">":
@@ -710,10 +724,10 @@ def _parse_content_type(self, raw: Optional[str]) -> None:
             self._content_type = "application/octet-stream"
             self._content_dict = {}
         else:
-            msg = HeaderParser().parsestr("Content-Type: " + raw)
-            self._content_type = msg.get_content_type()
-            params = msg.get_params(())
-            self._content_dict = dict(params[1:])  # First element is content type again
+            content_type, content_mapping_proxy = parse_content_type(raw)
+            self._content_type = content_type
+            # _content_dict needs to be mutable so we can update it
+            self._content_dict = content_mapping_proxy.copy()
 
     @property
     def content_type(self) -> str:
diff --git aiohttp/http_writer.py aiohttp/http_writer.py
index c66fda3d8d0..e031a97708d 100644
--- aiohttp/http_writer.py
+++ aiohttp/http_writer.py
@@ -1,6 +1,7 @@
 """Http related parsers and protocol."""
 
 import asyncio
+import sys
 import zlib
 from typing import (  # noqa
     Any,
@@ -24,6 +25,17 @@
 __all__ = ("StreamWriter", "HttpVersion", "HttpVersion10", "HttpVersion11")
 
 
+MIN_PAYLOAD_FOR_WRITELINES = 2048
+IS_PY313_BEFORE_313_2 = (3, 13, 0) <= sys.version_info < (3, 13, 2)
+IS_PY_BEFORE_312_9 = sys.version_info < (3, 12, 9)
+SKIP_WRITELINES = IS_PY313_BEFORE_313_2 or IS_PY_BEFORE_312_9
+# writelines is not safe for use
+# on Python 3.12+ until 3.12.9
+# on Python 3.13+ until 3.13.2
+# and on older versions it not any faster than write
+# CVE-2024-12254: https://github.com/python/cpython/pull/127656
+
+
 class HttpVersion(NamedTuple):
     major: int
     minor: int
@@ -72,7 +84,7 @@ def enable_compression(
     ) -> None:
         self._compress = ZLibCompressor(encoding=encoding, strategy=strategy)
 
-    def _write(self, chunk: bytes) -> None:
+    def _write(self, chunk: Union[bytes, bytearray, memoryview]) -> None:
         size = len(chunk)
         self.buffer_size += size
         self.output_size += size
@@ -90,10 +102,17 @@ def _writelines(self, chunks: Iterable[bytes]) -> None:
         transport = self._protocol.transport
         if transport is None or transport.is_closing():
             raise ClientConnectionResetError("Cannot write to closing transport")
-        transport.writelines(chunks)
+        if SKIP_WRITELINES or size < MIN_PAYLOAD_FOR_WRITELINES:
+            transport.write(b"".join(chunks))
+        else:
+            transport.writelines(chunks)
 
     async def write(
-        self, chunk: bytes, *, drain: bool = True, LIMIT: int = 0x10000
+        self,
+        chunk: Union[bytes, bytearray, memoryview],
+        *,
+        drain: bool = True,
+        LIMIT: int = 0x10000,
     ) -> None:
         """Writes chunk of data to a stream.
 
diff --git aiohttp/multipart.py aiohttp/multipart.py
index e0bcce07449..bd4d8ae1ddf 100644
--- aiohttp/multipart.py
+++ aiohttp/multipart.py
@@ -979,7 +979,7 @@ def decode(self, encoding: str = "utf-8", errors: str = "strict") -> str:
         return "".join(
             "--"
             + self.boundary
-            + "\n"
+            + "\r\n"
             + part._binary_headers.decode(encoding, errors)
             + part.decode()
             for part, _e, _te in self._parts
diff --git aiohttp/payload.py aiohttp/payload.py
index c8c01814698..3f6d3672db2 100644
--- aiohttp/payload.py
+++ aiohttp/payload.py
@@ -4,6 +4,7 @@
 import json
 import mimetypes
 import os
+import sys
 import warnings
 from abc import ABC, abstractmethod
 from itertools import chain
@@ -169,7 +170,11 @@ def __init__(
         if content_type is not sentinel and content_type is not None:
             self._headers[hdrs.CONTENT_TYPE] = content_type
         elif self._filename is not None:
-            content_type = mimetypes.guess_type(self._filename)[0]
+            if sys.version_info >= (3, 13):
+                guesser = mimetypes.guess_file_type
+            else:
+                guesser = mimetypes.guess_type
+            content_type = guesser(self._filename)[0]
             if content_type is None:
                 content_type = self._default_content_type
             self._headers[hdrs.CONTENT_TYPE] = content_type
diff --git aiohttp/pytest_plugin.py aiohttp/pytest_plugin.py
index 7ce60faa4a4..21d6ea7bbcd 100644
--- aiohttp/pytest_plugin.py
+++ aiohttp/pytest_plugin.py
@@ -98,7 +98,7 @@ def pytest_fixture_setup(fixturedef):  # type: ignore[no-untyped-def]
     if inspect.isasyncgenfunction(func):
         # async generator fixture
         is_async_gen = True
-    elif asyncio.iscoroutinefunction(func):
+    elif inspect.iscoroutinefunction(func):
         # regular async fixture
         is_async_gen = False
     else:
@@ -200,14 +200,14 @@ def _passthrough_loop_context(loop, fast=False):  # type: ignore[no-untyped-def]
 
 def pytest_pycollect_makeitem(collector, name, obj):  # type: ignore[no-untyped-def]
     """Fix pytest collecting for coroutines."""
-    if collector.funcnamefilter(name) and asyncio.iscoroutinefunction(obj):
+    if collector.funcnamefilter(name) and inspect.iscoroutinefunction(obj):
         return list(collector._genfunctions(name, obj))
 
 
 def pytest_pyfunc_call(pyfuncitem):  # type: ignore[no-untyped-def]
     """Run coroutines in an event loop instead of a normal function call."""
     fast = pyfuncitem.config.getoption("--aiohttp-fast")
-    if asyncio.iscoroutinefunction(pyfuncitem.function):
+    if inspect.iscoroutinefunction(pyfuncitem.function):
         existing_loop = pyfuncitem.funcargs.get(
             "proactor_loop"
         ) or pyfuncitem.funcargs.get("loop", None)
diff --git aiohttp/resolver.py aiohttp/resolver.py
index 9c744514fae..e14179cc8a2 100644
--- aiohttp/resolver.py
+++ aiohttp/resolver.py
@@ -18,6 +18,9 @@
 
 _NUMERIC_SOCKET_FLAGS = socket.AI_NUMERICHOST | socket.AI_NUMERICSERV
 _NAME_SOCKET_FLAGS = socket.NI_NUMERICHOST | socket.NI_NUMERICSERV
+_AI_ADDRCONFIG = socket.AI_ADDRCONFIG
+if hasattr(socket, "AI_MASK"):
+    _AI_ADDRCONFIG &= socket.AI_MASK
 
 
 class ThreadedResolver(AbstractResolver):
@@ -38,7 +41,7 @@ async def resolve(
             port,
             type=socket.SOCK_STREAM,
             family=family,
-            flags=socket.AI_ADDRCONFIG,
+            flags=_AI_ADDRCONFIG,
         )
 
         hosts: List[ResolveResult] = []
@@ -105,7 +108,7 @@ async def resolve(
                 port=port,
                 type=socket.SOCK_STREAM,
                 family=family,
-                flags=socket.AI_ADDRCONFIG,
+                flags=_AI_ADDRCONFIG,
             )
         except aiodns.error.DNSError as exc:
             msg = exc.args[1] if len(exc.args) >= 1 else "DNS lookup failed"
diff --git aiohttp/streams.py aiohttp/streams.py
index b97846171b1..7a3f64d1289 100644
--- aiohttp/streams.py
+++ aiohttp/streams.py
@@ -220,6 +220,9 @@ def feed_eof(self) -> None:
             self._eof_waiter = None
             set_result(waiter, None)
 
+        if self._protocol._reading_paused:
+            self._protocol.resume_reading()
+
         for cb in self._eof_callbacks:
             try:
                 cb()
@@ -517,8 +520,9 @@ def _read_nowait_chunk(self, n: int) -> bytes:
         else:
             data = self._buffer.popleft()
 
-        self._size -= len(data)
-        self._cursor += len(data)
+        data_len = len(data)
+        self._size -= data_len
+        self._cursor += data_len
 
         chunk_splits = self._http_chunk_splits
         # Prevent memory leak: drop useless chunk splits
@@ -551,6 +555,7 @@ class EmptyStreamReader(StreamReader):  # lgtm [py/missing-call-to-init]
 
     def __init__(self) -> None:
         self._read_eof_chunk = False
+        self.total_bytes = 0
 
     def __repr__(self) -> str:
         return "<%s>" % self.__class__.__name__
diff --git aiohttp/web.py aiohttp/web.py
index f975b665331..d6ab6f6fad4 100644
--- aiohttp/web.py
+++ aiohttp/web.py
@@ -9,6 +9,7 @@
 from contextlib import suppress
 from importlib import import_module
 from typing import (
+    TYPE_CHECKING,
     Any,
     Awaitable,
     Callable,
@@ -287,10 +288,13 @@
 )
 
 
-try:
+if TYPE_CHECKING:
     from ssl import SSLContext
-except ImportError:  # pragma: no cover
-    SSLContext = Any  # type: ignore[misc,assignment]
+else:
+    try:
+        from ssl import SSLContext
+    except ImportError:  # pragma: no cover
+        SSLContext = object  # type: ignore[misc,assignment]
 
 # Only display warning when using -Wdefault, -We, -X dev or similar.
 warnings.filterwarnings("ignore", category=NotAppKeyWarning, append=True)
diff --git aiohttp/web_fileresponse.py aiohttp/web_fileresponse.py
index 3b2bc2caf12..be9cf87e069 100644
--- aiohttp/web_fileresponse.py
+++ aiohttp/web_fileresponse.py
@@ -1,7 +1,10 @@
 import asyncio
+import io
 import os
 import pathlib
+import sys
 from contextlib import suppress
+from enum import Enum, auto
 from mimetypes import MimeTypes
 from stat import S_ISREG
 from types import MappingProxyType
@@ -15,6 +18,7 @@
     Iterator,
     List,
     Optional,
+    Set,
     Tuple,
     Union,
     cast,
@@ -66,12 +70,25 @@
     }
 )
 
+
+class _FileResponseResult(Enum):
+    """The result of the file response."""
+
+    SEND_FILE = auto()  # Ie a regular file to send
+    NOT_ACCEPTABLE = auto()  # Ie a socket, or non-regular file
+    PRE_CONDITION_FAILED = auto()  # Ie If-Match or If-None-Match failed
+    NOT_MODIFIED = auto()  # 304 Not Modified
+
+
 # Add custom pairs and clear the encodings map so guess_type ignores them.
 CONTENT_TYPES.encodings_map.clear()
 for content_type, extension in ADDITIONAL_CONTENT_TYPES.items():
     CONTENT_TYPES.add_type(content_type, extension)  # type: ignore[attr-defined]
 
 
+_CLOSE_FUTURES: Set[asyncio.Future[None]] = set()
+
+
 class FileResponse(StreamResponse):
     """A response object can be used to send files."""
 
@@ -160,10 +177,12 @@ async def _precondition_failed(
         self.content_length = 0
         return await super().prepare(request)
 
-    def _get_file_path_stat_encoding(
-        self, accept_encoding: str
-    ) -> Tuple[pathlib.Path, os.stat_result, Optional[str]]:
-        """Return the file path, stat result, and encoding.
+    def _make_response(
+        self, request: "BaseRequest", accept_encoding: str
+    ) -> Tuple[
+        _FileResponseResult, Optional[io.BufferedReader], os.stat_result, Optional[str]
+    ]:
+        """Return the response result, io object, stat result, and encoding.
 
         If an uncompressed file is returned, the encoding is set to
         :py:data:`None`.
@@ -171,6 +190,52 @@ def _get_file_path_stat_encoding(
         This method should be called from a thread executor
         since it calls os.stat which may block.
         """
+        file_path, st, file_encoding = self._get_file_path_stat_encoding(
+            accept_encoding
+        )
+        if not file_path:
+            return _FileResponseResult.NOT_ACCEPTABLE, None, st, None
+
+        etag_value = f"{st.st_mtime_ns:x}-{st.st_size:x}"
+
+        # https://www.rfc-editor.org/rfc/rfc9110#section-13.1.1-2
+        if (ifmatch := request.if_match) is not None and not self._etag_match(
+            etag_value, ifmatch, weak=False
+        ):
+            return _FileResponseResult.PRE_CONDITION_FAILED, None, st, file_encoding
+
+        if (
+            (unmodsince := request.if_unmodified_since) is not None
+            and ifmatch is None
+            and st.st_mtime > unmodsince.timestamp()
+        ):
+            return _FileResponseResult.PRE_CONDITION_FAILED, None, st, file_encoding
+
+        # https://www.rfc-editor.org/rfc/rfc9110#section-13.1.2-2
+        if (ifnonematch := request.if_none_match) is not None and self._etag_match(
+            etag_value, ifnonematch, weak=True
+        ):
+            return _FileResponseResult.NOT_MODIFIED, None, st, file_encoding
+
+        if (
+            (modsince := request.if_modified_since) is not None
+            and ifnonematch is None
+            and st.st_mtime <= modsince.timestamp()
+        ):
+            return _FileResponseResult.NOT_MODIFIED, None, st, file_encoding
+
+        fobj = file_path.open("rb")
+        with suppress(OSError):
+            # fstat() may not be available on all platforms
+            # Once we open the file, we want the fstat() to ensure
+            # the file has not changed between the first stat()
+            # and the open().
+            st = os.stat(fobj.fileno())
+        return _FileResponseResult.SEND_FILE, fobj, st, file_encoding
+
+    def _get_file_path_stat_encoding(
+        self, accept_encoding: str
+    ) -> Tuple[Optional[pathlib.Path], os.stat_result, Optional[str]]:
         file_path = self._path
         for file_extension, file_encoding in ENCODING_EXTENSIONS.items():
             if file_encoding not in accept_encoding:
@@ -184,7 +249,8 @@ def _get_file_path_stat_encoding(
                     return compressed_path, st, file_encoding
 
         # Fallback to the uncompressed file
-        return file_path, file_path.stat(), None
+        st = file_path.stat()
+        return file_path if S_ISREG(st.st_mode) else None, st, None
 
     async def prepare(self, request: "BaseRequest") -> Optional[AbstractStreamWriter]:
         loop = asyncio.get_running_loop()
@@ -192,9 +258,12 @@ async def prepare(self, request: "BaseRequest") -> Optional[AbstractStreamWriter
         # https://www.rfc-editor.org/rfc/rfc9110#section-8.4.1
         accept_encoding = request.headers.get(hdrs.ACCEPT_ENCODING, "").lower()
         try:
-            file_path, st, file_encoding = await loop.run_in_executor(
-                None, self._get_file_path_stat_encoding, accept_encoding
+            response_result, fobj, st, file_encoding = await loop.run_in_executor(
+                None, self._make_response, request, accept_encoding
             )
+        except PermissionError:
+            self.set_status(HTTPForbidden.status_code)
+            return await super().prepare(request)
         except OSError:
             # Most likely to be FileNotFoundError or OSError for circular
             # symlinks in python >= 3.13, so respond with 404.
@@ -202,51 +271,46 @@ async def prepare(self, request: "BaseRequest") -> Optional[AbstractStreamWriter
             return await super().prepare(request)
 
         # Forbid special files like sockets, pipes, devices, etc.
-        if not S_ISREG(st.st_mode):
+        if response_result is _FileResponseResult.NOT_ACCEPTABLE:
             self.set_status(HTTPForbidden.status_code)
             return await super().prepare(request)
 
-        etag_value = f"{st.st_mtime_ns:x}-{st.st_size:x}"
-        last_modified = st.st_mtime
-
-        # https://www.rfc-editor.org/rfc/rfc9110#section-13.1.1-2
-        ifmatch = request.if_match
-        if ifmatch is not None and not self._etag_match(
-            etag_value, ifmatch, weak=False
-        ):
-            return await self._precondition_failed(request)
-
-        unmodsince = request.if_unmodified_since
-        if (
-            unmodsince is not None
-            and ifmatch is None
-            and st.st_mtime > unmodsince.timestamp()
-        ):
+        if response_result is _FileResponseResult.PRE_CONDITION_FAILED:
             return await self._precondition_failed(request)
 
-        # https://www.rfc-editor.org/rfc/rfc9110#section-13.1.2-2
-        ifnonematch = request.if_none_match
-        if ifnonematch is not None and self._etag_match(
-            etag_value, ifnonematch, weak=True
-        ):
-            return await self._not_modified(request, etag_value, last_modified)
-
-        modsince = request.if_modified_since
-        if (
-            modsince is not None
-            and ifnonematch is None
-            and st.st_mtime <= modsince.timestamp()
-        ):
+        if response_result is _FileResponseResult.NOT_MODIFIED:
+            etag_value = f"{st.st_mtime_ns:x}-{st.st_size:x}"
+            last_modified = st.st_mtime
             return await self._not_modified(request, etag_value, last_modified)
 
+        assert fobj is not None
+        try:
+            return await self._prepare_open_file(request, fobj, st, file_encoding)
+        finally:
+            # We do not await here because we do not want to wait
+            # for the executor to finish before returning the response
+            # so the connection can begin servicing another request
+            # as soon as possible.
+            close_future = loop.run_in_executor(None, fobj.close)
+            # Hold a strong reference to the future to prevent it from being
+            # garbage collected before it completes.
+            _CLOSE_FUTURES.add(close_future)
+            close_future.add_done_callback(_CLOSE_FUTURES.remove)
+
+    async def _prepare_open_file(
+        self,
+        request: "BaseRequest",
+        fobj: io.BufferedReader,
+        st: os.stat_result,
+        file_encoding: Optional[str],
+    ) -> Optional[AbstractStreamWriter]:
         status = self._status
-        file_size = st.st_size
-        count = file_size
-
-        start = None
+        file_size: int = st.st_size
+        file_mtime: float = st.st_mtime
+        count: int = file_size
+        start: Optional[int] = None
 
-        ifrange = request.if_range
-        if ifrange is None or st.st_mtime <= ifrange.timestamp():
+        if (ifrange := request.if_range) is None or file_mtime <= ifrange.timestamp():
             # If-Range header check:
             # condition = cached date >= last modification date
             # return 206 if True else 200.
@@ -257,7 +321,7 @@ async def prepare(self, request: "BaseRequest") -> Optional[AbstractStreamWriter
             try:
                 rng = request.http_range
                 start = rng.start
-                end = rng.stop
+                end: Optional[int] = rng.stop
             except ValueError:
                 # https://tools.ietf.org/html/rfc7233:
                 # A server generating a 416 (Range Not Satisfiable) response to
@@ -268,13 +332,13 @@ async def prepare(self, request: "BaseRequest") -> Optional[AbstractStreamWriter
                 #
                 # Will do the same below. Many servers ignore this and do not
                 # send a Content-Range header with HTTP 416
-                self.headers[hdrs.CONTENT_RANGE] = f"bytes */{file_size}"
+                self._headers[hdrs.CONTENT_RANGE] = f"bytes */{file_size}"
                 self.set_status(HTTPRequestRangeNotSatisfiable.status_code)
                 return await super().prepare(request)
 
             # If a range request has been made, convert start, end slice
             # notation into file pointer offset and count
-            if start is not None or end is not None:
+            if start is not None:
                 if start < 0 and end is None:  # return tail of file
                     start += file_size
                     if start < 0:
@@ -304,7 +368,7 @@ async def prepare(self, request: "BaseRequest") -> Optional[AbstractStreamWriter
                     # suffix-byte-range-spec with a non-zero suffix-length,
                     # then the byte-range-set is satisfiable. Otherwise, the
                     # byte-range-set is unsatisfiable.
-                    self.headers[hdrs.CONTENT_RANGE] = f"bytes */{file_size}"
+                    self._headers[hdrs.CONTENT_RANGE] = f"bytes */{file_size}"
                     self.set_status(HTTPRequestRangeNotSatisfiable.status_code)
                     return await super().prepare(request)
 
@@ -316,48 +380,39 @@ async def prepare(self, request: "BaseRequest") -> Optional[AbstractStreamWriter
         # If the Content-Type header is not already set, guess it based on the
         # extension of the request path. The encoding returned by guess_type
         #  can be ignored since the map was cleared above.
-        if hdrs.CONTENT_TYPE not in self.headers:
-            self.content_type = (
-                CONTENT_TYPES.guess_type(self._path)[0] or FALLBACK_CONTENT_TYPE
-            )
+        if hdrs.CONTENT_TYPE not in self._headers:
+            if sys.version_info >= (3, 13):
+                guesser = CONTENT_TYPES.guess_file_type
+            else:
+                guesser = CONTENT_TYPES.guess_type
+            self.content_type = guesser(self._path)[0] or FALLBACK_CONTENT_TYPE
 
         if file_encoding:
-            self.headers[hdrs.CONTENT_ENCODING] = file_encoding
-            self.headers[hdrs.VARY] = hdrs.ACCEPT_ENCODING
+            self._headers[hdrs.CONTENT_ENCODING] = file_encoding
+            self._headers[hdrs.VARY] = hdrs.ACCEPT_ENCODING
             # Disable compression if we are already sending
             # a compressed file since we don't want to double
             # compress.
             self._compression = False
 
-        self.etag = etag_value  # type: ignore[assignment]
-        self.last_modified = st.st_mtime  # type: ignore[assignment]
+        self.etag = f"{st.st_mtime_ns:x}-{st.st_size:x}"  # type: ignore[assignment]
+        self.last_modified = file_mtime  # type: ignore[assignment]
         self.content_length = count
 
-        self.headers[hdrs.ACCEPT_RANGES] = "bytes"
-
-        real_start = cast(int, start)
+        self._headers[hdrs.ACCEPT_RANGES] = "bytes"
 
         if status == HTTPPartialContent.status_code:
-            self.headers[hdrs.CONTENT_RANGE] = "bytes {}-{}/{}".format(
+            real_start = start
+            assert real_start is not None
+            self._headers[hdrs.CONTENT_RANGE] = "bytes {}-{}/{}".format(
                 real_start, real_start + count - 1, file_size
             )
 
         # If we are sending 0 bytes calling sendfile() will throw a ValueError
-        if count == 0 or must_be_empty_body(request.method, self.status):
-            return await super().prepare(request)
-
-        try:
-            fobj = await loop.run_in_executor(None, file_path.open, "rb")
-        except PermissionError:
-            self.set_status(HTTPForbidden.status_code)
+        if count == 0 or must_be_empty_body(request.method, status):
             return await super().prepare(request)
 
-        if start:  # be aware that start could be None or int=0 here.
-            offset = start
-        else:
-            offset = 0
+        # be aware that start could be None or int=0 here.
+        offset = start or 0
 
-        try:
-            return await self._sendfile(request, fobj, offset, count)
-        finally:
-            await asyncio.shield(loop.run_in_executor(None, fobj.close))
+        return await self._sendfile(request, fobj, offset, count)
diff --git aiohttp/web_protocol.py aiohttp/web_protocol.py
index e8bb41abf97..1dba9606ea0 100644
--- aiohttp/web_protocol.py
+++ aiohttp/web_protocol.py
@@ -458,7 +458,7 @@ def _process_keepalive(self) -> None:
         loop = self._loop
         now = loop.time()
         close_time = self._next_keepalive_close_time
-        if now <= close_time:
+        if now < close_time:
             # Keep alive close check fired too early, reschedule
             self._keepalive_handle = loop.call_at(close_time, self._process_keepalive)
             return
@@ -520,8 +520,6 @@ async def start(self) -> None:
         keep_alive(True) specified.
         """
         loop = self._loop
-        handler = asyncio.current_task(loop)
-        assert handler is not None
         manager = self._manager
         assert manager is not None
         keepalive_timeout = self._keepalive_timeout
@@ -551,7 +549,16 @@ async def start(self) -> None:
             else:
                 request_handler = self._request_handler
 
-            request = self._request_factory(message, payload, self, writer, handler)
+            # Important don't hold a reference to the current task
+            # as on traceback it will prevent the task from being
+            # collected and will cause a memory leak.
+            request = self._request_factory(
+                message,
+                payload,
+                self,
+                writer,
+                self._task_handler or asyncio.current_task(loop),  # type: ignore[arg-type]
+            )
             try:
                 # a new task is used for copy context vars (#3406)
                 coro = self._handle_request(request, start, request_handler)
@@ -608,26 +615,29 @@ async def start(self) -> None:
 
             except asyncio.CancelledError:
                 self.log_debug("Ignored premature client disconnection")
+                self.f,orce_close()
                 raise
             except Exception as exc:
                 self.log_exception("Unhandled exception", exc_info=exc)
                 self.force_close()
+            except BaseException:
+                self.force_close()
+                raise
             finally:
+                request._task = None  # type: ignore[assignment] # Break reference cycle in case of exception
                 if self.transport is None and resp is not None:
                     self.log_debug("Ignored premature client disconnection.")
-                elif not self._force_close:
-                    if self._keepalive and not self._close:
-                        # start keep-alive timer
-                        if keepalive_timeout is not None:
-                            now = loop.time()
-                            close_time = now + keepalive_timeout
-                            self._next_keepalive_close_time = close_time
-                            if self._keepalive_handle is None:
-                                self._keepalive_handle = loop.call_at(
-                                    close_time, self._process_keepalive
-                                )
-                    else:
-                        break
+
+            if self._keepalive and not self._close and not self._force_close:
+                # start keep-alive timer
+                close_time = loop.time() + keepalive_timeout
+                self._next_keepalive_close_time = close_time
+                if self._keepalive_handle is None:
+                    self._keepalive_handle = loop.call_at(
+                        close_time, self._process_keepalive
+                    )
+            else:
+                break
 
         # remove handler, close transport if no handlers left
         if not self._force_close:
@@ -694,9 +704,13 @@ def handle_error(
             # or encrypted traffic to an HTTP port. This is expected
             # to happen when connected to the public internet so we log
             # it at the debug level as to not fill logs with noise.
-            self.logger.debug("Error handling request", exc_info=exc)
+            self.logger.debug(
+                "Error handling request from %s", request.remote, exc_info=exc
+            )
         else:
-            self.log_exception("Error handling request", exc_info=exc)
+            self.log_exception(
+                "Error handling request from %s", request.remote, exc_info=exc
+            )
 
         # some data already got sent, connection is broken
         if request.writer.output_size > 0:
diff --git aiohttp/web_response.py aiohttp/web_response.py
index cd2be24f1a3..367ac6e8c0a 100644
--- aiohttp/web_response.py
+++ aiohttp/web_response.py
@@ -537,7 +537,7 @@ async def _write_headers(self) -> None:
         status_line = f"HTTP/{version[0]}.{version[1]} {self._status} {self._reason}"
         await writer.write_headers(status_line, self._headers)
 
-    async def write(self, data: bytes) -> None:
+    async def write(self, data: Union[bytes, bytearray, memoryview]) -> None:
         assert isinstance(
             data, (bytes, bytearray, memoryview)
         ), "data argument must be byte-ish (%r)" % type(data)
@@ -629,10 +629,8 @@ def __init__(
 
         if headers is None:
             real_headers: CIMultiDict[str] = CIMultiDict()
-        elif not isinstance(headers, CIMultiDict):
-            real_headers = CIMultiDict(headers)
         else:
-            real_headers = headers  # = cast('CIMultiDict[str]', headers)
+            real_headers = CIMultiDict(headers)
 
         if content_type is not None and "charset" in content_type:
             raise ValueError("charset must not be in content_type argument")
diff --git aiohttp/web_runner.py aiohttp/web_runner.py
index f8933383435..bcfec727c84 100644
--- aiohttp/web_runner.py
+++ aiohttp/web_runner.py
@@ -3,7 +3,7 @@
 import socket
 import warnings
 from abc import ABC, abstractmethod
-from typing import Any, List, Optional, Set
+from typing import TYPE_CHECKING, Any, List, Optional, Set
 
 from yarl import URL
 
@@ -11,11 +11,13 @@
 from .web_app import Application
 from .web_server import Server
 
-try:
+if TYPE_CHECKING:
     from ssl import SSLContext
-except ImportError:
-    SSLContext = object  # type: ignore[misc,assignment]
-
+else:
+    try:
+        from ssl import SSLContext
+    except ImportError:  # pragma: no cover
+        SSLContext = object  # type: ignore[misc,assignment]
 
 __all__ = (
     "BaseSite",
diff --git aiohttp/web_urldispatcher.py aiohttp/web_urldispatcher.py
index 6443c500a33..28ae2518fec 100644
--- aiohttp/web_urldispatcher.py
+++ aiohttp/web_urldispatcher.py
@@ -180,8 +180,8 @@ def __init__(
         if expect_handler is None:
             expect_handler = _default_expect_handler
 
-        assert asyncio.iscoroutinefunction(
-            expect_handler
+        assert inspect.iscoroutinefunction(expect_handler) or (
+            sys.version_info < (3, 14) and asyncio.iscoroutinefunction(expect_handler)
         ), f"Coroutine is expected, got {expect_handler!r}"
 
         method = method.upper()
@@ -189,7 +189,9 @@ def __init__(
             raise ValueError(f"{method} is not allowed HTTP method")
 
         assert callable(handler), handler
-        if asyncio.iscoroutinefunction(handler):
+        if inspect.iscoroutinefunction(handler) or (
+            sys.version_info < (3, 14) and asyncio.iscoroutinefunction(handler)
+        ):
             pass
         elif inspect.isgeneratorfunction(handler):
             warnings.warn(
diff --git aiohttp/web_ws.py aiohttp/web_ws.py
index 0fb1549a3aa..439b8049987 100644
--- aiohttp/web_ws.py
+++ aiohttp/web_ws.py
@@ -182,7 +182,11 @@ def _ping_task_done(self, task: "asyncio.Task[None]") -> None:
 
     def _pong_not_received(self) -> None:
         if self._req is not None and self._req.transport is not None:
-            self._handle_ping_pong_exception(asyncio.TimeoutError())
+            self._handle_ping_pong_exception(
+                asyncio.TimeoutError(
+                    f"No PONG received after {self._pong_heartbeat} seconds"
+                )
+            )
 
     def _handle_ping_pong_exception(self, exc: BaseException) -> None:
         """Handle exceptions raised during ping/pong processing."""
@@ -248,7 +252,8 @@ def _handshake(
             else:
                 # No overlap found: Return no protocol as per spec
                 ws_logger.warning(
-                    "Client protocols %r don’t overlap server-known ones %r",
+                    "%s: Client protocols %r don’t overlap server-known ones %r",
+                    request.remote,
                     req_protocols,
                     self._protocols,
                 )
diff --git aiohttp/worker.py aiohttp/worker.py
index 9b307697336..f7281bfde75 100644
--- aiohttp/worker.py
+++ aiohttp/worker.py
@@ -1,12 +1,13 @@
 """Async gunicorn worker for aiohttp.web"""
 
 import asyncio
+import inspect
 import os
 import re
 import signal
 import sys
 from types import FrameType
-from typing import Any, Awaitable, Callable, Optional, Union  # noqa
+from typing import TYPE_CHECKING, Any, Optional
 
 from gunicorn.config import AccessLogFormat as GunicornAccessLogFormat
 from gunicorn.workers import base
@@ -17,13 +18,18 @@
 from .web_app import Application
 from .web_log import AccessLogger
 
-try:
+if TYPE_CHECKING:
     import ssl
 
     SSLContext = ssl.SSLContext
-except ImportError:  # pragma: no cover
-    ssl = None  # type: ignore[assignment]
-    SSLContext = object  # type: ignore[misc,assignment]
+else:
+    try:
+        import ssl
+
+        SSLContext = ssl.SSLContext
+    except ImportError:  # pragma: no cover
+        ssl = None  # type: ignore[assignment]
+        SSLContext = object  # type: ignore[misc,assignment]
 
 
 __all__ = ("GunicornWebWorker", "GunicornUVLoopWebWorker")
@@ -66,7 +72,9 @@ async def _run(self) -> None:
         runner = None
         if isinstance(self.wsgi, Application):
             app = self.wsgi
-        elif asyncio.iscoroutinefunction(self.wsgi):
+        elif inspect.iscoroutinefunction(self.wsgi) or (
+            sys.version_info < (3, 14) and asyncio.iscoroutinefunction(self.wsgi)
+        ):
             wsgi = await self.wsgi()
             if isinstance(wsgi, web.AppRunner):
                 runner = wsgi
diff --git docs/client_quickstart.rst docs/client_quickstart.rst
index f99339cf4a6..0e03f104e90 100644
--- docs/client_quickstart.rst
+++ docs/client_quickstart.rst
@@ -93,7 +93,7 @@ Passing Parameters In URLs
 You often want to send some sort of data in the URL's query string. If
 you were constructing the URL by hand, this data would be given as key/value
 pairs in the URL after a question mark, e.g. ``httpbin.org/get?key=val``.
-Requests allows you to provide these arguments as a :class:`dict`, using the
+aiohttp allows you to provide these arguments as a :class:`dict`, using the
 ``params`` keyword argument. As an example, if you wanted to pass
 ``key1=value1`` and ``key2=value2`` to ``httpbin.org/get``, you would use the
 following code::
diff --git docs/client_reference.rst docs/client_reference.rst
index c9031de5383..26537161971 100644
--- docs/client_reference.rst
+++ docs/client_reference.rst
@@ -448,11 +448,16 @@ The client session supports the context manager protocol for self closing.
       :param aiohttp.BasicAuth auth: an object that represents HTTP
                                      Basic Authorization (optional)
 
-      :param bool allow_redirects: If set to ``False``, do not follow redirects.
-                                   ``True`` by default (optional).
+      :param bool allow_redirects: Whether to process redirects or not.
+         When ``True``, redirects are followed (up to ``max_redirects`` times)
+         and logged into :attr:`ClientResponse.history` and ``trace_configs``.
+         When ``False``, the original response is returned.
+         ``True`` by default (optional).
 
       :param int max_redirects: Maximum number of redirects to follow.
-                                ``10`` by default.
+         :exc:`TooManyRedirects` is raised if the number is exceeded.
+         Ignored when ``allow_redirects=False``.
+         ``10`` by default.
 
       :param bool compress: Set to ``True`` if request has to be compressed
          with deflate encoding. If `compress` can not be combined
@@ -508,7 +513,7 @@ The client session supports the context manager protocol for self closing.
          .. versionadded:: 3.0
 
       :param str server_hostname: Sets or overrides the host name that the
-         target server’s certificate will be matched against.
+         target server's certificate will be matched against.
 
          See :py:meth:`asyncio.loop.create_connection` for more information.
 
@@ -554,8 +559,11 @@ The client session supports the context manager protocol for self closing.
 
       :param url: Request URL, :class:`str` or :class:`~yarl.URL`
 
-      :param bool allow_redirects: If set to ``False``, do not follow redirects.
-                                   ``True`` by default (optional).
+      :param bool allow_redirects: Whether to process redirects or not.
+         When ``True``, redirects are followed and logged into
+         :attr:`ClientResponse.history`.
+         When ``False``, the original response is returned.
+         ``True`` by default (optional).
 
       :return ClientResponse: a :class:`client response
                               <ClientResponse>` object.
@@ -623,8 +631,11 @@ The client session supports the context manager protocol for self closing.
 
       :param url: Request URL, :class:`str` or :class:`~yarl.URL`
 
-      :param bool allow_redirects: If set to ``False``, do not follow redirects.
-                                   ``False`` by default (optional).
+      :param bool allow_redirects: Whether to process redirects or not.
+         When ``True``, redirects are followed and logged into
+         :attr:`ClientResponse.history`.
+         When ``False``, the original response is returned.
+         ``False`` by default (optional).
 
       :return ClientResponse: a :class:`client response
                               <ClientResponse>` object.
@@ -641,8 +652,11 @@ The client session supports the context manager protocol for self closing.
 
       :param url: Request URL, :class:`str` or :class:`~yarl.URL`
 
-      :param bool allow_redirects: If set to ``False``, do not follow redirects.
-                                   ``True`` by default (optional).
+      :param bool allow_redirects: Whether to process redirects or not.
+         When ``True``, redirects are followed and logged into
+         :attr:`ClientResponse.history`.
+         When ``False``, the original response is returned.
+         ``True`` by default (optional).
 
       :return ClientResponse: a :class:`client response
                               <ClientResponse>` object.
@@ -836,14 +850,21 @@ certification chaining.
 
 .. function:: request(method, url, *, params=None, data=None, \
                         json=None,\
-                        headers=None, cookies=None, auth=None, \
+                        cookies=None, headers=None, skip_auto_headers=None, auth=None, \
                         allow_redirects=True, max_redirects=10, \
-                        encoding='utf-8', \
-                        version=HttpVersion(major=1, minor=1), \
-                        compress=None, chunked=None, expect100=False, raise_for_status=False, \
+                        compress=False, chunked=None, expect100=False, raise_for_status=None, \
+                        read_until_eof=True, \
+                        proxy=None, proxy_auth=None, \
+                        timeout=sentinel, ssl=True, \
+                        server_hostname=None, \
+                        proxy_headers=None, \
+                        trace_request_ctx=None, \
                         read_bufsize=None, \
-                        connector=None, loop=None,\
-                        read_until_eof=True, timeout=sentinel)
+                        auto_decompress=None, \
+                        max_line_size=None, \
+                        max_field_size=None, \
+                        version=aiohttp.HttpVersion11, \
+                        connector=None)
    :async:
 
    Asynchronous context manager for performing an asynchronous HTTP
@@ -856,8 +877,20 @@ certification chaining.
                be encoded with :class:`~yarl.URL` (see :class:`~yarl.URL`
                to skip encoding).
 
-   :param dict params: Parameters to be sent in the query
-                       string of the new request (optional)
+   :param params: Mapping, iterable of tuple of *key*/*value* pairs or
+                  string to be sent as parameters in the query
+                  string of the new request. Ignored for subsequent
+                  redirected requests (optional)
+
+                  Allowed values are:
+
+                  - :class:`collections.abc.Mapping` e.g. :class:`dict`,
+                     :class:`multidict.MultiDict` or
+                     :class:`multidict.MultiDictProxy`
+                  - :class:`collections.abc.Iterable` e.g. :class:`tuple` or
+                     :class:`list`
+                  - :class:`str` with preferably url-encoded content
+                     (**Warning:** content will not be encoded by *aiohttp*)
 
    :param data: The data to send in the body of the request. This can be a
                 :class:`FormData` object or anything that can be passed into
@@ -867,25 +900,46 @@ certification chaining.
    :param json: Any json compatible python object (optional). *json* and *data*
                 parameters could not be used at the same time.
 
+   :param dict cookies: HTTP Cookies to send with the request (optional)
+
    :param dict headers: HTTP Headers to send with the request (optional)
 
-   :param dict cookies: Cookies to send with the request (optional)
+   :param skip_auto_headers: set of headers for which autogeneration
+      should be skipped.
+
+      *aiohttp* autogenerates headers like ``User-Agent`` or
+      ``Content-Type`` if these headers are not explicitly
+      passed. Using ``skip_auto_headers`` parameter allows to skip
+      that generation.
+
+      Iterable of :class:`str` or :class:`~multidict.istr`
+      (optional)
 
    :param aiohttp.BasicAuth auth: an object that represents HTTP Basic
                                   Authorization (optional)
 
-   :param bool allow_redirects: If set to ``False``, do not follow redirects.
-                                ``True`` by default (optional).
+   :param bool allow_redirects: Whether to process redirects or not.
+      When ``True``, redirects are followed (up to ``max_redirects`` times)
+      and logged into :attr:`ClientResponse.history` and ``trace_configs``.
+      When ``False``, the original response is returned.
+      ``True`` by default (optional).
 
-   :param aiohttp.protocol.HttpVersion version: Request HTTP version (optional)
+   :param int max_redirects: Maximum number of redirects to follow.
+      :exc:`TooManyRedirects` is raised if the number is exceeded.
+      Ignored when ``allow_redirects=False``.
+      ``10`` by default.
 
    :param bool compress: Set to ``True`` if request has to be compressed
-                         with deflate encoding.
-                         ``False`` instructs aiohttp to not compress data.
+                         with deflate encoding. If `compress` can not be combined
+                         with a *Content-Encoding* and *Content-Length* headers.
                          ``None`` by default (optional).
 
    :param int chunked: Enables chunked transfer encoding.
-                       ``None`` by default (optional).
+      It is up to the developer
+      to decide how to chunk data streams. If chunking is enabled, aiohttp
+      encodes the provided chunks in the "Transfer-encoding: chunked" format.
+      If *chunked* is set, then the *Transfer-encoding* and *content-length*
+      headers are disallowed. ``None`` by default (optional).
 
    :param bool expect100: Expect 100-continue response from server.
                           ``False`` by default (optional).
@@ -899,28 +953,60 @@ certification chaining.
 
       .. versionadded:: 3.4
 
-   :param aiohttp.BaseConnector connector: BaseConnector sub-class
-      instance to support connection pooling.
-
    :param bool read_until_eof: Read response until EOF if response
                                does not have Content-Length header.
                                ``True`` by default (optional).
 
+   :param proxy: Proxy URL, :class:`str` or :class:`~yarl.URL` (optional)
+
+   :param aiohttp.BasicAuth proxy_auth: an object that represents proxy HTTP
+                                        Basic Authorization (optional)
+
+   :param timeout: a :class:`ClientTimeout` settings structure, 300 seconds (5min)
+        total timeout, 30 seconds socket connect timeout by default.
+
+   :param ssl: SSL validation mode. ``True`` for default SSL check
+               (:func:`ssl.create_default_context` is used),
+               ``False`` for skip SSL certificate validation,
+               :class:`aiohttp.Fingerprint` for fingerprint
+               validation, :class:`ssl.SSLContext` for custom SSL
+               certificate validation.
+
+               Supersedes *verify_ssl*, *ssl_context* and
+               *fingerprint* parameters.
+
+   :param str server_hostname: Sets or overrides the host name that the
+      target server's certificate will be matched against.
+
+      See :py:meth:`asyncio.loop.create_connection`
+      for more information.
+
+   :param collections.abc.Mapping proxy_headers: HTTP headers to send to the proxy
+      if the parameter proxy has been provided.
+
+   :param trace_request_ctx: Object used to give as a kw param for each new
+      :class:`TraceConfig` object instantiated,
+      used to give information to the
+      tracers that is only available at request time.
+
    :param int read_bufsize: Size of the read buffer (:attr:`ClientResponse.content`).
                             ``None`` by default,
                             it means that the session global value is used.
 
       .. versionadded:: 3.7
 
-   :param timeout: a :class:`ClientTimeout` settings structure, 300 seconds (5min)
-        total timeout, 30 seconds socket connect timeout by default.
+   :param bool auto_decompress: Automatically decompress response body.
+      May be used to enable/disable auto decompression on a per-request basis.
 
-   :param loop: :ref:`event loop<asyncio-event-loop>`
-                used for processing HTTP requests.
-                If param is ``None``, :func:`asyncio.get_event_loop`
-                is used for getting default event loop.
+   :param int max_line_size: Maximum allowed size of lines in responses.
 
-      .. deprecated:: 2.0
+   :param int max_field_size: Maximum allowed size of header fields in responses.
+
+   :param aiohttp.protocol.HttpVersion version: Request HTTP version,
+      ``HTTP 1.1`` by default. (optional)
+
+   :param aiohttp.BaseConnector connector: BaseConnector sub-class
+      instance to support connection pooling. (optional)
 
    :return ClientResponse: a :class:`client response <ClientResponse>` object.
 
diff --git docs/contributing-admins.rst docs/contributing-admins.rst
index acfaebc0e97..b17cbe1019a 100644
--- docs/contributing-admins.rst
+++ docs/contributing-admins.rst
@@ -21,9 +21,9 @@ To create a new release:
 #. Run ``towncrier``.
 #. Check and cleanup the changes in ``CHANGES.rst``.
 #. Checkout a new branch: e.g. ``git checkout -b release/v3.8.6``
-#. Commit and create a PR. Once PR is merged, continue.
+#. Commit and create a PR. Verify the changelog and release notes look good on Read the Docs. Once PR is merged, continue.
 #. Go back to the release branch: e.g. ``git checkout 3.8 && git pull``
-#. Add a tag: e.g. ``git tag -a v3.8.6 -m 'Release 3.8.6'``
+#. Add a tag: e.g. ``git tag -a v3.8.6 -m 'Release 3.8.6' -s``
 #. Push the tag: e.g. ``git push origin v3.8.6``
 #. Monitor CI to ensure release process completes without errors.
 
@@ -49,6 +49,10 @@ first merge into the newer release branch (e.g. 3.8 into 3.9) and then to master
 
 Back on the original release branch, bump the version number and append ``.dev0`` in ``__init__.py``.
 
+Post the release announcement to social media:
+ - BlueSky: https://bsky.app/profile/aiohttp.org and re-post to https://bsky.app/profile/aio-libs.org
+ - Mastodon: https://fosstodon.org/@aiohttp and re-post to https://fosstodon.org/@aio_libs
+
 If doing a minor release:
 
 #. Create a new release branch for future features to go to: e.g. ``git checkout -b 3.10 3.9 && git push``
diff --git docs/spelling_wordlist.txt docs/spelling_wordlist.txt
index a1f3d944584..59ea99c40bb 100644
--- docs/spelling_wordlist.txt
+++ docs/spelling_wordlist.txt
@@ -13,6 +13,8 @@ app
 app’s
 apps
 arg
+args
+armv
 Arsenic
 async
 asyncio
@@ -169,6 +171,7 @@ keepaliving
 kib
 KiB
 kwarg
+kwargs
 latin
 lifecycle
 linux
@@ -199,6 +202,7 @@ multidicts
 Multidicts
 multipart
 Multipart
+musllinux
 mypy
 Nagle
 Nagle’s
@@ -245,6 +249,7 @@ py
 pydantic
 pyenv
 pyflakes
+pyright
 pytest
 Pytest
 Quickstart
diff --git docs/third_party.rst docs/third_party.rst
index e8095c7f09d..145a505a5de 100644
--- docs/third_party.rst
+++ docs/third_party.rst
@@ -305,3 +305,6 @@ ask to raise the status.
 
 - `aiohttp-asgi-connector <https://github.com/thearchitector/aiohttp-asgi-connector>`_
   An aiohttp connector for using a ``ClientSession`` to interface directly with separate ASGI applications.
+
+- `aiohttp-openmetrics <https://github.com/jelmer/aiohttp-openmetrics>`_
+  An aiohttp middleware for exposing Prometheus metrics.
diff --git requirements/base.txt requirements/base.txt
index 1e7c0bbe6c1..d79bdab3893 100644
--- requirements/base.txt
+++ requirements/base.txt
@@ -30,7 +30,7 @@ multidict==6.1.0
     # via
     #   -r requirements/runtime-deps.in
     #   yarl
-packaging==24.1
+packaging==24.2
     # via gunicorn
 propcache==0.2.0
     # via
diff --git requirements/constraints.txt requirements/constraints.txt
index d32acc7b773..041a3737ab0 100644
--- requirements/constraints.txt
+++ requirements/constraints.txt
@@ -14,7 +14,7 @@ aiohttp-theme==0.1.7
     # via -r requirements/doc.in
 aiosignal==1.3.1
     # via -r requirements/runtime-deps.in
-alabaster==0.7.13
+alabaster==1.0.0
     # via sphinx
 annotated-types==0.7.0
     # via pydantic
@@ -129,7 +129,7 @@ mypy-extensions==1.0.0
     # via mypy
 nodeenv==1.9.1
     # via pre-commit
-packaging==24.1
+packaging==24.2
     # via
     #   build
     #   gunicorn
@@ -236,22 +236,22 @@ slotscheck==0.19.1
     # via -r requirements/lint.in
 snowballstemmer==2.2.0
     # via sphinx
-sphinx==7.1.2
+sphinx==8.1.3
     # via
     #   -r requirements/doc.in
     #   sphinxcontrib-spelling
     #   sphinxcontrib-towncrier
-sphinxcontrib-applehelp==1.0.4
+sphinxcontrib-applehelp==2.0.0
     # via sphinx
-sphinxcontrib-devhelp==1.0.2
+sphinxcontrib-devhelp==2.0.0
     # via sphinx
-sphinxcontrib-htmlhelp==2.0.1
+sphinxcontrib-htmlhelp==2.1.0
     # via sphinx
 sphinxcontrib-jsmath==1.0.1
     # via sphinx
-sphinxcontrib-qthelp==1.0.3
+sphinxcontrib-qthelp==2.0.0
     # via sphinx
-sphinxcontrib-serializinghtml==1.1.5
+sphinxcontrib-serializinghtml==2.0.0
     # via sphinx
 sphinxcontrib-spelling==8.0.0 ; platform_system != "Windows"
     # via -r requirements/doc-spelling.in
diff --git requirements/dev.txt requirements/dev.txt
index 168ce639d19..a99644dff81 100644
--- requirements/dev.txt
+++ requirements/dev.txt
@@ -14,7 +14,7 @@ aiohttp-theme==0.1.7
     # via -r requirements/doc.in
 aiosignal==1.3.1
     # via -r requirements/runtime-deps.in
-alabaster==0.7.13
+alabaster==1.0.0
     # via sphinx
 annotated-types==0.7.0
     # via pydantic
@@ -122,7 +122,7 @@ mypy-extensions==1.0.0
     # via mypy
 nodeenv==1.9.1
     # via pre-commit
-packaging==24.1
+packaging==24.2
     # via
     #   build
     #   gunicorn
@@ -210,21 +210,21 @@ slotscheck==0.19.1
     # via -r requirements/lint.in
 snowballstemmer==2.2.0
     # via sphinx
-sphinx==7.1.2
+sphinx==8.1.3
     # via
     #   -r requirements/doc.in
     #   sphinxcontrib-towncrier
-sphinxcontrib-applehelp==1.0.4
+sphinxcontrib-applehelp==2.0.0
     # via sphinx
-sphinxcontrib-devhelp==1.0.2
+sphinxcontrib-devhelp==2.0.0
     # via sphinx
-sphinxcontrib-htmlhelp==2.0.1
+sphinxcontrib-htmlhelp==2.1.0
     # via sphinx
 sphinxcontrib-jsmath==1.0.1
     # via sphinx
-sphinxcontrib-qthelp==1.0.3
+sphinxcontrib-qthelp==2.0.0
     # via sphinx
-sphinxcontrib-serializinghtml==1.1.5
+sphinxcontrib-serializinghtml==2.0.0
     # via sphinx
 sphinxcontrib-towncrier==0.4.0a0
     # via -r requirements/doc.in
diff --git requirements/doc-spelling.txt requirements/doc-spelling.txt
index df393012548..43b3822706e 100644
--- requirements/doc-spelling.txt
+++ requirements/doc-spelling.txt
@@ -6,7 +6,7 @@
 #
 aiohttp-theme==0.1.7
     # via -r requirements/doc.in
-alabaster==0.7.13
+alabaster==1.0.0
     # via sphinx
 babel==2.16.0
     # via sphinx
@@ -34,7 +34,7 @@ jinja2==3.1.4
     #   towncrier
 markupsafe==2.1.5
     # via jinja2
-packaging==24.1
+packaging==24.2
     # via sphinx
 pyenchant==3.2.2
     # via sphinxcontrib-spelling
@@ -46,22 +46,22 @@ requests==2.32.3
     # via sphinx
 snowballstemmer==2.2.0
     # via sphinx
-sphinx==7.1.2
+sphinx==8.1.3
     # via
     #   -r requirements/doc.in
     #   sphinxcontrib-spelling
     #   sphinxcontrib-towncrier
-sphinxcontrib-applehelp==1.0.4
+sphinxcontrib-applehelp==2.0.0
     # via sphinx
-sphinxcontrib-devhelp==1.0.2
+sphinxcontrib-devhelp==2.0.0
     # via sphinx
-sphinxcontrib-htmlhelp==2.0.1
+sphinxcontrib-htmlhelp==2.1.0
     # via sphinx
 sphinxcontrib-jsmath==1.0.1
     # via sphinx
-sphinxcontrib-qthelp==1.0.3
+sphinxcontrib-qthelp==2.0.0
     # via sphinx
-sphinxcontrib-serializinghtml==1.1.5
+sphinxcontrib-serializinghtml==2.0.0
     # via sphinx
 sphinxcontrib-spelling==8.0.0 ; platform_system != "Windows"
     # via -r requirements/doc-spelling.in
diff --git requirements/doc.txt requirements/doc.txt
index 43b7c6b7e8b..6ddfc47455b 100644
--- requirements/doc.txt
+++ requirements/doc.txt
@@ -6,7 +6,7 @@
 #
 aiohttp-theme==0.1.7
     # via -r requirements/doc.in
-alabaster==0.7.13
+alabaster==1.0.0
     # via sphinx
 babel==2.16.0
     # via sphinx
@@ -34,7 +34,7 @@ jinja2==3.1.4
     #   towncrier
 markupsafe==2.1.5
     # via jinja2
-packaging==24.1
+packaging==24.2
     # via sphinx
 pygments==2.18.0
     # via sphinx
@@ -44,21 +44,21 @@ requests==2.32.3
     # via sphinx
 snowballstemmer==2.2.0
     # via sphinx
-sphinx==7.1.2
+sphinx==8.1.3
     # via
     #   -r requirements/doc.in
     #   sphinxcontrib-towncrier
-sphinxcontrib-applehelp==1.0.4
+sphinxcontrib-applehelp==2.0.0
     # via sphinx
-sphinxcontrib-devhelp==1.0.2
+sphinxcontrib-devhelp==2.0.0
     # via sphinx
-sphinxcontrib-htmlhelp==2.0.1
+sphinxcontrib-htmlhelp==2.1.0
     # via sphinx
 sphinxcontrib-jsmath==1.0.1
     # via sphinx
-sphinxcontrib-qthelp==1.0.3
+sphinxcontrib-qthelp==2.0.0
     # via sphinx
-sphinxcontrib-serializinghtml==1.1.5
+sphinxcontrib-serializinghtml==2.0.0
     # via sphinx
 sphinxcontrib-towncrier==0.4.0a0
     # via -r requirements/doc.in
diff --git requirements/lint.txt requirements/lint.txt
index d7d97277bce..e2547d13da5 100644
--- requirements/lint.txt
+++ requirements/lint.txt
@@ -55,7 +55,7 @@ mypy-extensions==1.0.0
     # via mypy
 nodeenv==1.9.1
     # via pre-commit
-packaging==24.1
+packaging==24.2
     # via pytest
 platformdirs==4.3.6
     # via virtualenv
diff --git requirements/test.txt requirements/test.txt
index 33510f18682..cf81a7bf257 100644
--- requirements/test.txt
+++ requirements/test.txt
@@ -70,7 +70,7 @@ mypy==1.11.2 ; implementation_name == "cpython"
     # via -r requirements/test.in
 mypy-extensions==1.0.0
     # via mypy
-packaging==24.1
+packaging==24.2
     # via
     #   gunicorn
     #   pytest
diff --git tests/conftest.py tests/conftest.py
index 44ae384b633..95a98cd4fc0 100644
--- tests/conftest.py
+++ tests/conftest.py
@@ -221,6 +221,7 @@ def start_connection():
         "aiohttp.connector.aiohappyeyeballs.start_connection",
         autospec=True,
         spec_set=True,
+        return_value=mock.create_autospec(socket.socket, spec_set=True, instance=True),
     ) as start_connection_mock:
         yield start_connection_mock
 
diff --git a/tests/isolated/check_for_client_response_leak.py b/tests/isolated/check_for_client_response_leak.py
new file mode 100644
index 00000000000..67393c2c2d8
--- /dev/null
+++ tests/isolated/check_for_client_response_leak.py
@@ -0,0 +1,47 @@
+import asyncio
+import contextlib
+import gc
+import sys
+
+from aiohttp import ClientError, ClientSession, web
+from aiohttp.test_utils import get_unused_port_socket
+
+gc.set_debug(gc.DEBUG_LEAK)
+
+
+async def main() -> None:
+    app = web.Application()
+
+    async def stream_handler(request: web.Request) -> web.Response:
+        assert request.transport is not None
+        request.transport.close()  # Forcefully closing connection
+        return web.Response()
+
+    app.router.add_get("/stream", stream_handler)
+    sock = get_unused_port_socket("127.0.0.1")
+    port = sock.getsockname()[1]
+
+    runner = web.AppRunner(app)
+    await runner.setup()
+    site = web.SockSite(runner, sock)
+    await site.start()
+
+    session = ClientSession()
+
+    async def fetch_stream(url: str) -> None:
+        """Fetch a stream and read a few bytes from it."""
+        with contextlib.suppress(ClientError):
+            await session.get(url)
+
+    client_task = asyncio.create_task(fetch_stream(f"http://localhost:{port}/stream"))
+    await client_task
+    gc.collect()
+    client_response_present = any(
+        type(obj).__name__ == "ClientResponse" for obj in gc.garbage
+    )
+    await session.close()
+    await runner.cleanup()
+    sys.exit(1 if client_response_present else 0)
+
+
+asyncio.run(main())
diff --git a/tests/isolated/check_for_request_leak.py b/tests/isolated/check_for_request_leak.py
new file mode 100644
index 00000000000..6f340a05277
--- /dev/null
+++ tests/isolated/check_for_request_leak.py
@@ -0,0 +1,41 @@
+import asyncio
+import gc
+import sys
+from typing import NoReturn
+
+from aiohttp import ClientSession, web
+from aiohttp.test_utils import get_unused_port_socket
+
+gc.set_debug(gc.DEBUG_LEAK)
+
+
+async def main() -> None:
+    app = web.Application()
+
+    async def handler(request: web.Request) -> NoReturn:
+        await request.json()
+        assert False
+
+    app.router.add_route("GET", "/json", handler)
+    sock = get_unused_port_socket("127.0.0.1")
+    port = sock.getsockname()[1]
+
+    runner = web.AppRunner(app)
+    await runner.setup()
+    site = web.SockSite(runner, sock)
+    await site.start()
+
+    async with ClientSession() as session:
+        async with session.get(f"http://127.0.0.1:{port}/json") as resp:
+            await resp.read()
+
+    # Give time for the cancelled task to be collected
+    await asyncio.sleep(0.5)
+    gc.collect()
+    request_present = any(type(obj).__name__ == "Request" for obj in gc.garbage)
+    await session.close()
+    await runner.cleanup()
+    sys.exit(1 if request_present else 0)
+
+
+asyncio.run(main())
diff --git tests/test_benchmarks_client.py tests/test_benchmarks_client.py
index 61439183334..aa3536be820 100644
--- tests/test_benchmarks_client.py
+++ tests/test_benchmarks_client.py
@@ -124,7 +124,7 @@ def test_one_hundred_get_requests_with_512kib_chunked_payload(
     aiohttp_client: AiohttpClient,
     benchmark: BenchmarkFixture,
 ) -> None:
-    """Benchmark 100 GET requests with a payload of 512KiB."""
+    """Benchmark 100 GET requests with a payload of 512KiB using read."""
     message_count = 100
     payload = b"a" * (2**19)
 
@@ -148,6 +148,36 @@ def _run() -> None:
         loop.run_until_complete(run_client_benchmark())
 
 
+def test_one_hundred_get_requests_iter_chunks_on_512kib_chunked_payload(
+    loop: asyncio.AbstractEventLoop,
+    aiohttp_client: AiohttpClient,
+    benchmark: BenchmarkFixture,
+) -> None:
+    """Benchmark 100 GET requests with a payload of 512KiB using iter_chunks."""
+    message_count = 100
+    payload = b"a" * (2**19)
+
+    async def handler(request: web.Request) -> web.Response:
+        resp = web.Response(body=payload)
+        resp.enable_chunked_encoding()
+        return resp
+
+    app = web.Application()
+    app.router.add_route("GET", "/", handler)
+
+    async def run_client_benchmark() -> None:
+        client = await aiohttp_client(app)
+        for _ in range(message_count):
+            resp = await client.get("/")
+            async for _ in resp.content.iter_chunks():
+                pass
+        await client.close()
+
+    @benchmark
+    def _run() -> None:
+        loop.run_until_complete(run_client_benchmark())
+
+
 def test_get_request_with_251308_compressed_chunked_payload(
     loop: asyncio.AbstractEventLoop,
     aiohttp_client: AiohttpClient,
@@ -289,3 +319,158 @@ async def run_client_benchmark() -> None:
     @benchmark
     def _run() -> None:
         loop.run_until_complete(run_client_benchmark())
+
+
+def test_one_hundred_json_post_requests(
+    loop: asyncio.AbstractEventLoop,
+    aiohttp_client: AiohttpClient,
+    benchmark: BenchmarkFixture,
+) -> None:
+    """Benchmark 100 JSON POST requests that check the content-type."""
+    message_count = 100
+
+    async def handler(request: web.Request) -> web.Response:
+        _ = request.content_type
+        _ = request.charset
+        return web.Response()
+
+    app = web.Application()
+    app.router.add_route("POST", "/", handler)
+
+    async def run_client_benchmark() -> None:
+        client = await aiohttp_client(app)
+        for _ in range(message_count):
+            await client.post("/", json={"key": "value"})
+        await client.close()
+
+    @benchmark
+    def _run() -> None:
+        loop.run_until_complete(run_client_benchmark())
+
+
+def test_ten_streamed_responses_iter_any(
+    loop: asyncio.AbstractEventLoop,
+    aiohttp_client: AiohttpClient,
+    benchmark: BenchmarkFixture,
+) -> None:
+    """Benchmark 10 streamed responses using iter_any."""
+    message_count = 10
+    data = b"x" * 65536  # 64 KiB chunk size
+
+    async def handler(request: web.Request) -> web.StreamResponse:
+        resp = web.StreamResponse()
+        await resp.prepare(request)
+        for _ in range(10):
+            await resp.write(data)
+        return resp
+
+    app = web.Application()
+    app.router.add_route("GET", "/", handler)
+
+    async def run_client_benchmark() -> None:
+        client = await aiohttp_client(app)
+        for _ in range(message_count):
+            resp = await client.get("/")
+            async for _ in resp.content.iter_any():
+                pass
+        await client.close()
+
+    @benchmark
+    def _run() -> None:
+        loop.run_until_complete(run_client_benchmark())
+
+
+def test_ten_streamed_responses_iter_chunked_4096(
+    loop: asyncio.AbstractEventLoop,
+    aiohttp_client: AiohttpClient,
+    benchmark: BenchmarkFixture,
+) -> None:
+    """Benchmark 10 streamed responses using iter_chunked 4096."""
+    message_count = 10
+    data = b"x" * 65536  # 64 KiB chunk size, 4096 iter_chunked
+
+    async def handler(request: web.Request) -> web.StreamResponse:
+        resp = web.StreamResponse()
+        await resp.prepare(request)
+        for _ in range(10):
+            await resp.write(data)
+        return resp
+
+    app = web.Application()
+    app.router.add_route("GET", "/", handler)
+
+    async def run_client_benchmark() -> None:
+        client = await aiohttp_client(app)
+        for _ in range(message_count):
+            resp = await client.get("/")
+            async for _ in resp.content.iter_chunked(4096):
+                pass
+        await client.close()
+
+    @benchmark
+    def _run() -> None:
+        loop.run_until_complete(run_client_benchmark())
+
+
+def test_ten_streamed_responses_iter_chunked_65536(
+    loop: asyncio.AbstractEventLoop,
+    aiohttp_client: AiohttpClient,
+    benchmark: BenchmarkFixture,
+) -> None:
+    """Benchmark 10 streamed responses using iter_chunked 65536."""
+    message_count = 10
+    data = b"x" * 65536  # 64 KiB chunk size, 64 KiB iter_chunked
+
+    async def handler(request: web.Request) -> web.StreamResponse:
+        resp = web.StreamResponse()
+        await resp.prepare(request)
+        for _ in range(10):
+            await resp.write(data)
+        return resp
+
+    app = web.Application()
+    app.router.add_route("GET", "/", handler)
+
+    async def run_client_benchmark() -> None:
+        client = await aiohttp_client(app)
+        for _ in range(message_count):
+            resp = await client.get("/")
+            async for _ in resp.content.iter_chunked(65536):
+                pass
+        await client.close()
+
+    @benchmark
+    def _run() -> None:
+        loop.run_until_complete(run_client_benchmark())
+
+
+def test_ten_streamed_responses_iter_chunks(
+    loop: asyncio.AbstractEventLoop,
+    aiohttp_client: AiohttpClient,
+    benchmark: BenchmarkFixture,
+) -> None:
+    """Benchmark 10 streamed responses using iter_chunks."""
+    message_count = 10
+    data = b"x" * 65536  # 64 KiB chunk size
+
+    async def handler(request: web.Request) -> web.StreamResponse:
+        resp = web.StreamResponse()
+        await resp.prepare(request)
+        for _ in range(10):
+            await resp.write(data)
+        return resp
+
+    app = web.Application()
+    app.router.add_route("GET", "/", handler)
+
+    async def run_client_benchmark() -> None:
+        client = await aiohttp_client(app)
+        for _ in range(message_count):
+            resp = await client.get("/")
+            async for _ in resp.content.iter_chunks():
+                pass
+        await client.close()
+
+    @benchmark
+    def _run() -> None:
+        loop.run_until_complete(run_client_benchmark())
diff --git a/tests/test_benchmarks_web_fileresponse.py b/tests/test_benchmarks_web_fileresponse.py
new file mode 100644
index 00000000000..01aa7448c86
--- /dev/null
+++ tests/test_benchmarks_web_fileresponse.py
@@ -0,0 +1,105 @@
+"""codspeed benchmarks for the web file responses."""
+
+import asyncio
+import pathlib
+
+from multidict import CIMultiDict
+from pytest_codspeed import BenchmarkFixture
+
+from aiohttp import ClientResponse, web
+from aiohttp.pytest_plugin import AiohttpClient
+
+
+def test_simple_web_file_response(
+    loop: asyncio.AbstractEventLoop,
+    aiohttp_client: AiohttpClient,
+    benchmark: BenchmarkFixture,
+) -> None:
+    """Benchmark creating 100 simple web.FileResponse."""
+    response_count = 100
+    filepath = pathlib.Path(__file__).parent / "sample.txt"
+
+    async def handler(request: web.Request) -> web.FileResponse:
+        return web.FileResponse(path=filepath)
+
+    app = web.Application()
+    app.router.add_route("GET", "/", handler)
+
+    async def run_file_response_benchmark() -> None:
+        client = await aiohttp_client(app)
+        for _ in range(response_count):
+            await client.get("/")
+        await client.close()
+
+    @benchmark
+    def _run() -> None:
+        loop.run_until_complete(run_file_response_benchmark())
+
+
+def test_simple_web_file_sendfile_fallback_response(
+    loop: asyncio.AbstractEventLoop,
+    aiohttp_client: AiohttpClient,
+    benchmark: BenchmarkFixture,
+) -> None:
+    """Benchmark creating 100 simple web.FileResponse without sendfile."""
+    response_count = 100
+    filepath = pathlib.Path(__file__).parent / "sample.txt"
+
+    async def handler(request: web.Request) -> web.FileResponse:
+        transport = request.transport
+        assert transport is not None
+        transport._sendfile_compatible = False  # type: ignore[attr-defined]
+        return web.FileResponse(path=filepath)
+
+    app = web.Application()
+    app.router.add_route("GET", "/", handler)
+
+    async def run_file_response_benchmark() -> None:
+        client = await aiohttp_client(app)
+        for _ in range(response_count):
+            await client.get("/")
+        await client.close()
+
+    @benchmark
+    def _run() -> None:
+        loop.run_until_complete(run_file_response_benchmark())
+
+
+def test_simple_web_file_response_not_modified(
+    loop: asyncio.AbstractEventLoop,
+    aiohttp_client: AiohttpClient,
+    benchmark: BenchmarkFixture,
+) -> None:
+    """Benchmark web.FileResponse that return a 304."""
+    response_count = 100
+    filepath = pathlib.Path(__file__).parent / "sample.txt"
+
+    async def handler(request: web.Request) -> web.FileResponse:
+        return web.FileResponse(path=filepath)
+
+    app = web.Application()
+    app.router.add_route("GET", "/", handler)
+
+    async def make_last_modified_header() -> CIMultiDict[str]:
+        client = await aiohttp_client(app)
+        resp = await client.get("/")
+        last_modified = resp.headers["Last-Modified"]
+        headers = CIMultiDict({"If-Modified-Since": last_modified})
+        return headers
+
+    async def run_file_response_benchmark(
+        headers: CIMultiDict[str],
+    ) -> ClientResponse:
+        client = await aiohttp_client(app)
+        for _ in range(response_count):
+            resp = await client.get("/", headers=headers)
+
+        await client.close()
+        return resp  # type: ignore[possibly-undefined]
+
+    headers = loop.run_until_complete(make_last_modified_header())
+
+    @benchmark
+    def _run() -> None:
+        resp = loop.run_until_complete(run_file_response_benchmark(headers))
+        assert resp.status == 304
diff --git tests/test_client_functional.py tests/test_client_functional.py
index b34ccdb600d..ba75e8e93c6 100644
--- tests/test_client_functional.py
+++ tests/test_client_functional.py
@@ -603,6 +603,30 @@ async def handler(request):
     assert txt == "Test message"
 
 
+async def test_ssl_client_alpn(
+    aiohttp_server: AiohttpServer,
+    aiohttp_client: AiohttpClient,
+    ssl_ctx: ssl.SSLContext,
+) -> None:
+
+    async def handler(request: web.Request) -> web.Response:
+        assert request.transport is not None
+        sslobj = request.transport.get_extra_info("ssl_object")
+        return web.Response(text=sslobj.selected_alpn_protocol())
+
+    app = web.Application()
+    app.router.add_route("GET", "/", handler)
+    ssl_ctx.set_alpn_protocols(("http/1.1",))
+    server = await aiohttp_server(app, ssl=ssl_ctx)
+
+    connector = aiohttp.TCPConnector(ssl=False)
+    client = await aiohttp_client(server, connector=connector)
+    resp = await client.get("/")
+    assert resp.status == 200
+    txt = await resp.text()
+    assert txt == "http/1.1"
+
+
 async def test_tcp_connector_fingerprint_ok(
     aiohttp_server,
     aiohttp_client,
@@ -3358,6 +3382,22 @@ async def handler(request: web.Request) -> web.Response:
     await server.close()
 
 
+async def test_aiohttp_request_ssl(
+    aiohttp_server: AiohttpServer,
+    ssl_ctx: ssl.SSLContext,
+    client_ssl_ctx: ssl.SSLContext,
+) -> None:
+    async def handler(request: web.Request) -> web.Response:
+        return web.Response()
+
+    app = web.Application()
+    app.router.add_get("/", handler)
+    server = await aiohttp_server(app, ssl=ssl_ctx)
+
+    async with aiohttp.request("GET", server.make_url("/"), ssl=client_ssl_ctx) as resp:
+        assert resp.status == 200
+
+
 async def test_yield_from_in_session_request(aiohttp_client: AiohttpClient) -> None:
     # a test for backward compatibility with yield from syntax
     async def handler(request):
diff --git tests/test_client_session.py tests/test_client_session.py
index 65f80b6abe9..6309c5daf2e 100644
--- tests/test_client_session.py
+++ tests/test_client_session.py
@@ -15,13 +15,14 @@
 from yarl import URL
 
 import aiohttp
-from aiohttp import client, hdrs, web
+from aiohttp import CookieJar, client, hdrs, web
 from aiohttp.client import ClientSession
 from aiohttp.client_proto import ResponseHandler
 from aiohttp.client_reqrep import ClientRequest
 from aiohttp.connector import BaseConnector, Connection, TCPConnector, UnixConnector
 from aiohttp.helpers import DEBUG
 from aiohttp.http import RawResponseMessage
+from aiohttp.pytest_plugin import AiohttpServer
 from aiohttp.test_utils import make_mocked_coro
 from aiohttp.tracing import Trace
 
@@ -634,8 +635,24 @@ async def handler(request):
     assert resp_cookies["response"].value == "resp_value"
 
 
-async def test_session_default_version(loop) -> None:
-    session = aiohttp.ClientSession(loop=loop)
+async def test_cookies_with_not_quoted_cookie_jar(
+    aiohttp_server: AiohttpServer,
+) -> None:
+    async def handler(_: web.Request) -> web.Response:
+        return web.Response()
+
+    app = web.Application()
+    app.router.add_route("GET", "/", handler)
+    server = await aiohttp_server(app)
+    jar = CookieJar(quote_cookie=False)
+    cookies = {"name": "val=foobar"}
+    async with aiohttp.ClientSession(cookie_jar=jar) as sess:
+        resp = await sess.request("GET", server.make_url("/"), cookies=cookies)
+    assert resp.request_info.headers.get("Cookie", "") == "name=val=foobar"
+
+
+async def test_session_default_version(loop: asyncio.AbstractEventLoop) -> None:
+    session = aiohttp.ClientSession()
     assert session.version == aiohttp.HttpVersion11
     await session.close()
 
diff --git tests/test_client_ws_functional.py tests/test_client_ws_functional.py
index 7ede7432adf..0ca57ab3ab2 100644
--- tests/test_client_ws_functional.py
+++ tests/test_client_ws_functional.py
@@ -315,7 +315,6 @@ async def test_concurrent_close(aiohttp_client) -> None:
     client_ws = None
 
     async def handler(request):
-        nonlocal client_ws
         ws = web.WebSocketResponse()
         await ws.prepare(request)
 
@@ -902,6 +901,7 @@ async def handler(request):
         assert resp.close_code is WSCloseCode.ABNORMAL_CLOSURE
         assert msg.type is WSMsgType.ERROR
         assert isinstance(msg.data, ServerTimeoutError)
+        assert str(msg.data) == "No PONG received after 0.05 seconds"
 
 
 async def test_close_websocket_while_ping_inflight(
@@ -935,7 +935,7 @@ async def delayed_send_frame(
         message: bytes, opcode: int, compress: Optional[int] = None
     ) -> None:
         assert opcode == WSMsgType.PING
-        nonlocal cancelled, ping_started
+        nonlocal cancelled
         ping_started.set_result(None)
         try:
             await asyncio.sleep(1)
diff --git tests/test_connector.py tests/test_connector.py
index 483759a4180..a3fffc447ae 100644
--- tests/test_connector.py
+++ tests/test_connector.py
@@ -3474,6 +3474,61 @@ async def send_dns_cache_hit(self, *args: object, **kwargs: object) -> None:
     await connector.close()
 
 
+async def test_connector_resolve_in_case_of_trace_cache_miss_exception(
+    loop: asyncio.AbstractEventLoop,
+) -> None:
+    token: ResolveResult = {
+        "hostname": "localhost",
+        "host": "127.0.0.1",
+        "port": 80,
+        "family": socket.AF_INET,
+        "proto": 0,
+        "flags": socket.AI_NUMERICHOST,
+    }
+
+    request_count = 0
+
+    class DummyTracer(Trace):
+        def __init__(self) -> None:
+            """Dummy"""
+
+        async def send_dns_cache_hit(self, *args: object, **kwargs: object) -> None:
+            """Dummy send_dns_cache_hit"""
+
+        async def send_dns_resolvehost_start(
+            self, *args: object, **kwargs: object
+        ) -> None:
+            """Dummy send_dns_resolvehost_start"""
+
+        async def send_dns_resolvehost_end(
+            self, *args: object, **kwargs: object
+        ) -> None:
+            """Dummy send_dns_resolvehost_end"""
+
+        async def send_dns_cache_miss(self, *args: object, **kwargs: object) -> None:
+            nonlocal request_count
+            request_count += 1
+            if request_count <= 1:
+                raise Exception("first attempt")
+
+    async def resolve_response() -> List[ResolveResult]:
+        await asyncio.sleep(0)
+        return [token]
+
+    with mock.patch("aiohttp.connector.DefaultResolver") as m_resolver:
+        m_resolver().resolve.return_value = resolve_response()
+
+        connector = TCPConnector()
+        traces = [DummyTracer()]
+
+        with pytest.raises(Exception):
+            await connector._resolve_host("", 0, traces)
+
+        await connector._resolve_host("", 0, traces) == [token]
+
+    await connector.close()
+
+
 async def test_connector_does_not_remove_needed_waiters(
     loop: asyncio.AbstractEventLoop, key: ConnectionKey
 ) -> None:
diff --git tests/test_cookiejar.py tests/test_cookiejar.py
index bdcf54fa796..0b440bc2ca6 100644
--- tests/test_cookiejar.py
+++ tests/test_cookiejar.py
@@ -807,6 +807,7 @@ async def make_jar():
 async def test_dummy_cookie_jar() -> None:
     cookie = SimpleCookie("foo=bar; Domain=example.com;")
     dummy_jar = DummyCookieJar()
+    assert dummy_jar.quote_cookie is True
     assert len(dummy_jar) == 0
     dummy_jar.update_cookies(cookie)
     assert len(dummy_jar) == 0
diff --git tests/test_flowcontrol_streams.py tests/test_flowcontrol_streams.py
index 68e623b6dd7..9874cc2511e 100644
--- tests/test_flowcontrol_streams.py
+++ tests/test_flowcontrol_streams.py
@@ -4,6 +4,7 @@
 import pytest
 
 from aiohttp import streams
+from aiohttp.base_protocol import BaseProtocol
 
 
 @pytest.fixture
@@ -112,6 +113,15 @@ async def test_read_nowait(self, stream) -> None:
         assert res == b""
         assert stream._protocol.resume_reading.call_count == 1  # type: ignore[attr-defined]
 
+    async def test_resumed_on_eof(self, stream: streams.StreamReader) -> None:
+        stream.feed_data(b"data")
+        assert stream._protocol.pause_reading.call_count == 1  # type: ignore[attr-defined]
+        assert stream._protocol.resume_reading.call_count == 0  # type: ignore[attr-defined]
+        stream._protocol._reading_paused = True
+
+        stream.feed_eof()
+        assert stream._protocol.resume_reading.call_count == 1  # type: ignore[attr-defined]
+
 
 async def test_flow_control_data_queue_waiter_cancelled(
     buffer: streams.FlowControlDataQueue,
@@ -180,3 +190,16 @@ async def test_flow_control_data_queue_read_eof(
     buffer.feed_eof()
     with pytest.raises(streams.EofStream):
         await buffer.read()
+
+
+async def test_stream_reader_eof_when_full() -> None:
+    loop = asyncio.get_event_loop()
+    protocol = BaseProtocol(loop=loop)
+    protocol.transport = asyncio.Transport()
+    stream = streams.StreamReader(protocol, 1024, loop=loop)
+
+    data_len = stream._high_water + 1
+    stream.feed_data(b"0" * data_len)
+    assert protocol._reading_paused
+    stream.feed_eof()
+    assert not protocol._reading_paused
diff --git tests/test_helpers.py tests/test_helpers.py
index 2a83032e557..a343cbdfedf 100644
--- tests/test_helpers.py
+++ tests/test_helpers.py
@@ -351,7 +351,6 @@ async def test_timer_context_timeout_does_swallow_cancellation() -> None:
     ctx = helpers.TimerContext(loop)
 
     async def task_with_timeout() -> None:
-        nonlocal ctx
         new_task = asyncio.current_task()
         assert new_task is not None
         with pytest.raises(asyncio.TimeoutError):
diff --git tests/test_http_writer.py tests/test_http_writer.py
index 0ed0e615700..420816b3137 100644
--- tests/test_http_writer.py
+++ tests/test_http_writer.py
@@ -2,19 +2,38 @@
 import array
 import asyncio
 import zlib
-from typing import Iterable
+from typing import Generator, Iterable
 from unittest import mock
 
 import pytest
 from multidict import CIMultiDict
 
-from aiohttp import ClientConnectionResetError, http
+from aiohttp import ClientConnectionResetError, hdrs, http
 from aiohttp.base_protocol import BaseProtocol
+from aiohttp.http_writer import _serialize_headers
 from aiohttp.test_utils import make_mocked_coro
 
 
 @pytest.fixture
-def buf():
+def enable_writelines() -> Generator[None, None, None]:
+    with mock.patch("aiohttp.http_writer.SKIP_WRITELINES", False):
+        yield
+
+
+@pytest.fixture
+def disable_writelines() -> Generator[None, None, None]:
+    with mock.patch("aiohttp.http_writer.SKIP_WRITELINES", True):
+        yield
+
+
+@pytest.fixture
+def force_writelines_small_payloads() -> Generator[None, None, None]:
+    with mock.patch("aiohttp.http_writer.MIN_PAYLOAD_FOR_WRITELINES", 1):
+        yield
+
+
+@pytest.fixture
+def buf() -> bytearray:
     return bytearray()
 
 
@@ -92,6 +111,7 @@ async def test_write_payload_length(protocol, transport, loop) -> None:
     assert b"da" == content.split(b"\r\n\r\n", 1)[-1]
 
 
+@pytest.mark.usefixtures("disable_writelines")
 async def test_write_large_payload_deflate_compression_data_in_eof(
     protocol: BaseProtocol,
     transport: asyncio.Transport,
@@ -100,6 +120,32 @@ async def test_write_large_payload_deflate_compression_data_in_eof(
     msg = http.StreamWriter(protocol, loop)
     msg.enable_compression("deflate")
 
+    await msg.write(b"data" * 4096)
+    assert transport.write.called  # type: ignore[attr-defined]
+    chunks = [c[1][0] for c in list(transport.write.mock_calls)]  # type: ignore[attr-defined]
+    transport.write.reset_mock()  # type: ignore[attr-defined]
+
+    # This payload compresses to 20447 bytes
+    payload = b"".join(
+        [bytes((*range(0, i), *range(i, 0, -1))) for i in range(255) for _ in range(64)]
+    )
+    await msg.write_eof(payload)
+    chunks.extend([c[1][0] for c in list(transport.write.mock_calls)])  # type: ignore[attr-defined]
+
+    assert all(chunks)
+    content = b"".join(chunks)
+    assert zlib.decompress(content) == (b"data" * 4096) + payload
+
+
+@pytest.mark.usefixtures("enable_writelines")
+async def test_write_large_payload_deflate_compression_data_in_eof_writelines(
+    protocol: BaseProtocol,
+    transport: asyncio.Transport,
+    loop: asyncio.AbstractEventLoop,
+) -> None:
+    msg = http.StreamWriter(protocol, loop)
+    msg.enable_compression("deflate")
+
     await msg.write(b"data" * 4096)
     assert transport.write.called  # type: ignore[attr-defined]
     chunks = [c[1][0] for c in list(transport.write.mock_calls)]  # type: ignore[attr-defined]
@@ -180,6 +226,26 @@ async def test_write_payload_deflate_compression_chunked(
     await msg.write(b"data")
     await msg.write_eof()
 
+    chunks = [c[1][0] for c in list(transport.write.mock_calls)]  # type: ignore[attr-defined]
+    assert all(chunks)
+    content = b"".join(chunks)
+    assert content == expected
+
+
+@pytest.mark.usefixtures("enable_writelines")
+@pytest.mark.usefixtures("force_writelines_small_payloads")
+async def test_write_payload_deflate_compression_chunked_writelines(
+    protocol: BaseProtocol,
+    transport: asyncio.Transport,
+    loop: asyncio.AbstractEventLoop,
+) -> None:
+    expected = b"2\r\nx\x9c\r\na\r\nKI,I\x04\x00\x04\x00\x01\x9b\r\n0\r\n\r\n"
+    msg = http.StreamWriter(protocol, loop)
+    msg.enable_compression("deflate")
+    msg.enable_chunking()
+    await msg.write(b"data")
+    await msg.write_eof()
+
     chunks = [b"".join(c[1][0]) for c in list(transport.writelines.mock_calls)]  # type: ignore[attr-defined]
     assert all(chunks)
     content = b"".join(chunks)
@@ -216,6 +282,26 @@ async def test_write_payload_deflate_compression_chunked_data_in_eof(
     await msg.write(b"data")
     await msg.write_eof(b"end")
 
+    chunks = [c[1][0] for c in list(transport.write.mock_calls)]  # type: ignore[attr-defined]
+    assert all(chunks)
+    content = b"".join(chunks)
+    assert content == expected
+
+
+@pytest.mark.usefixtures("enable_writelines")
+@pytest.mark.usefixtures("force_writelines_small_payloads")
+async def test_write_payload_deflate_compression_chunked_data_in_eof_writelines(
+    protocol: BaseProtocol,
+    transport: asyncio.Transport,
+    loop: asyncio.AbstractEventLoop,
+) -> None:
+    expected = b"2\r\nx\x9c\r\nd\r\nKI,IL\xcdK\x01\x00\x0b@\x02\xd2\r\n0\r\n\r\n"
+    msg = http.StreamWriter(protocol, loop)
+    msg.enable_compression("deflate")
+    msg.enable_chunking()
+    await msg.write(b"data")
+    await msg.write_eof(b"end")
+
     chunks = [b"".join(c[1][0]) for c in list(transport.writelines.mock_calls)]  # type: ignore[attr-defined]
     assert all(chunks)
     content = b"".join(chunks)
@@ -231,6 +317,34 @@ async def test_write_large_payload_deflate_compression_chunked_data_in_eof(
     msg.enable_compression("deflate")
     msg.enable_chunking()
 
+    await msg.write(b"data" * 4096)
+    # This payload compresses to 1111 bytes
+    payload = b"".join([bytes((*range(0, i), *range(i, 0, -1))) for i in range(255)])
+    await msg.write_eof(payload)
+
+    compressed = []
+    chunks = [c[1][0] for c in list(transport.write.mock_calls)]  # type: ignore[attr-defined]
+    chunked_body = b"".join(chunks)
+    split_body = chunked_body.split(b"\r\n")
+    while split_body:
+        if split_body.pop(0):
+            compressed.append(split_body.pop(0))
+
+    content = b"".join(compressed)
+    assert zlib.decompress(content) == (b"data" * 4096) + payload
+
+
+@pytest.mark.usefixtures("enable_writelines")
+@pytest.mark.usefixtures("force_writelines_small_payloads")
+async def test_write_large_payload_deflate_compression_chunked_data_in_eof_writelines(
+    protocol: BaseProtocol,
+    transport: asyncio.Transport,
+    loop: asyncio.AbstractEventLoop,
+) -> None:
+    msg = http.StreamWriter(protocol, loop)
+    msg.enable_compression("deflate")
+    msg.enable_chunking()
+
     await msg.write(b"data" * 4096)
     # This payload compresses to 1111 bytes
     payload = b"".join([bytes((*range(0, i), *range(i, 0, -1))) for i in range(255)])
@@ -421,3 +535,29 @@ async def test_set_eof_after_write_headers(
     msg.set_eof()
     await msg.write_eof()
     assert not transport.write.called
+
+
+@pytest.mark.parametrize(
+    "char",
+    [
+        "\n",
+        "\r",
+    ],
+)
+def test_serialize_headers_raises_on_new_line_or_carriage_return(char: str) -> None:
+    """Verify serialize_headers raises on cr or nl in the headers."""
+    status_line = "HTTP/1.1 200 OK"
+    headers = CIMultiDict(
+        {
+            hdrs.CONTENT_TYPE: f"text/plain{char}",
+        }
+    )
+
+    with pytest.raises(
+        ValueError,
+        match=(
+            "Newline or carriage return detected in headers. "
+            "Potential header injection attack."
+        ),
+    ):
+        _serialize_headers(status_line, headers)
diff --git tests/test_imports.py tests/test_imports.py
index 5a2bb76b03c..b3f545ad900 100644
--- tests/test_imports.py
+++ tests/test_imports.py
@@ -38,7 +38,7 @@ def test_web___all__(pytester: pytest.Pytester) -> None:
         # and even slower under pytest-xdist, especially in CI
         _XDIST_WORKER_COUNT * 100 * (1 if _IS_CI_ENV else 1.53)
         if _IS_XDIST_RUN
-        else 265
+        else 295
     ),
 }
 _TARGET_TIMINGS_BY_PYTHON_VERSION["3.13"] = _TARGET_TIMINGS_BY_PYTHON_VERSION["3.12"]
diff --git a/tests/test_leaks.py b/tests/test_leaks.py
new file mode 100644
index 00000000000..07b506bdb99
--- /dev/null
+++ tests/test_leaks.py
@@ -0,0 +1,37 @@
+import pathlib
+import platform
+import subprocess
+import sys
+
+import pytest
+
+IS_PYPY = platform.python_implementation() == "PyPy"
+
+
+@pytest.mark.skipif(IS_PYPY, reason="gc.DEBUG_LEAK not available on PyPy")
+@pytest.mark.parametrize(
+    ("script", "message"),
+    [
+        (
+            # Test that ClientResponse is collected after server disconnects.
+            # https://github.com/aio-libs/aiohttp/issues/10535
+            "check_for_client_response_leak.py",
+            "ClientResponse leaked",
+        ),
+        (
+            # Test that Request object is collected when the handler raises.
+            # https://github.com/aio-libs/aiohttp/issues/10548
+            "check_for_request_leak.py",
+            "Request leaked",
+        ),
+    ],
+)
+def test_leak(script: str, message: str) -> None:
+    """Run isolated leak test script and check for leaks."""
+    leak_test_script = pathlib.Path(__file__).parent.joinpath("isolated", script)
+
+    with subprocess.Popen(
+        [sys.executable, "-u", str(leak_test_script)],
+        stdout=subprocess.PIPE,
+    ) as proc:
+        assert proc.wait() == 0, message
diff --git tests/test_proxy.py tests/test_proxy.py
index 1679b68909f..83457de891f 100644
--- tests/test_proxy.py
+++ tests/test_proxy.py
@@ -207,6 +207,7 @@ async def make_conn():
         "aiohttp.connector.aiohappyeyeballs.start_connection",
         autospec=True,
         spec_set=True,
+        return_value=mock.create_autospec(socket.socket, spec_set=True, instance=True),
     )
     def test_proxy_connection_error(self, start_connection: Any) -> None:
         async def make_conn():
diff --git tests/test_streams.py tests/test_streams.py
index fcf13a91eb3..1b65f771c77 100644
--- tests/test_streams.py
+++ tests/test_streams.py
@@ -1141,6 +1141,7 @@ async def test_empty_stream_reader() -> None:
     with pytest.raises(asyncio.IncompleteReadError):
         await s.readexactly(10)
     assert s.read_nowait() == b""
+    assert s.total_bytes == 0
 
 
 async def test_empty_stream_reader_iter_chunks() -> None:
diff --git tests/test_urldispatch.py tests/test_urldispatch.py
index 8ee3df33202..ba6bdff23a0 100644
--- tests/test_urldispatch.py
+++ tests/test_urldispatch.py
@@ -358,7 +358,7 @@ def test_add_static_path_resolution(router: any) -> None:
     """Test that static paths are expanded and absolute."""
     res = router.add_static("/", "~/..")
     directory = str(res.get_info()["directory"])
-    assert directory == str(pathlib.Path.home().parent)
+    assert directory == str(pathlib.Path.home().resolve(strict=True).parent)
 
 
 def test_add_static(router) -> None:
diff --git tests/test_web_functional.py tests/test_web_functional.py
index a3a990141a1..e4979851300 100644
--- tests/test_web_functional.py
+++ tests/test_web_functional.py
@@ -2324,3 +2324,41 @@ async def handler(request: web.Request) -> web.Response:
         # Make 2nd request which will hit the race condition.
         async with client.get("/") as resp:
             assert resp.status == 200
+
+
+async def test_keepalive_expires_on_time(aiohttp_client: AiohttpClient) -> None:
+    """Test that the keepalive handle expires on time."""
+
+    async def handler(request: web.Request) -> web.Response:
+        body = await request.read()
+        assert b"" == body
+        return web.Response(body=b"OK")
+
+    app = web.Application()
+    app.router.add_route("GET", "/", handler)
+
+    connector = aiohttp.TCPConnector(limit=1)
+    client = await aiohttp_client(app, connector=connector)
+
+    loop = asyncio.get_running_loop()
+    now = loop.time()
+
+    # Patch loop time so we can control when the keepalive timeout is processed
+    with mock.patch.object(loop, "time") as loop_time_mock:
+        loop_time_mock.return_value = now
+        resp1 = await client.get("/")
+        await resp1.read()
+        request_handler = client.server.handler.connections[0]
+
+        # Ensure th,e keep alive handle is set
+        assert request_handler._keepalive_handle is not None
+
+        # Set the loop time to exactly the keepalive timeout
+        loop_time_mock.return_value = request_handler._next_keepalive_close_time
+
+        # sleep twice to ensure the keep alive timeout is processed
+        await asyncio.sleep(0)
+        await asyncio.sleep(0)
+
+        # Ensure the keep alive handle expires
+        assert request_handler._keepalive_handle is None
diff --git tests/test_web_response.py tests/test_web_response.py
index f4acf23f61b..95769161804 100644
--- tests/test_web_response.py
+++ tests/test_web_response.py
@@ -10,7 +10,7 @@
 
 import aiosignal
 import pytest
-from multidict import CIMultiDict, CIMultiDictProxy
+from multidict import CIMultiDict, CIMultiDictProxy, MultiDict
 from re_assert import Matches
 
 from aiohttp import HttpVersion, HttpVersion10, HttpVersion11, hdrs
@@ -1201,7 +1201,7 @@ def read(self, size: int = -1) -> bytes:
         (BodyPartReader("x", CIMultiDictProxy(CIMultiDict()), mock.Mock()), None),
         (
             mpwriter,
-            "--x\nContent-Type: text/plain; charset=utf-8\r\nContent-Length: 4\r\n\r\ntest",
+            "--x\r\nContent-Type: text/plain; charset=utf-8\r\nContent-Length: 4\r\n\r\ntest",
         ),
     ),
 )
@@ -1479,3 +1479,15 @@ def test_text_is_json_encoded(self) -> None:
     def test_content_type_is_overrideable(self) -> None:
         resp = json_response({"foo": 42}, content_type="application/vnd.json+api")
         assert "application/vnd.json+api" == resp.content_type
+
+
+@pytest.mark.parametrize("loose_header_type", (MultiDict, CIMultiDict, dict))
+async def test_passing_cimultidict_to_web_response_not_mutated(
+    loose_header_type: type,
+) -> None:
+    req = make_request("GET", "/")
+    headers = loose_header_type({})
+    resp = Response(body=b"answer", headers=headers)
+    await resp.prepare(req)
+    assert resp.content_length == 6
+    assert not headers
diff --git tests/test_web_server.py tests/test_web_server.py
index 7b9b87a374a..d2f1341afe0 100644
--- tests/test_web_server.py
+++ tests/test_web_server.py
@@ -56,7 +56,9 @@ async def handler(request):
     assert txt.startswith("500 Internal Server Error")
     assert "Traceback" not in txt
 
-    logger.exception.assert_called_with("Error handling request", exc_info=exc)
+    logger.exception.assert_called_with(
+        "Error handling request from %s", cli.host, exc_info=exc
+    )
 
 
 async def test_raw_server_logs_invalid_method_with_loop_debug(
@@ -85,7 +87,9 @@ async def handler(request: web.BaseRequest) -> NoReturn:
     # on the first request since the client may
     # be probing for TLS/SSL support which is
     # expected to fail
-    logger.debug.assert_called_with("Error handling request", exc_info=exc)
+    logger.debug.assert_called_with(
+        "Error handling request from %s", cli.host, exc_info=exc
+    )
     logger.debug.reset_mock()
 
     # Now make another connection to the server
@@ -99,7 +103,9 @@ async def handler(request: web.BaseRequest) -> NoReturn:
     # on the first request since the client may
     # be probing for TLS/SSL support which is
     # expected to fail
-    logger.debug.assert_called_with("Error handling request", exc_info=exc)
+    logger.debug.assert_called_with(
+        "Error handling request from %s", cli.host, exc_info=exc
+    )
 
 
 async def test_raw_server_logs_invalid_method_without_loop_debug(
@@ -128,7 +134,9 @@ async def handler(request: web.BaseRequest) -> NoReturn:
     # on the first request since the client may
     # be probing for TLS/SSL support which is
     # expected to fail
-    logger.debug.assert_called_with("Error handling request", exc_info=exc)
+    logger.debug.assert_called_with(
+        "Error handling request from %s", cli.host, exc_info=exc
+    )
 
 
 async def test_raw_server_logs_invalid_method_second_request(
@@ -159,7 +167,9 @@ async def handler(request: web.BaseRequest) -> web.Response:
     # BadHttpMethod should be logged as an exception
     # if its not the first request since we know
     # that the client already was speaking HTTP
-    logger.exception.assert_called_with("Error handling request", exc_info=exc)
+    logger.exception.assert_called_with(
+        "Error handling request from %s", cli.host, exc_info=exc
+    )
 
 
 async def test_raw_server_logs_bad_status_line_as_exception(
@@ -184,7 +194,9 @@ async def handler(request: web.BaseRequest) -> NoReturn:
     txt = await resp.text()
     assert "Traceback (most recent call last):\n" not in txt
 
-    logger.exception.assert_called_with("Error handling request", exc_info=exc)
+    logger.exception.assert_called_with(
+        "Error handling request from %s", cli.host, exc_info=exc
+    )
 
 
 async def test_raw_server_handler_timeout(
@@ -221,6 +233,24 @@ async def handler(request):
     logger.debug.assert_called_with("Ignored premature client disconnection")
 
 
+async def test_raw_server_does_not_swallow_base_exceptions(
+    aiohttp_raw_server: AiohttpRawServer, aiohttp_client: AiohttpClient
+) -> None:
+    class UnexpectedException(BaseException):
+        """Dummy base exception."""
+
+    async def handler(request: web.BaseRequest) -> NoReturn:
+        raise UnexpectedException()
+
+    loop = asyncio.get_event_loop()
+    loop.set_debug(True)
+    server = await aiohttp_raw_server(handler)
+    cli = await aiohttp_client(server)
+
+    with pytest.raises(client.ServerDisconnectedError):
+        await cli.get("/path/to", timeout=client.ClientTimeout(10))
+
+
 async def test_raw_server_cancelled_in_write_eof(aiohttp_raw_server, aiohttp_client):
     async def handler(request):
         resp = web.Response(text=str(request.rel_url))
@@ -254,7 +284,9 @@ async def handler(request):
     txt = await resp.text()
     assert "Traceback (most recent call last):\n" in txt
 
-    logger.exception.assert_called_with("Error handling request", exc_info=exc)
+    logger.exception.assert_called_with(
+        "Error handling request from %s", cli.host, exc_info=exc
+    )
 
 
 async def test_raw_server_html_exception(aiohttp_raw_server, aiohttp_client):
@@ -278,7 +310,9 @@ async def handler(request):
         "</body></html>\n"
     )
 
-    logger.exception.assert_called_with("Error handling request", exc_info=exc)
+    logger.exception.assert_called_with(
+        "Error handling request from %s", cli.host, exc_info=exc
+    )
 
 
 async def test_raw_server_html_exception_debug(aiohttp_raw_server, aiohttp_client):
@@ -302,7 +336,9 @@ async def handler(request):
         "<pre>Traceback (most recent call last):\n"
     )
 
-    logger.exception.assert_called_with("Error handling request", exc_info=exc)
+    logger.exception.assert_called_with(
+        "Error handling request from %s", cli.host, exc_info=exc
+    )
 
 
 async def test_handler_cancellation(unused_port_socket: socket.socket) -> None:
@@ -311,7 +347,6 @@ async def test_handler_cancellation(unused_port_socket: socket.socket) -> None:
     port = sock.getsockname()[1]
 
     async def on_request(_: web.Request) -> web.Response:
-        nonlocal event
         try:
             await asyncio.sleep(10)
         except asyncio.CancelledError:
@@ -353,7 +388,7 @@ async def test_no_handler_cancellation(unused_port_socket: socket.socket) -> Non
     started = False
 
     async def on_request(_: web.Request) -> web.Response:
-        nonlocal done_event, started, timeout_event
+        nonlocal started
         started = True
         await asyncio.wait_for(timeout_event.wait(), timeout=5)
         done_event.set()
diff --git tests/test_web_urldispatcher.py tests/test_web_urldispatcher.py
index 92066f09b7d..ee60b6917c5 100644
--- tests/test_web_urldispatcher.py
+++ tests/test_web_urldispatcher.py
@@ -585,16 +585,17 @@ async def test_access_mock_special_resource(
     my_special.touch()
 
     real_result = my_special.stat()
-    real_stat = pathlib.Path.stat
+    real_stat = os.stat
 
-    def mock_stat(self: pathlib.Path, **kwargs: Any) -> os.stat_result:
-        s = real_stat(self, **kwargs)
+    def mock_stat(path: Any, **kwargs: Any) -> os.stat_result:
+        s = real_stat(path, **kwargs)
         if os.path.samestat(s, real_result):
             mock_mode = S_IFIFO | S_IMODE(s.st_mode)
             s = os.stat_result([mock_mode] + list(s)[1:])
         return s
 
     monkeypatch.setattr("pathlib.Path.stat", mock_stat)
+    monkeypatch.setattr("os.stat", mock_stat)
 
     app = web.Application()
     app.router.add_static("/", str(tmp_path))
diff --git tests/test_web_websocket_functional.py tests/test_web_websocket_functional.py
index b7494d9265f..945096a2af3 100644
--- tests/test_web_websocket_functional.py
+++ tests/test_web_websocket_functional.py
@@ -797,6 +797,7 @@ async def handler(request: web.Request) -> NoReturn:
     assert ws.close_code == WSCloseCode.ABNORMAL_CLOSURE
     assert ws_server_close_code == WSCloseCode.ABNORMAL_CLOSURE
     assert isinstance(ws_server_exception, asyncio.TimeoutError)
+    assert str(ws_server_exception) == "No PONG received after 0.025 seconds"
     await ws.close()
 
 
diff --git tests/test_websocket_handshake.py tests/test_websocket_handshake.py
index bbfa1d9260d..53d5d9152bb 100644
--- tests/test_websocket_handshake.py
+++ tests/test_websocket_handshake.py
@@ -174,7 +174,7 @@ async def test_handshake_protocol_unsupported(caplog) -> None:
 
     assert (
         caplog.records[-1].msg
-        == "Client protocols %r don’t overlap server-known ones %r"
+        == "%s: Client protocols %r don’t overlap server-known ones %r"
     )
     assert ws.ws_protocol is None
 
diff --git tools/gen.py tools/gen.py
index ab2b39a2df0..24fb71bdd9d 100755
--- tools/gen.py
+++ tools/gen.py
@@ -7,7 +7,7 @@
 import multidict
 
 ROOT = pathlib.Path.cwd()
-while ROOT.parent != ROOT and not (ROOT / ".git").exists():
+while ROOT.parent != ROOT and not (ROOT / "pyproject.toml").exists():
     ROOT = ROOT.parent
 
 

Here's my review of the PR:

Description

This is a large update PR that includes multiple bug fixes, performance improvements, and security fixes for aiohttp. The main changes span several versions from 3.11.10 to 3.11.16. Key updates include:

  1. Fixed memory leaks in client response and request handling
  2. Improved WebSocket buffer and header serialization performance
  3. Enhanced DNS resolution handling
  4. Security improvements around header injection prevention
  5. Better error logging and connection handling
  6. Updated GitHub Actions dependencies
  7. Added support for musllinux wheels
  8. Fixed file response race conditions

Security Hotspots

  1. Header Injection Prevention: A new validation was added to prevent header injection attacks by checking for newlines and carriage returns in headers. Medium risk mitigation.

  2. Zero Copy Writes: Zero copy writes were disabled as a mitigation for CVE-2024-12254 on affected Python versions. High risk mitigation.

Changes

Changes

By file category:

Core:

  • aiohttp/client*.py: Fixed memory leaks in client handling, improved error reporting
  • aiohttp/http_writer.py: Added protection against header injection, updated zero copy writes
  • aiohttp/web_fileresponse.py: Fixed race condition in file responses
  • aiohttp/web_protocol.py: Improved connection handling and error logging

Infrastructure:

  • .github/workflows/ci-cd.yml: Updated GitHub Actions versions, added musllinux builds
  • setup.py/requirements: Updated dependencies and build configurations

Tests:

  • Added new test cases for memory leaks
  • Added benchmarks for client operations
  • Improved test coverage for error cases
sequenceDiagram
    participant Client
    participant HttpWriter
    participant WebProtocol
    participant FileResponse
    participant Transport

    Client->>HttpWriter: write(data)
    HttpWriter->>HttpWriter: validate headers for injection
    alt has injection characters
        HttpWriter-->>Client: raise ValueError
    end
    
    HttpWriter->>Transport: write data
    
    Client->>FileResponse: GET file
    FileResponse->>FileResponse: check file state
    alt file changed during response
        FileResponse-->>Client: return error
    else file OK
        FileResponse->>Transport: sendfile
    end
    
    Client->>WebProtocol: close connection
    WebProtocol->>WebProtocol: cleanup references
    WebProtocol-->>Client: connection closed
Loading

The key changes improve security and stability while reducing resource leaks. The PR appears well-tested and includes important fixes for production deployments.

Let me know if you would like me to expand on any part of the review.

@renovate renovate bot changed the title chore(deps): update dependency aiohttp to v3.11.16 chore(deps): update dependency aiohttp to v3.11.16 - autoclosed Apr 15, 2025
@renovate renovate bot closed this Apr 15, 2025
@renovate renovate bot deleted the renovate/aiohttp-3.x branch April 15, 2025 05:45
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Projects
None yet
Development

Successfully merging this pull request may close these issues.

0 participants