diff --git a/.gitignore b/.gitignore
index 15642ec..8cf6848 100644
--- a/.gitignore
+++ b/.gitignore
@@ -12,3 +12,5 @@ envs/
scrap/
.coverage
.DS_Store
+.benchmarks/
+.codspeed/
\ No newline at end of file
diff --git a/README.md b/README.md
index d364ec3..893c70c 100644
--- a/README.md
+++ b/README.md
@@ -4,114 +4,102 @@
# API Reference
-## Ok
-
-Frozen instance of an Ok monad used to wrap successful operations.
-
-### `Ok.and_then`
-```python
-Ok.and_then(self, func: collections.abc.Callable[[~T], danom._result.Result], **kwargs: dict) -> danom._result.Result
-```
-Pipe another function that returns a monad.
+## Stream
-```python
->>> Ok(1).and_then(add_one) == Ok(2)
->>> Ok(1).and_then(raise_err) == Err(error=TypeError())
-```
+An immutable lazy iterator with functional operations.
+#### Why bother?
+Readability counts, abstracting common operations helps reduce cognitive complexity when reading code.
-### `Ok.is_ok`
-```python
-Ok.is_ok(self) -> Literal[True]
-```
-Returns True if the result type is Ok.
+#### Comparison
+Take this imperative pipeline of operations, it iterates once over the :
```python
->>> Ok().is_ok() == True
-```
-
-
-### `Ok.match`
-```python
-Ok.match(self, if_ok_func: collections.abc.Callable[[~T], danom._result.Result], _if_err_func: collections.abc.Callable[[~T], danom._result.Result]) -> danom._result.Result
-```
-Map Ok func to Ok and Err func to Err
-
-```python
->>> Ok(1).match(add_one, mock_get_error_type) == Ok(inner=2)
->>> Ok("ok").match(double, mock_get_error_type) == Ok(inner='okok')
->>> Err(error=TypeError()).match(double, mock_get_error_type) == Ok(inner='TypeError')
+>>> res = []
+...
+>>> for x in range(1_000_000):
+... item = triple(x)
+...
+... if not is_gt_ten(item):
+... continue
+...
+... item = min_two(item)
+...
+... if not is_even_num(item):
+... continue
+...
+... item = square(item)
+...
+... if not is_lt_400(item):
+... continue
+...
+... res.append(item)
+>>> [100, 256]
```
+number of tokens: `90`
+number of keywords: `11`
-### `Ok.unwrap`
-```python
-Ok.unwrap(self) -> ~T
-```
-Unwrap the Ok monad and get the inner value.
+keyword breakdown: `{'for': 1, 'in': 1, 'if': 3, 'not': 3, 'continue': 3}`
+After a bit of experience with python you might use list comprehensions, however this is arguably _less_ clear and iterates multiple times over the same data
```python
->>> Ok().unwrap() == None
->>> Ok(1).unwrap() == 1
->>> Ok("ok").unwrap() == 'ok'
+>>> mul_three = [triple(x) for x in range(1_000_000)]
+>>> gt_ten = [x for x in mul_three if is_gt_ten(x)]
+>>> sub_two = [min_two(x) for x in gt_ten]
+>>> is_even = [x for x in sub_two if is_even_num(x)]
+>>> squared = [square(x) for x in is_even]
+>>> lt_400 = [x for x in squared if is_lt_400(x)]
+>>> [100, 256]
```
+number of tokens: `92`
+number of keywords: `15`
-## Err
-
-Frozen instance of an Err monad used to wrap failed operations.
+keyword breakdown: `{'for': 6, 'in': 6, 'if': 3}`
-### `Err.and_then`
-```python
-Err.and_then(self, _: 'Callable[[T], Result]', **_kwargs: 'dict') -> 'Self'
-```
-Pipe another function that returns a monad. For Err will return original error.
+This still has a lot of tokens that the developer has to read to understand the code. The extra keywords add noise that cloud the actual transformations.
+Using a `Stream` results in this:
```python
->>> Err(error=TypeError()).and_then(add_one) == Err(error=TypeError())
->>> Err(error=TypeError()).and_then(raise_value_err) == Err(error=TypeError())
+>>> (
+... Stream.from_iterable(range(1_000_000))
+... .map(triple)
+... .filter(is_gt_ten)
+... .map(min_two)
+... .filter(is_even_num)
+... .map(square)
+... .filter(is_lt_400)
+... .collect()
+... )
+>>> (100, 256)
```
+number of tokens: `60`
+number of keywords: `0`
-### `Err.is_ok`
-```python
-Err.is_ok(self) -> 'Literal[False]'
-```
-Returns False if the result type is Err.
+keyword breakdown: `{}`
-```python
-Err().is_ok() == False
-```
+The business logic is arguably much clearer like this.
-### `Err.match`
+### `Stream.async_collect`
```python
-Err.match(self, _if_ok_func: 'Callable[[T], Result]', if_err_func: 'Callable[[T], Result]') -> 'Result'
+Stream.async_collect(self) -> 'tuple'
```
-Map Ok func to Ok and Err func to Err
+Async version of collect. Note that all functions in the stream should be `Awaitable`.
```python
->>> Ok(1).match(add_one, mock_get_error_type) == Ok(inner=2)
->>> Ok("ok").match(double, mock_get_error_type) == Ok(inner='okok')
->>> Err(error=TypeError()).match(double, mock_get_error_type) == Ok(inner='TypeError')
+>>> Stream.from_iterable(file_paths).map(async_read_files).async_collect()
```
-
-### `Err.unwrap`
-```python
-Err.unwrap(self) -> 'None'
-```
-Unwrap the Err monad will raise the inner error.
+If there are no operations in the `Stream` then this will act as a normal collect.
```python
->>> Err(error=TypeError()).unwrap() raise TypeError(...)
+>>> Stream.from_iterable(file_paths).async_collect()
```
-## Stream
-
-A lazy iterator with functional operations.
-
### `Stream.collect`
```python
Stream.collect(self) -> 'tuple'
@@ -222,6 +210,8 @@ If False the processing will use `ProcessPoolExecutor` else it will use `ThreadP
>>> stream.par_collect(use_threads=True) == (1, 2, 3, 4)
```
+Note that all operations should be pickle-able, for that reason `Stream` does not support lambdas or closures.
+
### `Stream.partition`
```python
@@ -244,11 +234,115 @@ As `partition` triggers an action, the parameters will be forwarded to the `par_
```
+## Ok
+
+Frozen instance of an Ok monad used to wrap successful operations.
+
+### `Ok.and_then`
+```python
+Ok.and_then(self, func: collections.abc.Callable[[~T], danom._result.Result], **kwargs: dict) -> danom._result.Result
+```
+Pipe another function that returns a monad.
+
+```python
+>>> Ok(1).and_then(add_one) == Ok(2)
+>>> Ok(1).and_then(raise_err) == Err(error=TypeError())
+```
+
+
+### `Ok.is_ok`
+```python
+Ok.is_ok(self) -> Literal[True]
+```
+Returns True if the result type is Ok.
+
+```python
+>>> Ok().is_ok() == True
+```
+
+
+### `Ok.match`
+```python
+Ok.match(self, if_ok_func: collections.abc.Callable[[~T], danom._result.Result], _if_err_func: collections.abc.Callable[[~T], danom._result.Result]) -> danom._result.Result
+```
+Map Ok func to Ok and Err func to Err
+
+```python
+>>> Ok(1).match(add_one, mock_get_error_type) == Ok(inner=2)
+>>> Ok("ok").match(double, mock_get_error_type) == Ok(inner='okok')
+>>> Err(error=TypeError()).match(double, mock_get_error_type) == Ok(inner='TypeError')
+```
+
+
+### `Ok.unwrap`
+```python
+Ok.unwrap(self) -> ~T
+```
+Unwrap the Ok monad and get the inner value.
+
+```python
+>>> Ok().unwrap() == None
+>>> Ok(1).unwrap() == 1
+>>> Ok("ok").unwrap() == 'ok'
+```
+
+
+## Err
+
+Frozen instance of an Err monad used to wrap failed operations.
+
+### `Err.and_then`
+```python
+Err.and_then(self, _: 'Callable[[T], Result]', **_kwargs: 'dict') -> 'Self'
+```
+Pipe another function that returns a monad. For Err will return original error.
+
+```python
+>>> Err(error=TypeError()).and_then(add_one) == Err(error=TypeError())
+>>> Err(error=TypeError()).and_then(raise_value_err) == Err(error=TypeError())
+```
+
+
+### `Err.is_ok`
+```python
+Err.is_ok(self) -> 'Literal[False]'
+```
+Returns False if the result type is Err.
+
+```python
+Err().is_ok() == False
+```
+
+
+### `Err.match`
+```python
+Err.match(self, _if_ok_func: 'Callable[[T], Result]', if_err_func: 'Callable[[T], Result]') -> 'Result'
+```
+Map Ok func to Ok and Err func to Err
+
+```python
+>>> Ok(1).match(add_one, mock_get_error_type) == Ok(inner=2)
+>>> Ok("ok").match(double, mock_get_error_type) == Ok(inner='okok')
+>>> Err(error=TypeError()).match(double, mock_get_error_type) == Ok(inner='TypeError')
+```
+
+
+### `Err.unwrap`
+```python
+Err.unwrap(self) -> 'None'
+```
+Unwrap the Err monad will raise the inner error.
+
+```python
+>>> Err(error=TypeError()).unwrap() raise TypeError(...)
+```
+
+
## safe
### `safe`
```python
-safe(func: collections.abc.Callable[~P, ~T]) -> collections.abc.Callable[~P, danom._result.Result]
+safe(func: collections.abc.Callable[[T], U]) -> collections.abc.Callable[[T], danom._result.Result]
```
Decorator for functions that wraps the function in a try except returns `Ok` on success else `Err`.
@@ -265,7 +359,7 @@ Decorator for functions that wraps the function in a try except returns `Ok` on
### `safe_method`
```python
-safe_method(func: collections.abc.Callable[~P, ~T]) -> collections.abc.Callable[~P, danom._result.Result]
+safe_method(func: collections.abc.Callable[[T], U]) -> collections.abc.Callable[[T], danom._result.Result]
```
The same as `safe` except it forwards on the `self` of the class instance to the wrapped function.
diff --git a/coverage.svg b/coverage.svg
index 11e2b54..ae5b368 100644
--- a/coverage.svg
+++ b/coverage.svg
@@ -11,6 +11,6 @@
font-family="DejaVu Sans,Verdana,Geneva,sans-serif"
font-size="11">
coverage
- 100.0%
+ 100.00%
\ No newline at end of file
diff --git a/dev_tools/update_cov.py b/dev_tools/update_cov.py
index f6563e7..96620da 100644
--- a/dev_tools/update_cov.py
+++ b/dev_tools/update_cov.py
@@ -25,10 +25,10 @@
def update_cov_badge(root: str) -> int:
cov_report = json.loads(Path(f"{root}/coverage.json").read_text())
- new_pct = cov_report["totals"]["percent_covered"]
+ new_pct = to_2dp_float_str(cov_report["totals"]["percent_covered"])
curr_badge = Path(f"{root}/coverage.svg").read_text()
- curr_pct = float(
+ curr_pct = to_2dp_float_str(
re.findall(r'([0-9]+(?:\.[0-9]+)?)%', curr_badge)[0]
)
@@ -40,6 +40,7 @@ def update_cov_badge(root: str) -> int:
def make_badge(badge_str: str, pct: int) -> str:
+ pct = float(pct)
colour = (
"red"
if pct < 50 # noqa: PLR2004
@@ -51,7 +52,11 @@ def make_badge(badge_str: str, pct: int) -> str:
if pct < 90 # noqa: PLR2004
else "green"
)
- return badge_str.format(colour=colour, pct=pct)
+ return badge_str.format(colour=colour, pct=f"{pct:.2f}")
+
+
+def to_2dp_float_str(pct: float) -> str:
+ return f"{float(pct):.2f}"
if __name__ == "__main__":
diff --git a/dev_tools/update_readme.py b/dev_tools/update_readme.py
index c442e9b..ee0b3cb 100644
--- a/dev_tools/update_readme.py
+++ b/dev_tools/update_readme.py
@@ -27,16 +27,16 @@ class ReadmeDoc:
doc: str
def to_readme(self) -> str:
- docs = "\n".join([line.strip() for line in self.doc.splitlines()])
+ docs = strip_doc(self.doc)
return "\n".join([f"### `{self.name}`", f"```python\n{self.name}{self.sig}\n```", docs])
def create_readme_lines() -> str:
readme_lines = []
- for ent in [Ok, Err, Stream]:
+ for ent in [Stream, Ok, Err]:
readme_lines.append(f"## {ent.__name__}")
- readme_lines.append(ent.__doc__)
+ readme_lines.append(strip_doc(ent.__doc__))
readme_docs = [
ReadmeDoc(f"{ent.__name__}.{k}", inspect.signature(v), v.__doc__)
for k, v in inspect.getmembers(ent, inspect.isroutine)
@@ -62,5 +62,9 @@ def update_readme(new_docs: str, readme_path: str = "./README.md") -> None:
return 0
+def strip_doc(doc: str) -> str:
+ return "\n".join([line.strip() for line in doc.splitlines()])
+
+
if __name__ == "__main__":
sys.exit(update_readme(create_readme_lines()))
diff --git a/pyproject.toml b/pyproject.toml
index 2da537a..d35b3e0 100644
--- a/pyproject.toml
+++ b/pyproject.toml
@@ -1,6 +1,6 @@
[project]
name = "danom"
-version = "0.6.0"
+version = "0.7.0"
description = "Functional streams and monads"
readme = "README.md"
license = "MIT"
@@ -26,6 +26,8 @@ dev = [
"ipykernel>=7.1.0",
"pre-commit>=4.5.0",
"pytest>=9.0.1",
+ "pytest-asyncio>=1.3.0",
+ "pytest-codspeed>=4.2.0",
"pytest-cov>=7.0.0",
"repo-mapper-rs>=0.3.0",
"ruff>=0.14.6",
@@ -41,3 +43,6 @@ source = ["src"]
show_missing = true
skip_covered = true
fail_under = 95
+
+[tool.pytest]
+# addopts = ["--codspeed"]
\ No newline at end of file
diff --git a/src/danom/__init__.py b/src/danom/__init__.py
index 35819e3..3166a1f 100644
--- a/src/danom/__init__.py
+++ b/src/danom/__init__.py
@@ -4,7 +4,7 @@
from danom._result import Result
from danom._safe import safe, safe_method
from danom._stream import Stream
-from danom._utils import all_of, any_of, compose, identity, invert
+from danom._utils import all_of, any_of, compose, identity, invert, none_of
__all__ = [
"Err",
@@ -17,6 +17,7 @@
"identity",
"invert",
"new_type",
+ "none_of",
"safe",
"safe_method",
]
diff --git a/src/danom/_err.py b/src/danom/_err.py
index b193275..017500a 100644
--- a/src/danom/_err.py
+++ b/src/danom/_err.py
@@ -17,18 +17,14 @@
class Err(Result):
"""Frozen instance of an Err monad used to wrap failed operations."""
+ inner: Any = attrs.field(default=None)
input_args: tuple[T] = attrs.field(default=None, repr=False)
error: Exception | None = attrs.field(default=None)
- err_type: BaseException = attrs.field(init=False, repr=False)
- err_msg: str = attrs.field(init=False, repr=False)
details: list[dict[str, Any]] = attrs.field(factory=list, init=False, repr=False)
def __attrs_post_init__(self) -> None:
- # little hack explained here: https://www.attrs.org/en/stable/init.html#post-init
- object.__setattr__(self, "err_type", type(self.error))
- object.__setattr__(self, "err_msg", str(self.error))
-
if isinstance(self.error, Exception):
+ # little hack explained here: https://www.attrs.org/en/stable/init.html#post-init
object.__setattr__(self, "details", self._extract_details(self.error.__traceback__))
def _extract_details(self, tb: TracebackType | None) -> list[dict[str, Any]]:
@@ -72,7 +68,9 @@ def unwrap(self) -> None:
>>> Err(error=TypeError()).unwrap() raise TypeError(...)
```
"""
- raise self.error
+ if self.error is not None:
+ raise self.error
+ raise ValueError(f"Err does not have a caught error to raise: {self.inner = }")
def match(
self, _if_ok_func: Callable[[T], Result], if_err_func: Callable[[T], Result]
@@ -86,9 +84,3 @@ def match(
```
"""
return if_err_func(self.error)
-
- def __getattr__(self, _name: str) -> Self:
- def _(*_args: tuple, **_kwargs: dict) -> Self:
- return self
-
- return _
diff --git a/src/danom/_ok.py b/src/danom/_ok.py
index 8a8c911..7f97229 100644
--- a/src/danom/_ok.py
+++ b/src/danom/_ok.py
@@ -57,6 +57,3 @@ def match(
```
"""
return if_ok_func(self.inner)
-
- def __getattr__(self, name: str) -> Callable:
- return getattr(self.inner, name)
diff --git a/src/danom/_safe.py b/src/danom/_safe.py
index 205195f..92cbe02 100644
--- a/src/danom/_safe.py
+++ b/src/danom/_safe.py
@@ -6,10 +6,10 @@
from danom._err import Err
from danom._ok import Ok
-from danom._result import P, Result, T
+from danom._result import P, Result
-def safe(func: Callable[[P], T]) -> Callable[[P], Result]:
+def safe[T, U](func: Callable[[T], U]) -> Callable[[T], Result]:
"""Decorator for functions that wraps the function in a try except returns `Ok` on success else `Err`.
```python
@@ -26,12 +26,12 @@ def wrapper(*args: P.args, **kwargs: P.kwargs) -> Result:
try:
return Ok(func(*args, **kwargs))
except Exception as e: # noqa: BLE001
- return Err((args, kwargs), e)
+ return Err(input_args=(args, kwargs), error=e)
return wrapper
-def safe_method(func: Callable[[P], T]) -> Callable[[P], Result]:
+def safe_method[T, U, E](func: Callable[[T], U]) -> Callable[[T], Result[U, E]]:
"""The same as `safe` except it forwards on the `self` of the class instance to the wrapped function.
```python
@@ -48,10 +48,10 @@ def safe_method(func: Callable[[P], T]) -> Callable[[P], Result]:
"""
@functools.wraps(func)
- def wrapper(self: Self, *args: P.args, **kwargs: P.kwargs) -> Result:
+ def wrapper(self: Self, *args: P.args, **kwargs: P.kwargs) -> Result[U, E]:
try:
return Ok(func(self, *args, **kwargs))
except Exception as e: # noqa: BLE001
- return Err((self, args, kwargs), e)
+ return Err(input_args=(self, args, kwargs), error=e)
return wrapper
diff --git a/src/danom/_stream.py b/src/danom/_stream.py
index ac586e3..2e7aed0 100644
--- a/src/danom/_stream.py
+++ b/src/danom/_stream.py
@@ -1,5 +1,6 @@
from __future__ import annotations
+import asyncio
import os
from abc import ABC, abstractmethod
from collections.abc import Callable, Iterable
@@ -38,12 +39,90 @@ def fold[T, U](
def collect(self) -> tuple: ...
@abstractmethod
- def par_collect(self) -> tuple: ...
+ def par_collect(self, workers: int = 4, *, use_threads: bool = False) -> tuple: ...
+
+ @abstractmethod
+ async def async_collect(self) -> tuple: ...
@attrs.define(frozen=True)
class Stream(_BaseStream):
- """A lazy iterator with functional operations."""
+ """An immutable lazy iterator with functional operations.
+
+ #### Why bother?
+ Readability counts, abstracting common operations helps reduce cognitive complexity when reading code.
+
+ #### Comparison
+ Take this imperative pipeline of operations, it iterates once over the :
+
+ ```python
+ >>> res = []
+ ...
+ >>> for x in range(1_000_000):
+ ... item = triple(x)
+ ...
+ ... if not is_gt_ten(item):
+ ... continue
+ ...
+ ... item = min_two(item)
+ ...
+ ... if not is_even_num(item):
+ ... continue
+ ...
+ ... item = square(item)
+ ...
+ ... if not is_lt_400(item):
+ ... continue
+ ...
+ ... res.append(item)
+ >>> [100, 256]
+ ```
+ number of tokens: `90`
+
+ number of keywords: `11`
+
+ keyword breakdown: `{'for': 1, 'in': 1, 'if': 3, 'not': 3, 'continue': 3}`
+
+ After a bit of experience with python you might use list comprehensions, however this is arguably _less_ clear and iterates multiple times over the same data
+ ```python
+ >>> mul_three = [triple(x) for x in range(1_000_000)]
+ >>> gt_ten = [x for x in mul_three if is_gt_ten(x)]
+ >>> sub_two = [min_two(x) for x in gt_ten]
+ >>> is_even = [x for x in sub_two if is_even_num(x)]
+ >>> squared = [square(x) for x in is_even]
+ >>> lt_400 = [x for x in squared if is_lt_400(x)]
+ >>> [100, 256]
+ ```
+ number of tokens: `92`
+
+ number of keywords: `15`
+
+ keyword breakdown: `{'for': 6, 'in': 6, 'if': 3}`
+
+ This still has a lot of tokens that the developer has to read to understand the code. The extra keywords add noise that cloud the actual transformations.
+
+ Using a `Stream` results in this:
+ ```python
+ >>> (
+ ... Stream.from_iterable(range(1_000_000))
+ ... .map(triple)
+ ... .filter(is_gt_ten)
+ ... .map(min_two)
+ ... .filter(is_even_num)
+ ... .map(square)
+ ... .filter(is_lt_400)
+ ... .collect()
+ ... )
+ >>> (100, 256)
+ ```
+ number of tokens: `60`
+
+ number of keywords: `0`
+
+ keyword breakdown: `{}`
+
+ The business logic is arguably much clearer like this.
+ """
@classmethod
def from_iterable(cls, it: Iterable) -> Self:
@@ -181,6 +260,8 @@ def par_collect(self, workers: int = 4, *, use_threads: bool = False) -> tuple:
>>> stream = Stream.from_iterable([0, 1, 2, 3]).map(add_one)
>>> stream.par_collect(use_threads=True) == (1, 2, 3, 4)
```
+
+ Note that all operations should be pickle-able, for that reason `Stream` does not support lambdas or closures.
"""
if workers == -1:
workers = (os.cpu_count() - 1) or 4
@@ -192,6 +273,25 @@ def par_collect(self, workers: int = 4, *, use_threads: bool = False) -> tuple:
return tuple(elem for elem in res if elem != _Nothing.NOTHING)
+ async def async_collect(self) -> tuple:
+ """Async version of collect. Note that all functions in the stream should be `Awaitable`.
+
+ ```python
+ >>> Stream.from_iterable(file_paths).map(async_read_files).async_collect()
+ ```
+
+ If there are no operations in the `Stream` then this will act as a normal collect.
+
+ ```python
+ >>> Stream.from_iterable(file_paths).async_collect()
+ ```
+ """
+ if not self.ops:
+ return self.collect()
+
+ res = await asyncio.gather(*(_async_apply_fns(x, self.ops) for x in self.seq))
+ return tuple(elem for elem in res if elem != _Nothing.NOTHING)
+
@unique
class _OpType(Enum):
@@ -216,3 +316,13 @@ def _apply_fns[T, U](elem: T, ops: tuple[tuple[_OpType, Callable], ...]) -> U |
elif op == _OpType.FILTER and not op_fn(res):
return _Nothing.NOTHING
return res
+
+
+async def _async_apply_fns[T, U](elem: T, ops: tuple[tuple[_OpType, Callable], ...]) -> U | None:
+ res = elem
+ for op, op_fn in ops:
+ if op == _OpType.MAP:
+ res = await op_fn(res)
+ elif op == _OpType.FILTER and not await op_fn(res):
+ return _Nothing.NOTHING
+ return res
diff --git a/src/danom/_utils.py b/src/danom/_utils.py
index a6f99a3..bc6fd65 100644
--- a/src/danom/_utils.py
+++ b/src/danom/_utils.py
@@ -36,10 +36,10 @@ def compose[T, U](*fns: Callable[[T], U]) -> Callable[[T], U]:
@attrs.define(frozen=True, hash=True, eq=True)
-class _AllOf[T, U]:
- fns: Sequence[Callable[[T], U]]
+class _AllOf[T]:
+ fns: Sequence[Callable[[T], bool]]
- def __call__(self, initial: T) -> U:
+ def __call__(self, initial: T) -> bool:
return all(fn(initial) for fn in self.fns)
@@ -55,10 +55,10 @@ def all_of[T](*fns: Callable[[T], bool]) -> Callable[[T], bool]:
@attrs.define(frozen=True, hash=True, eq=True)
-class _AnyOf[T, U]:
- fns: Sequence[Callable[[T], U]]
+class _AnyOf[T]:
+ fns: Sequence[Callable[[T], bool]]
- def __call__(self, initial: T) -> U:
+ def __call__(self, initial: T) -> bool:
return any(fn(initial) for fn in self.fns)
@@ -73,6 +73,17 @@ def any_of[T](*fns: Callable[[T], bool]) -> Callable[[T], bool]:
return _AnyOf(fns)
+def none_of[T](*fns: Callable[[T], bool]) -> Callable[[T], bool]:
+ """True if none of the given functions return True.
+
+ ```python
+ >>> is_valid = none_of(is_empty, exceeds_size_limit, contains_unsupported_format)
+ >>> is_valid(submission) == True
+ ```
+ """
+ return compose(_AnyOf(fns), not_)
+
+
def identity[T](x: T) -> T:
"""Basic identity function.
diff --git a/tests/conftest.py b/tests/conftest.py
index 5e36d26..1daf68e 100644
--- a/tests/conftest.py
+++ b/tests/conftest.py
@@ -1,8 +1,12 @@
+import asyncio
+from pathlib import Path
from typing import Any, Self
from src.danom import safe, safe_method
from src.danom._result import Result
+REPO_ROOT = Path(__file__).parents[1]
+
def add[T](a: T, b: T) -> T:
return a + b
@@ -12,18 +16,35 @@ def has_len(value: str) -> bool:
return len(value) > 0
-def add_one(x: int) -> int:
+def add_one[T](x: T) -> T:
return x + 1
-def divisible_by_3(x: int) -> bool:
+def double[T](x: T) -> T:
+ return x * 2
+
+
+def divisible_by_3[T](x: float) -> bool:
return x % 3 == 0
-def divisible_by_5(x: int) -> bool:
+def divisible_by_5(x: float) -> bool:
return x % 5 == 0
+def lt_10(x: float) -> bool:
+ return x < 10 # noqa: PLR2004
+
+
+async def async_is_file(path: Path) -> bool:
+ return path.is_file()
+
+
+async def async_read_text(path: str) -> str:
+ loop = asyncio.get_running_loop()
+ return await loop.run_in_executor(None, Path(path).read_text)
+
+
@safe
def safe_add(a: int, b: int) -> Result[int, Exception]:
return a + b
diff --git a/tests/mock_data/__init__.py b/tests/mock_data/__init__.py
new file mode 100644
index 0000000..e69de29
diff --git a/tests/mock_data/dir_should_skip/__init__.py b/tests/mock_data/dir_should_skip/__init__.py
new file mode 100644
index 0000000..e69de29
diff --git a/tests/mock_data/file_a.py b/tests/mock_data/file_a.py
new file mode 100644
index 0000000..7d4290a
--- /dev/null
+++ b/tests/mock_data/file_a.py
@@ -0,0 +1 @@
+x = 1
diff --git a/tests/mock_data/file_b.py b/tests/mock_data/file_b.py
new file mode 100644
index 0000000..47643d4
--- /dev/null
+++ b/tests/mock_data/file_b.py
@@ -0,0 +1 @@
+y = 2
diff --git a/tests/mock_data/file_c.py b/tests/mock_data/file_c.py
new file mode 100644
index 0000000..b67ca6a
--- /dev/null
+++ b/tests/mock_data/file_c.py
@@ -0,0 +1 @@
+z = 3
diff --git a/tests/test_err.py b/tests/test_err.py
index e572740..2369178 100644
--- a/tests/test_err.py
+++ b/tests/test_err.py
@@ -7,12 +7,17 @@
("monad", "expected_result", "expected_context"),
[
pytest.param(
- Err((), TypeError("should raise this")),
+ Err(input_args=(), error=TypeError("should raise this")),
None,
pytest.raises(TypeError),
),
pytest.param(
- Err((), ValueError("should raise this")),
+ Err(input_args=(), error=ValueError("should raise this")),
+ None,
+ pytest.raises(ValueError),
+ ),
+ pytest.param(
+ Err("some other err representation"),
None,
pytest.raises(ValueError),
),
@@ -26,7 +31,7 @@ def test_err_unwrap(monad, expected_result, expected_context):
@pytest.mark.parametrize(
("monad", "expected_result"),
[
- pytest.param(Err((), TypeError("should raise this")), False),
+ pytest.param(Err(input_args=(), error=TypeError("should raise this")), False),
],
)
def test_err_is_ok(monad, expected_result):
diff --git a/tests/test_safe.py b/tests/test_safe.py
index 913fbf2..c37298e 100644
--- a/tests/test_safe.py
+++ b/tests/test_safe.py
@@ -37,14 +37,14 @@ def test_invalid_safe_pipeline_with_match():
def test_valid_safe_method_pipeline():
cls = Adder()
- res = cls.add(2, 2).add(2, 2).add(2, 2)
+ res = cls.add(2, 2).and_then(lambda cls: cls.add(2, 2)).and_then(lambda cls: cls.add(2, 2))
assert res.is_ok()
assert res.unwrap().result == 12
def test_invalid_safe_method_pipeline():
cls = Adder()
- res = cls.add(2, 2).cls_raises().add(2, 2)
+ res = cls.add(2, 2).and_then(lambda cls: cls.cls_raises()).and_then(lambda cls: cls.add(2, 2))
assert not res.is_ok()
with pytest.raises(ValueError):
res.unwrap()
diff --git a/tests/test_stream.py b/tests/test_stream.py
index 3a2f514..cb93532 100644
--- a/tests/test_stream.py
+++ b/tests/test_stream.py
@@ -1,7 +1,17 @@
+from pathlib import Path
+
import pytest
from src.danom import Stream
-from tests.conftest import add, add_one, divisible_by_3, divisible_by_5
+from tests.conftest import (
+ REPO_ROOT,
+ add,
+ add_one,
+ async_is_file,
+ async_read_text,
+ divisible_by_3,
+ divisible_by_5,
+)
@pytest.mark.parametrize(
@@ -18,28 +28,28 @@ def test_stream_pipeline(it, expected_part1, expected_part2):
assert part2.collect() == expected_part2
-def test_stream_with_multiple_fns():
- assert (
- Stream.from_iterable(range(10))
- .map(lambda x: x * 2, lambda x: x + 1)
- .filter(lambda x: x % 5 == 0, lambda x: x < 10)
- .collect()
- ) == (5,)
-
-
@pytest.mark.parametrize(
- ("it", "n_workers", "expected_result"),
+ ("it", "expected_result"),
[
- pytest.param(range(15), 4, (15,)),
- pytest.param(13, -1, (15,)),
+ pytest.param(range(30), (15, 30), id="works with iterator"),
+ pytest.param(28, (30,), id="works with single value"),
],
)
-def test_par_collect(it, n_workers, expected_result):
+@pytest.mark.parametrize(
+ ("collect_fn", "kwargs"),
+ [
+ pytest.param("collect", {}, id="simple `collect`"),
+ pytest.param("par_collect", {"workers": 4}, id="`par_collect` with workers passed in"),
+ pytest.param("par_collect", {"workers": -1}, id="`par_collect` with n-1 workers"),
+ pytest.param("par_collect", {"use_threads": True}, id="`par_collect` with threads True"),
+ ],
+)
+def test_collect_methods(it, collect_fn, kwargs, expected_result):
assert (
- Stream.from_iterable(it)
- .map(add_one, add_one)
- .filter(divisible_by_3, divisible_by_5)
- .par_collect(workers=n_workers)
+ getattr(
+ Stream.from_iterable(it).map(add_one, add_one).filter(divisible_by_3, divisible_by_5),
+ collect_fn,
+ )(**kwargs)
== expected_result
)
@@ -57,7 +67,33 @@ def test_stream_to_par_stream():
[
pytest.param(range(10), 0, add, 1, 45),
pytest.param(range(10), 0, add, 4, 45),
+ pytest.param(range(10), 5, add, 4, 50),
],
)
def test_fold(starting, initial, fn, workers, expected_result):
assert Stream.from_iterable(starting).fold(initial, fn, workers=workers) == expected_result
+
+
+@pytest.mark.asyncio
+async def test_async_collect():
+ assert await Stream.from_iterable(
+ sorted(Path(f"{REPO_ROOT}/tests/mock_data").glob("*"))
+ ).filter(async_is_file).map(async_read_text).async_collect() == (
+ "",
+ "x = 1\n",
+ "y = 2\n",
+ "z = 3\n",
+ )
+
+
+@pytest.mark.asyncio
+async def test_async_collect_no_fns():
+ assert await Stream.from_iterable(
+ sorted(Path(f"{REPO_ROOT}/tests/mock_data").glob("*"))
+ ).async_collect() == (
+ Path(f"{REPO_ROOT}/tests/mock_data/__init__.py"),
+ Path(f"{REPO_ROOT}/tests/mock_data/dir_should_skip"),
+ Path(f"{REPO_ROOT}/tests/mock_data/file_a.py"),
+ Path(f"{REPO_ROOT}/tests/mock_data/file_b.py"),
+ Path(f"{REPO_ROOT}/tests/mock_data/file_c.py"),
+ )
diff --git a/tests/test_utils.py b/tests/test_utils.py
index 38baeee..4191193 100644
--- a/tests/test_utils.py
+++ b/tests/test_utils.py
@@ -1,7 +1,7 @@
import pytest
from src.danom import Ok, compose, identity, invert
-from src.danom._utils import all_of, any_of
+from src.danom._utils import all_of, any_of, none_of
from tests.conftest import add_one, divisible_by_3, divisible_by_5, has_len
@@ -41,6 +41,20 @@ def test_any_of(inp_args, fns, expected_result):
assert any_of(*fns)(inp_args) == expected_result
+@pytest.mark.parametrize(
+ ("inp_args", "fns", "expected_result"),
+ [
+ pytest.param(3, (divisible_by_3, divisible_by_5), False),
+ pytest.param(5, (divisible_by_3, divisible_by_5), False),
+ pytest.param(15, (divisible_by_3, divisible_by_5), False),
+ pytest.param(7, (divisible_by_3, divisible_by_5), True),
+ pytest.param(13, (divisible_by_3, divisible_by_5), True),
+ ],
+)
+def test_none_of(inp_args, fns, expected_result):
+ assert none_of(*fns)(inp_args) == expected_result
+
+
@pytest.mark.parametrize(
"x",
[
diff --git a/uv.lock b/uv.lock
index 45d5f0b..f2ee067 100644
--- a/uv.lock
+++ b/uv.lock
@@ -189,7 +189,7 @@ wheels = [
[[package]]
name = "danom"
-version = "0.6.0"
+version = "0.7.0"
source = { editable = "." }
dependencies = [
{ name = "attrs" },
@@ -201,6 +201,8 @@ dev = [
{ name = "ipykernel" },
{ name = "pre-commit" },
{ name = "pytest" },
+ { name = "pytest-asyncio" },
+ { name = "pytest-codspeed" },
{ name = "pytest-cov" },
{ name = "repo-mapper-rs" },
{ name = "ruff" },
@@ -215,6 +217,8 @@ dev = [
{ name = "ipykernel", specifier = ">=7.1.0" },
{ name = "pre-commit", specifier = ">=4.5.0" },
{ name = "pytest", specifier = ">=9.0.1" },
+ { name = "pytest-asyncio", specifier = ">=1.3.0" },
+ { name = "pytest-codspeed", specifier = ">=4.2.0" },
{ name = "pytest-cov", specifier = ">=7.0.0" },
{ name = "repo-mapper-rs", specifier = ">=0.3.0" },
{ name = "ruff", specifier = ">=0.14.6" },
@@ -405,6 +409,18 @@ wheels = [
{ url = "https://files.pythonhosted.org/packages/e7/e7/80988e32bf6f73919a113473a604f5a8f09094de312b9d52b79c2df7612b/jupyter_core-5.9.1-py3-none-any.whl", hash = "sha256:ebf87fdc6073d142e114c72c9e29a9d7ca03fad818c5d300ce2adc1fb0743407", size = 29032, upload-time = "2025-10-16T19:19:16.783Z" },
]
+[[package]]
+name = "markdown-it-py"
+version = "4.0.0"
+source = { registry = "https://pypi.org/simple" }
+dependencies = [
+ { name = "mdurl" },
+]
+sdist = { url = "https://files.pythonhosted.org/packages/5b/f5/4ec618ed16cc4f8fb3b701563655a69816155e79e24a17b651541804721d/markdown_it_py-4.0.0.tar.gz", hash = "sha256:cb0a2b4aa34f932c007117b194e945bd74e0ec24133ceb5bac59009cda1cb9f3", size = 73070, upload-time = "2025-08-11T12:57:52.854Z" }
+wheels = [
+ { url = "https://files.pythonhosted.org/packages/94/54/e7d793b573f298e1c9013b8c4dade17d481164aa517d1d7148619c2cedbf/markdown_it_py-4.0.0-py3-none-any.whl", hash = "sha256:87327c59b172c5011896038353a81343b6754500a08cd7a4973bb48c6d578147", size = 87321, upload-time = "2025-08-11T12:57:51.923Z" },
+]
+
[[package]]
name = "matplotlib-inline"
version = "0.2.1"
@@ -417,6 +433,15 @@ wheels = [
{ url = "https://files.pythonhosted.org/packages/af/33/ee4519fa02ed11a94aef9559552f3b17bb863f2ecfe1a35dc7f548cde231/matplotlib_inline-0.2.1-py3-none-any.whl", hash = "sha256:d56ce5156ba6085e00a9d54fead6ed29a9c47e215cd1bba2e976ef39f5710a76", size = 9516, upload-time = "2025-10-23T09:00:20.675Z" },
]
+[[package]]
+name = "mdurl"
+version = "0.1.2"
+source = { registry = "https://pypi.org/simple" }
+sdist = { url = "https://files.pythonhosted.org/packages/d6/54/cfe61301667036ec958cb99bd3efefba235e65cdeb9c84d24a8293ba1d90/mdurl-0.1.2.tar.gz", hash = "sha256:bb413d29f5eea38f31dd4754dd7377d4465116fb207585f97bf925588687c1ba", size = 8729, upload-time = "2022-08-14T12:40:10.846Z" }
+wheels = [
+ { url = "https://files.pythonhosted.org/packages/b3/38/89ba8ad64ae25be8de66a6d463314cf1eb366222074cfda9ee839c56a4b4/mdurl-0.1.2-py3-none-any.whl", hash = "sha256:84008a41e51615a49fc9966191ff91509e3c40b939176e643fd50a5c2196b8f8", size = 9979, upload-time = "2022-08-14T12:40:09.779Z" },
+]
+
[[package]]
name = "nest-asyncio"
version = "1.6.0"
@@ -589,6 +614,41 @@ wheels = [
{ url = "https://files.pythonhosted.org/packages/0b/8b/6300fb80f858cda1c51ffa17075df5d846757081d11ab4aa35cef9e6258b/pytest-9.0.1-py3-none-any.whl", hash = "sha256:67be0030d194df2dfa7b556f2e56fb3c3315bd5c8822c6951162b92b32ce7dad", size = 373668, upload-time = "2025-11-12T13:05:07.379Z" },
]
+[[package]]
+name = "pytest-asyncio"
+version = "1.3.0"
+source = { registry = "https://pypi.org/simple" }
+dependencies = [
+ { name = "pytest" },
+ { name = "typing-extensions", marker = "python_full_version < '3.13'" },
+]
+sdist = { url = "https://files.pythonhosted.org/packages/90/2c/8af215c0f776415f3590cac4f9086ccefd6fd463befeae41cd4d3f193e5a/pytest_asyncio-1.3.0.tar.gz", hash = "sha256:d7f52f36d231b80ee124cd216ffb19369aa168fc10095013c6b014a34d3ee9e5", size = 50087, upload-time = "2025-11-10T16:07:47.256Z" }
+wheels = [
+ { url = "https://files.pythonhosted.org/packages/e5/35/f8b19922b6a25bc0880171a2f1a003eaeb93657475193ab516fd87cac9da/pytest_asyncio-1.3.0-py3-none-any.whl", hash = "sha256:611e26147c7f77640e6d0a92a38ed17c3e9848063698d5c93d5aa7aa11cebff5", size = 15075, upload-time = "2025-11-10T16:07:45.537Z" },
+]
+
+[[package]]
+name = "pytest-codspeed"
+version = "4.2.0"
+source = { registry = "https://pypi.org/simple" }
+dependencies = [
+ { name = "cffi" },
+ { name = "pytest" },
+ { name = "rich" },
+]
+sdist = { url = "https://files.pythonhosted.org/packages/e2/e8/27fcbe6516a1c956614a4b61a7fccbf3791ea0b992e07416e8948184327d/pytest_codspeed-4.2.0.tar.gz", hash = "sha256:04b5d0bc5a1851ba1504d46bf9d7dbb355222a69f2cd440d54295db721b331f7", size = 113263, upload-time = "2025-10-24T09:02:55.704Z" }
+wheels = [
+ { url = "https://files.pythonhosted.org/packages/81/04/479905bd6653bc981c0554fcce6df52d7ae1594e1eefd53e6cf31810ec7f/pytest_codspeed-4.2.0-cp312-cp312-manylinux2014_aarch64.manylinux_2_17_aarch64.manylinux_2_28_aarch64.whl", hash = "sha256:7d4fefbd4ae401e2c60f6be920a0be50eef0c3e4a1f0a1c83962efd45be38b39", size = 262084, upload-time = "2025-10-24T09:02:43.155Z" },
+ { url = "https://files.pythonhosted.org/packages/d2/46/d6f345d7907bac6cbb6224bd697ecbc11cf7427acc9e843c3618f19e3476/pytest_codspeed-4.2.0-cp312-cp312-manylinux2014_x86_64.manylinux_2_17_x86_64.manylinux_2_28_x86_64.whl", hash = "sha256:309b4227f57fcbb9df21e889ea1ae191d0d1cd8b903b698fdb9ea0461dbf1dfe", size = 251100, upload-time = "2025-10-24T09:02:44.168Z" },
+ { url = "https://files.pythonhosted.org/packages/de/dc/e864f45e994a50390ff49792256f1bdcbf42f170e3bc0470ee1a7d2403f3/pytest_codspeed-4.2.0-cp313-cp313-manylinux2014_aarch64.manylinux_2_17_aarch64.manylinux_2_28_aarch64.whl", hash = "sha256:72aab8278452a6d020798b9e4f82780966adb00f80d27a25d1274272c54630d5", size = 262057, upload-time = "2025-10-24T09:02:45.791Z" },
+ { url = "https://files.pythonhosted.org/packages/1d/1c/f1d2599784486879cf6579d8d94a3e22108f0e1f130033dab8feefd29249/pytest_codspeed-4.2.0-cp313-cp313-manylinux2014_x86_64.manylinux_2_17_x86_64.manylinux_2_28_x86_64.whl", hash = "sha256:684fcd9491d810ded653a8d38de4835daa2d001645f4a23942862950664273f8", size = 251013, upload-time = "2025-10-24T09:02:46.937Z" },
+ { url = "https://files.pythonhosted.org/packages/0c/fd/eafd24db5652a94b4d00fe9b309b607de81add0f55f073afb68a378a24b6/pytest_codspeed-4.2.0-cp314-cp314-manylinux2014_aarch64.manylinux_2_17_aarch64.manylinux_2_28_aarch64.whl", hash = "sha256:50794dabea6ec90d4288904452051e2febace93e7edf4ca9f2bce8019dd8cd37", size = 262065, upload-time = "2025-10-24T09:02:48.018Z" },
+ { url = "https://files.pythonhosted.org/packages/f9/14/8d9340d7dc0ae647991b28a396e16b3403e10def883cde90d6b663d3f7ec/pytest_codspeed-4.2.0-cp314-cp314-manylinux2014_x86_64.manylinux_2_17_x86_64.manylinux_2_28_x86_64.whl", hash = "sha256:a0ebd87f2a99467a1cfd8e83492c4712976e43d353ee0b5f71cbb057f1393aca", size = 251057, upload-time = "2025-10-24T09:02:49.102Z" },
+ { url = "https://files.pythonhosted.org/packages/4b/39/48cf6afbca55bc7c8c93c3d4ae926a1068bcce3f0241709db19b078d5418/pytest_codspeed-4.2.0-cp314-cp314t-manylinux2014_aarch64.manylinux_2_17_aarch64.manylinux_2_28_aarch64.whl", hash = "sha256:dbbb2d61b85bef8fc7e2193f723f9ac2db388a48259d981bbce96319043e9830", size = 267983, upload-time = "2025-10-24T09:02:50.558Z" },
+ { url = "https://files.pythonhosted.org/packages/33/86/4407341efb5dceb3e389635749ce1d670542d6ca148bd34f9d5334295faf/pytest_codspeed-4.2.0-cp314-cp314t-manylinux2014_x86_64.manylinux_2_17_x86_64.manylinux_2_28_x86_64.whl", hash = "sha256:748411c832147bfc85f805af78a1ab1684f52d08e14aabe22932bbe46c079a5f", size = 256732, upload-time = "2025-10-24T09:02:51.603Z" },
+ { url = "https://files.pythonhosted.org/packages/25/0e/8cb71fd3ed4ed08c07aec1245aea7bc1b661ba55fd9c392db76f1978d453/pytest_codspeed-4.2.0-py3-none-any.whl", hash = "sha256:e81bbb45c130874ef99aca97929d72682733527a49f84239ba575b5cb843bab0", size = 113726, upload-time = "2025-10-24T09:02:54.785Z" },
+]
+
[[package]]
name = "pytest-cov"
version = "7.0.0"
@@ -718,6 +778,19 @@ wheels = [
{ url = "https://files.pythonhosted.org/packages/1b/14/dedd1b243fdc2a16b291d9b3a664ee593bff997f1dcb5eaffedf5b776ade/repo_mapper_rs-0.3.0-cp313-cp313-win_amd64.whl", hash = "sha256:c0815eff543c46a051b93247e98bad42213333d47415367b4d0716b4a01e88ef", size = 787923, upload-time = "2025-08-13T20:28:37.508Z" },
]
+[[package]]
+name = "rich"
+version = "14.2.0"
+source = { registry = "https://pypi.org/simple" }
+dependencies = [
+ { name = "markdown-it-py" },
+ { name = "pygments" },
+]
+sdist = { url = "https://files.pythonhosted.org/packages/fb/d2/8920e102050a0de7bfabeb4c4614a49248cf8d5d7a8d01885fbb24dc767a/rich-14.2.0.tar.gz", hash = "sha256:73ff50c7c0c1c77c8243079283f4edb376f0f6442433aecb8ce7e6d0b92d1fe4", size = 219990, upload-time = "2025-10-09T14:16:53.064Z" }
+wheels = [
+ { url = "https://files.pythonhosted.org/packages/25/7a/b0178788f8dc6cafce37a212c99565fa1fe7872c70c6c9c1e1a372d9d88f/rich-14.2.0-py3-none-any.whl", hash = "sha256:76bc51fe2e57d2b1be1f96c524b890b816e334ab4c1e45888799bfaab0021edd", size = 243393, upload-time = "2025-10-09T14:16:51.245Z" },
+]
+
[[package]]
name = "ruff"
version = "0.14.6"
@@ -804,6 +877,15 @@ wheels = [
{ url = "https://files.pythonhosted.org/packages/00/c0/8f5d070730d7836adc9c9b6408dec68c6ced86b304a9b26a14df072a6e8c/traitlets-5.14.3-py3-none-any.whl", hash = "sha256:b74e89e397b1ed28cc831db7aea759ba6640cb3de13090ca145426688ff1ac4f", size = 85359, upload-time = "2024-04-19T11:11:46.763Z" },
]
+[[package]]
+name = "typing-extensions"
+version = "4.15.0"
+source = { registry = "https://pypi.org/simple" }
+sdist = { url = "https://files.pythonhosted.org/packages/72/94/1a15dd82efb362ac84269196e94cf00f187f7ed21c242792a923cdb1c61f/typing_extensions-4.15.0.tar.gz", hash = "sha256:0cea48d173cc12fa28ecabc3b837ea3cf6f38c6d1136f85cbaaf598984861466", size = 109391, upload-time = "2025-08-25T13:49:26.313Z" }
+wheels = [
+ { url = "https://files.pythonhosted.org/packages/18/67/36e9267722cc04a6b9f15c7f3441c2363321a3ea07da7ae0c0707beb2a9c/typing_extensions-4.15.0-py3-none-any.whl", hash = "sha256:f0fa19c6845758ab08074a0cfa8b7aecb71c999ca73d62883bc25cc018c4e548", size = 44614, upload-time = "2025-08-25T13:49:24.86Z" },
+]
+
[[package]]
name = "virtualenv"
version = "20.35.4"