Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Wrap signatures onto several lines when function len is over a treshold #831

Open
wants to merge 42 commits into
base: master
Choose a base branch
from
Open
Show file tree
Hide file tree
Changes from 32 commits
Commits
Show all changes
42 commits
Select commit Hold shift + click to select a range
dea121c
Introduce ParsedDocstring.with_linker()/.with_tag()/.combine(). The w…
tristanlatr Oct 25, 2024
95f8f3d
Colorize the signature ourself.
tristanlatr Oct 25, 2024
0b7c76e
Some more adjustments. Make the self param always inline. Try to opti…
tristanlatr Oct 25, 2024
188c410
Add comment
tristanlatr Oct 25, 2024
83d47f7
Fix usage of cache
tristanlatr Oct 25, 2024
ab1cdd1
Fix usages of cache
tristanlatr Oct 25, 2024
eca5ced
Simplify with_linker() and with_tag(). These do not create new parsed…
tristanlatr Oct 25, 2024
ff4269f
Revert "Simplify with_linker() and with_tag(). These do not create ne…
tristanlatr Oct 25, 2024
914e01c
Minor changes not to use lru_cache too much
tristanlatr Oct 25, 2024
cdef965
Try to optimize what I can
tristanlatr Oct 25, 2024
2033d65
Fix mypy
tristanlatr Oct 26, 2024
5396396
Merge branch 'master' into 801-signature-spans
tristanlatr Oct 26, 2024
282250b
Remove unused imports
tristanlatr Oct 26, 2024
6a4de9f
Better implementation of with_linker and with_tag inside a single sub…
tristanlatr Oct 29, 2024
c0f93dc
First attempt to implement relatively smart Expand/Collapse signature…
tristanlatr Oct 29, 2024
da89d7c
Simplify things: don't try to wrap overload signatures. Sphinx doesn'…
tristanlatr Nov 14, 2024
141b211
Get rid of the ParsedStanOnly by using parsed_text_with_css instead.
tristanlatr Nov 14, 2024
a46a3a3
Few simplifications here and there.
tristanlatr Nov 14, 2024
40ac0a6
Use the CSS class 'decorator' for all decorators.
tristanlatr Nov 14, 2024
4172485
Fix various bugs in the implementation.
tristanlatr Nov 14, 2024
7103ce5
Fix pyflakes
tristanlatr Nov 14, 2024
eae961a
Fix format_undocumented_summary returning a tuple of strings instead …
tristanlatr Nov 14, 2024
7c6c6eb
increase the threshold for a function to be rendered in several lines.
tristanlatr Nov 14, 2024
19400ff
Avoid an empty div for decorators when there are no decorators.
tristanlatr Nov 14, 2024
a3ebbdf
Use non breaking spaces in sugnature defs.
tristanlatr Nov 14, 2024
cd257eb
Improve a little bit the rendering of parameter tables that uses very…
tristanlatr Nov 15, 2024
907792a
Get rid of the AnnotationLinker - drop the verbose messages when an a…
tristanlatr Nov 16, 2024
977e5b5
Merge branch 'master' into 801-signature-spans
tristanlatr Nov 16, 2024
91edc51
Change comment
tristanlatr Nov 18, 2024
b504c21
Merge branch '801-signature-spans' of github.com:twisted/pydoctor int…
tristanlatr Nov 18, 2024
25b5e62
Add an environment to build temporalio docs
tristanlatr Nov 21, 2024
bd2de92
Add a bug overload in the google demo
tristanlatr Nov 21, 2024
cc82f10
Apply suggestions from code review
tristanlatr Dec 13, 2024
07fc41d
Merge branch 'master' into 801-signature-spans
tristanlatr Dec 13, 2024
668f4d0
Fix the NotFoundLinker
tristanlatr Dec 13, 2024
7fc2b10
Do not mark overloaded functions with css class .long-signature
tristanlatr Dec 13, 2024
80de043
Remove unused imports
tristanlatr Dec 13, 2024
a9c5bf2
Add readme entries
tristanlatr Dec 13, 2024
6784e4c
Upadate docs tests
tristanlatr Dec 13, 2024
78f73b9
Like back, consider a function long from 88 chars.
tristanlatr Dec 13, 2024
bf7045f
Adjust test again
tristanlatr Dec 13, 2024
a37b028
Update README.rst
tristanlatr Dec 13, 2024
File filter

Filter by extension

Filter by extension


Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
2 changes: 1 addition & 1 deletion .github/workflows/system.yaml
Original file line number Diff line number Diff line change
Expand Up @@ -12,7 +12,7 @@ jobs:

strategy:
matrix:
tox_target: [twisted-apidoc, cpython-summary, python-igraph-apidocs, cpython-apidocs, numpy-apidocs, git-buildpackage-apidocs, pytype-apidocs]
tox_target: [twisted-apidoc, cpython-summary, python-igraph-apidocs, cpython-apidocs, numpy-apidocs, git-buildpackage-apidocs, pytype-apidocs, temporalio-apidocs]

steps:
- uses: actions/checkout@v4
Expand Down
212 changes: 211 additions & 1 deletion docs/google_demo/__init__.py
Original file line number Diff line number Diff line change
Expand Up @@ -34,7 +34,8 @@
https://google.github.io/styleguide/pyguide.html

"""
from typing import List, Union # NOQA
from datetime import timedelta
from typing import Any, Awaitable, Callable, Concatenate, List, Mapping, Optional, Sequence, Union, overload # NOQA
tristanlatr marked this conversation as resolved.
Show resolved Hide resolved

module_level_variable1 = 12345

Expand Down Expand Up @@ -298,3 +299,212 @@ class ExamplePEP526Class:

attr1: str
attr2: int


_not_in_the_demo = object()

@overload
async def overwhelming_overload(
workflow: tuple[Any, Any],
*,
id: str,
task_queue: str,
execution_timeout: Optional[timedelta] = None,
run_timeout: Optional[timedelta] = None,
task_timeout: Optional[timedelta] = None,
id_reuse_policy: _not_in_the_demo = '_not_in_the_demo.WorkflowIDReusePolicy.ALLOW_DUPLICATE',
id_conflict_policy: _not_in_the_demo = '_not_in_the_demo.WorkflowIDConflictPolicy.UNSPECIFIED',
retry_policy: Optional[_not_in_the_demo.RetryPolicy] = None,
cron_schedule: str = "",
memo: Optional[Mapping[str, Any]] = None,
search_attributes: Optional[
Union[
_not_in_the_demo.TypedSearchAttributes,
_not_in_the_demo.SearchAttributes,
]
] = None,
start_delay: Optional[timedelta] = None,
start_signal: Optional[str] = None,
start_signal_args: Sequence[Any] = [],
rpc_metadata: Mapping[str, str] = {},
rpc_timeout: Optional[timedelta] = None,
request_eager_start: bool = False,
) -> tuple[Any, Any]: ...

# Overload for single-param workflow
@overload
async def overwhelming_overload(
workflow: tuple[Any, Any, Any],
arg: Any,
*,
id: str,
task_queue: str,
execution_timeout: Optional[timedelta] = None,
run_timeout: Optional[timedelta] = None,
task_timeout: Optional[timedelta] = None,
id_reuse_policy: _not_in_the_demo.WorkflowIDReusePolicy = '_not_in_the_demo.WorkflowIDReusePolicy.ALLOW_DUPLICATE',
id_conflict_policy: _not_in_the_demo.WorkflowIDConflictPolicy = '_not_in_the_demo.WorkflowIDConflictPolicy.UNSPECIFIED',
retry_policy: Optional[_not_in_the_demo.RetryPolicy] = None,
cron_schedule: str = "",
memo: Optional[Mapping[str, Any]] = None,
search_attributes: Optional[
Union[
_not_in_the_demo.TypedSearchAttributes,
_not_in_the_demo.SearchAttributes,
]
] = None,
start_delay: Optional[timedelta] = None,
start_signal: Optional[str] = None,
start_signal_args: Sequence[Any] = [],
rpc_metadata: Mapping[str, str] = {},
rpc_timeout: Optional[timedelta] = None,
request_eager_start: bool = False,
) -> tuple[Any, Any]: ...

# Overload for multi-param workflow
@overload
async def overwhelming_overload(
workflow: Callable[
Concatenate[Any, Any], Awaitable[Any]
],
*,
args: Sequence[Any],
id: str,
task_queue: str,
execution_timeout: Optional[timedelta] = None,
run_timeout: Optional[timedelta] = None,
task_timeout: Optional[timedelta] = None,
id_reuse_policy: _not_in_the_demo.WorkflowIDReusePolicy = '_not_in_the_demo.WorkflowIDReusePolicy.ALLOW_DUPLICATE',
id_conflict_policy: _not_in_the_demo.WorkflowIDConflictPolicy = '_not_in_the_demo.WorkflowIDConflictPolicy.UNSPECIFIED',
retry_policy: Optional[_not_in_the_demo.RetryPolicy] = None,
cron_schedule: str = "",
memo: Optional[Mapping[str, Any]] = None,
search_attributes: Optional[
Union[
_not_in_the_demo.TypedSearchAttributes,
_not_in_the_demo.SearchAttributes,
]
] = None,
start_delay: Optional[timedelta] = None,
start_signal: Optional[str] = None,
start_signal_args: Sequence[Any] = [],
rpc_metadata: Mapping[str, str] = {},
rpc_timeout: Optional[timedelta] = None,
request_eager_start: bool = False,
) -> tuple[Any, Any]: ...

# Overload for string-name workflow
@overload
async def overwhelming_overload(
workflow: str,
arg: Any = _not_in_the_demo._arg_unset,
*,
args: Sequence[Any] = [],
id: str,
task_queue: str,
result_type: Optional[type] = None,
execution_timeout: Optional[timedelta] = None,
run_timeout: Optional[timedelta] = None,
task_timeout: Optional[timedelta] = None,
id_reuse_policy: _not_in_the_demo.WorkflowIDReusePolicy = '_not_in_the_demo.WorkflowIDReusePolicy.ALLOW_DUPLICATE',
id_conflict_policy: _not_in_the_demo.WorkflowIDConflictPolicy = '_not_in_the_demo.WorkflowIDConflictPolicy.UNSPECIFIED',
retry_policy: Optional[_not_in_the_demo.RetryPolicy] = None,
cron_schedule: str = "",
memo: Optional[Mapping[str, Any]] = None,
search_attributes: Optional[
Union[
_not_in_the_demo.TypedSearchAttributes,
_not_in_the_demo.SearchAttributes,
]
] = None,
start_delay: Optional[timedelta] = None,
start_signal: Optional[str] = None,
start_signal_args: Sequence[Any] = [],
rpc_metadata: Mapping[str, str] = {},
rpc_timeout: Optional[timedelta] = None,
request_eager_start: bool = False,
) -> tuple[Any, Any]: ...

async def overwhelming_overload(
workflow: Union[str, Callable[..., Awaitable[Any]]],
arg: Any = _not_in_the_demo,
*,
args: Sequence[Any] = [],
id: str,
task_queue: str,
result_type: Optional[type] = None,
execution_timeout: Optional[timedelta] = None,
run_timeout: Optional[timedelta] = None,
task_timeout: Optional[timedelta] = None,
id_reuse_policy: _not_in_the_demo.WorkflowIDReusePolicy = '_not_in_the_demo.WorkflowIDReusePolicy.ALLOW_DUPLICATE',
id_conflict_policy: _not_in_the_demo.WorkflowIDConflictPolicy = '_not_in_the_demo.WorkflowIDConflictPolicy.UNSPECIFIED',
retry_policy: Optional[_not_in_the_demo.RetryPolicy] = None,
cron_schedule: str = "",
memo: Optional[Mapping[str, Any]] = None,
search_attributes: Optional[
Union[
_not_in_the_demo.TypedSearchAttributes,
_not_in_the_demo.SearchAttributes,
]
] = None,
start_delay: Optional[timedelta] = None,
start_signal: Optional[str] = None,
start_signal_args: Sequence[Any] = [],
rpc_metadata: Mapping[str, str] = {},
rpc_timeout: Optional[timedelta] = None,
request_eager_start: bool = False,
stack_level: int = 2,
) -> tuple[Any, Any]:
"""
This is a big overload taken from the source code of temporalio sdk for Python.
The types don't make sens: it's only to showcase bigger overload.

Start a workflow and return its handle.

Args:
workflow: String name or class method decorated with
``@workflow.run`` for the workflow to start.
arg: Single argument to the workflow.
args: Multiple arguments to the workflow. Cannot be set if arg is.
id: Unique identifier for the workflow execution.
task_queue: Task queue to run the workflow on.
result_type: For string workflows, this can set the specific result
type hint to deserialize into.
execution_timeout: Total workflow execution timeout including
retries and continue as new.
run_timeout: Timeout of a single workflow run.
task_timeout: Timeout of a single workflow task.
id_reuse_policy: How already-existing IDs are treated.
id_conflict_policy: How already-running workflows of the same ID are
treated. Default is unspecified which effectively means fail the
start attempt. This cannot be set if ``id_reuse_policy`` is set
to terminate if running.
retry_policy: Retry policy for the workflow.
cron_schedule: See https://docs.temporal.io/docs/content/what-is-a-temporal-cron-job/
memo: Memo for the workflow.
search_attributes: Search attributes for the workflow. The
dictionary form of this is deprecated, use
:py:class:`_not_in_the_demo.TypedSearchAttributes`.
start_delay: Amount of time to wait before starting the workflow.
This does not work with ``cron_schedule``.
start_signal: If present, this signal is sent as signal-with-start
instead of traditional workflow start.
start_signal_args: Arguments for start_signal if start_signal
present.
rpc_metadata: Headers used on the RPC call. Keys here override
client-level RPC metadata keys.
rpc_timeout: Optional RPC deadline to set for the RPC call.
request_eager_start: Potentially reduce the latency to start this workflow by
encouraging the server to start it on a local worker running with
this same client.
This is currently experimental.

Returns:
A workflow handle to the started workflow.

Raises:
temporalio.exceptions.WorkflowAlreadyStartedError: Workflow has
already been started.
RPCError: Workflow could not be started for some other reason.
"""
...
tristanlatr marked this conversation as resolved.
Show resolved Hide resolved
59 changes: 12 additions & 47 deletions pydoctor/astbuilder.py
Original file line number Diff line number Diff line change
Expand Up @@ -13,12 +13,14 @@
Type, TypeVar, Union, Set, cast
)

from pydoctor import epydoc2stan, model, node2stan, extensions, linker
from pydoctor.epydoc.markup._pyval_repr import colorize_inline_pyval
from pydoctor import epydoc2stan, model, extensions
from pydoctor.astutils import (is_none_literal, is_typing_annotation, is_using_annotations, is_using_typing_final, node2dottedname, node2fullname,
is__name__equals__main__, unstring_annotation, upgrade_annotation, iterassign, extract_docstring_linenum, infer_type, get_parents,
get_docstring_node, get_assign_docstring_node, unparse, NodeVisitor, Parentage, Str)

class InvalidSignatureParamName(str):
def isidentifier(self) -> bool:
return True

def parseFile(path: Path) -> ast.Module:
"""Parse the contents of a Python source file."""
Expand Down Expand Up @@ -1032,9 +1034,9 @@ def get_default(index: int) -> Optional[ast.expr]:

parameters: List[Parameter] = []
def add_arg(name: str, kind: Any, default: Optional[ast.expr]) -> None:
default_val = Parameter.empty if default is None else _ValueFormatter(default, ctx=func)
default_val = Parameter.empty if default is None else default
# this cast() is safe since we're checking if annotations.get(name) is None first
annotation = Parameter.empty if annotations.get(name) is None else _AnnotationValueFormatter(cast(ast.expr, annotations[name]), ctx=func)
annotation = Parameter.empty if annotations.get(name) is None else cast(ast.expr, annotations[name])
parameters.append(Parameter(name, kind, default=default_val, annotation=annotation))

for index, arg in enumerate(posonlyargs):
Expand All @@ -1056,12 +1058,15 @@ def add_arg(name: str, kind: Any, default: Optional[ast.expr]) -> None:
add_arg(kwarg.arg, Parameter.VAR_KEYWORD, None)

return_type = annotations.get('return')
return_annotation = Parameter.empty if return_type is None or is_none_literal(return_type) else _AnnotationValueFormatter(return_type, ctx=func)
return_annotation = Parameter.empty if return_type is None or is_none_literal(return_type) else return_type
try:
signature = Signature(parameters, return_annotation=return_annotation)
except ValueError as ex:
func.report(f'{func.fullName()} has invalid parameters: {ex}')
signature = Signature()
# Craft an invalid signature that does not look like a function with zero arguments.
signature = Signature(
[Parameter(InvalidSignatureParamName('...'),
kind=Parameter.POSITIONAL_OR_KEYWORD)])

func.annotations = annotations

Expand Down Expand Up @@ -1120,7 +1125,7 @@ def _annotations_from_function(
@param func: The function definition's AST.
@return: Mapping from argument name to annotation.
The name C{return} is used for the return type.
Unannotated arguments are omitted.
Unannotated arguments are still included with a None value.
"""
def _get_all_args() -> Iterator[ast.arg]:
base_args = func.args
Expand Down Expand Up @@ -1153,47 +1158,7 @@ def _get_all_ast_annotations() -> Iterator[Tuple[str, Optional[ast.expr]]]:
value, self.builder.current), self.builder.current)
for name, value in _get_all_ast_annotations()
}

class _ValueFormatter:
"""
Class to encapsulate a python value and translate it to HTML when calling L{repr()} on the L{_ValueFormatter}.
Used for presenting default values of parameters.
"""

def __init__(self, value: ast.expr, ctx: model.Documentable):
self._colorized = colorize_inline_pyval(value)
"""
The colorized value as L{ParsedDocstring}.
"""

self._linker = ctx.docstring_linker
"""
Linker.
"""

def __repr__(self) -> str:
"""
Present the python value as HTML.
Without the englobing <code> tags.
"""
# Using node2stan.node2html instead of flatten(to_stan()).
# This avoids calling flatten() twice,
# but potential XML parser errors caused by XMLString needs to be handled later.
return ''.join(node2stan.node2html(self._colorized.to_node(), self._linker))

class _AnnotationValueFormatter(_ValueFormatter):
"""
Special L{_ValueFormatter} for function annotations.
"""
def __init__(self, value: ast.expr, ctx: model.Function):
super().__init__(value, ctx)
self._linker = linker._AnnotationLinker(ctx)

def __repr__(self) -> str:
"""
Present the annotation wrapped inside <code> tags.
"""
return '<code>%s</code>' % super().__repr__()

DocumentableT = TypeVar('DocumentableT', bound=model.Documentable)

Expand Down
Loading
Loading