Skip to content

Conversation

@MelleKoning
Copy link
Contributor

@MelleKoning MelleKoning commented Aug 30, 2025

ChatWithTree plugin addition.

As this is my first MR for a Gramps plugin - please advice.

Will add a topic in the gramps development forum with the features:
https://gramps.discourse.group/t/chatwithtree-plugin-gramplet/8251

@Nick-Hall
Copy link
Member

You only need to include the ChatWithTree directory and the three files under it in a single commit.

The po sub-directory will be created for you and the template.pot merged into the Weblate translations.

@PQYPLZXHGF
Copy link

PQYPLZXHGF commented Aug 30, 2025

Looking forward to testing thank you for this addon 👍

@Nick-Hall Attempted to test on Windows with the GrampsAIO (All in one) but the GrampsAIO install looks to be missing the required files (see error message below).

ModuleNotFoundError: No module named 'ctypes.wintypes'
....
ImportError: cannot import name 'wintypes' from 'ctypes' (C:\Program Files\GrampsAIO64-6.0.4\lib\library.zip\ctypes_init_.pyc)
....
Message: "Disabling truststore because platform isn't supported"

fyi on windows I had to select the option below first before getting the error, can you make the gramplet more graceful so it displays the gramplet with a message showing that python module litellm needs to be installed

2025-08-31 07_33_46-Addon Manager - Gramps 2025-08-31 07_31_56-Addon Manager - Gramps
Module installation failed.

--- Logging error ---
Traceback (most recent call last):
  File "C:\Program Files\GrampsAIO64-6.0.4\lib\pip\_internal\cli\index_command.py", line 44, in _create_truststore_ssl_context
    from pip._vendor import truststore
  File "C:\Program Files\GrampsAIO64-6.0.4\lib\pip\_vendor\truststore\__init__.py", line 31, in <module>
    from ._api import SSLContext, extract_from_ssl, inject_into_ssl  # noqa: E402
    ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
  File "C:\Program Files\GrampsAIO64-6.0.4\lib\pip\_vendor\truststore\_api.py", line 18, in <module>
    from ._windows import _configure_context, _verify_peercerts_impl
  File "C:\Program Files\GrampsAIO64-6.0.4\lib\pip\_vendor\truststore\_windows.py", line 18, in <module>
    from ctypes.wintypes import (
ModuleNotFoundError: No module named 'ctypes.wintypes'

During handling of the above exception, another exception occurred:

Traceback (most recent call last):
  File "C:\Program Files\GrampsAIO64-6.0.4\lib\pip\_internal\utils\logging.py", line 198, in emit
    self.console.print(renderable, overflow="ignore", crop=False, style=style)
  File "C:\Program Files\GrampsAIO64-6.0.4\lib\pip\_vendor\rich\console.py", line 1697, in print
    with self:
         ^^^^
  File "C:\Program Files\GrampsAIO64-6.0.4\lib\pip\_vendor\rich\console.py", line 870, in __exit__
    self._exit_buffer()
  File "C:\Program Files\GrampsAIO64-6.0.4\lib\pip\_vendor\rich\console.py", line 826, in _exit_buffer
    self._check_buffer()
  File "C:\Program Files\GrampsAIO64-6.0.4\lib\pip\_vendor\rich\console.py", line 2038, in _check_buffer
    self._write_buffer()
  File "C:\Program Files\GrampsAIO64-6.0.4\lib\pip\_vendor\rich\console.py", line 2067, in _write_buffer
    from pip._vendor.rich._win32_console import LegacyWindowsTerm
  File "C:\Program Files\GrampsAIO64-6.0.4\lib\pip\_vendor\rich\_win32_console.py", line 17, in <module>
    from ctypes import Structure, byref, wintypes
ImportError: cannot import name 'wintypes' from 'ctypes' (C:\Program Files\GrampsAIO64-6.0.4\lib\library.zip\ctypes\__init__.pyc)
Call stack:
  File "AIO/__startup__.py", line 133, in run
  File "AIO/console.py", line 25, in run
  File "AIO/__main__.py", line 24, in <module>
  File "C:\Program Files\GrampsAIO64-6.0.4\lib\pip\_internal\cli\main.py", line 80, in main
    return command.main(cmd_args)
  File "C:\Program Files\GrampsAIO64-6.0.4\lib\pip\_internal\cli\base_command.py", line 159, in main
    return self._main(args)
  File "C:\Program Files\GrampsAIO64-6.0.4\lib\pip\_internal\cli\base_command.py", line 238, in _main
    return self._run_wrapper(level_number, options, args)
  File "C:\Program Files\GrampsAIO64-6.0.4\lib\pip\_internal\cli\base_command.py", line 107, in _run_wrapper
    status = _inner_run()
  File "C:\Program Files\GrampsAIO64-6.0.4\lib\pip\_internal\cli\base_command.py", line 98, in _inner_run
    return self.run(options, args)
  File "C:\Program Files\GrampsAIO64-6.0.4\lib\pip\_internal\cli\req_command.py", line 71, in wrapper
    return func(self, options, args)
  File "C:\Program Files\GrampsAIO64-6.0.4\lib\pip\_internal\commands\install.py", line 339, in run
    session = self.get_default_session(options)
  File "C:\Program Files\GrampsAIO64-6.0.4\lib\pip\_internal\cli\index_command.py", line 80, in get_default_session
    self._session = self.enter_context(self._build_session(options))
  File "C:\Program Files\GrampsAIO64-6.0.4\lib\pip\_internal\cli\index_command.py", line 99, in _build_session
    ssl_context = _create_truststore_ssl_context()
  File "C:\Program Files\GrampsAIO64-6.0.4\lib\pip\_internal\cli\index_command.py", line 46, in _create_truststore_ssl_context
    logger.warning("Disabling truststore because platform isn't supported")
  File "AIO/logging/__init__.py", line 1551, in warning
  File "AIO/logging/__init__.py", line 1684, in _log
  File "AIO/logging/__init__.py", line 1700, in handle
  File "AIO/logging/__init__.py", line 1762, in callHandlers
  File "AIO/logging/__init__.py", line 1028, in handle
  File "C:\Program Files\GrampsAIO64-6.0.4\lib\pip\_internal\utils\logging.py", line 200, in emit
    self.handleError(record)
Message: "Disabling truststore because platform isn't supported"
Arguments: ()
--- Logging error ---
Traceback (most recent call last):
  File "C:\Program Files\GrampsAIO64-6.0.4\lib\pip\_internal\utils\logging.py", line 198, in emit
    self.console.print(renderable, overflow="ignore", crop=False, style=style)
  File "C:\Program Files\GrampsAIO64-6.0.4\lib\pip\_vendor\rich\console.py", line 1697, in print
    with self:
         ^^^^
  File "C:\Program Files\GrampsAIO64-6.0.4\lib\pip\_vendor\rich\console.py", line 870, in __exit__
    self._exit_buffer()
  File "C:\Program Files\GrampsAIO64-6.0.4\lib\pip\_vendor\rich\console.py", line 826, in _exit_buffer
    self._check_buffer()
  File "C:\Program Files\GrampsAIO64-6.0.4\lib\pip\_vendor\rich\console.py", line 2038, in _check_buffer
    self._write_buffer()
  File "C:\Program Files\GrampsAIO64-6.0.4\lib\pip\_vendor\rich\console.py", line 2067, in _write_buffer
    from pip._vendor.rich._win32_console import LegacyWindowsTerm
  File "C:\Program Files\GrampsAIO64-6.0.4\lib\pip\_vendor\rich\_win32_console.py", line 17, in <module>
    from ctypes import Structure, byref, wintypes
ImportError: cannot import name 'wintypes' from 'ctypes' (C:\Program Files\GrampsAIO64-6.0.4\lib\library.zip\ctypes\__init__.pyc)
Call stack:
  File "AIO/__startup__.py", line 133, in run
  File "AIO/console.py", line 25, in run
  File "AIO/__main__.py", line 24, in <module>
  File "C:\Program Files\GrampsAIO64-6.0.4\lib\pip\_internal\cli\main.py", line 80, in main
    return command.main(cmd_args)
  File "C:\Program Files\GrampsAIO64-6.0.4\lib\pip\_internal\cli\base_command.py", line 159, in main
    return self._main(args)
  File "C:\Program Files\GrampsAIO64-6.0.4\lib\pip\_internal\cli\base_command.py", line 238, in _main
    return self._run_wrapper(level_number, options, args)
  File "C:\Program Files\GrampsAIO64-6.0.4\lib\pip\_internal\cli\base_command.py", line 107, in _run_wrapper
    status = _inner_run()
  File "C:\Program Files\GrampsAIO64-6.0.4\lib\pip\_internal\cli\base_command.py", line 98, in _inner_run
    return self.run(options, args)
  File "C:\Program Files\GrampsAIO64-6.0.4\lib\pip\_internal\cli\req_command.py", line 71, in wrapper
    return func(self, options, args)
  File "C:\Program Files\GrampsAIO64-6.0.4\lib\pip\_internal\commands\install.py", line 393, in run
    requirement_set = resolver.resolve(
  File "C:\Program Files\GrampsAIO64-6.0.4\lib\pip\_internal\resolution\resolvelib\resolver.py", line 98, in resolve
    result = self._result = resolver.resolve(
  File "C:\Program Files\GrampsAIO64-6.0.4\lib\pip\_vendor\resolvelib\resolvers\resolution.py", line 596, in resolve
    state = resolution.resolve(requirements, max_rounds=max_rounds)
  File "C:\Program Files\GrampsAIO64-6.0.4\lib\pip\_vendor\resolvelib\resolvers\resolution.py", line 429, in resolve
    self._add_to_criteria(self.state.criteria, r, parent=None)
  File "C:\Program Files\GrampsAIO64-6.0.4\lib\pip\_vendor\resolvelib\resolvers\resolution.py", line 150, in _add_to_criteria
    if not criterion.candidates:
  File "C:\Program Files\GrampsAIO64-6.0.4\lib\pip\_vendor\resolvelib\structs.py", line 194, in __bool__
    return bool(self._sequence)
  File "C:\Program Files\GrampsAIO64-6.0.4\lib\pip\_internal\resolution\resolvelib\found_candidates.py", line 165, in __bool__
    self._bool = any(self)
  File "C:\Program Files\GrampsAIO64-6.0.4\lib\pip\_internal\resolution\resolvelib\found_candidates.py", line 149, in <genexpr>
    return (c for c in iterator if id(c) not in self._incompatible_ids)
  File "C:\Program Files\GrampsAIO64-6.0.4\lib\pip\_internal\resolution\resolvelib\found_candidates.py", line 39, in _iter_built
    candidate = func()
  File "C:\Program Files\GrampsAIO64-6.0.4\lib\pip\_internal\resolution\resolvelib\factory.py", line 180, in _make_candidate_from_link
    base: BaseCandidate | None = self._make_base_candidate_from_link(
  File "C:\Program Files\GrampsAIO64-6.0.4\lib\pip\_internal\resolution\resolvelib\factory.py", line 226, in _make_base_candidate_from_link
    self._link_candidate_cache[link] = LinkCandidate(
  File "C:\Program Files\GrampsAIO64-6.0.4\lib\pip\_internal\resolution\resolvelib\candidates.py", line 309, in __init__
    super().__init__(
  File "C:\Program Files\GrampsAIO64-6.0.4\lib\pip\_internal\resolution\resolvelib\candidates.py", line 162, in __init__
    self.dist = self._prepare()
  File "C:\Program Files\GrampsAIO64-6.0.4\lib\pip\_internal\resolution\resolvelib\candidates.py", line 239, in _prepare
    dist = self._prepare_distribution()
  File "C:\Program Files\GrampsAIO64-6.0.4\lib\pip\_internal\resolution\resolvelib\candidates.py", line 320, in _prepare_distribution
    return preparer.prepare_linked_requirement(self._ireq, parallel_builds=True)
  File "C:\Program Files\GrampsAIO64-6.0.4\lib\pip\_internal\operations\prepare.py", line 506, in prepare_linked_requirement
    self._log_preparing_link(req)
  File "C:\Program Files\GrampsAIO64-6.0.4\lib\pip\_internal\operations\prepare.py", line 310, in _log_preparing_link
    logger.info(message, information)
  File "AIO/logging/__init__.py", line 1539, in info
  File "AIO/logging/__init__.py", line 1684, in _log
  File "AIO/logging/__init__.py", line 1700, in handle
  File "AIO/logging/__init__.py", line 1762, in callHandlers
  File "AIO/logging/__init__.py", line 1028, in handle
  File "C:\Program Files\GrampsAIO64-6.0.4\lib\pip\_internal\utils\logging.py", line 200, in emit
    self.handleError(record)
Message: 'Collecting %s'
Arguments: ('litellm',)
--- Logging error ---
Traceback (most recent call last):
  File "C:\Program Files\GrampsAIO64-6.0.4\lib\pip\_internal\utils\logging.py", line 198, in emit
    self.console.print(renderable, overflow="ignore", crop=False, style=style)
  File "C:\Program Files\GrampsAIO64-6.0.4\lib\pip\_vendor\rich\console.py", line 1697, in print
    with self:
         ^^^^
  File "C:\Program Files\GrampsAIO64-6.0.4\lib\pip\_vendor\rich\console.py", line 870, in __exit__
    self._exit_buffer()
  File "C:\Program Files\GrampsAIO64-6.0.4\lib\pip\_vendor\rich\console.py", line 826, in _exit_buffer
    self._check_buffer()
  File "C:\Program Files\GrampsAIO64-6.0.4\lib\pip\_vendor\rich\console.py", line 2038, in _check_buffer
    self._write_buffer()
  File "C:\Program Files\GrampsAIO64-6.0.4\lib\pip\_vendor\rich\console.py", line 2067, in _write_buffer
    from pip._vendor.rich._win32_console import LegacyWindowsTerm
  File "C:\Program Files\GrampsAIO64-6.0.4\lib\pip\_vendor\rich\_win32_console.py", line 17, in <module>
    from ctypes import Structure, byref, wintypes
ImportError: cannot import name 'wintypes' from 'ctypes' (C:\Program Files\GrampsAIO64-6.0.4\lib\library.zip\ctypes\__init__.pyc)
Call stack:
  File "AIO/__startup__.py", line 133, in run
  File "AIO/console.py", line 25, in run
  File "AIO/__main__.py", line 24, in <module>
  File "C:\Program Files\GrampsAIO64-6.0.4\lib\pip\_internal\cli\main.py", line 80, in main
    return command.main(cmd_args)
  File "C:\Program Files\GrampsAIO64-6.0.4\lib\pip\_internal\cli\base_command.py", line 159, in main
    return self._main(args)
  File "C:\Program Files\GrampsAIO64-6.0.4\lib\pip\_internal\cli\base_command.py", line 238, in _main
    return self._run_wrapper(level_number, options, args)
  File "C:\Program Files\GrampsAIO64-6.0.4\lib\pip\_internal\cli\base_command.py", line 107, in _run_wrapper
    status = _inner_run()
  File "C:\Program Files\GrampsAIO64-6.0.4\lib\pip\_internal\cli\base_command.py", line 98, in _inner_run
    return self.run(options, args)
  File "C:\Program Files\GrampsAIO64-6.0.4\lib\pip\_internal\cli\req_command.py", line 71, in wrapper
    return func(self, options, args)
  File "C:\Program Files\GrampsAIO64-6.0.4\lib\pip\_internal\commands\install.py", line 393, in run
    requirement_set = resolver.resolve(
  File "C:\Program Files\GrampsAIO64-6.0.4\lib\pip\_internal\resolution\resolvelib\resolver.py", line 98, in resolve
    result = self._result = resolver.resolve(
  File "C:\Program Files\GrampsAIO64-6.0.4\lib\pip\_vendor\resolvelib\resolvers\resolution.py", line 596, in resolve
    state = resolution.resolve(requirements, max_rounds=max_rounds)
  File "C:\Program Files\GrampsAIO64-6.0.4\lib\pip\_vendor\resolvelib\resolvers\resolution.py", line 429, in resolve
    self._add_to_criteria(self.state.criteria, r, parent=None)
  File "C:\Program Files\GrampsAIO64-6.0.4\lib\pip\_vendor\resolvelib\resolvers\resolution.py", line 150, in _add_to_criteria
    if not criterion.candidates:
  File "C:\Program Files\GrampsAIO64-6.0.4\lib\pip\_vendor\resolvelib\structs.py", line 194, in __bool__
    return bool(self._sequence)
  File "C:\Program Files\GrampsAIO64-6.0.4\lib\pip\_internal\resolution\resolvelib\found_candidates.py", line 165, in __bool__
    self._bool = any(self)
  File "C:\Program Files\GrampsAIO64-6.0.4\lib\pip\_internal\resolution\resolvelib\found_candidates.py", line 149, in <genexpr>
    return (c for c in iterator if id(c) not in self._incompatible_ids)
  File "C:\Program Files\GrampsAIO64-6.0.4\lib\pip\_internal\resolution\resolvelib\found_candidates.py", line 39, in _iter_built
    candidate = func()
  File "C:\Program Files\GrampsAIO64-6.0.4\lib\pip\_internal\resolution\resolvelib\factory.py", line 180, in _make_candidate_from_link
    base: BaseCandidate | None = self._make_base_candidate_from_link(
  File "C:\Program Files\GrampsAIO64-6.0.4\lib\pip\_internal\resolution\resolvelib\factory.py", line 226, in _make_base_candidate_from_link
    self._link_candidate_cache[link] = LinkCandidate(
  File "C:\Program Files\GrampsAIO64-6.0.4\lib\pip\_internal\resolution\resolvelib\candidates.py", line 309, in __init__
    super().__init__(
  File "C:\Program Files\GrampsAIO64-6.0.4\lib\pip\_internal\resolution\resolvelib\candidates.py", line 162, in __init__
    self.dist = self._prepare()
  File "C:\Program Files\GrampsAIO64-6.0.4\lib\pip\_internal\resolution\resolvelib\candidates.py", line 239, in _prepare
    dist = self._prepare_distribution()
  File "C:\Program Files\GrampsAIO64-6.0.4\lib\pip\_internal\resolution\resolvelib\candidates.py", line 320, in _prepare_distribution
    return preparer.prepare_linked_requirement(self._ireq, parallel_builds=True)
  File "C:\Program Files\GrampsAIO64-6.0.4\lib\pip\_internal\operations\prepare.py", line 531, in prepare_linked_requirement
    metadata_dist = self._fetch_metadata_only(req)
  File "C:\Program Files\GrampsAIO64-6.0.4\lib\pip\_internal\operations\prepare.py", line 386, in _fetch_metadata_only
    return self._fetch_metadata_using_link_data_attr(
  File "C:\Program Files\GrampsAIO64-6.0.4\lib\pip\_internal\operations\prepare.py", line 406, in _fetch_metadata_using_link_data_attr
    metadata_file = get_http_url(
  File "C:\Program Files\GrampsAIO64-6.0.4\lib\pip\_internal\operations\prepare.py", line 121, in get_http_url
    from_path, content_type = download(link, temp_dir.path)
  File "C:\Program Files\GrampsAIO64-6.0.4\lib\pip\_internal\network\download.py", line 195, in __call__
    self._process_response(download, resp)
  File "C:\Program Files\GrampsAIO64-6.0.4\lib\pip\_internal\network\download.py", line 204, in _process_response
    chunks = _log_download(
  File "C:\Program Files\GrampsAIO64-6.0.4\lib\pip\_internal\network\download.py", line 74, in _log_download
    logger.info("Downloading %s", logged_url)
  File "AIO/logging/__init__.py", line 1539, in info
  File "AIO/logging/__init__.py", line 1684, in _log
  File "AIO/logging/__init__.py", line 1700, in handle
  File "AIO/logging/__init__.py", line 1762, in callHandlers
  File "AIO/logging/__init__.py", line 1028, in handle
  File "C:\Program Files\GrampsAIO64-6.0.4\lib\pip\_internal\utils\logging.py", line 200, in emit
    self.handleError(record)
Message: 'Downloading %s'
Arguments: ('litellm-1.76.1-py3-none-any.whl.metadata (41 kB)',)
--- Logging error ---
Traceback (most recent call last):
  File "C:\Program Files\GrampsAIO64-6.0.4\lib\pip\_internal\utils\logging.py", line 198, in emit
    self.console.print(renderable, overflow="ignore", crop=False, style=style)
  File "C:\Program Files\GrampsAIO64-6.0.4\lib\pip\_vendor\rich\console.py", line 1697, in print
    with self:
         ^^^^
  File "C:\Program Files\GrampsAIO64-6.0.4\lib\pip\_vendor\rich\console.py", line 870, in __exit__
    self._exit_buffer()
  File "C:\Program Files\GrampsAIO64-6.0.4\lib\pip\_vendor\rich\console.py", line 826, in _exit_buffer
    self._check_buffer()
  File "C:\Program Files\GrampsAIO64-6.0.4\lib\pip\_vendor\rich\console.py", line 2038, in _check_buffer
    self._write_buffer()
  File "C:\Program Files\GrampsAIO64-6.0.4\lib\pip\_vendor\rich\console.py", line 2067, in _write_buffer
    from pip._vendor.rich._win32_console import LegacyWindowsTerm
  File "C:\Program Files\GrampsAIO64-6.0.4\lib\pip\_vendor\rich\_win32_console.py", line 17, in <module>
    from ctypes import Structure, byref, wintypes
ImportError: cannot import name 'wintypes' from 'ctypes' (C:\Program Files\GrampsAIO64-6.0.4\lib\library.zip\ctypes\__init__.pyc)
Call stack:
  File "AIO/__startup__.py", line 133, in run
  File "AIO/console.py", line 25, in run
  File "AIO/__main__.py", line 24, in <module>
  File "C:\Program Files\GrampsAIO64-6.0.4\lib\pip\_internal\cli\main.py", line 80, in main
    return command.main(cmd_args)
  File "C:\Program Files\GrampsAIO64-6.0.4\lib\pip\_internal\cli\base_command.py", line 159, in main
    return self._main(args)
  File "C:\Program Files\GrampsAIO64-6.0.4\lib\pip\_internal\cli\base_command.py", line 238, in _main
    return self._run_wrapper(level_number, options, args)
  File "C:\Program Files\GrampsAIO64-6.0.4\lib\pip\_internal\cli\base_command.py", line 107, in _run_wrapper
    status = _inner_run()
  File "C:\Program Files\GrampsAIO64-6.0.4\lib\pip\_internal\cli\base_command.py", line 98, in _inner_run
    return self.run(options, args)
  File "C:\Program Files\GrampsAIO64-6.0.4\lib\pip\_internal\cli\req_command.py", line 71, in wrapper
    return func(self, options, args)
  File "C:\Program Files\GrampsAIO64-6.0.4\lib\pip\_internal\commands\install.py", line 393, in run
    requirement_set = resolver.resolve(
  File "C:\Program Files\GrampsAIO64-6.0.4\lib\pip\_internal\resolution\resolvelib\resolver.py", line 98, in resolve
    result = self._result = resolver.resolve(
  File "C:\Program Files\GrampsAIO64-6.0.4\lib\pip\_vendor\resolvelib\resolvers\resolution.py", line 596, in resolve
    state = resolution.resolve(requirements, max_rounds=max_rounds)
  File "C:\Program Files\GrampsAIO64-6.0.4\lib\pip\_vendor\resolvelib\resolvers\resolution.py", line 508, in resolve
    failure_criterion = self._attempt_to_pin_criterion(name)
  File "C:\Program Files\GrampsAIO64-6.0.4\lib\pip\_vendor\resolvelib\resolvers\resolution.py", line 220, in _attempt_to_pin_criterion
    criteria = self._get_updated_criteria(candidate)
  File "C:\Program Files\GrampsAIO64-6.0.4\lib\pip\_vendor\resolvelib\resolvers\resolution.py", line 211, in _get_updated_criteria
    self._add_to_criteria(criteria, requirement, parent=candidate)
  File "C:\Program Files\GrampsAIO64-6.0.4\lib\pip\_vendor\resolvelib\resolvers\resolution.py", line 150, in _add_to_criteria
    if not criterion.candidates:
  File "C:\Program Files\GrampsAIO64-6.0.4\lib\pip\_vendor\resolvelib\structs.py", line 194, in __bool__
    return bool(self._sequence)
  File "C:\Program Files\GrampsAIO64-6.0.4\lib\pip\_internal\resolution\resolvelib\found_candidates.py", line 165, in __bool__
    self._bool = any(self)
  File "C:\Program Files\GrampsAIO64-6.0.4\lib\pip\_internal\resolution\resolvelib\found_candidates.py", line 149, in <genexpr>
    return (c for c in iterator if id(c) not in self._incompatible_ids)
  File "C:\Program Files\GrampsAIO64-6.0.4\lib\pip\_internal\resolution\resolvelib\found_candidates.py", line 39, in _iter_built
    candidate = func()
  File "C:\Program Files\GrampsAIO64-6.0.4\lib\pip\_internal\resolution\resolvelib\factory.py", line 180, in _make_candidate_from_link
    base: BaseCandidate | None = self._make_base_candidate_from_link(
  File "C:\Program Files\GrampsAIO64-6.0.4\lib\pip\_internal\resolution\resolvelib\factory.py", line 226, in _make_base_candidate_from_link
    self._link_candidate_cache[link] = LinkCandidate(
  File "C:\Program Files\GrampsAIO64-6.0.4\lib\pip\_internal\resolution\resolvelib\candidates.py", line 309, in __init__
    super().__init__(
  File "C:\Program Files\GrampsAIO64-6.0.4\lib\pip\_internal\resolution\resolvelib\candidates.py", line 162, in __init__
    self.dist = self._prepare()
  File "C:\Program Files\GrampsAIO64-6.0.4\lib\pip\_internal\resolution\resolvelib\candidates.py", line 239, in _prepare
    dist = self._prepare_distribution()
  File "C:\Program Files\GrampsAIO64-6.0.4\lib\pip\_internal\resolution\resolvelib\candidates.py", line 320, in _prepare_distribution
    return preparer.prepare_linked_requirement(self._ireq, parallel_builds=True)
  File "C:\Program Files\GrampsAIO64-6.0.4\lib\pip\_internal\operations\prepare.py", line 506, in prepare_linked_requirement
    self._log_preparing_link(req)
  File "C:\Program Files\GrampsAIO64-6.0.4\lib\pip\_internal\operations\prepare.py", line 310, in _log_preparing_link
    logger.info(message, information)
  File "AIO/logging/__init__.py", line 1539, in info
  File "AIO/logging/__init__.py", line 1684, in _log
  File "AIO/logging/__init__.py", line 1700, in handle
  File "AIO/logging/__init__.py", line 1762, in callHandlers
  File "AIO/logging/__init__.py", line 1028, in handle
  File "C:\Program Files\GrampsAIO64-6.0.4\lib\pip\_internal\utils\logging.py", line 200, in emit
    self.handleError(record)
Message: 'Collecting %s'
Arguments: ('aiohttp>=3.10 (from litellm)',)
--- Logging error ---
Traceback (most recent call last):
  File "C:\Program Files\GrampsAIO64-6.0.4\lib\pip\_internal\utils\logging.py", line 198, in emit
    self.console.print(renderable, overflow="ignore", crop=False, style=style)
  File "C:\Program Files\GrampsAIO64-6.0.4\lib\pip\_vendor\rich\console.py", line 1697, in print
    with self:
         ^^^^
  File "C:\Program Files\GrampsAIO64-6.0.4\lib\pip\_vendor\rich\console.py", line 870, in __exit__
    self._exit_buffer()
  File "C:\Program Files\GrampsAIO64-6.0.4\lib\pip\_vendor\rich\console.py", line 826, in _exit_buffer
    self._check_buffer()
  File "C:\Program Files\GrampsAIO64-6.0.4\lib\pip\_vendor\rich\console.py", line 2038, in _check_buffer
    self._write_buffer()
  File "C:\Program Files\GrampsAIO64-6.0.4\lib\pip\_vendor\rich\console.py", line 2067, in _write_buffer
    from pip._vendor.rich._win32_console import LegacyWindowsTerm
  File "C:\Program Files\GrampsAIO64-6.0.4\lib\pip\_vendor\rich\_win32_console.py", line 17, in <module>
    from ctypes import Structure, byref, wintypes
ImportError: cannot import name 'wintypes' from 'ctypes' (C:\Program Files\GrampsAIO64-6.0.4\lib\library.zip\ctypes\__init__.pyc)
Call stack:
  File "AIO/__startup__.py", line 133, in run
  File "AIO/console.py", line 25, in run
  File "AIO/__main__.py", line 24, in <module>
  File "C:\Program Files\GrampsAIO64-6.0.4\lib\pip\_internal\cli\main.py", line 80, in main
    return command.main(cmd_args)
  File "C:\Program Files\GrampsAIO64-6.0.4\lib\pip\_internal\cli\base_command.py", line 159, in main
    return self._main(args)
  File "C:\Program Files\GrampsAIO64-6.0.4\lib\pip\_internal\cli\base_command.py", line 238, in _main
    return self._run_wrapper(level_number, options, args)
  File "C:\Program Files\GrampsAIO64-6.0.4\lib\pip\_internal\cli\base_command.py", line 107, in _run_wrapper
    status = _inner_run()
  File "C:\Program Files\GrampsAIO64-6.0.4\lib\pip\_internal\cli\base_command.py", line 98, in _inner_run
    return self.run(options, args)
  File "C:\Program Files\GrampsAIO64-6.0.4\lib\pip\_internal\cli\req_command.py", line 71, in wrapper
    return func(self, options, args)
  File "C:\Program Files\GrampsAIO64-6.0.4\lib\pip\_internal\commands\install.py", line 393, in run
    requirement_set = resolver.resolve(
  File "C:\Program Files\GrampsAIO64-6.0.4\lib\pip\_internal\resolution\resolvelib\resolver.py", line 98, in resolve
    result = self._result = resolver.resolve(
  File "C:\Program Files\GrampsAIO64-6.0.4\lib\pip\_vendor\resolvelib\resolvers\resolution.py", line 596, in resolve
    state = resolution.resolve(requirements, max_rounds=max_rounds)
  File "C:\Program Files\GrampsAIO64-6.0.4\lib\pip\_vendor\resolvelib\resolvers\resolution.py", line 508, in resolve
    failure_criterion = self._attempt_to_pin_criterion(name)
  File "C:\Program Files\GrampsAIO64-6.0.4\lib\pip\_vendor\resolvelib\resolvers\resolution.py", line 220, in _attempt_to_pin_criterion
    criteria = self._get_updated_criteria(candidate)
  File "C:\Program Files\GrampsAIO64-6.0.4\lib\pip\_vendor\resolvelib\resolvers\resolution.py", line 211, in _get_updated_criteria
    self._add_to_criteria(criteria, requirement, parent=candidate)
  File "C:\Program Files\GrampsAIO64-6.0.4\lib\pip\_vendor\resolvelib\resolvers\resolution.py", line 150, in _add_to_criteria
    if not criterion.candidates:
  File "C:\Program Files\GrampsAIO64-6.0.4\lib\pip\_vendor\resolvelib\structs.py", line 194, in __bool__
    return bool(self._sequence)
  File "C:\Program Files\GrampsAIO64-6.0.4\lib\pip\_internal\resolution\resolvelib\found_candidates.py", line 165, in __bool__
    self._bool = any(self)
  File "C:\Program Files\GrampsAIO64-6.0.4\lib\pip\_internal\resolution\resolvelib\found_candidates.py", line 149, in <genexpr>
    return (c for c in iterator if id(c) not in self._incompatible_ids)
  File "C:\Program Files\GrampsAIO64-6.0.4\lib\pip\_internal\resolution\resolvelib\found_candidates.py", line 39, in _iter_built
    candidate = func()
  File "C:\Program Files\GrampsAIO64-6.0.4\lib\pip\_internal\resolution\resolvelib\factory.py", line 180, in _make_candidate_from_link
    base: BaseCandidate | None = self._make_base_candidate_from_link(
  File "C:\Program Files\GrampsAIO64-6.0.4\lib\pip\_internal\resolution\resolvelib\factory.py", line 226, in _make_base_candidate_from_link
    self._link_candidate_cache[link] = LinkCandidate(
  File "C:\Program Files\GrampsAIO64-6.0.4\lib\pip\_internal\resolution\resolvelib\candidates.py", line 309, in __init__
    super().__init__(
  File "C:\Program Files\GrampsAIO64-6.0.4\lib\pip\_internal\resolution\resolvelib\candidates.py", line 162, in __init__
    self.dist = self._prepare()
  File "C:\Program Files\GrampsAIO64-6.0.4\lib\pip\_internal\resolution\resolvelib\candidates.py", line 239, in _prepare
    dist = self._prepare_distribution()
  File "C:\Program Files\GrampsAIO64-6.0.4\lib\pip\_internal\resolution\resolvelib\candidates.py", line 320, in _prepare_distribution
    return preparer.prepare_linked_requirement(self._ireq, parallel_builds=True)
  File "C:\Program Files\GrampsAIO64-6.0.4\lib\pip\_internal\operations\prepare.py", line 537, in prepare_linked_requirement
    return self._prepare_linked_requirement(req, parallel_builds)
  File "C:\Program Files\GrampsAIO64-6.0.4\lib\pip\_internal\operations\prepare.py", line 608, in _prepare_linked_requirement
    local_file = unpack_url(
  File "C:\Program Files\GrampsAIO64-6.0.4\lib\pip\_internal\operations\prepare.py", line 180, in unpack_url
    file = get_http_url(
  File "C:\Program Files\GrampsAIO64-6.0.4\lib\pip\_internal\operations\prepare.py", line 121, in get_http_url
    from_path, content_type = download(link, temp_dir.path)
  File "C:\Program Files\GrampsAIO64-6.0.4\lib\pip\_internal\network\download.py", line 195, in __call__
    self._process_response(download, resp)
  File "C:\Program Files\GrampsAIO64-6.0.4\lib\pip\_internal\network\download.py", line 204, in _process_response
    chunks = _log_download(
  File "C:\Program Files\GrampsAIO64-6.0.4\lib\pip\_internal\network\download.py", line 74, in _log_download
    logger.info("Downloading %s", logged_url)
  File "AIO/logging/__init__.py", line 1539, in info
  File "AIO/logging/__init__.py", line 1684, in _log
  File "AIO/logging/__init__.py", line 1700, in handle
  File "AIO/logging/__init__.py", line 1762, in callHandlers
  File "AIO/logging/__init__.py", line 1028, in handle
  File "C:\Program Files\GrampsAIO64-6.0.4\lib\pip\_internal\utils\logging.py", line 200, in emit
    self.handleError(record)
Message: 'Downloading %s'
Arguments: ('aiohttp-3.12.15.tar.gz (7.8 MB)',)
--- Logging error ---
Traceback (most recent call last):
  File "C:\Program Files\GrampsAIO64-6.0.4\lib\pip\_internal\cli\base_command.py", line 107, in _run_wrapper
    status = _inner_run()
             ^^^^^^^^^^^^
  File "C:\Program Files\GrampsAIO64-6.0.4\lib\pip\_internal\cli\base_command.py", line 98, in _inner_run
    return self.run(options, args)
           ^^^^^^^^^^^^^^^^^^^^^^^
  File "C:\Program Files\GrampsAIO64-6.0.4\lib\pip\_internal\cli\req_command.py", line 71, in wrapper
    return func(self, options, args)
           ^^^^^^^^^^^^^^^^^^^^^^^^^
  File "C:\Program Files\GrampsAIO64-6.0.4\lib\pip\_internal\commands\install.py", line 393, in run
    requirement_set = resolver.resolve(
                      ^^^^^^^^^^^^^^^^^
  File "C:\Program Files\GrampsAIO64-6.0.4\lib\pip\_internal\resolution\resolvelib\resolver.py", line 98, in resolve
    result = self._result = resolver.resolve(
                            ^^^^^^^^^^^^^^^^^
  File "C:\Program Files\GrampsAIO64-6.0.4\lib\pip\_vendor\resolvelib\resolvers\resolution.py", line 596, in resolve
    state = resolution.resolve(requirements, max_rounds=max_rounds)
            ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
  File "C:\Program Files\GrampsAIO64-6.0.4\lib\pip\_vendor\resolvelib\resolvers\resolution.py", line 508, in resolve
    failure_criterion = self._attempt_to_pin_criterion(name)
                        ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
  File "C:\Program Files\GrampsAIO64-6.0.4\lib\pip\_vendor\resolvelib\resolvers\resolution.py", line 220, in _attempt_to_pin_criterion
    criteria = self._get_updated_criteria(candidate)
               ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
  File "C:\Program Files\GrampsAIO64-6.0.4\lib\pip\_vendor\resolvelib\resolvers\resolution.py", line 211, in _get_updated_criteria
    self._add_to_criteria(criteria, requirement, parent=candidate)
  File "C:\Program Files\GrampsAIO64-6.0.4\lib\pip\_vendor\resolvelib\resolvers\resolution.py", line 150, in _add_to_criteria
    if not criterion.candidates:
           ^^^^^^^^^^^^^^^^^^^^
  File "C:\Program Files\GrampsAIO64-6.0.4\lib\pip\_vendor\resolvelib\structs.py", line 194, in __bool__
    return bool(self._sequence)
           ^^^^^^^^^^^^^^^^^^^^
  File "C:\Program Files\GrampsAIO64-6.0.4\lib\pip\_internal\resolution\resolvelib\found_candidates.py", line 165, in __bool__
    self._bool = any(self)
                 ^^^^^^^^^
  File "C:\Program Files\GrampsAIO64-6.0.4\lib\pip\_internal\resolution\resolvelib\found_candidates.py", line 149, in <genexpr>
    return (c for c in iterator if id(c) not in self._incompatible_ids)
                       ^^^^^^^^
  File "C:\Program Files\GrampsAIO64-6.0.4\lib\pip\_internal\resolution\resolvelib\found_candidates.py", line 39, in _iter_built
    candidate = func()
                ^^^^^^
  File "C:\Program Files\GrampsAIO64-6.0.4\lib\pip\_internal\resolution\resolvelib\factory.py", line 180, in _make_candidate_from_link
    base: BaseCandidate | None = self._make_base_candidate_from_link(
                                 ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
  File "C:\Program Files\GrampsAIO64-6.0.4\lib\pip\_internal\resolution\resolvelib\factory.py", line 226, in _make_base_candidate_from_link
    self._link_candidate_cache[link] = LinkCandidate(
                                       ^^^^^^^^^^^^^^
  File "C:\Program Files\GrampsAIO64-6.0.4\lib\pip\_internal\resolution\resolvelib\candidates.py", line 309, in __init__
    super().__init__(
  File "C:\Program Files\GrampsAIO64-6.0.4\lib\pip\_internal\resolution\resolvelib\candidates.py", line 162, in __init__
    self.dist = self._prepare()
                ^^^^^^^^^^^^^^^
  File "C:\Program Files\GrampsAIO64-6.0.4\lib\pip\_internal\resolution\resolvelib\candidates.py", line 239, in _prepare
    dist = self._prepare_distribution()
           ^^^^^^^^^^^^^^^^^^^^^^^^^^^^
  File "C:\Program Files\GrampsAIO64-6.0.4\lib\pip\_internal\resolution\resolvelib\candidates.py", line 320, in _prepare_distribution
    return preparer.prepare_linked_requirement(self._ireq, parallel_builds=True)
           ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
  File "C:\Program Files\GrampsAIO64-6.0.4\lib\pip\_internal\operations\prepare.py", line 537, in prepare_linked_requirement
    return self._prepare_linked_requirement(req, parallel_builds)
           ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
  File "C:\Program Files\GrampsAIO64-6.0.4\lib\pip\_internal\operations\prepare.py", line 608, in _prepare_linked_requirement
    local_file = unpack_url(
                 ^^^^^^^^^^^
  File "C:\Program Files\GrampsAIO64-6.0.4\lib\pip\_internal\operations\prepare.py", line 180, in unpack_url
    file = get_http_url(
           ^^^^^^^^^^^^^
  File "C:\Program Files\GrampsAIO64-6.0.4\lib\pip\_internal\operations\prepare.py", line 121, in get_http_url
    from_path, content_type = download(link, temp_dir.path)
                              ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
  File "C:\Program Files\GrampsAIO64-6.0.4\lib\pip\_internal\network\download.py", line 195, in __call__
    self._process_response(download, resp)
  File "C:\Program Files\GrampsAIO64-6.0.4\lib\pip\_internal\network\download.py", line 212, in _process_response
    for chunk in chunks:
                 ^^^^^^
  File "C:\Program Files\GrampsAIO64-6.0.4\lib\pip\_internal\cli\progress_bars.py", line 66, in _rich_download_progress_bar
    with progress:
         ^^^^^^^^
  File "C:\Program Files\GrampsAIO64-6.0.4\lib\pip\_vendor\rich\progress.py", line 1189, in __exit__
    self.stop()
  File "C:\Program Files\GrampsAIO64-6.0.4\lib\pip\_vendor\rich\progress.py", line 1175, in stop
    self.live.stop()
  File "C:\Program Files\GrampsAIO64-6.0.4\lib\pip\_vendor\rich\live.py", line 162, in stop
    with self.console:
         ^^^^^^^^^^^^
  File "C:\Program Files\GrampsAIO64-6.0.4\lib\pip\_vendor\rich\console.py", line 870, in __exit__
    self._exit_buffer()
  File "C:\Program Files\GrampsAIO64-6.0.4\lib\pip\_vendor\rich\console.py", line 826, in _exit_buffer
    self._check_buffer()
  File "C:\Program Files\GrampsAIO64-6.0.4\lib\pip\_vendor\rich\console.py", line 2038, in _check_buffer
    self._write_buffer()
  File "C:\Program Files\GrampsAIO64-6.0.4\lib\pip\_vendor\rich\console.py", line 2067, in _write_buffer
    from pip._vendor.rich._win32_console import LegacyWindowsTerm
  File "C:\Program Files\GrampsAIO64-6.0.4\lib\pip\_vendor\rich\_win32_console.py", line 17, in <module>
    from ctypes import Structure, byref, wintypes
ImportError: cannot import name 'wintypes' from 'ctypes' (C:\Program Files\GrampsAIO64-6.0.4\lib\library.zip\ctypes\__init__.pyc)

During handling of the above exception, another exception occurred:

Traceback (most recent call last):
  File "C:\Program Files\GrampsAIO64-6.0.4\lib\pip\_internal\utils\logging.py", line 198, in emit
    self.console.print(renderable, overflow="ignore", crop=False, style=style)
  File "C:\Program Files\GrampsAIO64-6.0.4\lib\pip\_vendor\rich\console.py", line 1697, in print
    with self:
         ^^^^
  File "C:\Program Files\GrampsAIO64-6.0.4\lib\pip\_vendor\rich\console.py", line 870, in __exit__
    self._exit_buffer()
  File "C:\Program Files\GrampsAIO64-6.0.4\lib\pip\_vendor\rich\console.py", line 826, in _exit_buffer
    self._check_buffer()
  File "C:\Program Files\GrampsAIO64-6.0.4\lib\pip\_vendor\rich\console.py", line 2038, in _check_buffer
    self._write_buffer()
  File "C:\Program Files\GrampsAIO64-6.0.4\lib\pip\_vendor\rich\console.py", line 2067, in _write_buffer
    from pip._vendor.rich._win32_console import LegacyWindowsTerm
  File "C:\Program Files\GrampsAIO64-6.0.4\lib\pip\_vendor\rich\_win32_console.py", line 17, in <module>
    from ctypes import Structure, byref, wintypes
ImportError: cannot import name 'wintypes' from 'ctypes' (C:\Program Files\GrampsAIO64-6.0.4\lib\library.zip\ctypes\__init__.pyc)
Call stack:
  File "AIO/__startup__.py", line 133, in run
  File "AIO/console.py", line 25, in run
  File "AIO/__main__.py", line 24, in <module>
  File "C:\Program Files\GrampsAIO64-6.0.4\lib\pip\_internal\cli\main.py", line 80, in main
    return command.main(cmd_args)
  File "C:\Program Files\GrampsAIO64-6.0.4\lib\pip\_internal\cli\base_command.py", line 159, in main
    return self._main(args)
  File "C:\Program Files\GrampsAIO64-6.0.4\lib\pip\_internal\cli\base_command.py", line 238, in _main
    return self._run_wrapper(level_number, options, args)
  File "C:\Program Files\GrampsAIO64-6.0.4\lib\pip\_internal\cli\base_command.py", line 148, in _run_wrapper
    logger.critical("Exception:", exc_info=True)
  File "AIO/logging/__init__.py", line 1586, in critical
  File "AIO/logging/__init__.py", line 1684, in _log
  File "AIO/logging/__init__.py", line 1700, in handle
  File "AIO/logging/__init__.py", line 1762, in callHandlers
  File "AIO/logging/__init__.py", line 1028, in handle
  File "C:\Program Files\GrampsAIO64-6.0.4\lib\pip\_internal\utils\logging.py", line 200, in emit
    self.handleError(record)
Message: 'Exception:'
Arguments: ()

@GaryGriffin
Copy link
Member

Attempted to test on MacOS, which is also a bundled gramps distro. The litellm dependency has about 45 other dependencies. Even after installing all of these into the gramps bundle, still had issues with pydantic_core wheel. Will attempt again when linux testing is successful (by others).

Open issues:

  1. These commits need rebased in order to merge.
  2. Do you really need the make.py change? As @Nick-Hall stated, this PR should probably be just the 5 py files and the template.pot . Weblate is used for all translation work.
  3. There is no header in the python files with the owner and licensing info.
  4. Some of the commits mention that they were created by AI. I am not sure what additional statements/assurances in the header may be needed for these.
  5. Status=UNSTABLE has been deprecated. https://gramps-project.org/wiki/index.php/Gramplets_development#Register_Options
    For v 6.0 on MacOS, if you use UNSTABLE, the gramplet is not visible.

@MelleKoning MelleKoning force-pushed the myaddon60 branch 3 times, most recently from 245242e to 7cfc554 Compare August 31, 2025 10:24
@MelleKoning
Copy link
Contributor Author

MelleKoning commented Aug 31, 2025

Thanks for the swift and extensive comments and the test on windows.

I have adjusted the commit by:

  1. rebasing and squashing the changes
  2. the 'make.py' change was very useful for understanding what module was missing! So yes, extending the output in case an exception occurs during development is very useful (I can comment on this a bit more later, explaining the way I've worked via a python virtual environment)
  3. I was not aware the header files were mandatory for the python files but I have added GNU license header files - copied by the examples of other gramp addons
  4. I definately used AI to help me understand mainly the GTK code, to help out with examples for creating the listbox and addmessagebox funcs for GTK - but all code is manually coded - there is no code blindly copied and unchanged as generated by AI at all - have tested and updated as good as I can.
  5. I have updated the status to EXPERIMENTAL

For the required llm module, the "litellm" is part of the ChatWithTree.gpr.py registration file - in my testing the module is installed and working. Maybe litellm is dependend on other modules that have to be accessible by Python?

Locally, I installed and ran the plugin with the gramps version 6.0.4, the development version.

Locally, I have python 3.12.3 (python3 --version)

As I work in a python virtual environment, this gives me the oppertunity to generate a requirements.txt. These are the modules that can be seen from that requirements.txt file:

orjson==3.11.1
pycairo==1.28.0
PyGObject==3.52.3
pyicu==2.15.2
setuptools==80.9.0

This does not show as much as I thought, so I used another method with pipreqs, and ran that within the ./ChatWithTree subfolder. pipreqs scans the code to find all dependencies.

I now get a huge list, that also includes litellm, this file requirements.txt is in the main addons-source folder, so I think it includes ALL packages needed by all the addons and gramps:

aiohappyeyeballs==2.6.1
aiohttp==3.12.15
aiosignal==1.4.0
annotated-types==0.7.0
anyio==4.10.0
attrs==25.3.0
certifi==2025.8.3
charset-normalizer==3.4.3
click==8.2.1
distro==1.9.0
filelock==3.19.1
frozenlist==1.7.0
fsspec==2025.7.0
-e git+ssh://[email protected]/gramps-project/gramps@285de0745885eee69daec80317fbeadb21c83525#egg=gramps
h11==0.16.0
hf-xet==1.1.7
httpcore==1.0.9
httpx==0.28.1
huggingface-hub==0.34.4
idna==3.10
importlib_metadata==8.7.0
Jinja2==3.1.6
jiter==0.10.0
jsonschema==4.25.0
jsonschema-specifications==2025.4.1
litellm==1.75.7
MarkupSafe==3.0.2
multidict==6.6.4
openai==1.99.9
orjson==3.11.1
packaging==25.0
propcache==0.3.2
pycairo==1.28.0
pydantic==2.11.7
pydantic_core==2.33.2
PyGObject==3.52.3
pyicu==2.15.2
python-dotenv==1.1.1
PyYAML==6.0.2
referencing==0.36.2
regex==2025.7.34
requests==2.32.4
rpds-py==0.27.0
setuptools==80.9.0
sniffio==1.3.1
tiktoken==0.11.0
tokenizers==0.21.4
tqdm==4.67.1
typing-inspection==0.4.1
typing_extensions==4.14.1
urllib3==2.5.0
yarl==1.20.1
zipp==3.23.0

There is also a requirements.txt generated in the /ChatWithTree subfolder only:

gramps==6.0.4
litellm==1.76.1

I do not understand why the litellm version here is different from the one above (1.76.1 vs 1.75.7), but it has far less dependencies.

About Windows: maybe the installed python3 or litellm version, or one of its dependencies, is not matching with the above?

Comment on lines 23 to 34
try:
from typing import Dict, Any, List, Optional, Tuple, Pattern, Iterator
import os
import json
import sys
import time
import re
import inspect
import litellm
except ImportError as e:
LOG.warning(e)
raise Exception("ChatWithTree requires litellm")
Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Putting a try/except around such large blocks is not a good practice, and can hide issues.

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I tend to agree - most of the imports seem to be default modules (os, json, sys, time), but am not entirely sure about inspect, re

As litellm is a requirement, just as the other imports, what is the best alternative suggested:

  • Put a try..except around each import
  • Only have the try..except around the litellm module (as that is what we want to test)
  • Or maybe - don't have the try..except at all

Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Everything but "litellm" here is included with python.

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Have put the try now only right above litellm.

def _llm_loop(self, seed: int) -> Iterator[Tuple[YieldType, str]]:
# Tool-calling loop
final_response = "I was unable to find the desired information."
limit_loop = 6
Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Not sure that will be enough. Maybe make this a parameter, or config setting?

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I found two different issues with a higher number

  1. Original number 10 could cause "same questions asked to the llm" especially with very dumb llms (local ollama llms that are small are not that smart, although they do work with litellm). It was kind of wasting time without making any progress in the roundtrips
  2. Rate limits of remote llms. When doing 10 quick roundtrips to the remote llms you often quickly hit a rate limit, while if we limit it to about 5 calls, and then let the user type a new question, rarely hitting the rate limit of the remote cloud llms

However: Limiting from 10 originally back to max 6 seems good enough, especially with models that are able to instruct to do multiple of the local tool calls at once, that is, the number of tool calls on the local system is not limited - this is important.

As an example - here is a question whereby in the chat there were already plenty people retrieved, and then I asked can you show me all birth and death dates for all these family members?

This is the tool-call balloon in yellow:

image

So - the 6 is about the number of roundtrips to the (remote) language model but not limiting the amount of tool calls to the local database - which can be pretty huge as seen above. Does that make sense?

Nevertheless, yes:

  • I do agree that making this "6" configurable is a good enhancement. I could probably best use the existing "/commands" pattern for that and add this configurable option to the /help page. Would love to do that in an upcoming MR.

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

/setlimit is now implemented

Comment on lines +382 to +385
person_obj = self.db.get_person_from_handle(person_handle)
obj = self.sa.mother(person_obj)
data = dict(self.db.get_raw_person_data(obj.handle))
return data
Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

The SimpleAccess is easy, but there are better ways to do what you want more directly. There may be more than one mother, for example.


if family_handle_list:
family_id = family_handle_list[0]
family = self.db.get_family_from_handle(family_id)
Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Using the raw functions will be much faster for such uses.

#search_pattern = re.compile(re.escape(search_string), re.IGNORECASE)
search_pattern = self.create_search_pattern(search_string)

for person_obj in self.sa.all_people():
Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

This is really slow.

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

For personal use asking for all people with the name "John" it's quick enough - The performance comments probably highly depend on the size of the database? For now only have used the SimpleAccess database entry point.

Enhancing that to use other patterns to ensure that Gramps users with many more data in the database can also use this tool is definately on the to do list.

What area's to look for within gramps to get a better understanding on how to use raw data access as opposed to the SimpleAccess?

Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I'm going to work on a set of functions tonight/tomorrow.

Comment on lines 41 to 46
try:
from ChatWithTreeBot import ChatBot
except ImportError as e:
LOG.warning(e)
raise ImportError("Failed to import ChatBot from chatbot module: " + str(e))

Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

This try/except doesn't do much more than without it.

@MelleKoning
Copy link
Contributor Author

@PQYPLZXHGF

Looks indeed to be a missing library ctypes.wintypes, a standard .py file in the Windows AIO installer.

​The Windows GrampsAIO64-6.0.4 installer's Python environment is missing this standard library in it's library.zip on your machine.

I would not recommend trying to add the python file to that library.zip yourself but it could be a workaround.

This goes beyond this particular addon. If there is an existing script that creates the windows installer we can inspect why this library is missing.
https://docs.python.org/3/library/ctypes.html

@GaryGriffin
Copy link
Member

@jralls

I installed litellm (and its 45 dependencies) to the ARM MacOS bundle of my 6.0.4 app so I coud test this addon. When I ran ChatWithTree addon, I got an error below. Can you provide any insights on this.

BTW, I had a similar issue [not valid for use in process: mapping process and mapped file (non-platform) have different Team IDs] when I installed pygraphviz and tried to run the NetworkChart addon.

2025-09-05 10:22:34.309: WARNING: ChatWithTreeBot.py: line 33: dlopen(/Volumes/Storage/Applications/Gramps6.0.4.app/Contents/Resources/lib/python3.13/site-packages/pydantic_core/_pydantic_core.cpython-313-darwin.so, 0x0002): tried: '/Volumes/Storage/Applications/Gramps6.0.4.app/Contents/Resources/lib/python3.13/site-packages/pydantic_core/_pydantic_core.cpython-313-darwin.so' (code signature in <2A10A0C5-9ABF-35D2-B314-198B0E3F4A16> '/Volumes/Storage/Applications/Gramps6.0.4.app/Contents/Resources/lib/python3.13/site-packages/pydantic_core/_pydantic_core.cpython-313-darwin.so' not valid for use in process: mapping process and mapped file (non-platform) have different Team IDs), '/System/Volumes/Preboot/Cryptexes/OS/Volumes/Storage/Applications/Gramps6.0.4.app/Contents/Resources/lib/python3.13/site-packages/pydantic_core/_pydantic_core.cpython-313-darwin.so' (no such file), '/Volumes/Storage/Applications/Gramps6.0.4.app/Contents/Resources/lib/python3.13/site-packages/pydantic_core/_pydantic_core.cpython-313-darwin.so' (code signature in <2A10A0C5-9ABF-35D2-B314-198B0E3F4A16> '/Volumes/Storage/Applications/Gramps6.0.4.app/Contents/Resources/lib/python3.13/site-packages/pydantic_core/_pydantic_core.cpython-313-darwin.so' not valid for use in process: mapping process and mapped file (non-platform) have different Team IDs)
Traceback (most recent call last):
File "/Users/gary/Library/Application Support/gramps/gramps60/plugins/ChatWithTree/ChatWithTreeBot.py", line 31, in
import litellm
File "/Volumes/Storage/Applications/Gramps6.0.4.app/Contents/Resources/lib/python3.13/site-packages/litellm/init.py", line 19, in
from litellm.types.integrations.datadog_llm_obs import DatadogLLMObsInitParams
File "/Volumes/Storage/Applications/Gramps6.0.4.app/Contents/Resources/lib/python3.13/site-packages/litellm/types/integrations/datadog_llm_obs.py", line 8, in
from litellm.types.integrations.custom_logger import StandardCustomLoggerInitParams
File "/Volumes/Storage/Applications/Gramps6.0.4.app/Contents/Resources/lib/python3.13/site-packages/litellm/types/integrations/custom_logger.py", line 3, in
from pydantic import BaseModel
File "/Volumes/Storage/Applications/Gramps6.0.4.app/Contents/Resources/lib/python3.13/site-packages/pydantic/init.py", line 5, in
from ._migration import getattr_migration
File "/Volumes/Storage/Applications/Gramps6.0.4.app/Contents/Resources/lib/python3.13/site-packages/pydantic/_migration.py", line 4, in
from .version import version_short
File "/Volumes/Storage/Applications/Gramps6.0.4.app/Contents/Resources/lib/python3.13/site-packages/pydantic/version.py", line 5, in
from pydantic_core import version as pydantic_core_version
File "/Volumes/Storage/Applications/Gramps6.0.4.app/Contents/Resources/lib/python3.13/site-packages/pydantic_core/init.py", line 6, in
from ._pydantic_core import (
...<22 lines>...
)
ImportError: dlopen(/Volumes/Storage/Applications/Gramps6.0.4.app/Contents/Resources/lib/python3.13/site-packages/pydantic_core/_pydantic_core.cpython-313-darwin.so, 0x0002): tried: '/Volumes/Storage/Applications/Gramps6.0.4.app/Contents/Resources/lib/python3.13/site-packages/pydantic_core/_pydantic_core.cpython-313-darwin.so' (code signature in <2A10A0C5-9ABF-35D2-B314-198B0E3F4A16> '/Volumes/Storage/Applications/Gramps6.0.4.app/Contents/Resources/lib/python3.13/site-packages/pydantic_core/_pydantic_core.cpython-313-darwin.so' not valid for use in process: mapping process and mapped file (non-platform) have different Team IDs), '/System/Volumes/Preboot/Cryptexes/OS/Volumes/Storage/Applications/Gramps6.0.4.app/Contents/Resources/lib/python3.13/site-packages/pydantic_core/_pydantic_core.cpython-313-darwin.so' (no such file), '/Volumes/Storage/Applications/Gramps6.0.4.app/Contents/Resources/lib/python3.13/site-packages/pydantic_core/_pydantic_core.cpython-313-darwin.so' (code signature in <2A10A0C5-9ABF-35D2-B314-198B0E3F4A16> '/Volumes/Storage/Applications/Gramps6.0.4.app/Contents/Resources/lib/python3.13/site-packages/pydantic_core/_pydantic_core.cpython-313-darwin.so' not valid for use in process: mapping process and mapped file (non-platform) have different Team IDs)

During handling of the above exception, another exception occurred:

Traceback (most recent call last):
File "/Volumes/Storage/Applications/Gramps6.0.4.app/Contents/Resources/lib/python3.13/site-packages/gramps/gen/plug/_manager.py", line 271, in load_plugin
_module = self.import_plugin(pdata)
File "/Volumes/Storage/Applications/Gramps6.0.4.app/Contents/Resources/lib/python3.13/site-packages/gramps/gen/plug/_manager.py", line 305, in import_plugin
module = import(pdata.mod_name)
File "/Users/gary/Library/Application Support/gramps/gramps60/plugins/ChatWithTree/ChatWithTree.py", line 42, in
from ChatWithTreeBot import ChatBot
File "/Users/gary/Library/Application Support/gramps/gramps60/plugins/ChatWithTree/ChatWithTreeBot.py", line 34, in
raise Exception("ChatWithTree requires litellm")
Exception: ChatWithTree requires litellm
2025-09-05 10:22:34.311: WARNING: grampletpane.py: line 192: Error loading gramplet 'ChatWithTree': skipping content

@jralls
Copy link
Member

jralls commented Sep 5, 2025

@GaryGriffin: It looks like it added libraries to the bundle that somebody else signed. I can disable that check with an entitlement. The description of that entitlement says that it turns on extra Gatekeeper checks so I'll have to make a bundle to test-notarize and then for you to try out.

@MelleKoning
Copy link
Contributor Author

MelleKoning commented Sep 5, 2025

I installed litellm (and its 45 dependencies) to the ARM MacOS bundle of my 6.0.4 app so I coud test this addon. When I ran ChatWithTree addon, I got an error below. Can you provide any insights on this.

...

vpydantic_core/_pydantic_core.cpython-313-darwin.so' (code signature in ... not valid for use in process: mapping process and mapped file (non-platform) have different Team IDs),

Although I don't have a MAC, this indicates a trust issue from the library you added to gramps with the signature of the installed gramps 6.0.4. When gramps project tries to load this module pydentic_core.cpython, gramps does not trust it.

Apparently you had a similar issue with another addon. I guess that is a good safety measure of a Mac to only allow *.so compiled libraries installed from the same application (Gramps) instead of allowing access to self-installed compiled binaries as downloaded from the web.

Maybe it is possible to override this?

Removing a quaraintaine flag, ref:
https://stackoverflow.com/questions/4833052/how-do-i-remove-the-extended-attributes-on-a-file-in-mac-os-x

Maybe this helps:
xattr -cr "/Volumes/Storage/Applications/Gramps6.0.4.app/Contents/Resources/lib/python3.13/site-packages"

Goes a bit beyond the MR at hand, but it is clear that the litellm packge is not installed by default with the gramps installation, and also not easily installed from within the addon manager of gramps when gramps is running; now seen both on Mac as previously on Windows by @PQYPLZXHGF

@jralls
Copy link
Member

jralls commented Sep 5, 2025

@GaryGriffin: https://sourceforge.net/projects/gramps/files/Testing/Gramps-Arm-6.0.4-NLV.dmg/download

NLV stands for no library verification but I gather from the docs that it really means that libraries must still be signed, just not necessarily by me. @Nick-Hall said something yesterday about maybe releasing 6.0.5 this weekend so if you can verify that this works I'll do the same for the release.

@GaryGriffin
Copy link
Member

@jralls
No luck. I installed the NLV app and tried to run it (before I loaded any of the litellm pip packages) and I got:

 gary$ ./Gramps 
 zsh: killed     ./Gramps

No core file created.

@jralls
Copy link
Member

jralls commented Sep 6, 2025

Rats. It seems there's an error in the Apple Docs and disable-library-validation requires a provisioning profile--but what capability is needed isn't documented so I've asked for help.

@MelleKoning
Copy link
Contributor Author

@Nick-Hall do you know where on windows the site-packages get installed when a plugin/addon wants to install site-packages?

(tag @PQYPLZXHGF )

On a Windows 10 machine, manually I was able to add a file wintypes.py (source: https://sources.debian.org/src/python3.14/3.14.0~rc2-1/Lib/ctypes/wintypes.py) into the library.zip that is located at C:\Program files\GrampsAIO64 6.0.4\lib\library.zip, by installing that file into the relevant ctypes folder.

Within Gramps the addon installer then goes a bit further until it spits out another error about pip not able to install the addon.

Then, manually, we can still install the litellm and dependend packages from the commandline in windows (Run as administrator) within the Gramps folder:

python -m pip install litellm --no-build-isolation --target="C:\Program Files\GrampsAIO64-6.0.4\lib\python3.12\site-packages"

I used this manual command, because I saw that from the code pip is used for windows, but apparently that does not succeed, while with the command line it is at least possible to install all litellm dependencies to a subfolder. To be honest, I also tried to update the code-line at gui\plug\_windows.py to use this command, but that was not picked up by gramps.

The above command nstalls litellm and dependend packages into the gramps at \lib\python3.12\site-packages folder, but still when running Gramps, and looking at the references, the litellm is not seen by gramps (showing the red cross), and shows as to-be-installed.

So question:

  • to what folder should dependencies like litellm be installed so that Gramps Windows sees it? If not the created \lib\python3.12\site-packages, what folder should that be instead?

I hope by manually installing the litellm dependency, outside of gramps, installing of this addon will succeed, because the dependency litellm will already be available for the python environment that Gramps is running in, in windows.

Let me know if the above made any sense, or none at all - otherwise, how can we update the Gramps AIO installer to already include certain dependencies that addons might need, like this litellm addon? And what about the missing wintypes.py?

@Nick-Hall
Copy link
Member

do you know where on windows the site-packages get installed when a plugin/addon wants to install site-packages?

We install python libraries for addons into the LIB_PATH directory which is called "lib" and located under the user plugins directory. We add this to the python path.

Running pip to install packages that are not pure python within our MSYS2 environment doesn't work. This is a known limitation.

I'll create a Windows AIO that contains litellm for testing.

@jralls
Copy link
Member

jralls commented Sep 18, 2025

Rats. It seems there's an error in the Apple Docs and disable-library-validation requires a provisioning profile--but what capability is needed isn't documented so I've asked for help.

@GaryGriffin It turned out I'd made a typo in the entitlements file. I fixed that and made a 6.0.5-NLV for you to try.

@GaryGriffin
Copy link
Member

GaryGriffin commented Sep 21, 2025

@jralls The 6.0.5NLV sorta works. It allows me to load the gramplet after I manually add all the dependencies in the MacOS bundle. So the MacOS install problem seems resolved. Thanks.

BUT the ChatWithTree gramplet does nothing. I see the tab in the lower window area where I loaded the gramplet, but cannot interact with it. I added a print statement in the init() function of ChatWithTree.py and it printed, so I know it executed. But there is no drawing area.

Screenshot 2025-09-20 at 8 20 28 PM

Not sure how to get anything to display at this point.

@jralls
Copy link
Member

jralls commented Sep 21, 2025

@GaryGriffin Note that I also added ~/Library/Application Support/gramps/lib to DYLD_FALLBACK_LIBRARY_PATH so (I hope) you won't have to compromise the bundle.

@MelleKoning
Copy link
Contributor Author

@GaryGriffin that's an interesting screenshot! The ChatWithTree addon should be displayed on your Dashboard, the area where you can also place addons like "tasks" and "top names" etc. It should be possible to also expand to its own popup window there.

@GaryGriffin
Copy link
Member

@jralls Thanks for the DYLD_FALLBACK_LIBRARY_PATH info. That is much better than compromising the bundle.

ChatWithTree -

  1. If I add the gramplet in People view, the title in the Add Gramplet rollover menu is 'Chat With Tree'. If I add to the Dashboard, the title in the Add Gramplet rollover menu is 'Chat With Tree Interactive Addon'
  2. When I add to Dashboard, I still get no drawing area
Screenshot 2025-09-21 at 9 02 07 AM

@MelleKoning
Copy link
Contributor Author

And when you click that icon that makes the window pop out, and or restart gramps?

Asking because I had set a parameter for the height that the addon should use, and maybe that height parameter is not picked up. Seeing what you see in the dashboard looks familiar to me after installing.

@GaryGriffin
Copy link
Member

When I opened the Detached window, it worked but there was no database attached for it. When I entered /help it told me to restart or select a db. When I selected a different db, it worked as expected.

Should you emit the db_changed event if there is a connection at load?

When I restart gramps, the ChatWithTree attached window lists 3 messages, initialized, db_changed, db_changed. Not sure why I got the 2 db_changed messages.

@GaryGriffin
Copy link
Member

help_url in the gpr would be very useful.

@GaryGriffin
Copy link
Member

If this gramplet is restricted to the Dashboard, then you should add the following to the GPR file:

navtypes=["Dashboard"],

@MelleKoning
Copy link
Contributor Author

When I opened the Detached window, it worked but there was no database attached for it. When I entered /help it told me to restart or select a db. When I selected a different db, it worked as expected.

Should you emit the db_changed event if there is a connection at load?

When I restart gramps, the ChatWithTree attached window lists 3 messages, initialized, db_changed, db_changed. Not sure why I got the 2 db_changed messages.

I had the same experience, seeing 2 times the db_changed message.

I did ensure that the addon itself:

  • Attaches only 1 time to the db_changed event that it receives from gramps

So why it does receive that event twice, I don't know, that might be something related to gramps itself - of course we can remove the message as it does not help the user that much

  • I still need to create better documentation on the gramps wiki. For now the /help provides rudimentary steps, but indeed extending that on a linked gramps wiki Url is better

Great that you got the addon working on a MAC 👍

Question: As I was not aware that the addon would also appear outside of the Dashboard; Do you think it makes sense to have the addon available in other views, like the Persons view, or will it be better to make it Dashboard only via that navtypes=["Dashboard"], flag?

@GaryGriffin
Copy link
Member

Question: As I was not aware that the addon would also appear outside of the Dashboard; Do you think it makes sense to have the addon available in other views, like the Persons view, or will it be better to make it Dashboard only via that navtypes=["Dashboard"], flag?

I think it should be available in any view that makes sense to a user. I doubt that I will be a normal user of this gramplet, so my opinion is not particularly relevant.

@MelleKoning
Copy link
Contributor Author

MelleKoning commented Sep 22, 2025

@GaryGriffin Given the space for a talk the Dashboard seems the right place.

A few updates

  • Have added the "Dashboard" to .gpr navtypes
  • /setlimit command added. If LLM model supports it, up to 20 tool calls can be done in the LLM interaction for a single user sentence instead of the default 6
  • Added to also retrieve "notes" for two existing tools: get_person and the two "family" tools.
  • The try...catch for llm is now limited only for the import of the python llm
  • Whenever a new remote model is selected via /setmodel, the chat history is being reset

@MelleKoning
Copy link
Contributor Author

Added

  • AsyncChatService.py : this disconnects the LLM communication from the main thread, unblocking Gramps for UI updates
  • ChatWithThree: uses AsyncChatService to send message on async Queue and read from that Queue, on databae change submits that the async worker thread should open its own database for the async session
  • Added an initial README.md
  • Added a pre-commit linter to adhere to certain coding styles
  • Added the oppertunity to show content_reasoning messages if the model supports it, as an example try the following model: /setmodel openrouter/z-ai/glm-4.5-air:free - you will need an openrouter API key to use that though.

Have fun, curious to your findings!

@GaryGriffin
Copy link
Member

Is this now stable for publishing or are you still tweaking? The bookkeeping changes needed are:

  1. Add help_url
  2. Add copyright to 2 files (AsyncChatService.py and litellm_utils.py)
  3. Do you need the yaml files? They do not (by default) go into the user install bundle.
  4. README.md does not go into the user install bundle. If this is desired, you need to use a MANIFEST (with ChatWithTree/README.md in it)

@Nick-Hall @dsblank : do you approve this for publishing? I have tested on Mac only. Probably needs a more experienced developer than me to do the code review.

@MelleKoning
Copy link
Contributor Author

Small updates:

  • when a non-existing model was set via "/setmodel model-typo" then an exception is thrown which was not captured by the code and bubbled up to gramps: it is now captured by the addon itself and displayed in the chat
  • added the license header to the "AsyncChatService.py" file
  • added the "help_url="Addon:ChatWithTree"

Have requested an account for the gramps-project.org wiki so that I can later start on adding a page there

@GaryGriffin
Copy link
Member

Looked for Addon:ChatWithTree but don't see it yet. Suggest that you include a privacy section in the documentation to address.

  1. Does the LLM 'learn' from the tree info? That is, does the tree details become part of the trained data.
  2. What tree info goes to the LLM?
  3. Are items tagged with Privacy sent to the LLM?
  4. Are living people data sent to the LLM?

@MelleKoning
Copy link
Contributor Author

MelleKoning commented Oct 18, 2025

Looked for Addon:ChatWithTree but don't see it yet. Suggest that you include a privacy section in the documentation to address.

I requested access to the gramps-project wiki, confirmed my emailaddress from the email that was sent to my emailaddress - which gave me a message last week that my account is on the waiting list. So far nothing happened; so yes, I would like to start, but I don't have access and can't login to the wiki yet. I just tried to refill the form with the same username, but indeed that username is already taken (by myself).

1. Does the LLM 'learn' from the tree info? That is, does the tree details become part of the trained data.
2. What tree info goes to the LLM?
3. Are items tagged with Privacy sent to the LLM?
4. Are living people data sent to the LLM?

Hm, good questions:

  1. The LLM that is configured, either running on local machine or, when a remote one is configured, is static.
  2. Everything that the tools read, based on the interaction from the LLM with the tools with the tree info goes into the LLM as data to drive the final answer returned by the LLM
  3. There is no logic in the created LLM tools that checks for gramps privacy flags, so yes
  4. Also yes

Regarding 3 and 4 it would be a nice enhancement to not sent that kind of data to the LLM, by way of ensuring the tools that read the database do not return that information. We could think of a "command" to enable or disable that...

All of the above should be part of the documentation (and or interaction within the chat) indeed.

@GaryGriffin
Copy link
Member

I requested access to the gramps-project wiki, confirmed my emailaddress from the email that was sent to my emailaddress - which gave me a message last week that my account is on the waiting list. So far nothing happened; so yes, I would like to start, but I don't have access and can't login to the wiki yet. I just tried to refill the form with the same username, but indeed that username is already taken (by myself).
@emyoulation Brian, who approves new users to wiki? This author needs to add help file for new plugin but has not been granted access yet.

@emyoulation
Copy link
Contributor

Sam does the approvals. That there's a waiting list is strange. There wasn't one a decade ago... approval was immediate. But we've had a series of DoS attacks. So the process obviously must have changed.

@emyoulation
Copy link
Contributor

emyoulation commented Oct 28, 2025

Just looked. Gioto and Patsy are also admins. And it looks like the Hace Gramps-project.org (MediaWiki) account was created 11 October.

Posted an inquiry in the Discourse private messaging.

@sam-m888
Copy link
Member

sam-m888 commented Oct 28, 2025

Hi @MelleKoning ,

The account was created as you can see on the 11th of October, you only need to login with the password you initially created after being approved?

If you have forgotten the password you only need to reset it using the following link:

https://gramps-project.org/wiki/index.php/Special:PasswordReset

The username is:

Hace

and you need to supply the email address you registered with.

Any issue contact me directly via email [email protected] if that does not work please.

Resolved in private message.

@MelleKoning
Copy link
Contributor Author

Hi @sam-m888 thank you!

I've setup an initial documentation https://gramps-project.org/wiki/index.php/Addon:ChatWithTree

@GaryGriffin GaryGriffin merged commit f3009a7 into gramps-project:maintenance/gramps60 Nov 2, 2025
GaryGriffin added a commit that referenced this pull request Nov 2, 2025
@GaryGriffin
Copy link
Member

I've setup an initial documentation https://gramps-project.org/wiki/index.php/Addon:ChatWithTree

The gramplet is available thru the standard Gramps project and does not need the MelleKoning project. In fact, version numbering may get confusing if you use multiple projects. Suggest you remove the top of the Installation section of the documentation. Or change it to use Gramps project with All Audiences and All Status

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

None yet

Projects

None yet

Development

Successfully merging this pull request may close these issues.

9 participants