-
Notifications
You must be signed in to change notification settings - Fork 128
ChatWithTree gramplet addition #762
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
base: maintenance/gramps60
Are you sure you want to change the base?
ChatWithTree gramplet addition #762
Conversation
You only need to include the The |
Looking forward to testing thank you for this addon 👍 @Nick-Hall Attempted to test on Windows with the GrampsAIO (All in one) but the GrampsAIO install looks to be missing the required files (see error message below). ModuleNotFoundError: No module named 'ctypes.wintypes' fyi on windows I had to select the option below first before getting the error, can you make the gramplet more graceful so it displays the gramplet with a message showing that python module litellm needs to be installed ![]() ![]()
|
Attempted to test on MacOS, which is also a bundled gramps distro. The litellm dependency has about 45 other dependencies. Even after installing all of these into the gramps bundle, still had issues with pydantic_core wheel. Will attempt again when linux testing is successful (by others). Open issues:
|
245242e
to
7cfc554
Compare
Thanks for the swift and extensive comments and the test on windows. I have adjusted the commit by:
For the required llm module, the "litellm" is part of the ChatWithTree.gpr.py registration file - in my testing the module is installed and working. Maybe litellm is dependend on other modules that have to be accessible by Python? Locally, I installed and ran the plugin with the gramps version 6.0.4, the development version. Locally, I have python 3.12.3 (python3 --version) As I work in a python virtual environment, this gives me the oppertunity to generate a requirements.txt. These are the modules that can be seen from that requirements.txt file:
This does not show as much as I thought, so I used another method with I now get a huge list, that also includes litellm, this file requirements.txt is in the main addons-source folder, so I think it includes ALL packages needed by all the addons and gramps:
There is also a requirements.txt generated in the /ChatWithTree subfolder only:
I do not understand why the litellm version here is different from the one above (1.76.1 vs 1.75.7), but it has far less dependencies. About Windows: maybe the installed python3 or litellm version, or one of its dependencies, is not matching with the above? |
try: | ||
from typing import Dict, Any, List, Optional, Tuple, Pattern, Iterator | ||
import os | ||
import json | ||
import sys | ||
import time | ||
import re | ||
import inspect | ||
import litellm | ||
except ImportError as e: | ||
LOG.warning(e) | ||
raise Exception("ChatWithTree requires litellm") |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Putting a try/except around such large blocks is not a good practice, and can hide issues.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
I tend to agree - most of the imports seem to be default modules (os, json, sys, time), but am not entirely sure about inspect, re
As litellm is a requirement, just as the other imports, what is the best alternative suggested:
- Put a try..except around each import
- Only have the try..except around the litellm module (as that is what we want to test)
- Or maybe - don't have the try..except at all
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Everything but "litellm" here is included with python.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Have put the try
now only right above litellm.
ChatWithTree/ChatWithTreeBot.py
Outdated
def _llm_loop(self, seed: int) -> Iterator[Tuple[YieldType, str]]: | ||
# Tool-calling loop | ||
final_response = "I was unable to find the desired information." | ||
limit_loop = 6 |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Not sure that will be enough. Maybe make this a parameter, or config setting?
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
I found two different issues with a higher number
- Original number 10 could cause "same questions asked to the llm" especially with very dumb llms (local ollama llms that are small are not that smart, although they do work with litellm). It was kind of wasting time without making any progress in the roundtrips
- Rate limits of remote llms. When doing 10 quick roundtrips to the remote llms you often quickly hit a rate limit, while if we limit it to about 5 calls, and then let the user type a new question, rarely hitting the rate limit of the remote cloud llms
However: Limiting from 10 originally back to max 6 seems good enough, especially with models that are able to instruct to do multiple of the local tool calls at once, that is, the number of tool calls on the local system is not limited - this is important.
As an example - here is a question whereby in the chat there were already plenty people retrieved, and then I asked can you show me all birth and death dates for all these family members?
This is the tool-call balloon in yellow:

So - the 6 is about the number of roundtrips to the (remote) language model but not limiting the amount of tool calls to the local database - which can be pretty huge as seen above. Does that make sense?
Nevertheless, yes:
- I do agree that making this "6" configurable is a good enhancement. I could probably best use the existing "/commands" pattern for that and add this configurable option to the /help page. Would love to do that in an upcoming MR.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
/setlimit is now implemented
person_obj = self.db.get_person_from_handle(person_handle) | ||
obj = self.sa.mother(person_obj) | ||
data = dict(self.db.get_raw_person_data(obj.handle)) | ||
return data |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
The SimpleAccess is easy, but there are better ways to do what you want more directly. There may be more than one mother, for example.
|
||
if family_handle_list: | ||
family_id = family_handle_list[0] | ||
family = self.db.get_family_from_handle(family_id) |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Using the raw functions will be much faster for such uses.
#search_pattern = re.compile(re.escape(search_string), re.IGNORECASE) | ||
search_pattern = self.create_search_pattern(search_string) | ||
|
||
for person_obj in self.sa.all_people(): |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
This is really slow.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
For personal use asking for all people with the name "John" it's quick enough - The performance comments probably highly depend on the size of the database? For now only have used the SimpleAccess database entry point.
Enhancing that to use other patterns to ensure that Gramps users with many more data in the database can also use this tool is definately on the to do list.
What area's to look for within gramps to get a better understanding on how to use raw data access as opposed to the SimpleAccess?
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
I'm going to work on a set of functions tonight/tomorrow.
try: | ||
from ChatWithTreeBot import ChatBot | ||
except ImportError as e: | ||
LOG.warning(e) | ||
raise ImportError("Failed to import ChatBot from chatbot module: " + str(e)) | ||
|
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
This try/except doesn't do much more than without it.
Looks indeed to be a missing library ctypes.wintypes, a standard .py file in the Windows AIO installer. The Windows GrampsAIO64-6.0.4 installer's Python environment is missing this standard library in it's library.zip on your machine. I would not recommend trying to add the python file to that library.zip yourself but it could be a workaround. This goes beyond this particular addon. If there is an existing script that creates the windows installer we can inspect why this library is missing. |
I installed litellm (and its 45 dependencies) to the ARM MacOS bundle of my 6.0.4 app so I coud test this addon. When I ran ChatWithTree addon, I got an error below. Can you provide any insights on this. BTW, I had a similar issue [not valid for use in process: mapping process and mapped file (non-platform) have different Team IDs] when I installed pygraphviz and tried to run the NetworkChart addon. 2025-09-05 10:22:34.309: WARNING: ChatWithTreeBot.py: line 33: dlopen(/Volumes/Storage/Applications/Gramps6.0.4.app/Contents/Resources/lib/python3.13/site-packages/pydantic_core/_pydantic_core.cpython-313-darwin.so, 0x0002): tried: '/Volumes/Storage/Applications/Gramps6.0.4.app/Contents/Resources/lib/python3.13/site-packages/pydantic_core/_pydantic_core.cpython-313-darwin.so' (code signature in <2A10A0C5-9ABF-35D2-B314-198B0E3F4A16> '/Volumes/Storage/Applications/Gramps6.0.4.app/Contents/Resources/lib/python3.13/site-packages/pydantic_core/_pydantic_core.cpython-313-darwin.so' not valid for use in process: mapping process and mapped file (non-platform) have different Team IDs), '/System/Volumes/Preboot/Cryptexes/OS/Volumes/Storage/Applications/Gramps6.0.4.app/Contents/Resources/lib/python3.13/site-packages/pydantic_core/_pydantic_core.cpython-313-darwin.so' (no such file), '/Volumes/Storage/Applications/Gramps6.0.4.app/Contents/Resources/lib/python3.13/site-packages/pydantic_core/_pydantic_core.cpython-313-darwin.so' (code signature in <2A10A0C5-9ABF-35D2-B314-198B0E3F4A16> '/Volumes/Storage/Applications/Gramps6.0.4.app/Contents/Resources/lib/python3.13/site-packages/pydantic_core/_pydantic_core.cpython-313-darwin.so' not valid for use in process: mapping process and mapped file (non-platform) have different Team IDs) During handling of the above exception, another exception occurred: Traceback (most recent call last): |
@GaryGriffin: It looks like it added libraries to the bundle that somebody else signed. I can disable that check with an entitlement. The description of that entitlement says that it turns on extra Gatekeeper checks so I'll have to make a bundle to test-notarize and then for you to try out. |
...
Although I don't have a MAC, this indicates a trust issue from the library you added to gramps with the signature of the installed gramps 6.0.4. When gramps project tries to load this module pydentic_core.cpython, gramps does not trust it. Apparently you had a similar issue with another addon. I guess that is a good safety measure of a Mac to only allow *.so compiled libraries installed from the same application (Gramps) instead of allowing access to self-installed compiled binaries as downloaded from the web. Maybe it is possible to override this? Removing a quaraintaine flag, ref: Maybe this helps: Goes a bit beyond the MR at hand, but it is clear that the litellm packge is not installed by default with the gramps installation, and also not easily installed from within the addon manager of gramps when gramps is running; now seen both on Mac as previously on Windows by @PQYPLZXHGF |
@GaryGriffin: https://sourceforge.net/projects/gramps/files/Testing/Gramps-Arm-6.0.4-NLV.dmg/download NLV stands for no library verification but I gather from the docs that it really means that libraries must still be signed, just not necessarily by me. @Nick-Hall said something yesterday about maybe releasing 6.0.5 this weekend so if you can verify that this works I'll do the same for the release. |
@jralls
No core file created. |
Rats. It seems there's an error in the Apple Docs and disable-library-validation requires a provisioning profile--but what capability is needed isn't documented so I've asked for help. |
@Nick-Hall do you know where on windows the site-packages get installed when a plugin/addon wants to install site-packages? (tag @PQYPLZXHGF ) On a Windows 10 machine, manually I was able to add a file Within Gramps the addon installer then goes a bit further until it spits out another error about pip not able to install the addon. Then, manually, we can still install the
I used this manual command, because I saw that from the code The above command nstalls litellm and dependend packages into the So question:
I hope by manually installing the litellm dependency, outside of gramps, installing of this addon will succeed, because the dependency Let me know if the above made any sense, or none at all - otherwise, how can we update the Gramps AIO installer to already include certain dependencies that addons might need, like this |
We install python libraries for addons into the LIB_PATH directory which is called "lib" and located under the user plugins directory. We add this to the python path. Running I'll create a Windows AIO that contains |
@GaryGriffin It turned out I'd made a typo in the entitlements file. I fixed that and made a 6.0.5-NLV for you to try. |
@jralls The 6.0.5NLV sorta works. It allows me to load the gramplet after I manually add all the dependencies in the MacOS bundle. So the MacOS install problem seems resolved. Thanks. BUT the ChatWithTree gramplet does nothing. I see the tab in the lower window area where I loaded the gramplet, but cannot interact with it. I added a print statement in the init() function of ChatWithTree.py and it printed, so I know it executed. But there is no drawing area. ![]() Not sure how to get anything to display at this point. |
@GaryGriffin Note that I also added ~/Library/Application Support/gramps/lib to DYLD_FALLBACK_LIBRARY_PATH so (I hope) you won't have to compromise the bundle. |
@GaryGriffin that's an interesting screenshot! The ChatWithTree addon should be displayed on your Dashboard, the area where you can also place addons like "tasks" and "top names" etc. It should be possible to also expand to its own popup window there. |
@jralls Thanks for the DYLD_FALLBACK_LIBRARY_PATH info. That is much better than compromising the bundle. ChatWithTree -
![]() |
And when you click that icon that makes the window pop out, and or restart gramps? Asking because I had set a parameter for the height that the addon should use, and maybe that height parameter is not picked up. Seeing what you see in the dashboard looks familiar to me after installing. |
When I opened the Detached window, it worked but there was no database attached for it. When I entered /help it told me to restart or select a db. When I selected a different db, it worked as expected. Should you emit the db_changed event if there is a connection at load? When I restart gramps, the ChatWithTree attached window lists 3 messages, initialized, db_changed, db_changed. Not sure why I got the 2 db_changed messages. |
help_url in the gpr would be very useful. |
If this gramplet is restricted to the Dashboard, then you should add the following to the GPR file:
|
I had the same experience, seeing 2 times the db_changed message. I did ensure that the addon itself:
So why it does receive that event twice, I don't know, that might be something related to gramps itself - of course we can remove the message as it does not help the user that much
Great that you got the addon working on a MAC 👍 Question: As I was not aware that the addon would also appear outside of the Dashboard; Do you think it makes sense to have the addon available in other views, like the Persons view, or will it be better to make it Dashboard only via that |
I think it should be available in any view that makes sense to a user. I doubt that I will be a normal user of this gramplet, so my opinion is not particularly relevant. |
@GaryGriffin Given the space for a talk the Dashboard seems the right place. A few updates
|
ChatWithTree plugin addition.
As this is my first MR for a Gramps plugin - please advice.
Will add a topic in the gramps development forum with the features:
https://gramps.discourse.group/t/chatwithtree-plugin-gramplet/8251