HTTP endpoints so OLM client can be in state for outsourcing GUI development #40
curious-debug
started this conversation in
Ideas
Replies: 0 comments
Sign up for free
to join this conversation on GitHub.
Already have an account?
Sign in to comment
Uh oh!
There was an error while loading. Please reload this page.
-
If the HTTP service had at least these below, the olm client would be in a more robust state, such that outsourcing the development of an OLM GUI could be pursued.
To help this project, there really should be a GUI for the olm client, and I think these changes to the olm client would help. (This will at least allow me to create a Windows system tray app to manage the client once it is installed. Others can do the same, and you guys can pick the best.)
BTW: The curl examples on the readme -- https://github.com/fosrl/olm -- use 8080, but the --http-addr flag says the default port is 9452. Please remove confusion and update the examples.
Beta Was this translation helpful? Give feedback.
All reactions