Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Update project for enhanced LLM API integration with streaming support #1

Draft
wants to merge 1 commit into
base: main
Choose a base branch
from

Conversation

araray
Copy link
Owner

@araray araray commented Mar 14, 2024

  • Update .gitignore to include .clj-kondo/ and .lsp/ for better developer experience.
  • Increment project version to 0.2.0-RC1 in project.clj, indicating significant updates.
  • Introduce core.async dependency for asynchronous communication with AI APIs.
  • Refactor Anthropic and OpenAI chat API implementations to support streaming responses using core.async channels.
  • Extend ChatAPI protocol with set-streaming! method to toggle streaming behavior dynamically.
  • Modify api.core namespace to define a constant for stream end signal.
  • Implement streaming logic in utils namespace with API request handling supporting both immediate and streamed responses.
  • Adjust main application logic to handle streamed chat responses, improving interaction flow.
  • Adapt API request utility function to manage concurrent requests and handle streaming responses properly.
  • Ensure error handling and retry mechanism are in place for API requests, enhancing robustness.
  • Miscellaneous code cleanups and improvements for readability and efficiency.

This update brings significant enhancements to the LLM API integration, introducing streaming support to handle real-time chat interactions more effectively. The changes include both structural adjustments and feature enhancements to accommodate streaming data from AI APIs, improving the overall responsiveness and user experience of the chatbot.

- Update .gitignore to include .clj-kondo/ and .lsp/ for better developer experience.
- Increment project version to 0.2.0-RC1 in project.clj, indicating significant updates.
- Introduce core.async dependency for asynchronous communication with AI APIs.
- Refactor Anthropic and OpenAI chat API implementations to support streaming responses using core.async channels.
- Extend ChatAPI protocol with `set-streaming!` method to toggle streaming behavior dynamically.
- Modify api.core namespace to define a constant for stream end signal.
- Implement streaming logic in utils namespace with API request handling supporting both immediate and streamed responses.
- Adjust main application logic to handle streamed chat responses, improving interaction flow.
- Adapt API request utility function to manage concurrent requests and handle streaming responses properly.
- Ensure error handling and retry mechanism are in place for API requests, enhancing robustness.
- Miscellaneous code cleanups and improvements for readability and efficiency.

This update brings significant enhancements to the LLM API integration, introducing streaming support to handle real-time chat interactions more effectively. The changes include both structural adjustments and feature enhancements to accommodate streaming data from AI APIs, improving the overall responsiveness and user experience of the chatbot.
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

1 participant