-
use
chat_autogen_based.ipynb
to perform chat between to llm, save chat result into txt -
use
chat_evaluation.ipynb
to analysis chat history. -
TO BE DONE: use
chat_langchain_based
to analysis chat history.
- chat_autogen_based.ipynb worked with env: pyautogen, to install pyautogen: https://microsoft.github.io/autogen/docs/installation/
- chat_evaluation.ipynb worked with env: evaluation, to install evaluation:
conda create -n evaluation python=3.9 -y pip install langchain
- install fastchat in env: fschat
https://github.com/lm-sys/FastChat
- run fastchat:
https://github.com/lm-sys/FastChat/blob/main/docs/langchain_integration.md