[Help Wanted] Supporting the chat_inner
API for existing VLMs.
#323
Labels
Feature Request
Extra attention is needed
chat_inner
API for existing VLMs.
#323
Since we have now supported the multi-turn benchmark MMDU, we would like to implement the
chat_inner
function for existing VLMs in VLMEvalKit add support for multi-turn chatting.Currently, we have already supported the method for:
GPT series
,Claude series
,QwenVL APIs
,qwen_chat
,MiniCPM-v2.5
,Idefics2_8b
,llava-v1.5
,deepseek-vl series
. We need help from the community to support it for more models.If you would like to help, you can refer to the development doc: https://github.com/open-compass/VLMEvalKit/blob/main/docs/en/Development.md.
Implementing three
chat_inner
api can be viewed as a major contribution. However, only implement this function if you are sure the VLM has the capability to handle the multi-turn input.The text was updated successfully, but these errors were encountered: