Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Deepseek API from Azure AI Foundry Version Has Compatibility Issues with Dify.ai #13122

Open
5 tasks done
kelunyang opened this issue Jan 31, 2025 · 2 comments
Open
5 tasks done
Labels
🐞 bug Something isn't working

Comments

@kelunyang
Copy link

Self Checks

  • This is only for bug report, if you would like to ask a question, please head to Discussions.
  • I have searched for existing issues search for existing issues, including closed ones.
  • I confirm that I am using English to submit this report (我已阅读并同意 Language Policy).
  • [FOR CHINESE USERS] 请务必使用英文提交 Issue,否则会被关闭。谢谢!:)
  • Please do not modify this template :) and fill in all the required fields.

Dify version

0.15.2

Cloud or Self Hosted

Cloud

Steps to reproduce

In chatbot mode, the Deepseek API created from Azure AI Foundry works perfectly fine. However, after switching to chatflow mode, an error occurs in the second newly added LLM block.

The error code is as follows:

Run failed: 1 validation error for LLMNodeData  
model.mode  
Field required [type=missing, input_value={'provider': 'azure_ai_st... 1, 'max_tokens': 50}}, input_type=dict]  

After exporting the DSL file, I noticed that in the automatically converted chatflow DSL, the LLM block contains:

mode: completion
However, this attribute does not exist in manually created LLM blocks (even those created from a blank template).

✔️ Expected Behavior

No response

❌ Actual Behavior

No response

Copy link

dosubot bot commented Jan 31, 2025

Hey @kelunyang! I'm here to help you with your bug report and any questions you have. I can also assist you in becoming a contributor.

The issue you're encountering seems to be related to the mode attribute in the LLM block within the chatflow DSL. The error message indicates that the model.mode field is missing, which is required for the LLMNodeData object. In chatflow mode, the mode attribute is crucial as it specifies the operational mode of the language model. When set to "chat," it enables tool calling by invoking assistant tools, whereas "completion" mode focuses on generating text completions without invoking tools [1].

To resolve this issue, ensure that the mode attribute is correctly set in your chatflow DSL. If the mode is set to "completion" and you need tool calling capabilities, consider changing it to "chat." Additionally, verify that all required fields for the LLMNodeData object are properly configured, as missing fields can lead to validation errors [2].

To continue talking to Dosu, mention @dosu.


Help Dosu learn! Give it feedback: Great Response | Irrelevant Answer | Incorrect Sources | Too Verbose | Hallucination | Other

@dosubot dosubot bot added the 🐞 bug Something isn't working label Jan 31, 2025
@zyu
Copy link

zyu commented Jan 31, 2025

I want to purchase Dify.ai’s services and have a few questions. Is there a business contact for inquiries?

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
🐞 bug Something isn't working
Projects
None yet
Development

No branches or pull requests

2 participants