Skip to content

Commit

Permalink
fix for LlmChain OutputKey. The result of CallAsync was always contai…
Browse files Browse the repository at this point in the history
…ning only ["text'] key ignoring OutputKey. This was causing an error when used with SequentialChain
  • Loading branch information
TesAnti committed Oct 17, 2023
1 parent fdebe46 commit 9ea137c
Showing 1 changed file with 3 additions and 2 deletions.
5 changes: 3 additions & 2 deletions src/libs/LangChain.Core/Chains/LLM/LLMChain.cs
Original file line number Diff line number Diff line change
Expand Up @@ -58,8 +58,9 @@ public override async Task<IChainValues> CallAsync(IChainValues values)

BasePromptValue promptValue = await Prompt.FormatPromptValue(new InputValues(values.Value));
var response = await Llm.GenerateAsync(new ChatRequest(promptValue.ToChatMessages(), stop));

return new ChainValues(response.Messages.Last().Content);
if(string.IsNullOrEmpty(OutputKey))
return new ChainValues(response.Messages.Last().Content);
return new ChainValues(OutputKey,response.Messages.Last().Content);
}

public async Task<object> Predict(ChainValues values)
Expand Down

0 comments on commit 9ea137c

Please sign in to comment.