Skip to content

Conversation

@devin-ai-integration
Copy link
Contributor

@devin-ai-integration devin-ai-integration bot commented Nov 23, 2025

Document Ask Fern LLM provider options

Summary

Adds documentation for the three LLM providers supported by Ask Fern: Anthropic, AWS Bedrock, and Cohere. This feature was added in fern-platform PR #5400 (Cohere support) but was not documented in the user-facing docs.

The new page explains:

  • The three supported providers and their models
  • How the fallback system works for high availability
  • Configuration requirements for each provider

Review & Testing Checklist for Human

  • Verify technical accuracy: Confirm model names (Claude 3.7, Claude 4 Sonnet, etc.) match customer-facing names and that "Command A 03 2025" is the correct Cohere model name
  • Validate environment variables: Confirm ANTHROPIC_API_KEY and COHERE_API_KEY are the correct variable names used in production
  • Check fallback order: Verify the provider preference order (Bedrock → Anthropic → Cohere) matches actual production configuration
  • Review configuration section: Confirm whether LLM provider configuration is truly only available through Fern team contact, or if there's a self-service option I missed
  • Test page rendering: View the page at /learn/ask-fern/llm-providers to ensure it renders correctly and navigation works

Test Plan

  1. Navigate to the Ask Fern documentation section
  2. Verify "LLM providers" appears in the Configuration section navigation
  3. Click the link and confirm the page loads
  4. Review content for accuracy against actual Ask Fern implementation
  5. If possible, verify with someone who has configured Ask Fern that the information matches their experience

Notes

  • Based on code review of fern-platform PR #5400 and the LLM factory implementation
  • I documented all three providers (not just Cohere) for completeness, though the original request focused on Cohere
  • The page assumes this is enterprise/infrastructure-level configuration; if there's a self-service option, the "Configuration" section should be updated
  • Session: https://app.devin.ai/sessions/ef0744e80a96482a85c0f5cec53e4b4b
  • Requested by: [email protected] (@dannysheridan)

@devin-ai-integration
Copy link
Contributor Author

🤖 Devin AI Engineer

I'll be helping with this pull request! Here's what you should know:

✅ I will automatically:

  • Address comments on this PR. Add '(aside)' to your comment to have me ignore it.
  • Look at CI failures and help fix them

Note: I can only respond to comments from users who have write access to this repository.

⚙️ Control Options:

  • Disable automatic comment and CI monitoring


**Configuration:** Requires `ANTHROPIC_API_KEY` environment variable.

### AWS Bedrock
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

📝 [vale] reported by reviewdog 🐶
[FernStyles.Headings] 'AWS Bedrock' should use sentence-style capitalization.


### AWS Bedrock

Access Claude models through AWS Bedrock, which provides enterprise features like VPC endpoints, CloudWatch logging, and AWS IAM integration.
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

📝 [vale] reported by reviewdog 🐶
[FernStyles.Acronyms] 'IAM' has no definition.

@github-actions
Copy link
Contributor

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

None yet

Projects

None yet

Development

Successfully merging this pull request may close these issues.

2 participants