Skip to content
Open
Show file tree
Hide file tree
Changes from all commits
Commits
File filter

Filter by extension

Filter by extension

Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
2 changes: 2 additions & 0 deletions fern/products/ask-fern/ask-fern.yml
Original file line number Diff line number Diff line change
Expand Up @@ -12,6 +12,8 @@ navigation:
- page: Setup
path: ./pages/configuration/setup.mdx
slug: setup
- page: LLM providers
path: ./pages/configuration/llm-providers.mdx
- page: Guidance
path: ./pages/features/guidance.mdx
- page: Additional content sources
Expand Down
57 changes: 57 additions & 0 deletions fern/products/ask-fern/pages/configuration/llm-providers.mdx
Original file line number Diff line number Diff line change
@@ -0,0 +1,57 @@
---
title: LLM providers
description: Configure which language model provider Ask Fern uses to generate responses.
---

Ask Fern supports multiple language model (LLM) providers for generating AI responses. The system automatically selects an available provider based on your configuration and falls back to alternatives if the primary provider is unavailable.

## Supported providers

Ask Fern supports the following LLM providers:

### Anthropic

Direct integration with Anthropic's Claude models via the Anthropic API. This is the default provider when configured.

**Supported models:**
- Claude 3.7
- Claude 4 Sonnet
- Claude 4.5 Sonnet
- Claude 4.5 Haiku

**Configuration:** Requires `ANTHROPIC_API_KEY` environment variable.

### AWS Bedrock
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

📝 [vale] reported by reviewdog 🐶
[FernStyles.Headings] 'AWS Bedrock' should use sentence-style capitalization.


Access Claude models through AWS Bedrock, which provides enterprise features like VPC endpoints, CloudWatch logging, and AWS IAM integration.
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

📝 [vale] reported by reviewdog 🐶
[FernStyles.Acronyms] 'IAM' has no definition.


**Supported models:**
- Claude 3.7 (via Bedrock)
- Claude 4 Sonnet (via Bedrock)
- Claude 4.5 Sonnet (via Bedrock)
- Claude 4.5 Haiku (via Bedrock)

**Configuration:** Requires AWS credentials with Bedrock access permissions.

### Cohere

Integration with Cohere's Command models for AI-powered responses.

**Supported models:**
- Command A 03 2025

**Configuration:** Requires `COHERE_API_KEY` environment variable.

## Provider selection

Ask Fern uses a fallback system to ensure high availability:

1. The system attempts to use providers in order of preference: Bedrock, Anthropic, Cohere
2. If a provider is unavailable (missing credentials or API errors), the system automatically falls back to the next available provider
3. Within each provider, Ask Fern attempts multiple models in order of capability, falling back to smaller models if needed

This fallback system ensures Ask Fern remains available even if individual providers experience issues.

## Configuration

LLM providers are configured at the infrastructure level through environment variables. Contact your Fern account team to configure or change your LLM provider preferences.