Skip to content
Merged
Show file tree
Hide file tree
Changes from all commits
Commits
File filter

Filter by extension

Filter by extension

Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
62 changes: 32 additions & 30 deletions evi/evi-flutter/README.md
Original file line number Diff line number Diff line change
Expand Up @@ -5,61 +5,63 @@

This project features a sample implementation of Hume's [Empathic Voice Interface](https://dev.hume.ai/docs/empathic-voice-interface-evi/overview) using Flutter. This is lightly adapted from the stater project provided by `flutter create`.

**Targets:** The example supports iOS, Android, and Web.
**Targets:** The example supports iOS, Android, and Web.

**Dependencies:** It uses the [record](https://pub.dev/packages/record) Flutter package for audio recording, and [audioplayers](https://pub.dev/packages/audioplayers) package for playback.
**Dependencies:** It uses the [record](https://pub.dev/packages/record) Flutter package for audio recording, and [audioplayers](https://pub.dev/packages/audioplayers) package for playback.

## Instructions

1. Clone this examples repository:

```shell
git clone https://github.com/humeai/hume-api-examples
cd hume-api-examples/evi/flutter/evi-flutter
```
```shell
git clone https://github.com/humeai/hume-api-examples
cd hume-api-examples/evi/evi-flutter
```

2. Install Flutter (if needed) following the [official guide](https://docs.flutter.dev/get-started/install).

3. Install dependencies:
```shell
flutter pub get
```

```shell
flutter pub get
```

4. Set up your API key:

You must authenticate to use the EVI API. Your API key can be retrieved from the [Hume AI platform](https://platform.hume.ai/settings/keys). For detailed instructions, see our documentation on [getting your api keys](https://dev.hume.ai/docs/introduction/api-key).
This example uses [flutter_dotenv](https://pub.dev/packages/flutter_dotenv). Place your API key in a `.env` file at the root of your project.
You must authenticate to use the EVI API. Your API key can be retrieved from the [Hume AI platform](https://platform.hume.ai/settings/keys). For detailed instructions, see our documentation on [getting your api keys](https://dev.hume.ai/docs/introduction/api-key).

This example uses [flutter_dotenv](https://pub.dev/packages/flutter_dotenv). Place your API key in a `.env` file at the root of your project.

```shell
echo "HUME_API_KEY=your_api_key_here" > .env
```

You can copy the `.env.example` file to use as a template.
```shell
echo "HUME_API_KEY=your_api_key_here" > .env
```

**Note:** the `HUME_API_KEY` environment variable is for development only. In a production flutter app you should avoid building your api key into the app -- the client should fetch an access token from an endpoint on your server. You should supply the `MY_SERVER_AUTH_URL` environment variable and uncomment the call to `fetchAccessToken` in `lib/main.dart`.
You can copy the `.env.example` file to use as a template.

**Note:** the `HUME_API_KEY` environment variable is for development only. In a production flutter app you should avoid building your api key into the app -- the client should fetch an access token from an endpoint on your server. You should supply the `MY_SERVER_AUTH_URL` environment variable and uncomment the call to `fetchAccessToken` in `lib/main.dart`.

5. Specify an EVI configuration (Optional):

EVI is pre-configured with a set of default values, which are automatically applied if you do not specify a configuration. The default configuration includes a preset voice and language model, but does not include a system prompt or tools. To customize these options, you will need to create and specify your own EVI configuration. To learn more, see our [configuration guide](https://dev.hume.ai/docs/empathic-voice-interface-evi/configuration/build-a-configuration).
EVI is pre-configured with a set of default values, which are automatically applied if you do not specify a configuration. The default configuration includes a preset voice and language model, but does not include a system prompt or tools. To customize these options, you will need to create and specify your own EVI configuration. To learn more, see our [configuration guide](https://dev.hume.ai/docs/empathic-voice-interface-evi/configuration/build-a-configuration).

```shell
echo "HUME_CONFIG_ID=your_config_id_here" >> .env
```
```shell
echo "HUME_CONFIG_ID=your_config_id_here" >> .env
```

6. Run the app:
```shell
flutter run
```

```shell
flutter run
```

7. If you are using the Android emulator, make sure to send audio to the emulator from the host.

![](host-audio-screenshot.png)

## Notes

* **Echo cancellation**. Echo cancellation is important for a good user experience using EVI. Without echo cancellation, EVI will detect its own speech as user interruptions, and will cut itself off and become incoherent. This flutter example *requests* echo cancellation from the browser or the device's operating system, but echo cancellation is hardware-dependent and may not be provided in all environments.
* Echo cancellation works consistently on physical iOS devices and on the web.
* Echo cancellation works on some physical Android devices.
* Echo cancellation doesn't seem to work using the iOS simulator or Android Emulator when forwarding audio from the host.
* If you need to test using a simulator or emulator, or in an environment where echo cancellation is not provided, use headphones, or enable the mute button while EVI is speaking.
- **Echo cancellation**. Echo cancellation is important for a good user experience using EVI. Without echo cancellation, EVI will detect its own speech as user interruptions, and will cut itself off and become incoherent. This flutter example _requests_ echo cancellation from the browser or the device's operating system, but echo cancellation is hardware-dependent and may not be provided in all environments.
- Echo cancellation works consistently on physical iOS devices and on the web.
- Echo cancellation works on some physical Android devices.
- Echo cancellation doesn't seem to work using the iOS simulator or Android Emulator when forwarding audio from the host.
- If you need to test using a simulator or emulator, or in an environment where echo cancellation is not provided, use headphones, or enable the mute button while EVI is speaking.
59 changes: 30 additions & 29 deletions evi/evi-next-js-app-router-quickstart/README.md
Original file line number Diff line number Diff line change
Expand Up @@ -22,49 +22,50 @@ Below are the steps to completing deployment:
1. Create a Git Repository for your project.
2. Provide the required environment variables. To get your API key and Secret key, log into the Hume AI Platform and visit the [API keys page](https://platform.hume.ai/settings/keys).

## Modify the project
## Modify the project

1. Clone this examples repository:

```shell
git clone https://github.com/humeai/hume-api-examples
cd hume-api-examples/evi/next-js/evi-next-js-app-router-quickstart
```
```shell
git clone https://github.com/humeai/hume-api-examples
cd hume-api-examples/evi/evi-next-js-app-router-quickstart
```

2. Install dependencies:
```shell
npm install
```

```shell
npm install
```

3. Set up your API key and Secret key:

In order to make an authenticated connection we will first need to generate an access token. Doing so will require your API key and Secret key. These keys can be obtained by logging into the Hume AI Platform and visiting the [API keys page](https://platform.hume.ai/settings/keys). For detailed instructions, see our documentation on [getting your api keys](https://dev.hume.ai/docs/introduction/api-key).
Place your `HUME_API_KEY` and `HUME_SECRET_KEY` in a `.env` file at the root of your project.
In order to make an authenticated connection we will first need to generate an access token. Doing so will require your API key and Secret key. These keys can be obtained by logging into the Hume AI Platform and visiting the [API keys page](https://platform.hume.ai/settings/keys). For detailed instructions, see our documentation on [getting your api keys](https://dev.hume.ai/docs/introduction/api-key).

Place your `HUME_API_KEY` and `HUME_SECRET_KEY` in a `.env` file at the root of your project.

```shell
echo "HUME_API_KEY=your_api_key_here" > .env
echo "HUME_SECRET_KEY=your_secret_key_here" >> .env
```
```shell
echo "HUME_API_KEY=your_api_key_here" > .env
echo "HUME_SECRET_KEY=your_secret_key_here" >> .env
```

You can copy the `.env.example` file to use as a template.
You can copy the `.env.example` file to use as a template.

4. Specify an EVI configuration (Optional):

EVI is pre-configured with a set of default values, which are automatically applied if you do not specify a configuration. The default configuration includes a preset voice and language model, but does not include a system prompt or tools. To customize these options, you will need to create and specify your own EVI configuration. To learn more, see our [configuration guide](https://dev.hume.ai/docs/empathic-voice-interface-evi/configuration/build-a-configuration).

You may pass in a configuration ID to the `VoiceProvider` object inside the [components/Chat.tsx file](https://github.com/HumeAI/hume-api-examples/blob/main/evi/next-js/evi-next-js-app-router-quickstart/components/Chat.tsx).

Here's an example:
```tsx
<VoiceProvider
configId="YOUR_CONFIG_ID"
auth={{ type: "accessToken", value: accessToken }}
>
```
You may pass in a configuration ID to the `VoiceProvider` object inside the [components/Chat.tsx file](https://github.com/HumeAI/hume-api-examples/blob/main/evi/next-js/evi-next-js-app-router-quickstart/components/Chat.tsx).

5. Run the project:
```shell
npm run dev
```
Here's an example:

```tsx
<VoiceProvider
configId="YOUR_CONFIG_ID"
auth={{ type: "accessToken", value: accessToken }}
>
```

5. Run the project:
```shell
npm run dev
```
100 changes: 51 additions & 49 deletions evi/evi-next-js-function-calling/README.md
Original file line number Diff line number Diff line change
Expand Up @@ -13,85 +13,87 @@ See the [Tool Use guide](https://dev.hume.ai/docs/empathic-voice-interface-evi/f

1. [Create a tool](https://dev.hume.ai/docs/empathic-voice-interface-evi/tool-use#create-a-tool) with the following payload:

```json
{
"name": "get_current_weather",
"description": "This tool is for getting the current weather.",
"parameters": "{ \"type\": \"object\", \"properties\": { \"location\": { \"type\": \"string\", \"description\": \"The city and state, e.g. San Francisco, CA\" }, \"format\": { \"type\": \"string\", \"enum\": [\"celsius\", \"fahrenheit\"], \"description\": \"The temperature unit to use. Infer this from the users location.\" } }, \"required\": [\"location\", \"format\"] }"
}
```
```json
{
"name": "get_current_weather",
"description": "This tool is for getting the current weather.",
"parameters": "{ \"type\": \"object\", \"properties\": { \"location\": { \"type\": \"string\", \"description\": \"The city and state, e.g. San Francisco, CA\" }, \"format\": { \"type\": \"string\", \"enum\": [\"celsius\", \"fahrenheit\"], \"description\": \"The temperature unit to use. Infer this from the users location.\" } }, \"required\": [\"location\", \"format\"] }"
}
```

2. [Create a configuration](https://dev.hume.ai/docs/empathic-voice-interface-evi/tool-use#create-a-configuration) equipped with that tool:

```json
{
"name": "Weather Assistant Config",
"language_model": {
"model_provider": "ANTHROPIC",
"model_resource": "claude-3-5-sonnet-20240620",
},
"tools": [
{
"id": "<YOUR_TOOL_ID>",
"version": 0
}
]
}
```
```json
{
"name": "Weather Assistant Config",
"language_model": {
"model_provider": "ANTHROPIC",
"model_resource": "claude-3-5-sonnet-20240620"
},
"tools": [
{
"id": "<YOUR_TOOL_ID>",
"version": 0
}
]
}
```

## Instructions

1. Clone this examples repository:

```shell
git clone https://github.com/humeai/hume-api-examples
cd hume-api-examples/evi/next-js/evi-next-js-function-calling
```
```shell
git clone https://github.com/humeai/hume-api-examples
cd hume-api-examples/evi/evi-next-js-function-calling
```

2. Install dependencies:
```shell
pnpm install
```

```shell
pnpm install
```

3. Set up your API key and Secret key:

In order to make an authenticated connection we will first need to generate an access token. Doing so will require your API key and Secret key. These keys can be obtained by logging into the Hume AI Platform and visiting the [API keys page](https://platform.hume.ai/settings/keys). For detailed instructions, see our documentation on [getting your api keys](https://dev.hume.ai/docs/introduction/api-key).
Place your `HUME_API_KEY` and `HUME_SECRET_KEY` in a `.env` file at the root of your project.
In order to make an authenticated connection we will first need to generate an access token. Doing so will require your API key and Secret key. These keys can be obtained by logging into the Hume AI Platform and visiting the [API keys page](https://platform.hume.ai/settings/keys). For detailed instructions, see our documentation on [getting your api keys](https://dev.hume.ai/docs/introduction/api-key).

Place your `HUME_API_KEY` and `HUME_SECRET_KEY` in a `.env` file at the root of your project.

```shell
echo "HUME_API_KEY=your_api_key_here" > .env
echo "HUME_SECRET_KEY=your_secret_key_here" >> .env
```
```shell
echo "HUME_API_KEY=your_api_key_here" > .env
echo "HUME_SECRET_KEY=your_secret_key_here" >> .env
```

You can copy the `.env.example` file to use as a template.
You can copy the `.env.example` file to use as a template.

4. Add your Config ID to the `.env` file. This ID is from the EVI configuration you created earlier that includes your weather tool.

```shell
echo "NEXT_PUBLIC_HUME_CONFIG_ID=your_config_id_here" >> .env
```
```shell
echo "NEXT_PUBLIC_HUME_CONFIG_ID=your_config_id_here" >> .env
```

5. Add the Geocoding API key to the `.env` file. You can obtain it for free from [geocode.maps.co](https://geocode.maps.co/).

```shell
echo "GEOCODING_API_KEY=your_geocoding_api_key_here" >> .env
```
```shell
echo "GEOCODING_API_KEY=your_geocoding_api_key_here" >> .env
```

6. Run the project:
```shell
pnpm run dev
```

This will start the Next.js development server, and you can access the application at `http://localhost:3000`.
```shell
pnpm run dev
```

This will start the Next.js development server, and you can access the application at `http://localhost:3000`.

## Example Conversation

Here's an example of how you might interact with the EVI to get weather information:

*User: "What's the weather like in New York City?"*
_User: "What's the weather like in New York City?"_

*EVI: (Uses the get_current_weather tool to fetch data) "Currently in New York City, it's 72°F (22°C) and partly cloudy. The forecast calls for a high of 78°F (26°C) and a low of 65°F (18°C) today."*
_EVI: (Uses the get_current_weather tool to fetch data) "Currently in New York City, it's 72°F (22°C) and partly cloudy. The forecast calls for a high of 78°F (26°C) and a low of 65°F (18°C) today."_

## License

Expand Down
47 changes: 23 additions & 24 deletions evi/evi-next-js-pages-router-quickstart/README.md
Original file line number Diff line number Diff line change
Expand Up @@ -22,38 +22,39 @@ Below are the steps to completing deployment:
1. Create a Git Repository for your project.
2. Provide the required environment variables. To get your API key and Secret key, log into the Hume AI Platform and visit the [API keys page](https://platform.hume.ai/settings/keys).

## Modify the project
## Modify the project

1. Clone this examples repository:

```shell
git clone https://github.com/humeai/hume-api-examples
cd hume-api-examples/evi/next-js/evi-next-js-pages-router-quickstart
```
```shell
git clone https://github.com/humeai/hume-api-examples
cd hume-api-examples/evi/evi-next-js-pages-router-quickstart
```

2. Install dependencies:
```shell
pnpm install
```

```shell
pnpm install
```

3. Set up your API key and Secret key:

In order to make an authenticated connection we will first need to generate an access token. Doing so will require your API key and Secret key. These keys can be obtained by logging into the Hume AI Platform and visiting the [API keys page](https://platform.hume.ai/settings/keys). For detailed instructions, see our documentation on [getting your api keys](https://dev.hume.ai/docs/introduction/api-key).
Place your `HUME_API_KEY` and `HUME_SECRET_KEY` in a `.env` file at the root of your project.
In order to make an authenticated connection we will first need to generate an access token. Doing so will require your API key and Secret key. These keys can be obtained by logging into the Hume AI Platform and visiting the [API keys page](https://platform.hume.ai/settings/keys). For detailed instructions, see our documentation on [getting your api keys](https://dev.hume.ai/docs/introduction/api-key).

Place your `HUME_API_KEY` and `HUME_SECRET_KEY` in a `.env` file at the root of your project.

```shell
echo "HUME_API_KEY=your_api_key_here" > .env
echo "HUME_SECRET_KEY=your_secret_key_here" >> .env
```
```shell
echo "HUME_API_KEY=your_api_key_here" > .env
echo "HUME_SECRET_KEY=your_secret_key_here" >> .env
```

You can copy the `.env.example` file to use as a template.
You can copy the `.env.example` file to use as a template.

4. Specify an EVI configuration (Optional):

EVI is pre-configured with a set of default values, which are automatically applied if you do not specify a configuration. The default configuration includes a preset voice and language model, but does not include a system prompt or tools. To customize these options, you will need to create and specify your own EVI configuration. To learn more, see our [configuration guide](https://dev.hume.ai/docs/empathic-voice-interface-evi/configuration/build-a-configuration).
You may pass in a configuration ID to the `VoiceProvider` object inside the [components/Chat.tsx file](https://github.com/HumeAI/hume-api-examples/blob/main/evi/next-js/evi-next-js-pages-router-quickstart/components/Chat.tsx).
EVI is pre-configured with a set of default values, which are automatically applied if you do not specify a configuration. The default configuration includes a preset voice and language model, but does not include a system prompt or tools. To customize these options, you will need to create and specify your own EVI configuration. To learn more, see our [configuration guide](https://dev.hume.ai/docs/empathic-voice-interface-evi/configuration/build-a-configuration).

You may pass in a configuration ID to the `VoiceProvider` object inside the [components/Chat.tsx file](https://github.com/HumeAI/hume-api-examples/blob/main/evi/next-js/evi-next-js-pages-router-quickstart/components/Chat.tsx).

Here's an example:
```tsx
Expand All @@ -64,8 +65,6 @@ Below are the steps to completing deployment:
```

5. Run the project:
```shell
pnpm run dev
```


```shell
pnpm run dev
```
Loading