Skip to content

Commit

Permalink
Switch by default to the openai demo account
Browse files Browse the repository at this point in the history
  • Loading branch information
arey committed Dec 26, 2024
1 parent 48ef273 commit 853a606
Show file tree
Hide file tree
Showing 5 changed files with 47 additions and 34 deletions.
36 changes: 27 additions & 9 deletions README.md
Original file line number Diff line number Diff line change
Expand Up @@ -79,6 +79,7 @@ This project consists of several microservices:
- **Customers Service**: Manages customer data.
- **Vets Service**: Handles information about veterinarians.
- **Visits Service**: Manages pet visit records.
- **GenAI Service**: Provides a chatbot interface to the application.
- **API Gateway**: Routes client requests to the appropriate services.
- **Config Server**: Centralized configuration management for all services.
- **Discovery Server**: Eureka-based service registry.
Expand All @@ -102,16 +103,33 @@ Spring Petclinic integrates a Chatbot that allows you to interact with the appli
3. Is there an owner named Betty?
4. Which owners have dogs?
5. Add a dog for Betty. Its name is Moopsie.
6. Create a new owner
6. Create a new owner.

![Screenshot of the chat dialog](docs/spring-ai.png)

This `spring-petlinic-genai-service` microservice currently supports **OpenAI** (default) or **Azure's OpenAI** as the LLM provider.
In order to start the microservice, perform the following steps:

1. Decide which provider you want to use. By default, the `spring-ai-openai-spring-boot-starter` dependency is enabled.
You can change it to `spring-ai-azure-openai-spring-boot-starter`in the `pom.xml`.
2. Create an OpenAI API key or a Azure OpenAI resource in your Azure Portal.
Refer to the [OpenAI's quickstart](https://platform.openai.com/docs/quickstart) or [Azure's documentation](https://learn.microsoft.com/en-us/azure/ai-services/openai/) for further information on how to obtain these.
You only need to populate the provider you're using - either openai, or azure-openai.
If you don't have your own OpenAI API key, don't worry!
You can temporarily use the `demo` key, which OpenAI provides free of charge for demonstration purposes.
This `demo` key has a quota, is limited to the `gpt-4o-mini` model, and is intended solely for demonstration use.
With your own OpenAI account, you can test the `gpt-4o` model by modifying the `deployment-name` property of the `application.yml` file.
3. Export your API keys and endpoint as environment variables:
* either OpenAI:
```bash
export OPENAI_API_KEY="your_api_key_here"
```
* or Azure OpenAI:
```bash
export AZURE_OPENAI_ENDPOINT="https://your_resource.openai.azure.com"
export AZURE_OPENAI_KEY="your_api_key_here"
```

![alt text](docs/spring-ai.png)

This Microservice currently supports OpenAI or Azure's OpenAI as the LLM provider.
In order to enable Spring AI, perform the following steps:

1. Decide which provider you want to use. By default, the `spring-ai-azure-openai-spring-boot-starter` dependency is enabled. You can change it to `spring-ai-openai-spring-boot-starter`in `pom.xml`.
2. Copy `src/main/resources/creds-template.yaml` into `src/main/resources/creds.yaml`, and edit its contents with your API key and API endpoint. Refer to OpenAI's or Azure's documentation for further information on how to obtain these. You only need to populate the provider you're using - either openai, or azure-openai.
3. Boot the `spring-petclinic-genai-service` microservice.
## In case you find a bug/suggested improvement for Spring Petclinic Microservices

Our issue tracker is available here: https://github.com/spring-petclinic/spring-petclinic-microservices/issues
Expand Down
7 changes: 4 additions & 3 deletions spring-petclinic-genai-service/pom.xml
Original file line number Diff line number Diff line change
Expand Up @@ -17,14 +17,15 @@
<properties>
<docker.image.exposed.port>8081</docker.image.exposed.port>
<docker.image.dockerfile.dir>${basedir}/../docker</docker.image.dockerfile.dir>
<spring-ai.version>1.0.0-M3</spring-ai.version>
<spring-ai.version>1.0.0-M4</spring-ai.version>
</properties>

<dependencies>
<!-- Spring Boot -->
<dependency>
<groupId>org.springframework.ai</groupId>
<artifactId>spring-ai-azure-openai-spring-boot-starter</artifactId>
<artifactId>spring-ai-openai-spring-boot-starter</artifactId>
<!-- artifactId>spring-ai-azure-openai-spring-boot-starter</artifactId-->
</dependency>
<dependency>
<groupId>org.springframework.boot</groupId>
Expand Down Expand Up @@ -168,7 +169,7 @@
</snapshots>
</repository>
</repositories>

<profiles>
<profile>
<id>buildDocker</id>
Expand Down
Original file line number Diff line number Diff line change
Expand Up @@ -37,21 +37,22 @@ public PetclinicChatClient(ChatClient.Builder builder, ChatMemory chatMemory) {
you don't know the answer, then ask the user a followup question to try and clarify the question they are asking.
If you do know the answer, provide the answer but do not provide any additional followup questions.
When dealing with vets, if the user is unsure about the returned results, explain that there may be additional data that was not returned.
Only if the user is asking about the total number of all vets, answer that there are a lot and ask for some additional criteria.
Only if the user is asking about the total number of all vets, answer that there are a lot and ask for some additional criteria.
For owners, pets or visits - provide the correct data.
""")
.defaultAdvisors(
// Chat memory helps us keep context when using the chatbot for up to 10 previous messages.
new MessageChatMemoryAdvisor(chatMemory, DEFAULT_CHAT_MEMORY_CONVERSATION_ID, 10), // CHAT MEMORY
new SimpleLoggerAdvisor()
)
.defaultFunctions("listOwners", "addOwnerToPetclinic", "addPetToOwner", "listVets")
.build();
}

@PostMapping("/chatclient")
public String exchange(@RequestBody String query) {
try {
//All chatbot messages go through this endpoint
//All chatbot messages go through this endpoint
//and are passed to the LLM
return
this.chatClient
Expand Down
16 changes: 13 additions & 3 deletions spring-petclinic-genai-service/src/main/resources/application.yml
Original file line number Diff line number Diff line change
Expand Up @@ -7,17 +7,27 @@ spring:
active: production
config:
import: optional:configserver:${CONFIG_SERVER_URL:http://localhost:8888/},optional:classpath:/creds.yaml
#These apply when using spring-ai-azure-openai-spring-boot-starter
ai:
chat:
client:
enabled: true
# These apply when using spring-ai-azure-openai-spring-boot-starter
azure:
openai:
api-key: ${AZURE_OPENAI_KEY}
endpoint: ${AZURE_OPENAI_ENDPOINT}
chat:
options:
functions: listOwners,addOwnerToPetclinic,addPetToOwner,listVets
temperature: 0.7
deployment-name: gpt-4o
# These apply when using spring-ai-openai-spring-boot-starter
openai:
api-key: ${OPENAI_API_KEY:demo}
chat:
options:
temperature: 0.7
model: gpt-4o-mini


logging:
level:
Expand All @@ -40,4 +50,4 @@ server:
eureka:
client:
serviceUrl:
defaultZone: http://discovery-server:8761/eureka/
defaultZone: http://discovery-server:8761/eureka/

This file was deleted.

0 comments on commit 853a606

Please sign in to comment.