Skip to content

Commit

Permalink
Merge branch 'main' into feature-connectors-anthropic
Browse files Browse the repository at this point in the history
  • Loading branch information
RogerBarreto authored Oct 10, 2024
2 parents 2f47d9b + 900beca commit 86e169a
Show file tree
Hide file tree
Showing 759 changed files with 61,780 additions and 5,512 deletions.
7 changes: 6 additions & 1 deletion .github/_typos.toml
Original file line number Diff line number Diff line change
Expand Up @@ -16,8 +16,13 @@ extend-exclude = [
"test_code_tokenizer.py",
"*response.json",
"test_content.txt",
"google_what_is_the_semantic_kernel.json",
"what-is-semantic-kernel.json",
"serializedChatHistoryV1_15_1.json",
"MultipleFunctionsVsParameters.cs"
"MultipleFunctionsVsParameters.cs",
"PopulationByCountry.csv",
"PopulationByAdmin1.csv",
"WomensSuffrage.txt",
]

[default.extend-words]
Expand Down
2 changes: 1 addition & 1 deletion .github/workflows/dotnet-build-and-test.yml
Original file line number Diff line number Diff line change
Expand Up @@ -141,7 +141,7 @@ jobs:

# Generate test reports and check coverage
- name: Generate test reports
uses: danielpalme/[email protected].9
uses: danielpalme/[email protected].10
with:
reports: "./TestResults/Coverage/**/coverage.cobertura.xml"
targetdir: "./TestResults/Reports"
Expand Down
31 changes: 24 additions & 7 deletions .github/workflows/python-integration-tests.yml
Original file line number Diff line number Diff line change
Expand Up @@ -15,6 +15,7 @@ on:

permissions:
contents: read
id-token: "write"

env:
# Configure a constant location for the uv cache
Expand Down Expand Up @@ -46,7 +47,6 @@ jobs:
name: Python Pre-Merge Integration Tests
needs: paths-filter
if: github.event_name != 'pull_request' && github.event_name != 'schedule' && needs.paths-filter.outputs.pythonChanges == 'true'
runs-on: ${{ matrix.os }}
strategy:
max-parallel: 1
fail-fast: false
Expand All @@ -56,6 +56,8 @@ jobs:
defaults:
run:
working-directory: python
runs-on: ${{ matrix.os }}
environment: "integration"
steps:
- uses: actions/checkout@v4
- name: Set up uv
Expand Down Expand Up @@ -109,6 +111,13 @@ jobs:
- name: Setup Redis Stack Server
if: matrix.os == 'ubuntu-latest'
run: docker run -d --name redis-stack-server -p 6379:6379 redis/redis-stack-server:latest
- name: Azure CLI Login
if: github.event_name != 'pull_request'
uses: azure/login@v2
with:
client-id: ${{ secrets.AZURE_CLIENT_ID }}
tenant-id: ${{ secrets.AZURE_TENANT_ID }}
subscription-id: ${{ secrets.AZURE_SUBSCRIPTION_ID }}
- name: Run Integration Tests
id: run_tests
shell: bash
Expand All @@ -120,7 +129,6 @@ jobs:
AZURE_OPENAI_TEXT_DEPLOYMENT_NAME: ${{ vars.AZURE_OPENAI_TEXT_DEPLOYMENT_NAME }}
AZURE_OPENAI_API_VERSION: ${{ vars.AZURE_OPENAI_API_VERSION }}
AZURE_OPENAI_ENDPOINT: ${{ secrets.AZURE_OPENAI_ENDPOINT }}
AZURE_OPENAI_API_KEY: ${{ secrets.AZURE_OPENAI_API_KEY }}
BING_API_KEY: ${{ secrets.BING_API_KEY }}
OPENAI_CHAT_MODEL_ID: ${{ vars.OPENAI_CHAT_MODEL_ID }}
OPENAI_TEXT_MODEL_ID: ${{ vars.OPENAI_TEXT_MODEL_ID }}
Expand All @@ -129,6 +137,7 @@ jobs:
OPENAI_API_KEY: ${{ secrets.OPENAI_API_KEY }}
PINECONE_API_KEY: ${{ secrets.PINECONE__APIKEY }}
POSTGRES_CONNECTION_STRING: ${{secrets.POSTGRES__CONNECTIONSTR}}
POSTGRES_MAX_POOL: ${{ vars.POSTGRES_MAX_POOL }}
AZURE_AI_SEARCH_API_KEY: ${{secrets.AZURE_AI_SEARCH_API_KEY}}
AZURE_AI_SEARCH_ENDPOINT: ${{secrets.AZURE_AI_SEARCH_ENDPOINT}}
MONGODB_ATLAS_CONNECTION_STRING: ${{secrets.MONGODB_ATLAS_CONNECTION_STRING}}
Expand All @@ -149,7 +158,7 @@ jobs:
VERTEX_AI_GEMINI_MODEL_ID: ${{ vars.VERTEX_AI_GEMINI_MODEL_ID }}
VERTEX_AI_EMBEDDING_MODEL_ID: ${{ vars.VERTEX_AI_EMBEDDING_MODEL_ID }}
REDIS_CONNECTION_STRING: ${{ vars.REDIS_CONNECTION_STRING }}
run: |
run: |
uv run pytest -n logical --dist loadfile --dist worksteal ./tests/integration ./tests/samples -v --junitxml=pytest.xml
- name: Surface failing tests
if: always()
Expand All @@ -166,7 +175,6 @@ jobs:
python-integration-tests:
needs: paths-filter
if: (github.event_name == 'schedule' || github.event_name == 'workflow_dispatch') && needs.paths-filter.outputs.pythonChanges == 'true'
runs-on: ${{ matrix.os }}
strategy:
max-parallel: 1
fail-fast: false
Expand All @@ -176,6 +184,8 @@ jobs:
defaults:
run:
working-directory: python
runs-on: ${{ matrix.os }}
environment: "integration"
steps:
- uses: actions/checkout@v4
- name: Set up uv
Expand Down Expand Up @@ -225,22 +235,28 @@ jobs:
project_id: ${{ vars.VERTEX_AI_PROJECT_ID }}
credentials_json: ${{ secrets.VERTEX_AI_SERVICE_ACCOUNT_KEY }}
- name: Set up gcloud
uses: google-github-actions/setup-gcloud@v2
uses: google-github-actions/setup-gcloud@v2
- name: Setup Redis Stack Server
if: matrix.os == 'ubuntu-latest'
run: docker run -d --name redis-stack-server -p 6379:6379 redis/redis-stack-server:latest
- name: Azure CLI Login
if: github.event_name != 'pull_request'
uses: azure/login@v2
with:
client-id: ${{ secrets.AZURE_CLIENT_ID }}
tenant-id: ${{ secrets.AZURE_TENANT_ID }}
subscription-id: ${{ secrets.AZURE_SUBSCRIPTION_ID }}
- name: Run Integration Tests
id: run_tests
shell: bash
env:
env:
HNSWLIB_NO_NATIVE: 1
Python_Integration_Tests: Python_Integration_Tests
AZURE_OPENAI_EMBEDDING_DEPLOYMENT_NAME: ${{ vars.AZURE_OPENAI_EMBEDDING_DEPLOYMENT_NAME }} # azure-text-embedding-ada-002
AZURE_OPENAI_CHAT_DEPLOYMENT_NAME: ${{ vars.AZURE_OPENAI_CHAT_DEPLOYMENT_NAME }}
AZURE_OPENAI_TEXT_DEPLOYMENT_NAME: ${{ vars.AZURE_OPENAI_TEXT_DEPLOYMENT_NAME }}
AZURE_OPENAI_API_VERSION: ${{ vars.AZURE_OPENAI_API_VERSION }}
AZURE_OPENAI_ENDPOINT: ${{ secrets.AZURE_OPENAI_ENDPOINT }}
AZURE_OPENAI_API_KEY: ${{ secrets.AZURE_OPENAI_API_KEY }}
BING_API_KEY: ${{ secrets.BING_API_KEY }}
OPENAI_CHAT_MODEL_ID: ${{ vars.OPENAI_CHAT_MODEL_ID }}
OPENAI_TEXT_MODEL_ID: ${{ vars.OPENAI_TEXT_MODEL_ID }}
Expand All @@ -249,6 +265,7 @@ jobs:
OPENAI_API_KEY: ${{ secrets.OPENAI_API_KEY }}
PINECONE_API_KEY: ${{ secrets.PINECONE__APIKEY }}
POSTGRES_CONNECTION_STRING: ${{secrets.POSTGRES__CONNECTIONSTR}}
POSTGRES_MAX_POOL: ${{ vars.POSTGRES_MAX_POOL }}
AZURE_AI_SEARCH_API_KEY: ${{secrets.AZURE_AI_SEARCH_API_KEY}}
AZURE_AI_SEARCH_ENDPOINT: ${{secrets.AZURE_AI_SEARCH_ENDPOINT}}
MONGODB_ATLAS_CONNECTION_STRING: ${{secrets.MONGODB_ATLAS_CONNECTION_STRING}}
Expand Down
6 changes: 2 additions & 4 deletions README.md
Original file line number Diff line number Diff line change
Expand Up @@ -6,8 +6,6 @@
[![Python package](https://img.shields.io/pypi/v/semantic-kernel)](https://pypi.org/project/semantic-kernel/)
- .NET <br/>
[![Nuget package](https://img.shields.io/nuget/vpre/Microsoft.SemanticKernel)](https://www.nuget.org/packages/Microsoft.SemanticKernel/)[![dotnet Docker](https://github.com/microsoft/semantic-kernel/actions/workflows/dotnet-ci-docker.yml/badge.svg?branch=main)](https://github.com/microsoft/semantic-kernel/actions/workflows/dotnet-ci-docker.yml)[![dotnet Windows](https://github.com/microsoft/semantic-kernel/actions/workflows/dotnet-ci-windows.yml/badge.svg?branch=main)](https://github.com/microsoft/semantic-kernel/actions/workflows/dotnet-ci-windows.yml)
- Java <br/>
[![Java CICD Builds](https://github.com/microsoft/semantic-kernel/actions/workflows/java-build.yml/badge.svg?branch=java-development)](https://github.com/microsoft/semantic-kernel/actions/workflows/java-build.yml)[![Maven Central](https://maven-badges.herokuapp.com/maven-central/com.microsoft.semantic-kernel/semantickernel-api/badge.svg)](https://maven-badges.herokuapp.com/maven-central/com.microsoft.semantic-kernel/semantickernel-api)

## Overview

Expand Down Expand Up @@ -113,7 +111,7 @@ on our Learn site. Each sample comes with a completed C# and Python project that

1. 📖 [Getting Started](https://learn.microsoft.com/en-us/semantic-kernel/get-started/quick-start-guide)
1. 🔌 [Detailed Samples](https://learn.microsoft.com/en-us/semantic-kernel/get-started/detailed-samples)
1. 💡 [Concepts](https://learn.microsoft.com/en-us/semantic-kernel/concepts/agents)
1. 💡 [Concepts](https://learn.microsoft.com/en-us/semantic-kernel/concepts/kernel)

Finally, refer to our API references for more details on the C# and Python APIs:

Expand All @@ -138,7 +136,7 @@ in a different direction, but also to consider the impact on the larger ecosyste
To learn more and get started:

- Read the [documentation](https://aka.ms/sk/learn)
- Learn how to [contribute](https://learn.microsoft.com/en-us/semantic-kernel/get-started/contributing) to the project
- Learn how to [contribute](https://learn.microsoft.com/en-us/semantic-kernel/support/contributing) to the project
- Ask questions in the [GitHub discussions](https://github.com/microsoft/semantic-kernel/discussions)
- Ask questions in the [Discord community](https://aka.ms/SKDiscord)

Expand Down
14 changes: 11 additions & 3 deletions TRANSPARENCY_FAQS.md
Original file line number Diff line number Diff line change
Expand Up @@ -5,7 +5,6 @@ Microsoft Semantic Kernel is a lightweight, open-source development kit designed

It serves as efficient middleware that supports developers in building AI agents, automating business processes, and connecting their code with the latest AI technologies. Input to this system can range from text data to structured commands, and it produces various outputs, including natural language responses, function calls, and other actionable data.


## What can Microsoft Semantic Kernel do?
Building upon its foundational capabilities, Microsoft Semantic Kernel facilitates several functionalities:
- AI Agent Development: Users can create agents capable of performing specific tasks or interactions based on user input.
Expand All @@ -15,7 +14,6 @@ Building upon its foundational capabilities, Microsoft Semantic Kernel facilitat
- Filtering: Developers can use filters to monitor the application, control function invocation or implement Responsible AI.
- Prompt Templates: Developer can define their prompts using various template languages including Handlebars and Liquid or the built-in Semantic Kernel format.


## What is/are Microsoft Semantic Kernel’s intended use(s)?
The intended uses of Microsoft Semantic Kernel include:
- Production Ready Applications: Building small to large enterprise scale solutions that can leverage advanced AI models capabilities.
Expand Down Expand Up @@ -51,7 +49,6 @@ Operational factors and settings for optimal use include:
- Real-Time Monitoring: System behavior should be regularly monitored to detect unexpected patterns or malfunctions promptly.
- Incorporate RAI and safety tools like Prompt Shield with filters to ensure responsible use.


### Plugins and Extensibility

#### What are plugins and how does Microsoft Semantic Kernel use them?
Expand All @@ -68,3 +65,14 @@ Potential issues that may arise include:
- Invocation Failures: Incorrectly triggered plugins can result in unexpected outputs.
- Output Misinformation: Errors in plugin handling can lead to generation of inaccurate or misleading results.
- Dependency Compatibility: Changes in external dependencies may affect plugin functionality. To prevent these issues, users are advised to keep plugins updated and to rigorously test their implementations for stability and accuracy

#### When working with AI, the developer can enable content moderation in the AI platforms used, and has complete control on the prompts being used, including the ability to define responsible boundaries and guidelines. For instance:
- When using Azure OpenAI, by default the service includes a content filtering system that works alongside core models. This system works by running both the prompt and completion through an ensemble of classification models aimed at detecting and preventing the output of harmful content. In addition to the content filtering system, the Azure OpenAI Service performs monitoring to detect content and/or behaviors that suggest use of the service in a manner that might violate applicable product terms. The filter configuration can be adjusted, for example to block also "low severity level" content. See here for more information.
- The developer can integrate Azure AI Content Safety to detect harmful user-generated and AI-generated content, including text and images. The service includes an interactive Studio online tool with templates and customized workflows. See here for more information.
- When using OpenAI the developer can integrate OpenAI Moderation to identify problematic content and take action, for instance by filtering it. See here for more information.
- Other AI providers provide content moderation and moderation APIs, which developers can integrate with Node Engine.

#### If a sequence of components are run, additional risks/failures may arise when using non-deterministic behavior. To mitigate this, developers can:
Implement safety measures and bounds on each component to prevent undesired outcomes.
Add output to the user to maintain control and awareness of the system's state.
In multi-agent scenarios, build in places that prompt the user for a response, ensuring user involvement and reducing the likelihood of undesired results due to multi-agent looping.
2 changes: 1 addition & 1 deletion docs/PLANNERS.md
Original file line number Diff line number Diff line change
Expand Up @@ -2,4 +2,4 @@

This document has been moved to the Semantic Kernel Documentation site. You can find it by navigating to the [Automatically orchestrate AI with planner](https://learn.microsoft.com/en-us/semantic-kernel/ai-orchestration/planner) page.

To make an update on the page, file a PR on the [docs repo.](https://github.com/MicrosoftDocs/semantic-kernel-docs/blob/main/semantic-kernel/ai-orchestration/planner.md)
To make an update on the page, file a PR on the [docs repo.](https://github.com/MicrosoftDocs/semantic-kernel-docs/blob/main/semantic-kernel/concepts/planning.md)
2 changes: 1 addition & 1 deletion docs/decisions/0034-rag-in-sk.md
Original file line number Diff line number Diff line change
Expand Up @@ -143,7 +143,7 @@ var result = await kernel.InvokePromptAsync("{{budgetByYear}} What is my budget
This approach is similar to Option 1, but data search step is part of prompt rendering process. Following list contains possible plugins to use for data search:

- [ChatGPT Retrieval Plugin](https://github.com/openai/chatgpt-retrieval-plugin) - this plugin should be hosted as a separate service. It has integration with various [vector databases](https://github.com/openai/chatgpt-retrieval-plugin?tab=readme-ov-file#choosing-a-vector-database).
- [SemanticKernel.Plugins.Memory.TextMemoryPlugin](https://www.nuget.org/packages/Microsoft.SemanticKernel.Plugins.Memory) - Semantic Kernel solution, which supports various [vector databases](https://learn.microsoft.com/en-us/semantic-kernel/memories/vector-db#available-connectors-to-vector-databases).
- [SemanticKernel.Plugins.Memory.TextMemoryPlugin](https://www.nuget.org/packages/Microsoft.SemanticKernel.Plugins.Memory) - Semantic Kernel solution, which supports various vector databases.
- Custom user plugin.

ChatGPT Retrieval Plugin:
Expand Down
Loading

0 comments on commit 86e169a

Please sign in to comment.