-
Notifications
You must be signed in to change notification settings - Fork 198
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
AI/ML Online Endpoint Service Target #3718
Conversation
ba83aa4
to
38b5b33
Compare
Host is required and needs to be set to Project is optional. If set it will be used as a base for any other paths referenced in the config section.
Some AI components like prompt flows can be deployed to ACA or AKS, but you wouldn't use this host for that. That would simply use a dockerfile and deploy like normal containerized application. @vhvb1989 - I accidently edited your post instead of replying to it... :( |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Overall looking really good, @wbreza. Thanks for how cleanly this was inserted. A few questions that came to mind as I did my first pass.
Does seem a little strange that we don't include a |
My general thought here is that I'd drive the decision based on the UX we can give to users that have to edit this file by hand. Just my two cents... |
When thinking about the sample azure.yaml file: config:
# The name of the AI studio project / workspace
workspace: ${AZUREML_AI_PROJECT_NAME}
# Optional: Path to custom ML environment manifest
environment:
path: deployment/docker/environment.yml
# Optional: Path to your prompt flow folder that contains the flow manifest
flow:
path: ./contoso-chat
# Optional: Path to custom model manifest
model:
path: deployment/chat-model.yaml
overrides:
"properties.azureml.promptflow.source_flow_id": ${AZUREML_FLOW_NAME}
# Required: Path to deployment manifest
deployment:
path: deployment/chat-deployment.yaml
overrides:
environment_variables.PRT_CONFIG_OVERRIDE: deployment.subscription_id=${AZURE_SUBSCRIPTION_ID},deployment.resource_group=${AZURE_RESOURCE_GROUP},deployment.workspace_name=${AZUREML_AI_PROJECT_NAME},deployment.endpoint_name=${AZUREML_ENDPOINT_NAME},deployment.deployment_name=${AZUREML_DEPLOYMENT_NAME} I am wondering how much if this we could default? It seems like we "require" the overrides for both the model and deployment. Could we do work in It feels like we some simple defaults we could get this to: config:
# Optional: Path to custom ML environment manifest
environment:
path: deployment/docker/environment.yml
# Optional: Path to your prompt flow folder that contains the flow manifest
flow:
path: ./contoso-chat
# Optional: Path to custom model manifest
model:
path: deployment/chat-model.yaml
# Required: Path to deployment manifest
deployment:
path: deployment/chat-deployment.yaml And then if we promote the I would like to make the |
I feel more confident with the long term of using config as the general-purpose way to pass structured data (provided that as we expand, we have enough information for intelligence to not get bloated, confusing, or have info missing). The |
I actually would prefer if we moved to a generic |
@kristenwomack We will be iterating on the naming of the hosts, environment variables, etc with the AI team. Seems like long term everything will be under the |
The overrides would only be used for advanced scenarios where there needs to be some dynamic configuration with the AI component configuration that can't be statically set in the yaml definition. Lots of the other configuration is purely optional if you are not using it. Workspace (aka project) is optional and will use ENV VAR when not defined. |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
partial rev
206aaee
to
8cd9c3c
Compare
Just wondering - I'm in sort of bad state over here and ended doing an
Deployment Error Details: Due to operator error, I am unsure if |
I confirmed that we are already purging these account types:
So feel free to ignore the above. |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
My main comments are about implementation details which can be re-iterated in the future.
Functionally speaking, LGTM
9e8d747
to
6fff19f
Compare
95c6d88
to
eb9835f
Compare
eb9835f
to
5aa3844
Compare
…t always available
Azure Dev CLI Install InstructionsInstall scriptsMacOS/Linux
bash:
pwsh:
WindowsPowerShell install
MSI install
Standalone Binary
MSIDocumentationlearn.microsoft.com documentationtitle: Azure Developer CLI reference
|
Adds ability to quickly and easily deploy to an AI/ML studio online endpoint from
azd
config.flow
section is definedazd
will create a new prompt flow from the specified file pathconfig.environment
section is definedazd
will create a new environment version using the referenced yaml file definitionconfig.model
section is definedazd
will create a new model version using the referenced yaml file definitionThe
config.deployment
section is required and will create a new online deployment to the associated online endpoint from the referenced yaml file definition.azd
waits for deployment to enter a terminal provisioning stateTip
Use the custom build of
azd
outlined in the comments below to try out this new feature!Warning
This feature is a work in progress and is subject to change. Any and all feedback is welcome and appreciated.
New azd service host type:
ai.endpoint
Important
Requires Python installed and available on
PATH
What can I deploy to AI online endpoints?
Environments
sectionModels
sectionPrompt flow
sectionPrompt flow
sectionEndpoints
sectionDeployments
sectionRequirements
The following resources will be expected to be included within your deployed Azure resources.
azd-service-name
tagExample azure.yaml
This example comes from a fork of the
contoso-chat
application.Reference the yaml schema from this branch for live IntelliSense within the
azure.yaml
AI Endpoint Configuration
The configuration spec of the
config
section of services forai.endpoint
.- workspace: The name of the AI studio project / workspace (supports env var substitutions)
- flow: Custom configuration for flows
- environment: Custom configuration for ML environments
- model: Custom configuration for ML models
- deployment: Custom configuration for online endpoint deployments
Flow (flow)
Optional flow configuration section
- name: Name of flow (defaults to
<service-name>-flow-<timestamp>
if not specified)- path: Relative path to a flow folder that contains the flow manifest
- overrides: Any custom overrides to apply to the flow
Note
Each call to
azd deploy
will create a new timestamped flowEnvironment (environment)
Optional environment configuration section
- name: Name of custom environment (defaults to
<service-name>-environment
if not specified)- path: Relative path to a custom environment yaml manifest
- overrides: Any custom overrides to apply to the environment
Note
Each call to
azd deploy
will create a new environment versionModel (model)
Optional model configuration section
- name: Name of custom model (defaults to
<service-name>-model
if not specified)- path: Relative path to a custom model yaml manifest
- overrides: Any custom overrides to apply to the model
Note
Each call to
azd deploy
will create a new environment versionDeployment (deployment)
Required deployment configuration section.
- name: Name of custom model(defaults to
<service-name>-deployment
if not specified)- path: Relative path to a custom deployment yaml manifest
- environment: A map of key value pairs to set environment variables for the deployment. Supports environment variable substitutions from OS/AZD environment variables using
${VAR_NAME}
syntax.- overrides: Any custom overrides to apply to the deployment
Note
Only supports managed online deployments