You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
The goal of this integration would be to use the Lakehouse monitoring tables as the source of the LLM requests/reponses (instead of jaeger) and then create a new Custom monitor metric table (for all the Custom metrics that Pythia produces) as defined here
4.Finally, we also need to package this solution so it can be installed from the marketplace and also build an example notebook that can demonstrate how to use it like this listing:
Requirements
2.For Pythia's LLM we should use a Databricks hosted LLM for our purposes and not OpenAI
3.This should in theory allow users to then create SQL alerts within Databricks for the custom Pythia metrics
4.Finally, we also need to package this solution so it can be installed from the marketplace and also build an example notebook that can demonstrate how to use it like this listing:
Open Questions
Do we need to use the Inference tables for LLMs being served ?
What validators do we need to enable for this effort ?
The text was updated successfully, but these errors were encountered: