Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

feat: add notebook for BQML remote endpoint Blog #1704

Merged
merged 12 commits into from
Feb 7, 2025
2 changes: 2 additions & 0 deletions .github/actions/spelling/allow.txt
Original file line number Diff line number Diff line change
Expand Up @@ -829,6 +829,7 @@ loghub
logparser
logprobs
lolcat
loras
lparam
lru
lsb
Expand Down Expand Up @@ -927,6 +928,7 @@ openai
openfda
opsz
osm
oss
osx
outdir
outro
Expand Down
1 change: 1 addition & 0 deletions open-models/README.md
Original file line number Diff line number Diff line change
Expand Up @@ -21,5 +21,6 @@ This repository contains examples for deploying and fine-tuning open source mode

### Use cases

- [use-cases/bigquery_ml_llama_inference.ipynb](./use-cases/bigquery_ml_llama_inference.ipynb) - This notebook showcases a simple end-to-end process for extracting entities and performing data analytics using BigQuery in conjunction with an open-source text-generation Large Language Model (LLM). We use Meta's Llama 3.3 70B model as an example.
- [use-cases/cloud_run_ollama_gemma2_rag_qa.ipynb](./use-cases/cloud_run_ollama_gemma2_rag_qa.ipynb) - This notebooks provides steps and code to deploy an open source RAG pipeline to Cloud Run using Ollama and the Gemma 2 model.
- [use-cases/guess_app.ipynb](./use-cases/guess_app.ipynb) - This notebook shows how to build a "Guess Who or What" app using FLUX and Gemini.
Loading