Incorporating the Docker Compose formally known as Everything Bagel.
This sample repository captures a collection of notebooks, dockerized applications and code snippets that demonstrate how to use lakeFS.
lakeFS is a popular open-source solution for managing data. It provides a consistent and scalable data management layer on top of cloud storage, such as Amazon S3, Azure Blob Storage, or Google Cloud Storage. It allows users to create and manage data in a version-controlled and immutable manner, and offers features such as data governance, data lineage, and data access controls. lakeFS is compatible with a wide range of data processing frameworks and tools.
Go to lakefs_enterprise folder if you want to use lakeFS Enterprise instead of lakeFS open source
Clone this repository
git clone https://github.com/treeverse/lakeFS-samples.git
cd lakeFS-samples
You now have two options:
If you have already installed lakeFS or are utilizing lakeFS cloud, all you need to run is the Jupyter notebook server:
docker compose up
Once the stack's up and running, open the Jupyter Notebook (http://localhost:8888) and check out the catalog of sample notebooks to explore lakeFS.
Once you've finished, run the following to remove all the containers:
docker compose down
If you want to provision a lakeFS server as well as MinIO for your object store, plus Jupyter then bring up the full stack:
docker compose --profile local-lakefs up
As above, open the Jupyter Notebook (http://localhost:8888) peruse the catalog of sample notebooks to explore lakeFS.
- Jupyter Notebook is based on the Jupyter PySpark notebook and provides an interactive environment in which to explore lakeFS using Python and PySpark.
- lakeFS can be provisioned as part of this environment, or provided by lakeFS cloud or your own installation.
- If you run lakeFS as part of this environment, MinIO is provided as an S3-compatible object store. If you run lakeFS yourself you can use other S3-compatible object stores include S3, GCS, as well as MinIO
- Jupyter http://localhost:8888/
If you've brought up the full stack you'll also have:
- LakeFS http://localhost:8000/ (
AKIAIOSFOLKFSSAMPLES
/wJalrXUtnFEMI/K7MDENG/bPxRfiCYEXAMPLEKEY
) - MinIO http://localhost:9001/ (
minioadmin
/minioadmin
) - Spark UI http://localhost:4040/
Under the standalone_examples folder are a set of examples that need to be run on their own. Some use the repository's Docker Compose file and extend it, and others are self-contained and use their own Dockerfile.
- Airflow (1) - Four examples of using lakeFS with Airflow:
- Versioning DAGs and running pipeline from hooks using a configurable version of DAGs
- Isolating Airflow job run and atomic promotion to production
- Integration of lakeFS with Airflow via Hooks
- Troubleshooting production issues
- Integration of lakeFS with Airflow and Databricks
- Integration of lakeFS with Airflow and Iceberg
- Airflow (2) - lakeFS + Airflow
- Azure Databricks
- AWS Databricks
- Databricks CI/CD
- AWS Glue and Athena
- AWS Glue and Trino
- AWS Glue and Iceberg
- lakeFS + Dagster
- lakeFS + Prefect
- Image Segmentation Demo: ML Data Version Control and Reproducibility at Scale
- Labelbox integration
- Kafka integration
- Flink integration
- Red Hat OpenShift AI integration
- How to migrate or clone a repo
- Running lakeFS with PostgreSQL as K/V store
👉🏻 Join the lakeFS Slack group - https://lakefs.io/slack