Documentation | Sample Data Bundles
The Lakeflow Framework is a meta-data driven framework designed to:
- accelerate and simplify the deployment of Spark Declarative Pipelines, and support their deployment through your SDLC.
- support a wide variety of patterns across the medallion architecture for both batch and streaming workloads.
The Framework is designed for simplicity, performance and alignment to the Databricks Product Roadmap. The Framework is designed in such away to allow ease of maintenance and extensibility as the SDP product evolves.
Please refer to the documentation for further details and an explanation of the samples. The documentation needs to be deployed as HTML or Markdown within your org before it can be used.
Databricks support doesn't cover this content. For questions or bugs, please open a GitHub issue and the team will help on a best effort basis.
© 2025 Databricks, Inc. All rights reserved. The source in this notebook is provided subject to the Databricks License [https://databricks.com/db-license-source]. All included or referenced third party libraries are subject to the licenses set forth below.