Databricks asset bundles is the recommended way to promote files, notebooks and jobs and cluster details to the next workspace. Usually, you go from dev to test to qa, then may pre-prod and then prod. But how do you maintain the right code in each environment?
https://github.com/clintgrove/databricks-devops
Welcome Databricks asset bundles and DevOps!
? **Promote Notebooks to Databricks Using CI/CD (Azure DevOps & GitHub Actions)**
In this video, we walk through how to automate the promotion of your notebooks and Python files to a Databricks workspace using CI/CD tools like **Azure DevOps** and **GitHub Actions**.
? You'll learn how to use the **Databricks CLI** (e.g., `databricks workspace import_dir`) to push files from your local repo to any desired folder in Databricks (`/Shared`, `/Users`, etc.). We also cover using the `--overwrite` flag to manage existing files.
? **What’s Included:**
- CI/CD pipeline setup using Azure DevOps and GitHub Actions
- Required SPN (Service Principal) and token setup
- Creating service connections in Azure DevOps
- Adding secrets and variables in both DevOps & GitHub
- Handling Unity Catalog and workspace permissions
- YAML files to automate the whole deployment!
? **Requirements:**
- Two Databricks workspaces (Dev & Prod style)
- Unity Catalog enabled
- Azure DevOps or GitHub with access to App Registrations
- Your Databricks Account ID
? Clone/fork the repo and tailor it to your own project! https://github.com/clintgrove/databricks-devops
#Databricks #AzureDevOps #GitHubActions #CI/CD #DataEngineering #Python #Notebooks