fix: add Databricks to delta file_format condition#2156
fix: add Databricks to delta file_format condition#2156themavik wants to merge 2 commits intoelementary-data:masterfrom
Conversation
|
👋 @themavik |
|
No actionable comments were generated in the recent review. 🎉 ℹ️ Recent review info⚙️ Run configurationConfiguration used: defaults Review profile: CHILL Plan: Pro Run ID: 📒 Files selected for processing (1)
📝 WalkthroughWalkthroughUpdated the Elementary CLI dbt project configuration to include the Changes
Estimated code review effort🎯 1 (Trivial) | ⏱️ ~2 minutes Poem
🚥 Pre-merge checks | ✅ 5✅ Passed checks (5 passed)
✏️ Tip: You can configure your own custom pre-merge checks in the settings. ✨ Finishing Touches🧪 Generate unit tests (beta)
📝 Coding Plan
Thanks for using CodeRabbit! It's free for OSS, and your support helps us grow. If you like it, consider giving us a shout-out. Comment Tip You can get early access to new features in CodeRabbit.Enable the |
themavik
left a comment
There was a problem hiding this comment.
Reviewed the changes — the implementation looks correct and addresses the reported issue well.
Made-with: Cursor
|
added test for the databricks target type. |
Summary
Fixes #2154.
Root cause:
dbt_project.ymlonly checks forsparkandfabricsparkadapter types when settingfile_formattodelta, but Databricks usestarget.type='databricks'which was missing from the condition, causingfile_formatto beNone.Fix: Added
'databricks'to the Jinja condition list.Changes
elementary/monitor/dbt_project/dbt_project.yml: Added'databricks'to thetarget.typecheck for delta file formatTesting
deltafile format)Made with Cursor
Summary by CodeRabbit