Skip to content
Draft
Show file tree
Hide file tree
Changes from all commits
Commits
File filter

Filter by extension

Filter by extension

Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
3 changes: 2 additions & 1 deletion .gitignore
Original file line number Diff line number Diff line change
Expand Up @@ -2,6 +2,7 @@
target/
dbt_modules/
logs/
dbt_internal_packages/
venv/
dbt_packages/
test_run_models.sh
test_run_models.sh
12 changes: 6 additions & 6 deletions README.md
Original file line number Diff line number Diff line change
Expand Up @@ -163,7 +163,7 @@ BigQuery, Snowflake, Redshift, and Postgres. By default a comma `,` is used as a
* `field_to_agg` (required): Field within the table you are wishing to aggregate.

----
### ceiling ([source](macros/ceiling.sql))
### ceiling ([source](https://github.com/fivetran/dbt_fivetran_utils/blob/master/macros/ceiling.sql))
This macro allows for cross database use of the ceiling function. The ceiling function returns the smallest integer greater
than, or equal to, the specified numeric expression. The ceiling macro is compatible with BigQuery, Redshift, Postgres, and Snowflake.

Expand All @@ -187,7 +187,7 @@ This macro extracts a url parameter from a column containing a url. It is an exp
* `url_parameter` (required): The parameter you want to extract.

----
### first_value ([source](macros/first_value.sql))
### first_value ([source](https://github.com/fivetran/dbt_fivetran_utils/blob/master/macros/first_value.sql))
This macro returns the value_expression for the first row in the current window frame with cross db functionality. This macro ignores null values. The default first_value calculation within the macro is the `first_value` function. The Redshift first_value calculation is the `first_value` function, with the inclusion of a frame_clause `{{ partition_field }} rows unbounded preceding`.

**Usage:**
Expand All @@ -214,7 +214,7 @@ The data is returned by the path you provide as the argument. The json_extract m
* `string_path` (required): Name of the path in the json object which you want to extract the data from.

----
### json_parse ([source](macros/json_parse.sql))
### json_parse ([source](https://github.com/fivetran/dbt_fivetran_utils/blob/master/macros/json_parse.sql))
This macro allows for cross database use of the json extract function, specifically used to parse and extract a nested value from a json object.
The data is returned by the path you provide as the list within the `string_path` argument. The json_parse macro is compatible with BigQuery, Redshift, Postgres, Snowflake and Databricks.

Expand Down Expand Up @@ -266,7 +266,7 @@ This macro builds off of the `json_extract` macro in order to extract a list of
* `list_of_properties` (required): List of the fields that you want to extract from the json object and pivot out into columns.

----
### string_agg ([source](macros/string_agg.sql))
### string_agg ([source](https://github.com/fivetran/dbt_fivetran_utils/blob/master/macros/string_agg.sql))
This macro allows for cross database field aggregation and delimiter customization. Supported database specific field aggregation functions include
BigQuery, Snowflake, Redshift, Postgres, and Spark.

Expand All @@ -278,7 +278,7 @@ BigQuery, Snowflake, Redshift, Postgres, and Spark.
* `field_to_agg` (required): Field within the table you are wishing to aggregate.
* `delimiter` (required): Character you want to be used as the delimiter between aggregates.
----
### timestamp_add ([source](macros/timestamp_add.sql))
### timestamp_add ([source](https://github.com/fivetran/dbt_fivetran_utils/blob/master/macros/timestamp_add.sql))
This macro allows for cross database addition of a timestamp field and a specified datepart and interval for BigQuery, Redshift, Postgres, and Snowflake.

**Usage:**
Expand All @@ -291,7 +291,7 @@ This macro allows for cross database addition of a timestamp field and a specifi
* `from_timestamp` (required): The timestamp field you are adding the datepart and interval.

----
### timestamp_diff ([source](macros/timestamp_diff.sql))
### timestamp_diff ([source](https://github.com/fivetran/dbt_fivetran_utils/blob/master/macros/timestamp_diff.sql))
This macro allows for cross database timestamp difference calculation for BigQuery, Redshift, Postgres, and Snowflake.

**Usage:**
Expand Down
21 changes: 19 additions & 2 deletions dbt_project.yml
Original file line number Diff line number Diff line change
@@ -1,4 +1,21 @@
name: 'fivetran_utils'
version: '0.4.10'
version: '0.5.0'
profile: 'default'
config-version: 2
require-dbt-version: [">=1.3.0", "<2.0.0"]
require-dbt-version: [">=1.10.6"]
Copy link
Copy Markdown
Author

@dataders dataders Aug 8, 2025

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

@b-per maybe we still need a new version to remove the upper bound so that fusion works?

Suggested change
require-dbt-version: [">=1.10.6"]
require-dbt-version: [">=1.3.0"]

Copy link
Copy Markdown

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Fusion doesn't look at those today (but might in the future)


model-paths: ["models"]
analysis-paths: ["analysis"]
test-paths: ["tests"]
seed-paths: ["data"]
macro-paths: ["macros"]
snapshot-paths: ["snapshots"]


vars:
fivetran_utils:
dbt_utils_dispatch_list:
- fivetran_utils
Comment on lines +15 to +18
Copy link
Copy Markdown
Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Suggested change
vars:
fivetran_utils:
dbt_utils_dispatch_list:
- fivetran_utils


flags:
require_generic_test_arguments_property: true
4 changes: 3 additions & 1 deletion integration_tests/dbt_project.yml
Original file line number Diff line number Diff line change
Expand Up @@ -3,6 +3,8 @@ version: '0.4.10'
config-version: 2
profile: 'integration_tests'

flags:
require_generic_test_arguments_property: true
clean-targets:
- "target"
- "dbt_packages"
Expand All @@ -11,4 +13,4 @@ dispatch:
- macro_namespace: fivetran_utils
search_order: ['spark_utils', 'fivetran_utils']
- macro_namespace: dbt_utils
search_order: ['spark_utils', 'fivetran_utils', 'dbt_utils']
search_order: ['spark_utils', 'fivetran_utils', 'dbt_utils']