/dbt_fivetran_log

Data models for Fivetran's internal log connector built using dbt.

Primary LanguageShellApache License 2.0Apache-2.0

Fivetran Platform dbt Package (Docs)

What does this dbt package do?

  • Generates a comprehensive data dictionary of your Fivetran Platform connector (previously called Fivetran Log) data via the dbt docs site

  • Produces staging models in the format described by this ERD which clean, test, and prepare your Fivetran data from our free Fivetran Platform connector and generates analysis ready end models.

  • The above mentioned models enable you to better understand how you are spending money in Fivetran according to our consumption-based pricing model as well as providing details about the performance and status of your Fivetran connectors. This is achieved by:

    • Displaying consumption data at the table, connector, destination, and account levels
    • Providing a history of measured free and paid monthly active rows (MAR), credit consumption, and the relationship between the two
    • Creating a history of vital daily events for each connector
    • Surfacing an audit log of records inserted, deleted, an updated in each table during connector syncs
    • Keeping an audit log of user-triggered actions across your Fivetran instance

Refer to the table below for a detailed view of all tables materialized by default within this package. Additionally, check out our docs site for more details about these tables.

Tables

Table Description
fivetran_platform__connector_status Each record represents a connector loading data into a destination, enriched with data about the connector's data sync status.
fivetran_platform__mar_table_history Each record represents a table's free, paid, and total volume for a month, complete with data about its connector and destination.
fivetran_platform__usage_mar_destination_history Table of each destination's usage and active volume, per month. Includes the usage per million MAR and MAR per usage. Usage either refers to a dollar or credit amount, depending on customer's pricing model. Read more about the relationship between usage and MAR here.
fivetran_platform__connector_daily_events Each record represents a daily measurement of the API calls, schema changes, and record modifications made by a connector, starting from the date on which the connector was set up.
fivetran_platform__schema_changelog Each record represents a schema change (altering/creating tables, creating schemas, and changing schema configurations) made to a connector and contains detailed information about the schema change event.
fivetran_platform__audit_table Replaces the deprecated _fivetran_audit table. Each record represents a table being written to during a connector sync. Contains timestamps related to the connector and table-level sync progress and the sum of records inserted/replaced, updated, and deleted in the table.
fivetran_platform__audit_user_activity Each record represents a user-triggered action in your Fivetran instance. This table is intended for audit-trail purposes, as it can be very helpful when trying to trace a user action to a log event such as a schema change, sync frequency update, manual update, broken connection, etc.

How do I use the dbt package?

Step 1: Pre-Requisites

  • Connector: Have the Fivetran Platform connector syncing data into your warehouse.
  • Database support: This package has been tested on BigQuery, Snowflake, Redshift, Postgres, Databricks, and SQL Server. Ensure you are using one of these supported databases.

Databricks Dispatch Configuration

If you are using a Databricks destination with this package you will need to add the below (or a variation of the below) dispatch configuration within your dbt_project.yml. This is required in order for the package to accurately search for macros within the dbt-labs/spark_utils then the dbt-labs/dbt_utils packages respectively.

dispatch:
  - macro_namespace: dbt_utils
    search_order: ['spark_utils', 'dbt_utils']

Database Incremental Strategies

For models in this package that are materialized incrementally, they are configured to work with the different strategies available to each supported warehouse.

For BigQuery and Databricks All Purpose Cluster runtime destinations, we have chosen insert_overwrite as the default strategy, which benefits from the partitioning capability.

For Databricks SQL Warehouse destinations, models are materialized as tables without support for incremental runs.

For Snowflake, Redshift, and Postgres databases, we have chosen delete+insert as the default strategy.

Regardless of strategy, we recommend that users periodically run a --full-refresh to ensure a high level of data quality.

Step 2: Installing the Package

Include the following Fivetran Platform package version range in your packages.yml

Check dbt Hub for the latest installation instructions, or read the dbt docs for more information on installing packages.

packages:
  - package: fivetran/fivetran_log
    version: [">=1.9.0", "<1.10.0"]

Note that although the source connector is now "Fivetran Platform", the package retains the old name of "fivetran_log".

Step 3: Define Database and Schema Variables

By default, this package will run using your target database and the fivetran_log schema. If this is not where your Fivetran Platform data is (perhaps your fivetran platform schema is fivetran_platform), add the following configuration to your root dbt_project.yml file:

vars:
    fivetran_platform_database: your_database_name # default is your target.database
    fivetran_platform_schema: your_schema_name # default is fivetran_log

Step 4: Disable Models for Non Existent Sources

If you do not leverage Fivetran RBAC, then you will not have the user or destination_membership sources. It's also possible you might not have any To disable the corresponding functionality in the package, you must add the following variable(s) to your root dbt_project.yml file. By default, all variables are assumed to be true:

vars:
    fivetran_platform_using_destination_membership: false # this will disable only the destination membership logic
    fivetran_platform_using_user: false # this will disable only the user logic

(Optional) Step 5: Additional Configurations

Change the Build Schema

By default this package will build the Fivetran staging models within a schema titled (<target_schema> + _stg_fivetran_platform) and the Fivetran Platform final models within your <target_schema> + _fivetran_platform in your target database. If this is not where you would like you Fivetran staging and final models to be written to, add the following configuration to your root dbt_project.yml file:

models:
  fivetran_log:
    +schema: my_new_final_models_schema # leave blank for just the target_schema
    staging:
      +schema: my_new_staging_models_schema # leave blank for just the target_schema

Change the Source Table References

If an individual source table has a different name than expected (see this projects dbt_project.yml variable declarations for expected names), provide the name of the table as it appears in your warehouse to the respective variable as identified below:

vars:
    fivetran_platform_<default_table_name>_identifier: your_table_name 

(Optional) Step 6: Orchestrate your models with Fivetran Transformations for dbt Core™

Expand for details

Fivetran offers the ability for you to orchestrate your dbt project through Fivetran Transformations for dbt Core™. Refer to the linked docs for more information on how to setup your project for orchestration through Fivetran.

Does this package have dependencies?

This dbt package is dependent on the following dbt packages. These dependencies are installed by default within this package. For more information on the below packages, refer to the dbt hub site.

If you have any of these dependent packages in your own packages.yml I highly recommend you remove them to ensure there are no package version conflicts.

packages:
    - package: fivetran/fivetran_utils
      version: [">=0.4.0", "<0.5.0"]

    - package: dbt-labs/dbt_utils
      version: [">=1.0.0", "<2.0.0"]

    - package: dbt-labs/spark_utils
      version: [">=0.3.0", "<0.4.0"]

    - package: calogica/dbt_date
      version: [">=0.9.0", "<1.0.0"]

How is this package maintained and can I contribute?

Package Maintenance

The Fivetran team maintaining this package only maintains the latest version of the package. We highly recommend you stay consistent with the latest version of the package and refer to the CHANGELOG and release notes for more information on changes across versions.

Contributions

These dbt packages are developed by a small team of analytics engineers at Fivetran. However, the packages are made better by community contributions.

We highly encourage and welcome contributions to this package. Check out this post on the best workflow for contributing to a package.

Are there any resources available?

  • If you encounter any questions or want to reach out for help, see the GitHub Issue section to find the right avenue of support for you.
  • If you would like to provide feedback to the dbt package team at Fivetran, or would like to request a future dbt package to be developed, then feel free to fill out our Feedback Form.