Data-vault modeling in Rivery?

Hello,
i am new to rivery.io.

how well does Rivery support Data vault modeling? How reliable is it?

I have tried creating a data vault model using Logic river steps, but I mostly had to type the SQL code and hassle with the interface and make sure where the data was loaded and sourced. It felt like directly creating a data vault model in Snowflake was less time-consuming.

what is the best practive for Data vault modeling? How would you approach it? and is perhaps rivery not the optimal option?

Any help and insight would be appreciated. Thank you

hi @mammoth,
While we don’t have an existing kit for data vault modeling (i.e. like the one we have for handling slowly changing dimension), we have seen different implementations of advanced data modeling in Rivery.

These include using Rivery as the orchestrator for modeling that is defined in the target warehouse (in your case Snowflake but we have customers doing the same in BigQuery or others) using database procedures that are triggered from within Rivery’s logic rivers. The same approach have been done by triggering dedicated transformation in other modern tools such dbt jobs.

As for doing it all in Rivery, in that case we see users reusing logic rivers to create modularity and a template based approach to logic that needs to be repeated. For example, calling another logic river to generate the SQL required to create hubs and links etc. . Here’s a quick example of how to use Logic steps to create a list to loop over in subsequent steps: Dynamically create and populate tables using a looping container

In the case of Data Vault, you could use the above methodology to make generic templates for creation of link/hub tables so that you are only managing the metadata.

hope that helps,
Taylor