hi @mammoth,
While we don’t have an existing kit for data vault modeling (i.e. like the one we have for handling slowly changing dimension), we have seen different implementations of advanced data modeling in Rivery.
These include using Rivery as the orchestrator for modeling that is defined in the target warehouse (in your case Snowflake but we have customers doing the same in BigQuery or others) using database procedures that are triggered from within Rivery’s logic rivers. The same approach have been done by triggering dedicated transformation in other modern tools such dbt jobs.
As for doing it all in Rivery, in that case we see users reusing logic rivers to create modularity and a template based approach to logic that needs to be repeated. For example, calling another logic river to generate the SQL required to create hubs and links etc. . Here’s a quick example of how to use Logic steps to create a list to loop over in subsequent steps: Dynamically create and populate tables using a looping container
In the case of Data Vault, you could use the above methodology to make generic templates for creation of link/hub tables so that you are only managing the metadata.
hope that helps,
Taylor