Transforming and Loading Data to DWH
In the last chapter, we have learnt how to establish connection to a MS SQL DB in Infoveave and use it as a datasource. We also learnt how to create a Star schema-based data model in Infoveave. In this chapter, we will move on to the next step: loading data into the Star schema tables all through the ETL (Extract, Transform, Load) process.
We will understand how data flows through the ETL pipeline—from extraction and staging to transformation and loading. We will also learn how to use workflows in Infoveave to organise ETL tasks into an automated sequence, which reduces manual effort and improves efficiency.
Let us start with the concept of Data quality in the next section.