The vendor required to provide data warehousing automation solution (DWAS) for include:
- Mandatory requirements
• The DWAS must ensure that agency data (including metadata) remains within country and is protected according to agency security and privacy standards.
• The DWAS must be at a minimum data vault 2.0 architecture compliant.
• Data vault supports multi-temporal solutions by providing standard load patterns for implementing them and permits the definition of many timelines in parallel, allowing users to switch between them as needed.
• Furthermore, data vault facilitates distributed solutions, enabling the enterprise data warehouse, lake house, or mesh, to span different clouds or regions in a multi-cloud setup.
• It seamlessly bridges on-premises and cloud environments.
• The DWAS must include data modeling editing solution that has a series of graphical interface editors that let you configure the different sources into one integrated data vault model.
• The DWAS must generate all the necessary non-proprietary DDL and ETL code for the creation and loading of a data vault model.
• The DWAS must provide the means to assist in deploying the generated code to our databases and integrate seamlessly with agency CI/CD processes (git-based deployments for GitHub, azure DeVos), allowing our teams to automate the deployment of generated code
• The DWAS must provide the automated management of our data warehouse loading processes by generating code that defines a workflow for azure data factory (ADF), synapse pipeline, open-source python, Apache airflow or equivalent.
• The DWAS must include necessary delta generations to complement full generations by generating only the code representing changes between releases to detect structural and metadata differences then autogenerates data migration scripts that can be automatically deployed.
• The DWAS must provide the means for metadata generation and export that can extract information about the sources, data vaults, and business vaults to retrieve data lineage information.
• The DWAS must harvest the metadata from any source technology that safely stores data across all metadata levels including schema, object, and attribute levels facilitating abstraction across various physical objects.
• During harvesting no data is copied into the application, only metadata is loaded.
• The DWAS must be capable of harvesting the metadata from any source technology that safely stores data across all metadata levels including schema, object, and attribute levels facilitating abstraction across various physical objects.
• During harvesting no data is copied into the application, only metadata is loaded.
• The DWAS must address how to map a large amount of source metadata into the prebuilt and predefined default data vault model structures to propose a solution for the physical target data vault model.
• A graphical user interface (GUI) should allow a data modeler to accept, correct, or enrich the proposed solution, the final decision rests with the user.
• The DWAS must support cloud-native computing architecture with a strong but flexible security model.
• The DWAS must support rest APIs over https for session connections so that integrity is guaranteed.
Set up free email alerts and get notified when new government bids, tenders and procurement opportunities match your industry and location. Choose daily or weekly delivery.