ERwin Support for Master Data Management

In this article we’ll discuss how certain ERwin functions, typically seen in Data Warehouse implementations, can play a pivotal role in providing a base for data evolution within the MDM space, as well as the capability for extending model meta data for MDM implementations, at both the logical and physical level.

This author has been working recently with a number of customers that are implementing MDM solutions to evolve transactional systems management of customer, account, and product data. Notice we said transactional systems, not data warehouse- analytics systems – MDM really hasn’t made a significant push into the delivery of solutions for data warehouse applications as yet. Most of the focus is still on the side of the operational system that delivers mission critical data and functions. For customers working in this operational space, there is a need to conform to the strict demands of the MDM implementation, at both the logical and physical level, in addition to extending the model where and how the MDM solution permits. The development team needs to be able to work with both the data as it is presented by the MDM vendor, but also conform to the process and data needs of the methods to be implemented. Two functional areas of Erwin can help – namely Data Movement mapping, and Design Layers. Let’s take a look at Design Layers firstly.

Design Layer applications assist the data architect since they support the concept that data modeling can be executed to capture data requirements for the enterprise, at any level of abstraction. This abstraction can be tied to designs that will be implemented in more detailed logical terms, and the physical interpretation of the information in more applied terms (at the database level). Thus the data as it is implemented can be tied back through the defined layers to the logical and abstract definitions.

In a classic portrayal, the conceptual enterprise model (logical only) is seeded as the parent of the logical and physical operational models. This is further adapted with the introduction of the MDM model (logical and physical), with the MDM-logical layer being linked to the Application Logical Models, and the MDM-physical model representing the base MDM logical model plus any conformities brought on through model extensions. For example, the MDM model can be extended with objects like business keys, that are used by the data architect to identify in their terms the unique identifier for a customer or product. In other cases business relations ( e.g. soft keys) may be implemented as traditional Foreign Keys or represented by table lookups defined within the business services themselves.

 

Through the use of Model Layers, in specific Link Model Source, we can establish the links at the Entity and Attribute level between objects in the Conceptual Enterprise Model and MDM Logical Model. This can be further extended to the Application Logical Model and MDM Physical Model. We will then get a better understanding at the conceptual level of the business data that is acted upon, and have traceability to the business services (defined collection of data and processes) that update the physical model data.
Through the Sync with Model Source function we can identify and migrate changes to the various model abstractions that arise due to implementation specifics that are delivered at the physical level (MDM Physical Model) within each project.

The Add Model Source function gives us the ability to portray the federated physical implementation, where within an SOA implementation, we have more that 1 physical database involved in supporting the application.

Let’s next examine Data Movement Mapping and how it can play a role in bringing together both physical and logical model meta data.

We typically experience in Data Warehouse applications the need to tie ETL data from operational source systems through the staging areas (perhaps ODS) and into the Data Warehouse where it is in turn mapped to analytics requirements to produce persistent or transient data marts.
Data Movement mapping gives us the ability to translate the physical model maps from source to target system.

To satisfy this need we identify the source database tables and columns and import the source from a database, DDL file, or model using the same engine that facilitates Complete Compare.

Once we have the source(s) defined we can then embark on a series of 1 to N mappings between the source and target tables and columns, and also add a Transform Comment which will typically become instructions on how to transform the data, possibly including pseudo code or structured english on how the task is to be completed.

This information is lastly presented to the builders in the form of a Data Browser report, that can serve as the mapping deliverable from the SDLC.

Now within the MDM Space, this mapping function can be employed to identify the Application Logical Model objects as well, including MDM Logical Model objects that are mapped to the Physical Model. Instead of defining the source as a Database or DDL script, we’ll define it as a File Source…

MDM Model File source is defined as the MDM server…

The Import From .CSV function can be used to import the MDM Logical model file. The source for the import can be created from the Data Browser or ODBC-Crystal capability in ERwin..

Format of the CSV file to import the definitions into ERwin…

/*Entity Descriptions*/,,,,,,,,
1,DWXT,0,0,DWXS,0,PARTY,,
2,DWXT,0,0,DWXS,0,ORGANIZATION,,
/*Attribute Descriptions*/,,,,,,,,
1,DWXC,0,1,DWXT,0,party id,integer,Uniquely identifies a party
2,DWXC,0,1,DWXT1,0,party name,varchar(100),The party’s name.
3,DWXC,0,2,DWXT,0,organization id,integer,Uniquely identifies an organization

So now that we have the MDM Logical Model defined in the Source editor we can then map the logical fields to the physical columns, giving us a solid grasp of the transition from MDM Logical model to the implemented database.

Couple this with the layering concept’s ability to identify the linkages across multiple models, we now have the ability to produce an end to end picture of the data abstractions to detailed data fields to the Format of the CSV file.

Leave a Reply

Your email address will not be published. Required fields are marked *