White Paper 4: Practical Application of Best Practices
COSOL WHITE PAPER
Practical application of best practices
This paper is the fourth and final in the Business Guide to Data Migration series. This paper aims to summarise lessons learned from 20+ years of experience, using practical examples to demonstrate how the combination of people, process and technology platforms deliver a seamless data-driven transformation.
Hopefully by now, the first three papers have been read but if not, this paper will link them together via a series of case studies to reinforce the “why” behind the recommendations made.
As a quick reminder:
- Pre-migration: Data Profiling and Remediation (White Paper #1)
- During migration: Data Standardization and Loading (White Paper #2)
- Post migration: Data Reconciliation and Archiving (White Paper #3)
White paper # 1 takeaways
- Data will typically be exposed as a major risk and cost in large scale digital transformations due to unknown and/or poor quality.
- Strong data ownership and data governance is critical.
- Commence data profiling and data remediating as early as possible
Data quality can best be explained along six dimensions.
|Duplication||Duplicate records may exist (e.g. specific master data may exist multiple times with each instance having a variation to the original master data naming convention). Any duplicates are to be deleted and relationships amended to link the surviving row.|
|Redundancy||Data that is no longer current. These should be identified in source systems and corrected accordingly (e.g. active vendors without an invoice in the last 14 months).|
|Standardisation||The data to be migrated will need to conform to an approved or conventional standard (e.g. master data standard). Cross-validation of datasets across table structures against the agreed standards will need to be monitored and rectified as appropriate until go-live.|
|Incorrect||The data has the incorrect business value (e.g. entitlement amount is incorrect, bank account name is incorrect, addresses inclusive of post codes are incorrect, field values are not aligned to their original planned usage patterns).|
|Integrity||Relationships are not maintained correctly, such as orphaned records (e.g. an account balance for an account that no longer exists). Tables and application functional areas must be maintained (e.g. organisational structure, accounts, reports to, etc.).|
|Completeness||Is a measure of data content quality expresses as a percentage of the columns or fields of a table or file that should have values in them, or fields that need to be left blank and have values in them. (e.g. first names not in preferred name column).|
Figure 2: six dimensions of data quality
White Paper # 2 takeaways
- Six data quality dimensions need to be remedied through the data migration.
- The process of applying remedies to standardise the data for the new system is an iterative process consisting of multiple rehearsals; and
- An integrated cut over plan is critica lto the overall success of the program.
Data reconciling and archiving objective can best be described as ensuring relevant, accurate and auditable Production Data is loaded into the new system in a way that enables new ways of working.
White Paper # 3 takeaways
- Business owners should continually pose the question: if ways of working is new, then why bring “old” data?
- Data reconciliation is the responsibility of the business data owner.
- Placing legacy data in a ‘vault’ has many advantages over retaining it in old ‘read only’ software and hardware.
- Data is an asset, until it is a liability.
The data migration picture
They say a picture is worth a thousand words, and the three white papers in this series can best be expressed per below.
Starting with the business data owners at the top, being supported by functional consultants who provide context and recommendations to the business owners who may not have been given this responsibility previously in their careers.
Together, overseeing the three phases of Data Migration: profiling and remediating, with the six remedies that must be applied; standardising and loading from existing systems to the target system, and finishing with reconciling and archiving.
“As a “domain”, data should and can be managed before, during and after a major program. This strengthens an organisations overall digital capability and mitigates future risks and costs by ensuring data remains evergreen.” – COSOL white paper #1
With the art of migrating data now covered, attention must turn to the key decision of how this “domain” is organised within transformation programs managing consolidation and standardisation of systems.
The pictures below elevate the focus from the data migration method, to the organisational units, and “assets” of each unit within the scope of the program.
In Figure 3, there is a very simple one to one relationship with each unit having a single asset. Figure 4 is still a simple, but more realistic picture where one organisational unit has multiple assets, and this is where the complexity arises, and program structure and governance become more important.
In a mining context, an organisational unit may be a resource sector such as iron ore, and the asset would be the iron ore mine site. A second organisational unit may be coal and the asset would be a coal mine site, and so on. In a government context, an organisational unit may be a department and an asset may be an agency, or a group of agencies (a hub).
Simplifying the picture by focussing on a single organisational unit, there really are only two options for any program.
- Each asset manages their data migration independently and is accountable for successfully migrating their data into the target system as shown in Figure 5; or
- The program provides data migration as a service and works with each asset to migrate, in this example, both sets of data into the target system as shown in Figure 6.
The factors that influence this decision are user experience and program complexity and risk.
|Independent approach||Integrated approach|
|User experience||Some users may work across Assets and therefore would have to engage with different teams using different methods and tools through the six stages of data migration.||All users would experience a common method and approach, independent of which assets they worked in.|
Common communication, training and change management can be leveraged much wider via an integrated approach.
|Program complexity and risk||Each Asset within an organisational unit, and across units for that matter, would all be working towards the one common target system. Scheduling of system access, providing multiple security logins, have multiple parties making changes will increase the risks to the program, and increase the complexity of change management and auditing.||Having a single stream accountable for remediating and migrating data from different source systems to the common target system ensures that the ‘funnel’ and flow of clean data into the production system has the appropriate security and quality controls in place.|
Based on this assessment, and takeaway #1 referred to below, it is strongly recommended that a complex program structure ensures that the data domain, or ‘workstream’, is established as a shared service and that accountability for data migration from multiple source systems to singe target system is centralised and standardised for the reasons highlighted above.
As a “domain”, data should and can be managed before, during and after a major program. This strengthens an organisation’s overall digital capability and mitigates future risks and costs by ensuring data remains evergreen.
Source: COSOL business guide to data migration white paper 1 – data profiling and remediation.
Case study 1 – global diversified mining organisation
A global diversified mining organisation undertook a transformation project to standardise processes and data across disparate organisational units to one centralised SAP system.
The initial approach was not integrated. As seen many times, the organisation made a tool decision first without consideration of the entire process. In this case SAP Master Data Management (MDM) was selected for a subset of master data, resulting in a more complex migration process to ensure master data was not duplicated in the target system.
The organisation also engaged a third party early to remediate source data in preparation for the migration. While this is to be encouraged, the effort was done without reference to the target system and hence the effort focussed on how the existing system works and not how the new system intended to work. This resulted in increased costs for the subsequent rework that was required as the program progressed. As outlined In whitepaper #1 “understand the past but plan for the future”.
Once COSOL was engaged to deliver the subsequent data standardising and loading phase of work, the team quickly performed the initial data profiling diagnostic which should have happened up front to determine the data sources, objects and complexity which enabled them to then de- risk the program to the client by fixing the price.
Using its RPConnect platform, COSOL was able to standardise the data across a very complex program with the use of a single RPConnect technical specialist. Without the platform, other providers would solve this problem with lots of technical “experts.”
By simplifying the technical side of data migration, COSOL experts bring functional expertise to the table and work closely with the business owners, providing insights and recommendations as to the quality of their data and how best to remediate it. By understanding both the existing system as well as the target system, the recommendations are aligned to the new ways of working.
This case study shows why it is recommended to take an integrated approach from the outset, and engaging a system agnostic provider with the tools AND the expertise who specialises in the data domain and who can understand the past but plan for the future.
Case study 2 – large state government department
A large state government department undertook an SAP system upgrade from ECC6 to S4 that encountered several data quality risks and Issues causing significant time and cost overruns to the project.
The project had fallen into the trap of thinking of data as a technical issue. Once again, focus was on a toolset, in this case SAP information steward was selected, and surrounded with a technical SAP team with a scope to build extract, transform and load (ETL) objects to manage the data remediation process.
COSOL was engaged to recover the situation whereupon it was found that the project:
- Data team only consisted of technical resources, and as a result failed to engage the business in any business-related remediation activities
- Was reacting to data quality issues as they surfaced, rather than having a holistic and proactive approach to data migration
- Had misaligned roles and responsibilities between the data migration and data cleansing teams, resulting in duplication of tasks and, of greater concern, inconsistent extraction rules between the two teams resulting In wasted effort remediating non relevant data
- Had limited coverage of the data quality KPIs resulting in mock data conversions failing and significant project delays due to Inadequate data remediation
To remedy the situation, COSOL quickly established the concept of data owners and stewards within the department to ensure the business took ownership of their data and the remediation activities. Due to the limited time remaining for the project, an agile method was adopted with two-week sprints prioritising critical project requirements investment cases to undertake business led remediation activities.
Case study 3 – global thermal coal organisation
A global thermal coal organisation recently undertook a large-scale program to consolidate seven different ERP systems within their assets to a single SAP system. The data migration stream was extremely complex, needing to ensure that data standards, consistency and integrity is maintained as seven systems become one.
COSOLs RPConnect platform was adopted which provided the client a consolidated view of the data quality within the seven source systems which then allowed them to centrally apply 100 data quality KPIs over all source systems and manage the data remediation process across the seven Assets.
COSOL was also able to reduce the data migration project considerably by simulating the mock conversions prior to the SAP system being available. The business was able to view their data In SAP format prior to design workshops which bridged the “language” barrier between source and target terminologies and allowed data Issues to be resolved in the context of the blueprint.
This is a great case study and example of understanding the past, but planning for the future.
Case study 4 – electricity generation organisation
An energy generation organisation acquired a generation asset from a competitor which resulted in a systems consolidation project. In this case, the acquired asset was managed using SAP while the acquirer used Ellipse.
Being in the same industry, the acquired asset used common suppliers, products and services, and ensuring these merged correctly allowed the acquirer to have a consolidated view of their relationship with key suppliers across all assets.
As RPConnect is system agnostic, it was an obvious choice to apply the same data migration process, using the same data quality dimensions working with business data owners to inform, advise and assist them in data remediation, reconciliation and archiving decisions.
The key challenges overcome in this program were:
- The data mapping and value mapping across the different system types (SAP & Ellipse), and to merge these ensuring no duplication
- Which data to migrate into the target system and what to do with the legacy data. In this instance, RPConnects Legacy Data Viewer module was used to store all of the legacy data. This solution provided the business the ability to quickly and simply lookup historical data through an intuitive user interface. This allowed the client to retire the SAP software and hardware costs, and mitigated the need to retain SAP specific skills purely for navigating a legacy system.
This is a great case study on the importance of the business objective and why it is important to be business owned and business led, and that data migration is not a technical activity. It also shows that a common method and a system agnostic platform is more important than a system specific tool led by technical teams is more important and more effective.
Case study 5 – federal government department
A large federal government department is undertaking a significant transformation program to replace over 100 legacy applications with SAP S4.
COSOL has been engaged for its system agnostic RPConnect platform along with its data migration expertise. The cloud based RPConnect platform was security cleared and certified to protected status due to the sensitivity of the data being processed.
The program is currently profiling the data using the data quality KPIs to drive remediation activities from a central location. The data is continually refreshed from the source systems allowing the tracking of remediation progress across different systems and divisions.
Case study 6 – global diversified mining organisation
Another global mining organisation undertook a transformation program to consolidate ERP systems and associated applications to SAP across all organisational units and assets.
The strategy for the data domain was to only migrate master data and open transactions from the source systems the target, as part of realising the value of the transformation project, and then to “freeze” the source systems.
However, during the value realisation phase of each release the value in the business case was unattainable as the source systems were unable to be truly “frozen” due to:
- Security risks that required patching in the source systems (operating systems, databases, etc.)
- Ongoing hardware, licence and basic support costs in the source systems
- Skills availability to utilise and extract Information from the source systems
- The complex underlying data structures that were inconsistent with their target state
COSOL was engaged to provide a legacy data solution for all of their legacy applications that provided access to the business for operational, regulatory and statutory requirements. The RPConnect Legacy Data Viewer provided the client with a resilient, cost effective and self- service data platform that transformed their raw data objects into a business readable format. Additionally, the data migration maps were integrated into the solution providing whole of lifecycle data lineage for historical comparisons.
The client realised the value of the toolset when a warranty claim was rejected due to a lack of works management history for a significant asset that had a 20 year useful life. Using the RPConnect Legacy Data Viewer the client was able to extract the entire works management history for the asset demonstrating that it had been maintained in line with the OEMs specifications.This is a great case study on the importance of Archiving, and the key takeaway stating that placing Legacy Data in a ‘Vault’ has many advantages over retaining it in old ‘read only’ software and hardware.
Case study 7 – clean energy generation organisation
A recently established clean energy generator acquired several generation assets from multiple organisations. Each acquired asset was managed by a different ERP system (Ellipse and SAP ECC6) which the organisation decided to consolidate into a greenfield SAP S4 environment.
COSOL was engaged for its system agnostic platform, RPConnect, and data migration expertise to provide the end to end data management solution.
The RPConnect platform de-risked the time critical project with built-in SAP ECC6 and S4 connectors and standard data mapping objects and associated value maps, resulting in minimal technical project resources and a functionally focused data project.
This client, like others, was able to view all data in one consolidated location, run a common set of quality KPIs across all assets and track the remediation progress and validate data prior to it being loaded into the new system.
This paper is the final in the Business Guide to Data Migration series and it is hoped that it has enlightened the reader on some of the key considerations to be taken into account when embarking on a complex transformation or system consolidation program.
The key takeaways have been shaped by many years of real-world examples, some of which have been shared in the case studies, of situations that have worked well and others that have not worked so well.
If there was a single takeaway, it is simply that data migration is a business challenge to be solved by business people. If left to technical people many of the aforementioned outcomes can be expected, resulting in potentially significant time and cost overruns and putting whole new ways of working at risk.
Secondly, setting up the program with data as a domain, and finding the right partner with the proven, holistic experience across people, process and tools is the best way to mitigate the risks. In a complex organisation and program, providing this as a shared service across Organisational Units and Assets is the best way to gain consistency and efficiencies when the objective is to deliver a common, integrated platform, whilst preserving a contextual link to the past.
Please, if nothing else, heed these two pieces of advice that come from many, many years of practical experience that have been shared in this series.
Our point of view
Experience shows that one critical success factor for these programs to realise the value in their business cases is data migration. The axiom “garbage in, garbage out” remains as relevant today as it did when computers were first invented. COSOL strongly believes:
- That digital transformation is well underway, and every board is, and should be worried about how to become a truly digital enterprise.
- Strong enterprise data foundations will be required to enable adoption of digital solutions including advanced analytics, robotic process automation, machine learning and artificial intelligence which are the next frontiers to productivity and market competitiveness.
- Enterprise data is the glue, the fact base, that drives decision making and business improvement, allowing organisations to meet stakeholder expectations in a timely and efficient manner; and
- For organisations to succeed, data must be treated as a mission critical asset; it is the single biggest success factor in a digital transformation journey, and most organisations are ill prepared due to many islands of disconnected data that is of unknown and/or poor quality.