Is bad data wreaking havoc on digital supply chain transformation?

  • March 21, 2023
Dynamic wave of glowing particles

Data is an essential part of every digital supply chain transformation, and your data accuracy will determine the quality of your results. However, data cleansing is often overlooked, and you’ll feel the ripple effects throughout the process.

What data belongs, and what should be left out?

What data to include, what data to reconsider and what data to leave out altogether? These decisions — more than any others — separate a structured and precise implementation from a perplexing, undisciplined and chaotic one.

Data structuring and mapping are as critical as the output of any planning tool, and it depends on the data you get from the client. Think about building a Lego model using bricks and pieces. You need not use all the bricks to create a simple, stable model. Deciding on what parts are needed to finish the structure is an excellent first step, only then to be followed by stacking and building the model. Reverse these steps, and the structure may fail. Then a rebuild becomes inevitable.

This example is a classic case of an implementation when the decisions of data structuring, mapping, and transformation are given less priority than model building. While the data issues are varied, these may include invalid master data, inaccurate transactional data or obsolete data structures.

Sorting and evaluating when the data at hand isn’t up to par

Only 5–10% of the data in enterprise resource planning (ERP) systems meets basic industry standards, which is something you must plan for. We typically see data affect the transformation journey during testing, integration and user acceptance testing (UAT). At these stages, business users find underlying data issues that cause costly delays.

Data analysis is essential in understanding the business process before the project scope is defined and frozen. The following steps are recommended to ensure data quality before starting the implementation process.

  • Prepare a standard data set template and collect the required data
  • Understand the data from a business perspective
  • Evaluate and validate from your end users’ point of view
  • Conduct a gap analysis
  • Recommend capturing additional data for future use

Data implementation considerations and best practices

Extract valid data from ERP solutions and filter out irrelevant and obsolete data from the ERP itself or the staging tables. Once you extract all master and transactional data, you can further process the data for downstream systems.

  • Extract data and share it with the client for proper validation by the respective stakeholders. The client needs to take ownership and check and filter the data — using applied conditions — to make the extraction of the correct data possible.
  • Data that is duplicated, blocked, not in use or obsolete must be filtered out before it can enter the planning system.
  • Master and transactional data is required for the planning system. It must be shared in a template so it can be understood and mapped to its respective field or column particular to the ERP system, as field names may be different.
  • Send completed data modeling for validation along with the output for proper analysis before UAT.
  • The bulk of the data received from the client is tested during the system integration phase. End-to-end testing allows data flow validation from upstream legacy systems to downstream planning and execution modules. Once testing is completed and signed off, the volume of data on which the testing was done should be frozen or locked. This same data should be used for UAT and go-live. This is extremely critical. If the data used for system integration testing (SIT) and UAT differs, multiple performance, planning and data integrity issues can creep up.
  • With proper research and analysis applied to what data to use during the data-gathering phase, the planning systems will function effectively, with little need to capture exceptions in the data extraction and transformation phase. These issues are primarily caused by incorrect information flowing from the client to the planning tools, giving rise to missing and wrong data in various UI tabs. All this leads to investing more time and effort to fix the issues.

Having quality data leads to greater transparency and accountability. It promotes business process understanding for management and better decision making, which reduces the duration of project implementation and associated costs.

— By Sheetal Pusalkar

Subscribe to our blog


Related Blog Posts