IN AN IDEAL WORLD
Figure 1 shows the two major processes needed to get accurate data
for our new system. During any new system implementation
preparation activities, a data cleanup plan needs to be developed
to examine the critical data in the legacy systems. This will
identify which data elements need to be corrected before the legacy
systems are converted. It should be pointed out that some data can
be corrected electronically during the system conversion process,
but time must be allowed during the actual conversion process to
have the data owners check to ensure that this final conversion has
been successfully executed.
The second essential ingredient in maintaining accurate data is an
ongoing data management program that occurs after the system
conversion when the new system is being used. A data management
program enables one to be continuously monitoring the quality of
the system data.
When developing the new system implementation plan, data cleanup
must be a key part of this plan with its own full-time team. The
data cleanup team is a vital part of any new system implementation
and should be organized in the same manner as the requirements
definition, detailed design, application programming, system/stress
testing, and implementation project teams. One of the first
activities that they have to perform is to identify any mandatory
system data that may be required in the new system. In addition,
they then need to identify if there is any company mandatory data
that may be an elective data field in the new system.
The next questions that one needs to answer are
• Does this data currently exist, either in the legacy system
electronically or manually in some hard-copy folders?
• How accurate is this data?
• How are we going to capture any missing data?
An ideal first step is to identify the data element that shows who
is responsible for a part, often "planner code" or "buyer code," and
start by cleaning up this data element. This data element can then
be used to distribute the various exception reports that need to be
cleaned up to the individual responsible for the part. The cleanup
teams then start looking at the data in the legacy systems and
develop the various exception conditions that need to be reviewed
and corrected. The use of a 4GL or any other report writer program
is a quick and easy way to extract data from the current legacy
systems. By way of an example, the sort of "part master" and "bill
of material" data that one would look at includes the following:
Define data elements for review and conditions for clean-up
• Part Master
- Do all the part numbers have a planner code?
- Are the current part numbers too long?
- Do the current part numbers contain any special formatting, and is
this to be carried over into the new system?
- Do the required part number codes exist, such as commodity code,
ABC code, part code, unit of measure, etc.?- Do all purchased parts
have buyer codes?
- Do all parts have a lead time value? • Bill of Material
- Do all top-level assemblies have a bill of material?
- Are there any purchased parts that have bills of material (there
may be cases when this is correct, but these should be reviewed)?
- Do all manufactured parts have a bill of material?
- Have the units of measure been checked? This includes counts to
identify any unusual UoMs that perhaps should be converted to a more
commonly used UoM and also errors such as a part having a UoM of
"EA" and the quantity field shows the usage as "1.32".
- What are the number of levels in the bills of material?
- Are there any effectivity conflicts?
The questions above reflect the type of information that would be
examined by the functional users as they review the data in the
legacy systems. This same type of information needs to be developed
for all the major databases that will be converted to the new
system, such as work center data, process plans, open shop orders,
open purchase orders, customer data, and supplier data together
with all the financial data. Once the users start this cleanup
process, they will probably be making additional report requests to
help analyze additional data conditions. One should support these
report requests immediately as it is a clear indication that the
users are taking this data cleanup task seriously.
DATA CLEANUP METRICS AND GOALS
To help ensure that the data cleanup task is proceeding on schedule,
goals and metrics should be set up. A sweep of the legacy systems
should be done to size the extent of the cleanup effort. These
reports should then be reviewed with the functional users and a "get
well" plan developed. Once this has been agreed upon with the
functional users, metrics should be developed that highlight whether
the plan is being achieved. If the plan is not being achieved, then
the functional user team can use the plan to help quantify the
additional resources that are needed to get the cleanup task
completed on schedule.
It is important to realize that the corrected data should be
available to support system testing. This helps ensure that when
reviewing the results of system testing, any incorrect results are
not caused by bad data.
In several cases where I have been involved in data cleanup
projects, we summarized by functional vice president how the data
cleanup effort was proceeding in their area. This information was
then presented at the executive steering committee meetings when
reviewing the implementation status.
Due to the size of the data cleanup task, it may be necessary to
prioritize the sequence in which the cleanup activities are
undertaken. An additional consideration is that quality data will
be needed to support the system testing activity, and so this will
be a consideration in establishing the cleanup completion date. It
is also important to realize that some cleanup goals may not be zero
but rather at some level above this based on the business process
involved. In these cases the goal chart would include an "upper
control limit," and if a count occurs above this limit, the
functional users must initiate a cleanup plan to get back under
The manner in which this process works is summarized in figure 2. It
is important to remember that while one is performing this task, the
data in the legacy systems is getting better, so one does achieve
some immediate benefit.
To Be Continued
To stay current on Lean Management Basics and
Best Practices, subscribe to our weekly MBBP Bulletin... and we'll send you
our PowerPoint presentation
"How to Survive in an Entirely New
Economy." All at no cost
personal information will never
be disclosed to any third party.
what one of our 13,000 plus subscribers
wrote about the MBBP Newsletter:
"Great manufacturing articles. Thanks for the insights. I often share portions of your articles
with my staff and they too enjoy them and fine aspects where they can
integrate points into their individual areas of responsibilities. Thanks
Kerry B. Stephenson. President. KALCO
to Basics" Training for anyone ... anywhere ... anytime
6003 Dassia Way, Oceanside, CA 92056
West Coast: 760-945-5596
© 2001-2009 Business Basics, LLC