Maintenance Consulting Scope

Part 1: Preliminaries – Maintenance Assessment and Benchmarking

Read more...

1. Introduction

OMDEC’s business is reliability.  Converting the concept of Reliability into practical benefits for the customer requires three complementary approaches:

  1. Maintenance Management and Technical Training
  2. Passing on our experience in the components of reliability via Consulting
  3. Decision support software tools that provide easy to use but sophisticated analysis.

Our training catalogue is available from www.omdec.com under Training/Workshops, and contains details of the many programs that have been successfully delivered around the world over the past few years.  Details of our software tools are also on www.omdec.com under Brochures and Case Studies.  For more details on these, contact info@omdec.com.  This present paper focuses on our Maintenance Consulting programs.

 

 

2. Where to Start?

OMDEC’s maintenance management methodology follows the principles behind the Maintenance Cube of Excellence – a development from the Pyramid of Excellence originally devised by John Campbell in 1995.  This is shown in Part 2 of this series of Scope papers.  But before looking at the processes and solutions prompted by The Cube, it is important to decide where to start:

1.       if you understand the deficiencies that you want to fix, then focus on solving them using the suggestions shown in the appropriate sections which follow.

2.       if you are not certain of the deficiencies, then the place to start is with a Maintenance Assessment – an experienced look at your current maintenance business with a view to focusing on priorities.  More of this in section 3 below.

3.      another way to get started is through a Benchmarking analysis – comparing your current status with other organizations to see where you stand, and then zero-ing in on those that provide the best opportunity for the greatest return on effort.  More of this in Section 4 below.

 

3. Maintenance Assessment or Gap Analysis

Using the many years of maintenance management experience, and based on numerous assessments, our consultants will take a hard, objective look at your current maintenance practices and results, identify gaps compared to best practices experienced elsewhere and propose a series of priorities for the organization.  This approach can centre on one or several aspects of Maintenance, or can be a broad brush look at the whole function.  The assessment analysis and results follow the Maintenance Cube of Excellence to ensure that no key elements are missed, and generally follows these steps:

1.       Preliminary analysis of existing data – strategy, budgets, documented processes, sample reports, work orders, materials analyses etc.  This provides a base of knowledge of your organization before we come on site, and therefore helps to reduce the assessment’s start-up time.

2.       Next, a series of on-site interviews combined with the option of a detailed questionnaire.  Participants are best drawn from a cross section of maintainers and engineers, managers and operators, materials and admin staff so as to get as broad and as deep a view as is needed to understand the current maintenance business.  The questionnaire and interviews are based around the categories in the Cube of Excellence – again to ensure no key elements are missing.

3.       Step 3 consists of analysis of the information gained to date, and represents the initial stages of drawing conclusions and organizing proposals for priorities.

4.       Validation of the analysis now takes place – resolving and clarifying conflicting opinions gained to date through a series of follow-up interviews.

5.       Preparation of a preliminary report of findings comes next, together with informal discussions with selected client management to start to formulate the final proposals and recommendation.

6.       The final step comprises presentations and workshops – including feedback analyses, PowerPoint and hands-on workshops, MS Project plan of precedences, assessments of benefits and costs etc – in fact enough to provide management with good directions for their future maintenance road-map.

Of course the amount of time required for a assessment will vary according to the size of the organization and the scope and depth of the analysis, but for a small to medium organization with a single maintenance function, about three weeks will produce solid results.

 

Benchmarking

Benchmarking can take many forms, but basically is designed to compare one organizational unit with another for the purposes of learning what the differences are and how to prioritise areas for improvement.  Examples of Benchmarking are:

  1. comparisons among different departments or divisions in the same plant.
  2. comparison among plants in the same company
  3. comparison among companies in the same business
  4. comparison among companies seen as Best Practice or World Class companies.

Typically the output will show the range of results (from highest to lowest), for selected KPI’s (Key Performance Indicators),  the median result and where in the range the individual organizational unit sits.  Success in benchmarking depends upon a series of factors which must be carefully controlled:

    1. confidentiality – the participants must be confident that the data relating to them is kept confidential to them and that the results displayed are not traceable back to them.
    2. the selected KPI’s must be meaningful, well-defined and documented, and with the data relatively easily available
    1. the data itself needs to be reliable and comparable.  This is a major task, as (for example) one company drawing data from different divisions may unknowingly have different data definitions even with the same system and process.
    2. while an internal comparison may be successfully performed by internal staff, it is most frequently the case that an external, experienced third party organization conducts the analysis – thus ensuring impartiality and confidentiality.
    3. some means of cross-checking or verifying the key data must be available – participants have been known to slant the results to make their own position look more favourable.
    4. There must be enough companies able and willing to participate in the benchmarking to be able to draw statistically significant conclusions.

Despite these quite demanding conditions, many benchmarking projects have been very successfully completed, and have led participants to discover areas of significant strength (on which they can build) or significant weakness, which then form the basis for an improvement program.

Contact OMDEC at info@omdec.com for more details on the application of these consulting techniques to your business.

www.omdec.com