So this is the framework I have developed and use for reviewing and auditing failing projects, programmes and systems. As I might have said before this is a simple, effective, framework, based on my experience, and although you might have seen approaches like this before, this is one that I have personally used to great effect.
To an extent this framework is a description of how I document a review and the process steps that I take as well, the major difference is that the process itself is likely to be re-iterative and you well learn things during the review which generate fresh lines of inquiry.
I get asked to perform these types of review probably because I’ve done a large number of them and have become quite good at them, however originally I think it was because I have an analytical and inquiring mind, I am tenacious enough to chase down what is really happening in a situation, have a broad and deep appreciation of technology and it’s implementation, I have a great deal of project and programme experience across a number of Industry’s, and am good at getting people to tell me about the problems they are experiencing. I expect these are the type of qualities you would probably want to encourage to become better at this type of activity; an unkind person might say that being a pedant and didact can help too.
So I separate my reviews down into five simple areas:
- Problem(s)
- Fact(s)
- Result(s)
- Conclusion(s)
- Recommendation(s)
I bet you’re thinking “well that’s obvious Wayne”, but simplicity is always an imperative when you set out, because believe me, the complexity of the issues you’re going to find can sometimes seem overwhelming. So explaining what I mean by these five headings:
- Problem(s)Or rather perceived problem(s). Because this is what the client thinks is wrong or at the very least is the effect not the cause (unless they know or think they know what the problem is and just want an expert to confirm their opinions). If in doubt this should definitely be why you have been commissioned.
This section should not really be that large, because otherwise I would expect that items from the following sections have ‘leaked’ into here, most likely from the ‘fact(s)’ section. For instance if the MD of a company who has commissioned you to review a system starts telling you about all the individual issues that they are having then you are clearly in ‘gathering facts’ mode and much of this should end up in the next section.
I would typically expect this section to be only a paragraph or couple of paragraphs at the most, if it’s running to half a page or more I’d be concerned because in a large review clarity will matter a great deal. Even in a large scale, complex review, be careful of this section because if it is too large it could point to being too over-focused on detail which should be drawn out further in the review or a problem with the level of abstraction used and the problem description.
Examples of ‘Problems’ I’ve been asked to investigate include:
- We’ve spent £10s of Millions on our IT supplier and the web sites which they have built are still not available when they are needed to be, what is going wrong
- We’ve spent £30 Million plus to date on a data centre build out, which should be complete, and our IT supplier keeps telling us that it could be a month or a couple of months until it’s complete, but we have lost confidence in their updates.
- We have spent over £70 Million on a large integration project, which has yet to deliver it’s first release to the business and I’ve just been told it’s going to need another £10 Million immediately and another £40 Million to complete
- We are just under two years into a ten year, £300 Million pound a year contract, which has ‘ballooned’ up to £800 Million per year already, and yet our supplier still hasn’t delivered the ‘Transformation’ that they promised, what is stopping them
- Fact(s)Gather and document facts. This should easily be your largest section because data matters and you will need good data to make an appropriate diagnosis of the situation and to ensure you deliver a credible and believable review.
Obviously there are many ways to gather data, especially technically i.e. gather crash dumps, read through code, measure network, processing and storage performance and capabilities, etc. For non-technical fact gathering you can review contracts, documentation, investigate online and offline document repositories, review authorised and freely given email and communication trails and other ‘digital echos’ as you see is appropriate, etc.
By far the most effective means of gathering facts in a large scale and complex review is via interviews. In a large scale review you would expect to find the majority of fact gathering comes from interviewing. Inquiry, question development and delivery, structured interviewing and aware and active listening matter a lot here. Never lead an interview in case you are building a case for a theory or ‘pet’ view that you have. Remain impartial at all times.
It is important to be empathetic enough to be good at relating to people and getting them to open up when doing structured interviewing and active listening; if you are too proud or arrogant you can forget interviewing as a method of gathering data because it’s unlikely anyone will open up to you enough, and this will seriously impact your ability to perform reviews and audits in any meaningful manner.
People will be people in the interviews: they will be emotive, some may be reserved, stoic, cynical even, some will care, some won’t, a few may be objective, many are wittingly or unwittingly subjective, and all will have opinions.
Remember interviewing is the no.1 manner in which good quality data gathering is done for system, project and programme reviews and audits, becoming fluent in performing interviews and capturing the data thereof is key to performing good reviews.
Do not lead the data, nor start to analyse until a good body of data is gathered. Often once facts have been gathered and started to be analysed more information may be required to perform a good quality diagnosis. Be prepared to ask lots of questions and be prepared to meet people who don’t want to answer you. Document everything.
- Result(s)This is where you will be relating facts into results; although some analysis and thought will have been done considering which information to gather and how, much of the real analysis ‘foot work’ starts here. This section is where you relate the information so far presented and relate it to issues and problems that the client is experiencing. Hopefully you should now have information gathered and documented in the ‘fact(s)’ section which is causing or could be causing the problems that the client is exhibiting. It is likely that you will be sorting facts and the problems that they are associated with into a basic set of categorisations (and the next article in the series deals with those categories).
A simple example result may be that a defined problem might be “the web server keeps falling over, we don’t know why”, whilst the related fact may be that “patches were not applied”, after more investigation it would probably be fair to link the two together thus “a result of not applying the appropriate patches leads to the web servers being unstable”. The reason you shouldn’t jump the gun and stop with the first thing you come across is that it may not be the root cause, it could be a contributing cause or even unrelated, the good reviewer is appropriately thorough, without needlessly wasteful of clients time, money or resources.
An example of a conclusion might be “without implementing and maintaining change control the project will continue to move out of control and will be increasingly difficult to deliver to time and budget, never mind delivering the contractually required document deliverables”.
If the facts and results do not map to the original issue for which the exercise was commissioned, you would need to consider gathering more data which is more related to the original problem and re-iteratively gathering more information, alternately perhaps testing the validity of the original problem description and politely questioning the original area you were asked to review with the sponsor of the review (secure a meeting and let them know of the concerns and issues you are having). Document any disparity between the originally identified problems, the facts gathered and the results given here.
- Conclusion(s)Defining conclusions is where you look at the Facts and Results and conclude what will happen if the situation continues. This is where you make rational predictions on a future state, suggesting what problems might occur in the future if no action (or what planned action) is taken. It would be dismissive to say that this is where any ‘scaremongering’ occurs, but it is important to inform, and, possibly even, warn the client about further problems and issues that they might experience if the situation remains unchecked.
It is important that your conclusions address the original problem and although you may like to address any additional problems which have been drawn out during the review it is not imperative to do so; however I almost always do and this is because I feel an obligation to the client and I want to demonstrate my delivery focus too, you however may not find this something you have time, or want, to do.
Again this will probably be a short section, and although you well may have been creative before, this and the next section is where your good ideas need to start to be. You need to ensure that you are not too fanciful, and personally I prefer not to be seen to be influencing any particular recommendation by ‘weighting’ any possible future state too negatively however I have seen a lot of of reviews which have lacked impartiality over the years.
If things are bad you must be honest and deliver the difficult news; whatever you do do not attempt to ‘sugar-coat’ the news and detract from the important information and messages you are delivering to the business; although I heavily recommend that you ensure that you inform your sponsor verbally early on so as to ensure you do not deliver any surprises which could have a possible negative effect and lose or diminish their support.
- Recommendation(s)This is where you make suggestions in terms of trying to improve the situation; deliver recommendations which relate to the facts, results and conclusions, and the original problem. If other problems have come to light during the review and you have included them as part of the overall review then you should include recommendations which address those problems as well.
Making recommendations could well be seen as being the easy part to an experienced ‘expert’ with a certain field, and it is always attractive to the inexperienced reviewer to dive in with recommendations before proper analysis has been completed (i.e. we’ve found these facts, therefore because the last project had a similar issue and we fixed it by doing X, Y and Z, we will try X, Y and Z here). This behaviour will likely lead to either the wrong problem being fixed, or worsening the current situation, all of which waste the clients time, money and resources.
With recommendations I like to remember the ‘pareto principle’ (the “80-20 rule”), in that your principal recommendations should by mindful of this and have significant impact on addressing the problem space originally described by the client. Minor recommendations are all well and good, but if they don’t “fix” the problem in the mind of the client it’s unlikely that you will be being asked to review for them again or that your recommendations are implemented at all.
Above all recommendation are given so as to improve a situation, not to push any personal agenda, and again it is key to be impartial and objective.
The biggest problem you will likely have in using a framework like this, or any other, is early on you will likely have content in the following or proceeding section to the one it should be in, as you become more familiar with the framework and more experienced at doing reviews and audits this should improve.
Also do not imagine that the only place that you bring value is in the ‘Recommendation(s)’, this is grossly incorrect, because the client may well have not gathered the data you have, analysed it in that same way, nor come to the same conclusions. Your work ultimately will improve their understanding of a situation and allow them to plan accordingly, and this is the genuine value.
Of course a good review document will contain more than the above, probably references, appendices, document control, etc., however the above is the absolute core of a good review in my opinion and experience. If you find yourself arguing with your co-reviewers about the document version control table you are way off the mark, because fundamentally the quality of the review is paramount as well as the effect it brings (hopefully a resolution to the issues for which it was commissioned).
My friend, Chris Loughran, of Deloitte, use a framework even more stripped down and ‘lean’ than this, delineating into (1) gather facts, (2) relate results, and (3) make conclusions, which is certainly punchier and easier to explain in short order to your typical senior executive or CxO who has very limited time. But of course Chris is one of the leading Business and Technology Consultants in the UK, so this is to be expected and he is highly effective using this approach. Personally, as I’ve written about in this article, I prefer to document the (perceived) problem and to ensure recommendations are distinct from conclusions.
As usual, I hope you have enjoyed the article despite it being a lot larger than I hoped, and to mention that the next one is looking at the categories of reasons why projects and programmes fail (although I’ve just decided to deliver and have subsequently written a short article documenting an example of the above review and audit framework too).
- Recovered link: https://horkan.com/2009/07/19/project-review-programme-audit-framework
- Archived link: https://web.archive.org/web/20100713051548/http://blogs.sun.com/eclectic/entry/project_review_programme_audit_framework
- Original link:
http://blogs.sun.com/eclectic/entry/project_review_programme_audit_framework