Menu
Tools and Techniques: Some of the tools that can be used for qualitative risk analysis include: Probability And Impact Matrix. The matrix helps in identifying those risks which require an immediate response. The matrix may be customized according to the needs of the project. After that, all you have to do is become appointed to a court and undergo judgeship training. Thankfully, you don’t need all that training to decide the case of Judgment vs. To determine whether judgement or judgment is the right choice, you can use the evidence to make a decision.
Monitor and Control Project Work not only involves identifying variances on the PMP Certification Exam, but also determining whether the variance is acceptable, what caused the variance, and what (if anything) you should do about it. This is where expert judgment is useful.
Expertise is used to analyze performance, pinpoint the cause of variances, and determine the appropriate actions to address performance variances. You can use a number of techniques to analyze past and current performance and also forecast future performance.
Analytical techniques
When analyzing data, you should check the accuracy of the source data and make sure that schedule measurements and cost measurements are using the same scope information to report performance information. Analyze what the combined schedule and cost information tells you. Look at a few examples of reviewing cost data:
- Over budget, ahead of scheduleThis state could mean that you’re accomplishing work faster than expected, so you’re spending funds faster than expected. This is fine if you’re actually spending the same amount you budgeted for the work; you’re just doing it earlier than you planned.This could also mean that you’re spending money to expedite work. In other words, you’re crashing your schedule to finish quicker. This might be okay, depending on the priorities for the project. If you’re trying to get to market as soon as possible and the sponsor or client is willing to spend more to get there faster, you’re in good shape.
- Over budget, behind scheduleThis isn’t good. You might have underestimated, or maybe you’re not doing a good job managing the project.This could also mean that a risk event occurred, and you had to spend money and take time to respond to the situation.
- Under budget, ahead of scheduleThis might be the most suspicious of all situations. It rarely happens, so the results should be scrutinized to see whether the cost and schedule estimates were padded, or perhaps you have an error in the numbers.
- Under budget, behind scheduleMost likely, your project is starved for resources. You’re under budget because your labor costs are lower than expected, but you’re behind schedule because you don’t have the manpower to accomplish the work as planned.
From these examples, you can see that you’re looking at combined information to understand the cause of any variances and understand the implications of the variances.
Another source of information you should review is the status of your cost and schedule reserves. If your project is 30 percent complete but you’ve spent 60 percent of your reserves, you should understand the implications for your project. Perhaps the first 30 percent was the riskiest, and the remaining 70 percent is relatively risk free.
In this circumstance, your remaining reserve should be sufficient. However, if the majority of the uncertainty in your project is likely to be toward the end, then it’s a fair assumption that you will overrun your budget.
Forecasting
Good performance reports include cost and schedule forecasts along with the supporting data to explain the forecasts.
Forecast. An estimate or prediction of conditions and events in the project’s future based on information and knowledge available at the time of the forecast. The information is based on the project’s past performance and expected future performance, and includes information that could impact the project in the future, such as estimates at completion and estimate to complete.
Of the several different methods of forecasting available, the main three to look at are
- Time series
- Regression analysis
- Expert opinion
Time series
A time series forecasting method uses a model to predict performance based on past performance. In general, time series forecasting is more accurate for a short-term forecast for which future events are expected to be similar to the recent past. Predictions get less accurate the further in the future they fall.
Another consideration with time series forecasting is whether the work or the environment for the future work is different from the past work. If so, time series forecasting is not the best method. You might consider trend analysis instead. Trend analysis entails plotting the progress for past reporting periods and then projecting future performance assuming the same trend continues.
Regression analysis
Regression analysis looks at the relationship among variables that cause fluctuations in outcomes. You assume the budget is the dependent variable and then that resource skills, availability, and cost are independent variables. You are looking to see the expectation of achieving the budget based on changes in each independent variable.
Expert opinion
With this method, a person (or group) estimates cost or schedule performance; this estimate is based on knowledge and expert opinion. Experts may develop probability estimates or build models based on various scenarios. Sometimes this method seems as if it’s based on intuition or an educated guess.
However, many times, you can’t really quantify the knowledge that experience brings. A bottom-up estimate by an expert can oftentimes be the most accurate method of forecasting.
Other forecasting methods
Other forecasting methods include running simulations and probabilistic forecasts (a forecast based on a probability distribution). In reality, using multiple methods will end up with the best results.
The PMBOK Guide mentions many analytical techniques, including fault tree analysis, failure mode effect analysis (FMEA), and grouping methods. Don’t worry! You don’t have to know and understand all those methods of forecasting. You should just understand the basic ideas that you can use the past performance, variables that drive the future, or expert judgment to develop a forecast.
One way of documenting variances is by using a simple variance analysis form.
- Beach, L.R., Christensen-Szalanski, J. & Barnes, V. (1987). Assessing human judgement: has it been done, can it be done, should it be done? In: C. Wright & P. Ayton (Eds), Judgemental Forecasting, John Wiley, Chichester, 49–62.Google Scholar
- Cooke, R.M. (1987). A theory of weights for combining expert opinion. Department of Mathematics, Delft University of Technology.Google Scholar
- Cooke, R.M. (1991). Experts in Uncertainty: Expert Opinion and Subjective Probability in Science. Oxford University Press.Google Scholar
- Cooke, R.M., Mendel, M.B. & Theys, W. (1988a). Calibration and information in expert resolution: a classical approach, Automatica, 24, 87–94.CrossRefGoogle Scholar
- Cooke, R.M., Stobbelaar, M. & Van Steen, J. (1988b). Expert Opinion in Safety Studies:Case Report 4 — DSM Case. Department of Mathematics, Delft University of Technology.Google Scholar
- European Safety and Reliability Research and Development Association (1990). Expert Judgement in Risk and Reliability Analysis: Experience and Perspective. ESSRDA Report No. 2.Google Scholar
- French, S. (1985). Group consensus probability distributions: a critical survey. In: J.M. Bernardo, M.H. Degroot, D.V. Lindley & A.F.M. Smith (Eds), Bayesian Statistics 2, North Holland, Amsterdam, 183–201.Google Scholar
- French, S. (1987). Conflict of belief: when advisers disagree. In: P.G. Bennett (Ed), Analysing Conflict and its Resolution, Oxford University Press.Google Scholar
- French, S. & Wiper, M.P. (1990). Bayesian Updating by remodelling an Expert’s Quantiles. Report 90.7, School of Computer Studies, University of Leeds.Google Scholar
- Genest, C. & Wagner, C. (1984). Further evidence against independence preservation in expert judgement synthesis.Google Scholar
- Kahneman, D., Slovic, P. & Tversky, A. (Eds) (1982). Judgement under Uncertainty: Heuristics and Biases, Cambridge University Press.Google Scholar
- Lichtenstein, S. & Fischoff, B. (1980). Training for calibration, Organisational Behaviour and Human Performance, 28, 149–171.CrossRefGoogle Scholar
- Lichtenstein, S., Fischoff, B. & Phillips, L.D. (1982). Calibration of probabilities: the state of the art until 1980. In: Kahnemanet al, 306–334.Google Scholar
- Mendel, M.B. & Sheridan, T. (1987). Optimal estimation using human experts. Man-machine Systems Lab., Department of Mechanical Engineering, MIT.Google Scholar
- Merkhofer, M.W. (1987). Quantifying judgemental uncertainty: methodology experiences and insights, IEEE Transactions on Systems, Man and Cybernetics, 17, 741–752.Google Scholar
- Morris, P.A. (1974). Decision analysis: expert use, Management Science, 20, 1233–1241.zbMATHCrossRefGoogle Scholar
- Raiffa, H. (1968). Decision Analysis. Addison Wesley, Reading, Mass.zbMATHGoogle Scholar
- Wiper, M.P. (1990). Calibration and Use of Expert Probability Judgements. PhD Thesis, School of Computer Studies, University of Leeds.Google Scholar