Effort accounting is needed in software and IT for the simple reason that labor is generally the most important controllable element of cost. Labor cost (effort hours times prevailing labor rate) is an essential and fundamental element of efficiency measurement. Six Sigma projects in software and IT (especially DMAICs) are often focused on driving down the effort associated with a particular process. To know if an improvement has produced the desired impact, an organization must know time spent, both before and after the improvement.
Rarely will effort accounting be sufficient in itself. Typically, it will be combined with other measures. A company may, for example, normalize by size in order to enable valid comparison across projects of different scope. It also may combine effort data with defect count data to calculate average effort required to find and/or fix a defect, or combine effort data with a count of support requests to calculate average effort required to service a support request. Derived measures such as these combine several elementary measures to develop values that can be used for planning, evaluation and estimation.
When applying Six Sigma, it is always wise to know the validity of the data that will be used to make significant decisions. The discipline for determining validity is known as measurement systems analysis (MSA). In transactional environments, including software and IT, MSA poses questions such as the following for assessment of any measurement system:
Accuracy: Unfortunately, most of the effort accounting data is not very accurate. While reasons for this vary widely, the most significant causes include:
Consistency: There also are significant deficiencies in this area, including:
It is critically important that organizations focus their attention on the reliability and usability of any measurement systems involved in Six Sigma efforts. The following list of do’s and don’ts will both improve the quality of effort accounting data and reduce the cost of collection.
1. Don’t collect information you don’t use to make decisions or drive improvement. Curiosity or we always did it this way are not reasonable justifications for the cost and annoyance associated with effort accounting.
2. Do establish a process to verify the accuracy of the data being collected. This means at least weekly review of data by first level supervisors. Did the team really work on the projects/phase indicated? Do the hours reported jive with the time actually worked? Is overtime work reflected in the time reported?
3. Don’t let the fact that the reporting system is used for other purposes interfere with the need to get valid effort accounting. For example, if you need to report eight hours per day for payroll or benefits systems, build a simple export from the actual hours data to the payroll system. An excuse is not a reason.
4. Do keep to a manageable number the categories against which any individual needs to report. Maybe you can get people to report with reasonable accuracy against two or three accounts in one day, but you will never get reliable data about 10 or 12 categories. It also is worth noting that assigning more than two concurrent tasks is a sure formula for low productivity. Studies by leading management consultant Tom DeMarco and others indicate a 50 percent reimmersion penalty for task switching associated with complex tasks. Help your people stay focused and finish one thing before beginning another. They will be much more productive and less frustrated, and effort accounting overhead will be significantly less and the figures correspondingly more accurate.
5. Don’t let inappropriate incentives or punishments get in the way of the truth about how time is really being used. If people get negative feedback when they tell the truth, you will never find out what is really going on. And consequently you will never be able to get at root causes and find permanent solutions. Management-by-fact is a central feature of a Six Sigma culture.
6. Do make sure to frequently and fully communicate the operational definitions needed for accurate effort accounting. Make sure your people know why they are reporting effort, and what you do with that information. Show them how they will benefit from the process improvements that are possible with accurate data. Communicate the information that is relevant to the target audience. If, for example, process improvements can reduce the frequency of late night support calls, that reduction is probably a meaningful motivator for those involved. Never use the data to punish.
7. Don’t confuse effort accounting with status tracking. For status tracking, all you need to know is which tasks are complete and which are not. Be honest with yourself – have you actually ever used task-level effort accounting data to plan a future project? Most organizations do not, but often claim that is why they collect the data. Unless you are doing earned value analysis, you probably do not make any valuable use of task level effort data.
8. Do consider using a very simple chart of accounts for effort accounting. For example, collect effort data by project, phase and cost-of-quality categories only. Cost-of-quality categories include:
- Value-added effort – in general, the time it takes to do the task the first time
- Appraisal effort – time spent on checking the work done, which includes all reviews, inspections and testing
- Rework – all time devoted to correcting defects identified by appraisal efforts
- Prevention – activities such as training and DMAIC projects – not normally accounted for by project
This approach gives a five-phase project a total of 15 accounts – certainly manageable.
Most software organizations are spending 60 to 70 percent of total effort on appraisal and rework, but few realize this because most effort accounting practices rarely bring it out in the open.
Productivity improvement in software is fundamentally simple – increase value-added effort by reducing rework, which, in turn, is accomplished by changing the mix and effectiveness of appraisal activities and by instituting or expanding prevention activities.
9. Don’t confuse sampling with 100 percent reporting. You need 100 percent reporting against a basic chart of accounts such as described above, but you can readily drill down when necessary by using much smaller samples taken as needed. For example, you may wish to know the relative cost of finding and fixing different types of defects. The approach proposed above shows the average for all types. If you want to differentiate cost by type, you can certainly do that with a sample – much less overhead, but fully as effective as 100 percent reporting, provided the sample is properly drawn.
It is not rocket science, but software and IT organizations will continue to fall short of their objectives without significant improvements in measurement systems, beginning with effort accounting. Yet, the discussion of measurement systems for software companies should not end with effort accounting. Other critical measurement systems also need exploration.