Overtime is always a controversial topic in the workplace, and it seems as if many wish the practice didn’t exist whatsoever.
Frequent criticisms of overtime are that it’s unsustainable, unproductive, error prone, and detrimental to team morale – leading to increased sick leave and attrition. DeMarco and Lister, authors of the classic ‘Peopleware’, argue strongly against it:
“The positive potential of working extra hours is far exaggerated, and its negative impact is almost never considered.” (1)
In spite of this, overtime is still very prevalent in the UK, with a recent study finding that more than five million employees nationwide put in an average of 7.7 extra hours a week in unpaid work.
Why might a manager request or support overtime in their team?
Given the risks to wellbeing and productivity, why would a manager request or support extended hours in their team? Well, I've noticed two main drivers – one quite reasonable, the other more controversial.
The first driver is to achieve numerical flexibility. Numerical flexibility is a measure of the team’s ability to adjust the number of hours worked in order to adapt to variations in demand – for example to recover from a schedule slip or meet a critical customer milestone.
Whilst overtime is not the only source of numerical flexibility, in roles with a high level of pre-requisite learning, common in software development teams, it is often the only practical way to respond to short term spikes in demand.
The second – more controversial – reason for a manager to request extended hours is to demonstrate, perhaps to senior stakeholders, a decisive response to a slipping schedule.
This sets a dangerous precedent, because it may lead to excessive patterns of overtime being supported even when they are recognised to be counter-productive. This could herald what Edward Yourdon characterises ominously as a ‘Death March project’. (2)
How do we minimise the need for overtime?
To minimise overtime, we need to consider the underlying causes. The most common trigger that I’ve observed is poor estimation. If we underestimate the effort required for a project, we can’t plan sufficient resources – meaning we’re more likely to resort to short term, reactive measures.
I’ve experienced this several times when managing client programmes as a Delivery Director at NTT DATA. Accurate estimations are notoriously difficult, particularly when we’re asked to do it at an early stage when the project requirements may not be fully clear.
This is often compounded by the natural optimism of the team – wanting to give managers the answer they think they’re looking for. Or by senior managers pushing for estimates to be reduced, for example to ensure they fit with a business case.
Learning from these experiences, these are some simple ways we attempt to improve our estimation accuracy at NTT DATA:
- Ensure we are clear and honest about how well we understand the requirements (and cover any uncertainty with appropriate contingency)
- Involve the people doing the work, ideally multiple people in a collaborative format (such as ‘Planning Poker’ in agile software teams)
- Use a structured approach, such as an estimation model, which is progressively refined based on experience
- Don’t ignore key factors within the actual people doing the work – such as their level of experience or whether they’re a new or established team.
Working smarter – not just longer
Organisations need to help foster a culture of continuous improvement in our teams to harness change.
In my experience a team that is continually striving to make things better – formalising and refining methods or increasing automation – is more productive and far less in need of overtime.
(1)DEMARCO, T. & LISTER, T. 1999. Peopleware: productive projects and teams, Dorset House Publishing Co Inc., U.S.
(2)YOURDON, E. Death March, The Complete Software Developer’s Guide to Surviving ‘Mission Impossible’ Projects, Prentice Hall Computer Books